TY - JOUR
T1 - An Improved Training Algorithm for Nonlinear Kernel Discriminants
AU - Abdallah, Fahed
AU - Richard, Cedric
AU - Lengellé, Regis
PY - 2004
Y1 - 2004
N2 - A simple method to derive nonlinear discriminants is to map the samples into a high-dimensional feature space F using a nonlinear function and then to perform a linear discriminant analysis in F. Clearly, if F is a very high, or even infinitely, dimensional space, designing such a receiver may be a computationally intractable problem. However, using Mercer kernels, this problem can be solved without explicitly mapping the data to F. Recently, a powerful method of obtaining nonlinear kernel Fisher discriminants (KFDs) has been proposed, and very promising results were reported when compared with the other state-of-the-art classification techniques. In this paper, we present an extension of the KFD method that is also based on Mercer kernels. Our approach, which is called the nonlinear kernel second-order discriminant (KSOD), consists of determining a nonlinear receiver via optimization of a general form of second-order measures of performance. We also propose a complexity control procedure in order to improve the performance of these classifiers when few training data are available. Finally, simulations compare our approach with the KFD method. © 2004, IEEE. All rights reserved.
AB - A simple method to derive nonlinear discriminants is to map the samples into a high-dimensional feature space F using a nonlinear function and then to perform a linear discriminant analysis in F. Clearly, if F is a very high, or even infinitely, dimensional space, designing such a receiver may be a computationally intractable problem. However, using Mercer kernels, this problem can be solved without explicitly mapping the data to F. Recently, a powerful method of obtaining nonlinear kernel Fisher discriminants (KFDs) has been proposed, and very promising results were reported when compared with the other state-of-the-art classification techniques. In this paper, we present an extension of the KFD method that is also based on Mercer kernels. Our approach, which is called the nonlinear kernel second-order discriminant (KSOD), consists of determining a nonlinear receiver via optimization of a general form of second-order measures of performance. We also propose a complexity control procedure in order to improve the performance of these classifiers when few training data are available. Finally, simulations compare our approach with the KFD method. © 2004, IEEE. All rights reserved.
KW - Kernel Fisher discriminant
KW - learning machine
KW - second-order criteria
KW - support vector machines
UR - https://www.mendeley.com/catalogue/fa6607bd-bdf4-3fd4-983b-64e061503fcc/
U2 - 10.1109/TSP.2004.834346
DO - 10.1109/TSP.2004.834346
M3 - Article
SN - 1053-587X
VL - 52
SP - 2798
EP - 2806
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 10
ER -