TY - JOUR
T1 - Learning general Gaussian kernel hyperparameters of SVMs using optimization on symmetric positive-definite matrices manifold
AU - Laanaya, Hicham
AU - Abdallah, Fahed
AU - Snoussi, Hichem
AU - Richard, Cédric
PY - 2011/10/1
Y1 - 2011/10/1
N2 - We propose a new method for general Gaussian kernel hyperparameter optimization for support vector machines classification. The hyperparameters are constrained to lie on a differentiable manifold. The proposed optimization technique is based on a gradient-like descent algorithm adapted to the geometrical structure of the manifold of symmetric positive-definite matrices. We compare the performance of our approach with the classical support vector machine for classification and with other methods of the state of the art on toy data and on real world data sets.
AB - We propose a new method for general Gaussian kernel hyperparameter optimization for support vector machines classification. The hyperparameters are constrained to lie on a differentiable manifold. The proposed optimization technique is based on a gradient-like descent algorithm adapted to the geometrical structure of the manifold of symmetric positive-definite matrices. We compare the performance of our approach with the classical support vector machine for classification and with other methods of the state of the art on toy data and on real world data sets.
KW - General Gaussian kernel
KW - Kernel optimization
KW - Support vector machines
KW - Symmetric positive-definite matrices manifold
UR - http://www.scopus.com/inward/record.url?scp=79960257350&partnerID=8YFLogxK
U2 - 10.1016/j.patrec.2011.05.009
DO - 10.1016/j.patrec.2011.05.009
M3 - Article
AN - SCOPUS:79960257350
SN - 0167-8655
VL - 32
SP - 1511
EP - 1515
JO - Pattern Recognition Letters
JF - Pattern Recognition Letters
IS - 13
ER -