Learning general Gaussian kernel hyperparameters of SVMs using optimization on symmetric positive-definite matrices manifold

Hicham Laanaya, Fahed Abdallah, Hichem Snoussi, Cédric Richard

Research output: Contribution to journalArticlepeer-review

10 Citations (Scopus)

Abstract

We propose a new method for general Gaussian kernel hyperparameter optimization for support vector machines classification. The hyperparameters are constrained to lie on a differentiable manifold. The proposed optimization technique is based on a gradient-like descent algorithm adapted to the geometrical structure of the manifold of symmetric positive-definite matrices. We compare the performance of our approach with the classical support vector machine for classification and with other methods of the state of the art on toy data and on real world data sets.

Original languageEnglish
Pages (from-to)1511-1515
Number of pages5
JournalPattern Recognition Letters
Volume32
Issue number13
DOIs
Publication statusPublished - 1 Oct 2011
Externally publishedYes

Keywords

  • General Gaussian kernel
  • Kernel optimization
  • Support vector machines
  • Symmetric positive-definite matrices manifold

Cite this