Analysis of SVM with Indefinite Kernels

Ying Yiming, ICG Campbell, Girolami Mark

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

45 Citations (Scopus)


The recent introduction of indefinite SVM by Luss and d’Aspremont [15] has effectively demonstrated SVM classification with a non-positive semi-definite kernel (indefinite kernel). This paper studies the properties of the objective function introduced there. In particular, we show that the objective function is continuously differentiable and its gradient can be explicitly computed. Indeed, we further show that its gradient is Lipschitz continuous. The main idea behind our analysis is that the objective function is smoothed by the penalty term, in its saddle (min-max) representation, measuring the distance between the indefinite kernel matrix and the proxy positive semi-definite one. Our elementary result greatly facilitates the application of gradient-based algorithms. Based on our analysis, we further develop Nesterov’s smooth optimization approach [17, 18] for indefinite SVM which has an optimal convergence rate for smooth problems. Experiments on various benchmark datasets validate our analysis and demonstrate the efficiency of our proposed algorithms.
Translated title of the contributionAnalysis of SVM with Indefinite Kernels
Original languageEnglish
Title of host publicationNIPS 22
Pages2205 - 2213
Number of pages8
Publication statusPublished - 2009

Bibliographical note

Conference Proceedings/Title of Journal: Advances in Neural Information Processing Systems
Conference Organiser: NIPS Foundation


Dive into the research topics of 'Analysis of SVM with Indefinite Kernels'. Together they form a unique fingerprint.

Cite this