Abstract
The recent introduction of indefinite SVM by Luss and d’Aspremont [15] has effectively
demonstrated SVM classification with a non-positive semi-definite kernel
(indefinite kernel). This paper studies the properties of the objective function
introduced there. In particular, we show that the objective function is continuously
differentiable and its gradient can be explicitly computed. Indeed, we further show
that its gradient is Lipschitz continuous. The main idea behind our analysis is that
the objective function is smoothed by the penalty term, in its saddle (min-max)
representation, measuring the distance between the indefinite kernel matrix and
the proxy positive semi-definite one. Our elementary result greatly facilitates the
application of gradient-based algorithms. Based on our analysis, we further develop
Nesterov’s smooth optimization approach [17, 18] for indefinite SVM which
has an optimal convergence rate for smooth problems. Experiments on various
benchmark datasets validate our analysis and demonstrate the efficiency of our
proposed algorithms.
Translated title of the contribution | Analysis of SVM with Indefinite Kernels |
---|---|
Original language | English |
Title of host publication | NIPS 22 |
Pages | 2205 - 2213 |
Number of pages | 8 |
Volume | 22 |
Publication status | Published - 2009 |
Bibliographical note
Conference Proceedings/Title of Journal: Advances in Neural Information Processing SystemsConference Organiser: NIPS Foundation