Learning the Coordinate Gradients

Ying Yiming, Wu Qiang, ICG Campbell

Research output: Contribution to journalArticle (Academic Journal)peer-review

9 Citations (Scopus)


Abstract In this paper we study the problem of learning the gradient function with application to variable selection and determining variable covariation. Firstly, we pro- pose a novel unifying framework for coordinate gradient learning from the perspective of multi-task learning. Various variable selection algorithms can be regarded as spe- cial instances of this framework. Secondly, we formulate the dual problems of gradient learning with general loss functions. This enables the direct application of standard optimization toolboxes to the case of gradient learning. For instance, gradient learning with SVM loss can be solved by quadratic programming (QP) routines. Thirdly, we propose a novel gradient learning algorithm which can be cast as learning the kernel matrix problem. Its relation with sparse regularization is highlighted. A semi-in¯nite linear programming (SILP) approach and an iterative optimization approach are pro- posed to e±ciently solve this problem. Finally, we validate our proposed approaches on both synthetic and real datasets.
Translated title of the contributionLearning the Coordinate Gradients
Original languageEnglish
Pages (from-to)355-378
JournalAdvances in Computational Mathematics
Issue number3
Publication statusPublished - Sept 2011

Bibliographical note

Author of Publication Reviewed: Yiming Ying, Qiang Wu and Colin Campbell


Dive into the research topics of 'Learning the Coordinate Gradients'. Together they form a unique fingerprint.

Cite this