Rademacher chaos complexities for learning the kernel

Yiming Ying, ICG Campbell

Research output: Contribution to journalArticle (Academic Journal)peer-review

26 Citations (Scopus)


In this paper we develop a novel generalization bound for learning the kernel problem. First, we show that the generalization analysis of the kernel learning problem reduces to investigation of the suprema of the Rademacher chaos process of order two over candidate kernels, which we refer to as Rademacher chaos complexity. Next, we show how to estimate the empirical Rademacher chaos complexity by well-established metric entropy integrals and pseudo-dimension of the set of candidate kernels. Our new methodology mainly depends on the principal theory of U-processes and entropy integrals. Finally, we establish satisfactory excess generalization bounds and misclassification error rates for learning Gaussian kernels and general radial basis kernels.
Translated title of the contributionRademacher chaos complexities for learning the kernel
Original languageEnglish
Pages (from-to)2858 - 2886
Number of pages28
JournalNeural Computation
Issue number11
Publication statusPublished - 2010


Dive into the research topics of 'Rademacher chaos complexities for learning the kernel'. Together they form a unique fingerprint.

Cite this