Kernel Reference Discriminant Analysis

Alexandros Iosifidis, Anastasios Tefas, Ioannis Pitas

Research output: Contribution to journalArticle (Academic Journal)peer-review

25 Citations (Scopus)
270 Downloads (Pure)


Linear Discriminant Analysis (LDA) and its nonlinear version Kernel Discriminant Analysis (KDA) are well-known and widely used techniques for supervised feature extraction and dimensionality reduction. They determine an optimal discriminant space for (non)linear data projection based on certain assumptions, e.g. on using normal distributions (either on the input or in the kernel space) for each class and employing class representation by the corresponding class mean vectors. However, there might be other vectors that can be used for classes representation,
in order to increase class discrimination in the resulted feature space. In this paper, we propose an optimization scheme aiming at the optimal class representation, in terms of Fisher ratio maximization, for nonlinear data projection. Compared to the standard approach, the proposed optimization scheme increases class discrimination in the reduced-dimensionality feature space and achieves higher classification rates in publicly available data sets.
Original languageEnglish
Pages (from-to)85-91
Number of pages7
JournalPattern Recognition Letters
Early online date9 Jul 2014
Publication statusPublished - 1 Nov 2014


  • Kernel Discriminant Analysis
  • Kernel Spectral Regression
  • Class Representation


Dive into the research topics of 'Kernel Reference Discriminant Analysis'. Together they form a unique fingerprint.

Cite this