Class-specific nonlinear subspace learning based on optimized class representation

Alexandros Iosifidis, Anastasios Tefas, Ioannis Pitas

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

185 Downloads (Pure)

Abstract

In this paper, a new nonlinear subspace learning technique for class-specific data representation based on an optimized class representation is described. An iterative optimization scheme is formulated where both the optimal nonlinear data
projection and the optimal class representation are determined at each optimization step. This approach is tested on human face and action recognition problems, where its performance is compared with that of the standard class-specific subspace learning approach, as well as other nonlinear discriminant subspace learning techniques. Experimental results denote the effectiveness of this new approach, since it consistently outperforms the standard one and outperforms other nonlinear discriminant subspace learning techniques in most cases.
Original languageEnglish
Title of host publication2015 23rd European Signal Processing Conference (EUSIPCO)
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages2491-2495
Number of pages5
ISBN (Electronic)9780992862633
ISBN (Print)9781479988518
DOIs
Publication statusPublished - 28 Dec 2015
Event23rd European Signal Processing Conference, EUSIPCO 2015 - Nice, France
Duration: 31 Aug 20154 Sep 2015

Publication series

NameProceedings of the European Signal Processing Conference (EUSIPCO)
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
ISSN (Print)2219-5491

Conference

Conference23rd European Signal Processing Conference, EUSIPCO 2015
Country/TerritoryFrance
CityNice
Period31/08/154/09/15

Keywords

  • Class-specific discriminant learning
  • Nonlinear subspace learning
  • Action recognition
  • Face recognition

Fingerprint

Dive into the research topics of 'Class-specific nonlinear subspace learning based on optimized class representation'. Together they form a unique fingerprint.

Cite this