Class-specific Reference Discriminant Analysis with application in Human Behaviour Analysis

Ioannis Pitas, Alexandros Iosifidis, Anastasios Tefas

Research output: Contribution to journalArticle (Academic Journal)peer-review

32 Citations (Scopus)
235 Downloads (Pure)

Abstract

In this paper, a novel nonlinear subspace learning technique for class-specific data representation is proposed. A novel data representation is obtained by applying nonlinear class-specific data projection to a discriminant feature space, where the data belonging to the class under consideration are enforced to be close to their class representation, while the data belonging to the remaining classes are enforced to be as far as possible from it. A class is represented by an optimized class vector, enhancing class discrimination in the resulting feature space. An iterative optimization scheme is proposed to this end, where both the optimal nonlinear data projection and the optimal class representation are determined in each optimization step. The proposed approach is tested on three problems relating to human behaviour analysis: face recognition, facial expression recognition and human action recognition. Experimental results denote the effectiveness of the proposed approach, since the proposed Class-specific Reference Discriminant Analysis outperforms Kernel Discriminant Analysis, Kernel Spectral Regression and Class-specific Kernel Discriminant Analysis, as well as Support Vector Machine-based classification, in most cases.
Original languageEnglish
Pages (from-to)315-326
JournalIEEE Transactions on Human-Machine Systems
Volume45
Issue number3
Early online date25 Dec 2014
DOIs
Publication statusPublished - 1 Jun 2015

Keywords

  • Class-Specific Kernel Discriminant Analysis
  • Class-Specific Kernel Spectral Regression
  • Optimized Class Representation
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Class-specific Reference Discriminant Analysis with application in Human Behaviour Analysis'. Together they form a unique fingerprint.

Cite this