Constructing Nonlinear Discriminants from Multiple Data Views

Tom Diethe*, David Roi Hardoon, John Shawe-Taylor

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

31 Citations (Scopus)

Abstract

There are many situations in which we have more than one view of a single data source, or in which we have multiple sources of data that are aligned. We would like to be able to build classifiers which incorporate these to enhance classification performance. Kernel Fisher Discriminant Analysis (KFDA) can be formulated as a convex optimisation problem, which we extend to the Multiview setting (MFDA) and introduce a sparse version (SMFDA). We show that our formulations are justified from both probabilistic and learning theory perspectives. We then extend the optimisation problem to account for directions unique to each view (PMFDA). We show experimental validation on a toy dataset, and then give experimental results on a brain imaging dataset and part of the PASCAL 2007 VOC challenge dataset.

Original languageEnglish
Title of host publicationMACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT I
EditorsJL Balcazar, F Bonchi, A Gionis, M Sebag
Place of PublicationBERLIN
PublisherSpringer Berlin Heidelberg
Pages328-343
Number of pages16
Publication statusPublished - 2010
EventEuropean Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD) - Barcelona, Spain
Duration: 20 Sep 201024 Sep 2010

Publication series

NameLecture Notes in Artificial Intelligence
PublisherSPRINGER-VERLAG BERLIN
Volume6321
ISSN (Print)0302-9743

Conference

ConferenceEuropean Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)
CountrySpain
Period20/09/1024/09/10

Keywords

  • Fisher Discriminant Analysis
  • Convex Optimisation
  • Multiview Learning
  • Kernel methods

Cite this