Inter-battery Topic Representation Learning

Cheng Zhang, Hedvig Kjellström, Carl Henrik Ek

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

157 Downloads (Pure)

Abstract

In this paper, we present the Inter-Battery Topic Model (IBTM). Our approach extends traditional topic models by learning a factorized latent variable representation. The structured representation leads to a model that marries benefits traditionally associated with a discriminative approach, such as feature selection, with those of a generative model, such as principled regularization and ability to handle missing data. The factorization is provided by representing data in terms of aligned pairs of observations as different views. This provides means for selecting a representation that separately models topics that exist in both views from the topics that are unique to a single view. This structured consolidation allows for efficient and robust inference and provides a compact and efficient representation. Learning is performed in a Bayesian fashion by maximizing a rigorous bound on the log-likelihood. Firstly, we illustrate the benefits of the model on a synthetic dataset,. The model is then evaluated in both uni- and multi-modality settings on two different classification tasks with off-the-shelf convolutional neural network (CNN) features which generate state-of-the-art results with extremely compact representations.
Original languageEnglish
Title of host publicationComputer Vision - ECCV 2016 - 14th European Conference, Amsterdam, The Netherlands, October 11-14, 2016, Proceedings, Part VIII
EditorsBastian Leibe, Jiri Matas, Nicu Sebe, Max Welling
PublisherSpringer
Pages210-226
Number of pages17
Volume9912
ISBN (Print)9783319464831
DOIs
Publication statusPublished - 17 Sep 2016

Publication series

NameLecture Notes in Computer Science
PublisherSpringer

Keywords

  • Factorized Representation
  • Topic Model
  • Multi-View Model
  • CNN Feature
  • Image Classification

Fingerprint Dive into the research topics of 'Inter-battery Topic Representation Learning'. Together they form a unique fingerprint.

Cite this