A fine-grained perspective onto object interactions from first-person views

Dima Damen*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

12 Downloads (Pure)

Abstract

This extended abstract summarises the relevant works to the keynote lecture at VISAPP 2019. The talk discusses understanding object interactions from wearable cameras, focusing on fine-grained understanding of interactions on realistic unbalanced datasets recorded in-the-wild.

Original languageEnglish
Title of host publicationVISIGRAPP 2019 - Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications
EditorsAndreas Kerren, Christophe Hurter, Jose Braz
PublisherSciTePress
Pages11-13
Number of pages3
ISBN (Electronic)9789897583544
DOIs
Publication statusPublished - 25 Feb 2019
Event10th International Conference on Information Visualization Theory and Applications, IVAPP 2019 - Part of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, VISIGRAPP 2019 - Prague, Czech Republic
Duration: 25 Feb 201927 Feb 2019

Publication series

NameAdvances in Intelligent Systems & Computing
PublisherSpringer Nature
ISSN (Print)2194-5357
ISSN (Electronic)2194-5365

Conference

Conference10th International Conference on Information Visualization Theory and Applications, IVAPP 2019 - Part of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, VISIGRAPP 2019
CountryCzech Republic
CityPrague
Period25/02/1927/02/19

Keywords

  • Action anticipation
  • Action completion
  • Action recognition
  • Egocentric vision
  • EPIC-kitchens
  • Fine-grained recognition
  • First-person datasets
  • First-person vision
  • Object interaction recognition
  • Skill determination
  • Wearable cameras

Fingerprint Dive into the research topics of 'A fine-grained perspective onto object interactions from first-person views'. Together they form a unique fingerprint.

Cite this