Skip to content

EPIC-Tent: An Egocentric Video Dataset for Camping Tent Assembly

Research output: Contribution to conferencePaper

Original languageEnglish
Number of pages9
DateAccepted/In press - 21 Aug 2019
DatePublished (current) - 2 Nov 2019
EventThe Fifth International Workshop on Egocentric Perception, Interaction and Computing - Seoul, Korea, Republic of
Duration: 2 Nov 20192 Nov 2019
Conference number: 5
https://eyewear-computing.org/EPIC_ICCV19/

Conference

ConferenceThe Fifth International Workshop on Egocentric Perception, Interaction and Computing
Abbreviated titleEPIC@ICCV19
CountryKorea, Republic of
CitySeoul
Period2/11/192/11/19
Internet address

Abstract

This paper presents an outdoor video dataset annotated with action labels, collected from 24 participants wearing two head-mounted cameras (GoPro and SMI eye tracker) while assembling a camping tent. In total, this is 5.4 hours of recordings. Tent assembly includes manual interactions with non-rigid objects such as spreading the tent, securing guylines, reading instructions, and opening a tent bag. An interesting aspect of the dataset is that it reflects participants' proficiency in completing or understanding the task. This leads to participant differences in action sequences and action durations. Our dataset, called EPIC-Tent, also has several new types of annotations for two synchronised egocentric videos. These include task errors, self-rated uncertainty and gaze position, in addition to the task action labels. We present baseline results on the EPIC-Tent dataset using a state-of-the-art method for offline and online action recognition and detection.

    Structured keywords

  • Visual Perception

Event

The Fifth International Workshop on Egocentric Perception, Interaction and Computing

Abbreviated titleEPIC@ICCV19
Conference number5
Duration2 Nov 20192 Nov 2019
CitySeoul
CountryKorea, Republic of
Web address (URL)
Degree of recognitionInternational event

Event: Conference

Download statistics

No data available

Documents

Documents

View research connections

Related faculties, schools or groups