VideoHandles: Replicating gestures to search through action-camera video

Jarrod Knibbe, Sue Ann Seah, Mike Fraser

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

5 Citations (Scopus)


We present VideoHandles, a novel interaction technique to support rapid review of wearable video camera data by re-performing gestures as a search query. The availability of wearable video capture devices has led to a significant increase in activity logging across a range of domains. However, searching through and reviewing footage for data curation can be a laborious and painstaking process. In this paper we showcase the use of gestures as search queries to support review and navigation of video data. By exploring example self-captured footage across a range of activities, we propose two video data navigation styles using gestures: prospective gesture tagging and retrospective gesture searching. We describe VideoHandles' interaction design, motivation and results of a pilot study.

Original languageEnglish
Title of host publicationSUI 2014 - Proceedings of the 2nd ACM Symposium on Spatial User Interaction
PublisherAssociation for Computing Machinery (ACM)
Number of pages4
ISBN (Print)9781450328203
Publication statusPublished - 4 Oct 2014
Event2nd ACM Symposium on Spatial User Interaction, SUI 2014 - Honolulu, United States
Duration: 4 Oct 20145 Oct 2014


Conference2nd ACM Symposium on Spatial User Interaction, SUI 2014
Country/TerritoryUnited States


Dive into the research topics of 'VideoHandles: Replicating gestures to search through action-camera video'. Together they form a unique fingerprint.

Cite this