Interaction between hand and wearable camera in 2D and 3D environments

WW Mayol, AJ Davison, BJ Tordoff, ND Molton, DW Murray

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

Abstract

This paper is concerned with allowing the user of a wearable, portable, vision system to interact with the visual information using hand movements and gestures. Two example scenarios are explored. The first, in 2D, uses the wearer s hand to both guide an active wearable camera and to highlight objects of interest using a grasping vector. The second is based in 3D, and builds on earlier work which recovers 3D scene structure at video-rate, allowing real-time purposive redirection of the camera to any scene point. Here, a range of hand gestures are used to highlight and select 3D points within the structure and in this instance used to insert 3D graphical objects into the scene. Structure recovery, gesture recognition, scene annotation and augmentation are achieved in parallel and at video-rate.
Translated title of the contributionInteraction between hand and wearable camera in 2D and 3D environments
Original languageEnglish
Title of host publicationBritish Machine Vision Conference, BMVC 2004, Kingston University, London, 7-9 September
EditorsA Hoppe, S Barman, T Ellis
PublisherBMVA
Pages1 - 10
Number of pages10
ISBN (Print)1901725251
Publication statusPublished - Sep 2004

Bibliographical note

Other: http://www.cs.bris.ac.uk/Publications/pub_info.jsp?id=2000270

Fingerprint

Dive into the research topics of 'Interaction between hand and wearable camera in 2D and 3D environments'. Together they form a unique fingerprint.

Cite this