Directing the Attention of a Wearable Camera by Pointing Gestures

Teofilo E. de Campos, Walterio Mayol-Cuevas, David W. Murray

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

5 Citations (Scopus)

Abstract

Wearable visual sensors provide views of the environment which are rich in information about the wearer s location, interactions and intentions. In the wearable domain, hand gesture recognition is the natural replacement for keyboard input. We describe a framework combining a coarse-to-fine method for shape detection and a 3D tracking method that can identify pointing gestures and estimate their direction. The low computational complexity of both methods allows a real-time implementation that is applied to estimate the user's focus of attention and to control fast redirections of gaze of a wearable active camera. Experiments have demonstrated a level of robustness of this system in long and noisy image sequences.
Translated title of the contributionDirecting the Attention of a Wearable Camera by Pointing Gestures
Original languageEnglish
Title of host publicationBrazilian Symposium on Computer Graphics and Image Processing, SIBGRAPI
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Publication statusPublished - 2006

Bibliographical note

Other page information: -
Conference Proceedings/Title of Journal: Brazilian Symposium on Computer Graphics and Image Processing, SIBGRAPI
Other identifier: 2000557

Fingerprint Dive into the research topics of 'Directing the Attention of a Wearable Camera by Pointing Gestures'. Together they form a unique fingerprint.

Cite this