Head pose estimation for wearable robot control

Ben J Tordoff, Walterio Mayol-Cuevas, Teofilo deCampos, David W. Murray

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

Abstract

Recent advances in wearable sensing allow active control of the orientation of a body-mounted camera worn by a remote user. In this paper we consider the control of the active camera from head movements. In the context of teleoperation, these may be the head movements of a remote operator, perhaps acting as the wearer’s assistant. The movements are likely to be larger than those in video-conference applications, and so frontal facial features are insufficient. The paper presents a model which incrementally combines a fixed 3D shape model with specific features found on the observed head. Robust methods, including the incorporation of a colour model, are used to mitigate the effect of mismatching, the main contribution being the use of both interest point and colour features within a single random-sampling framework.
Translated title of the contributionHead pose estimation for wearable robot control
Original languageEnglish
Title of host publicationProc. British Machine Vision Conference
PublisherBMVA
Publication statusPublished - 2002

Bibliographical note

Other page information: -
Conference Proceedings/Title of Journal: Proc. British Machine Vision Conference
Other identifier: 2000864

Fingerprint

Dive into the research topics of 'Head pose estimation for wearable robot control'. Together they form a unique fingerprint.

Cite this