Real-time Visual SLAM with Resilience to Erratic Motion

ML Pupilli, AD Calway

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

33 Citations (Scopus)

Abstract

Simultaneous localisation and mapping using a single camera becomes difficult when erratic motions violate predictive motion models. This problem needs to be addressed when visual SLAM algorithms are transferred from robots or mobile vehicles onto hand-held or wearable devices. In this paper we describe a novel SLAM extension to a camera localisation algorithm based on particle filtering which provides resilience to erratic motion. The mapping component is based on auxiliary unscented Kalman filters coupled to the main particle filter via measurement covariances. This coupling allows the system to survive unpredictable motions such as camera shake, and enables a return to full SLAM operation once normal motion resumes. We present results demonstrating the effectiveness of the approach when operating within a desktop environment.
Translated title of the contributionReal-time Visual SLAM with Resilience to Erratic Motion
Original languageEnglish
Title of host publicationIEEE Computer Society Conference on Computer Vision and Pattern Recognition, 17-22 June 2006
Pages1244 - 1249
Number of pages6
DOIs
Publication statusPublished - Jun 2006

Bibliographical note

Conference Organiser: IEEE

Fingerprint

Dive into the research topics of 'Real-time Visual SLAM with Resilience to Erratic Motion'. Together they form a unique fingerprint.

Cite this