AbstractVisual Simultaneous Localisation and Mapping (SLAM) is a vital technology for the advancement of autonomous robotics, providing a means of both mapping and pose estimation from visual data. Many advances in visual and RGB-D SLAM however have built upon established point feature and dense reconstruction methodologies, leaving other approaches relatively uncharted in comparison.
This thesis instead explores semi-dense edge based approaches for RGB-D SLAM, in addition to path planning approaches for SLAM based indoor aerial robotics.
We first present an edge based RGB-D SLAM approach utilizing edges from both depth and RGB channels, along with a method of depth edge detection exploiting the temporal similarity between RGB-D video frames. Detected edge pixels are back-projected to form semi-dense point clouds and a fast edge based Iterative Closest Point (ICP) algorithm is utilized for sensor pose estimation. Evaluation of the proposed SLAM system demonstrates compelling results, achieving similar accuracy to a number of alternative systems while having significantly lower computational cost, thus making it fit for real-time application on small robotic platforms. An extension of this SLAM system to use 3D line features is then presented, along with methods of extracting and registering such line features from semi-dense edge point clouds.
Secondly we present a belief space planning approach for autonomous indoor MAVs relying on bearing measurements to environmental landmarks for localisation in cluttered environments. This approach can produce trajectories which minimize uncertainty in the vehicle's state from such bearing measurements, while reaching an end goal. The approach allows for a tunable trade-off between minimizing trajectory length and state uncertainty, and examples trajectories are given comparing these two extrema in various 3D environments and landmark layouts.
Lastly a SLAM-aware path planning approach is presented for indoor MAVs, utilizing the proposed edge based SLAM system for localisation. In addition to ensuring collision avoidance this approach ensures that SLAM localisation is maintained by producing trajectories along which the vehicle's RGB-D sensor always observes sufficient information from the existing SLAM map. These trajectories are "keyframe centric" consisting of many connected "segment" trajectories, each defined within the reference frame of a specific keyframe, together forming a complete trajectory across the map. This scheme allows the MAV to navigate across the SLAM map, even if it has become globally inconsistent due to issues such as sensor drift accumulation.
|Date of Award||7 Mar 2017|
|Sponsors||Defence Science and Technology Laboratory|
|Supervisor||Arthur G Richards (Supervisor)|
- path planning
- computer Vision