Abstract
We introduce a framework to predict the landing behaviour of a Micro Air Vehicle (MAV) from the appearance of the landing surface. We approach this problem by learning a mapping from visual texture observed from an onboard camera to the landing behaviour on a set of sample materials. In this case we exemplify our framework by predicting the yaw angle of the MAV after landing. Our framework demonstrates the applicability of established texture classification methods usually tested on stationary camera setups for the more challenging case of textures observed from a MAV. Results for supervised training demonstrate good estimation of the landing behaviour and motivate future work to implement autonomous decision making strategies and other behaviour predictions based on imagery.
Original language | English |
---|---|
Title of host publication | IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) |
Publisher | IEEE Computer Society |
Pages | 4550-4556 |
Number of pages | 7 |
Publication status | Published - 1 Oct 2012 |