Abstract
Robotic pushing can be a complicated process that is indicative of the techniques needed for general object manipulation. Here we propose a novel coarse-to-fine approach that combines visual localization with a pushing strategy using tactile and proprioceptive feedback. In the coarse control stage, visual feedback continuously adjusts the relative pose between the end-effector and the object. This serves as an operational point to start the fine control. In the fine control phase, relative sensor-object pose information from tactile sensing is used to accurately control the end-effector to push the object to the target pose. The visual and tactile feedback are integrated into a multi-stage control process so that the object can be moved to a target position and orientation; in contrast, using only a single-stage pushing method does not permit control of object orientation. Our study confirms that combining tactile and visual approaches is more efficient and accurate for a fairly complex manipulation task. We expect that related methods will extend to more challenging prehensile object manipulation tasks to improve dexterous capabilities of robots.
Original language | English |
---|---|
Pages (from-to) | 1545-1552 |
Number of pages | 8 |
Journal | IEEE Robotics and Automation Letters |
Volume | 10 |
Issue number | 2 |
Early online date | 1 Dec 2024 |
DOIs | |
Publication status | Published - 1 Feb 2025 |
Bibliographical note
Publisher Copyright:© 2016 IEEE.
Keywords
- deep learning in grasping and manipulation
- Force and tactile sensing