Coarse-to-Fine Robotic Pushing Using Touch, Vision and Proprioception

Bowen Deng, Yijiong Lin, Max Yang, Nathan F. Lepora*

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review

30 Downloads (Pure)

Abstract

Robotic pushing can be a complicated process that is indicative of the techniques needed for general object manipulation. Here we propose a novel coarse-to-fine approach that combines visual localization with a pushing strategy using tactile and proprioceptive feedback. In the coarse control stage, visual feedback continuously adjusts the relative pose between the end-effector and the object. This serves as an operational point to start the fine control. In the fine control phase, relative sensor-object pose information from tactile sensing is used to accurately control the end-effector to push the object to the target pose. The visual and tactile feedback are integrated into a multi-stage control process so that the object can be moved to a target position and orientation; in contrast, using only a single-stage pushing method does not permit control of object orientation. Our study confirms that combining tactile and visual approaches is more efficient and accurate for a fairly complex manipulation task. We expect that related methods will extend to more challenging prehensile object manipulation tasks to improve dexterous capabilities of robots.
Original languageEnglish
Pages (from-to)1545-1552
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume10
Issue number2
Early online date1 Dec 2024
DOIs
Publication statusPublished - 1 Feb 2025

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

Keywords

  • deep learning in grasping and manipulation
  • Force and tactile sensing

Fingerprint

Dive into the research topics of 'Coarse-to-Fine Robotic Pushing Using Touch, Vision and Proprioception'. Together they form a unique fingerprint.

Cite this