Attention for Robot Touch: Tactile Saliency Prediction for Robust Sim-to-Real Tactile Control

Yijiong Lin*, Mauro Comi, Alex Church, Dandan Zhang, Nathan F. Lepora

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference Contribution (Conference Proceeding)

3 Citations (Scopus)

Abstract

High-resolution tactile sensing can provide accurate information about local contact in contact-rich robotic tasks. However, the deployment of such tasks in unstructured environments remains under-investigated. To improve the robustness of tactile robot control in unstructured environments, we propose and study a new concept: tactile saliency for robot touch, inspired by the human touch attention mechanism from neuroscience and the visual saliency prediction problem from computer vision. In analogy to visual saliency, this concept involves identifying key information in tactile images captured by a tactile sensor. While visual saliency datasets are commonly annotated by humans, manually labelling tactile images is challenging due to their counterintuitive patterns. To address this challenge, we propose a novel approach comprised of three interrelated networks: 1) a Contact Depth Network (ConDepNet), which generates a contact depth map to localize deformation in a real tactile image that contains target and noise features; 2) a Tactile Saliency Network (TacSalNet), which predicts a tactile saliency map to describe the target areas for an input contact depth map; 3) and a Tactile Noise Generator (TacNGen), which generates noise features to train the TacSalNet. Experimental results in contact pose estimation and edge-following in the presence of distractors showcase the accurate prediction of target features from real tactile images. Overall, our tactile saliency prediction approach gives robust sim-to-real tactile control in environments with unknown distractors. Project page: https://sites.google.com/view/tactile-saliency/.
Original languageEnglish
Title of host publication2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages10806-10812
Number of pages7
ISBN (Electronic)9781665491907
ISBN (Print)9781665491914
DOIs
Publication statusPublished - 13 Dec 2023
Event2023 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2023 - Detroit, United States
Duration: 1 Oct 20235 Oct 2023
https://2023.ieee-iros.org/

Publication series

NameIEEE International Conference on Intelligent Robots and Systems
PublisherIEEE
ISSN (Print)2153-0858
ISSN (Electronic)2153-0866

Conference

Conference2023 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2023
Country/TerritoryUnited States
CityDetroit
Period1/10/235/10/23
Internet address

Bibliographical note

Publisher Copyright:
© 2023 IEEE.

Fingerprint

Dive into the research topics of 'Attention for Robot Touch: Tactile Saliency Prediction for Robust Sim-to-Real Tactile Control'. Together they form a unique fingerprint.

Cite this