Dual-modal tactile perception and exploration

Dataset

Description

EPSRC grant on tactile superresolution sensing (EP/M02993X/1) and Leverhulme Grant on 'Biomimetic Forebrain for Robot Touch' (RL-2016-39).

Tactile sensing is required for human-like control
with robotic manipulators. Multimodality is an essential component
for these tactile sensors, for robots to achieve both the
perceptual accuracy required for precise control, as well as the
robustness to maintain a stable grasp without causing damage to
the object or the robot itself. In this study, we present a cheap,
3D-printed, compliant, dual-modal, optical tactile sensor which
is capable of both high (temporal) speed sensing, analogous to
pain reception in humans and high (spatial) resolution sensing,
analogous to the sensing provided by Merkel cell complexes in
the human fingertip. We apply three tasks designed to test the
sensing capabilities in both modalities; i) a depth modulation
task, where a robot is required to follow a target trajectory using
the high-speed modality; ii) an off-line high-resolution perception
task, where the sensor perceives angle and radial position relative
to an object edge; and iii) a tactile exploration task, where the
robot uses the high-resolution modality to perceive an edge and
subsequently follow the object contour. The robot is capable of
modulating contact depth using the high-speed mode, attains
a high level of accuracy in the perception task and accurate
control using the high-resolution mode. The control method is
successfully applied to an unseen object at an arbitrary depth
with the use of both high-speed and high-resolution modalities
in combination.
Date made available6 Dec 2017
PublisherUniversity of Bristol

Cite this