TY - GEN
T1 - Joint Action Understanding improves Robot-to-Human Object Handover
AU - Grigore, Elena Corina
AU - Eder, Kerstin I
AU - Pipe, Anthony
AU - Melhuish, Christopher R
AU - Leonards, Ute B
PY - 2013/11
Y1 - 2013/11
N2 - The development of trustworthy human-assistive robots is a challenge that goes beyond the traditional boundaries of engineering. Essential components of trustworthiness are safety, predictability and usefulness. In this paper we demonstrate that the integration of joint action understanding from human-human interaction into the human-robot context can significantly improve the success rate of robot-to-human object handover tasks. We take a two layer approach. The first layer handles the physical aspects of the handover. The robot’s decision to release the object is informed by a Hidden Markov Model that estimates the state of the handover. We then introduce a higher-level cognitive layer that models behaviour to be expected from the human user in a handover situation inspired by human-human handover observations. In particular, we focus on the inclusion of eye gaze / head orientation into the robot’s decision making. Our results demonstrate that by integrating these non-verbal cues the success rate of robot-to-human handovers can be significantly improved, resulting in a more robust and therefore safer system.
AB - The development of trustworthy human-assistive robots is a challenge that goes beyond the traditional boundaries of engineering. Essential components of trustworthiness are safety, predictability and usefulness. In this paper we demonstrate that the integration of joint action understanding from human-human interaction into the human-robot context can significantly improve the success rate of robot-to-human object handover tasks. We take a two layer approach. The first layer handles the physical aspects of the handover. The robot’s decision to release the object is informed by a Hidden Markov Model that estimates the state of the handover. We then introduce a higher-level cognitive layer that models behaviour to be expected from the human user in a handover situation inspired by human-human handover observations. In particular, we focus on the inclusion of eye gaze / head orientation into the robot’s decision making. Our results demonstrate that by integrating these non-verbal cues the success rate of robot-to-human handovers can be significantly improved, resulting in a more robust and therefore safer system.
KW - Human Robot Interaction, Safety, Verification and Validation, Joint Action, Joint Attention
M3 - Conference Contribution (Conference Proceeding)
SN - tbc
VL - tbc
SP - 4622
EP - 4629
BT - IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
CY - Tokyo
T2 - IEEE/RSJ International Conference on Intelligent Robots and Systems
Y2 - 3 November 2013 through 8 November 2013
ER -