Interacting with a Handheld Robot
: Prediction of User Intention and Assisted Remote Collaboration

  • S Stolzenwald

Student thesis: Doctoral ThesisDoctor of Philosophy (PhD)

Abstract

Handheld robots are intelligent tools that can process task and context information to help the user in a manual task. This concept bridges the gap between two traditional categories of robots, i.e. fully autonomous independent robots and highly controlled and dependent wearable devices. They enable novice users to complete a task through the provided task knowledge and accuracy, while the user can effortlessly navigate through uncontrolled environments. Recent work shows that the robot’s autonomy can improve performance, however, this presents new challenges as the occurrence of mismatches between the robot’s plans and the user’s intention leads to frustration in users. To overcome this obstacle, we explore interaction concepts that could suit the requirements for handheld robots.

The first part of this work, concerns ways for the system’s perception of the user and their intention. We use a tool-mounted gaze tracking system, which we use as a proxy for estimating the user’s attention. This information is then used for cooperation with users in a generic reaching task, where we test various degrees of robot autonomy. Our results measure performance and subjective metrics and show how the attention model benefits the interaction and preference of users.

In the second part, we then use the attention profile over time to make predictions about the user’s decisions. Using a support vector machine and the mounted eye tracker, the model derives users’ intention from perceived gaze patterns. It yields real-time capabilities and reliable accuracy up to 1.5s prior to predicted actions being executed. That way, the robot predicts one step ahead in the task and can align its plans accordingly. We assess the model in an assisted pick and place task and show how the robot’s intention obedience or rebellion affects the cooperation with the robot.

In the third part, we go one step further in the dimension of human interaction and propose a system that involves three collaborating parties: a local worker, a remote helper and the handheld robot, carried by the local worker. The system enables a remote user to assist the local user through diagnosis, guidance and physical interaction through telemanipulation, with the robot completing subtasks autonomously. We show that the handheld robot can mediate the helpers remote instructions and actions, while the robots semi-autonomous features improve task performance by 24%, reduce the workload for the remote user and decrease required communication bandwidth between both users.

In this work, we explored new ways to interact with a handheld robot. We demonstrate that a tool that makes task decisions can collaborate more effectively when taking into account user intention during real-time task planning. Moreover, this study is a first attempt to evaluate how this new type of collaborative robot works in a remote assistance scenario, a setup that we believe is important to leverage current robot constraints and existing communication technologies.
Date of Award23 Mar 2021
Original languageEnglish
Awarding Institution
  • The University of Bristol
SupervisorWalterio W Mayol-Cuevas (Supervisor), David A W Barton (Supervisor) & François Dupressoir (Supervisor)

Keywords

  • Robotics, Human-Robot Interaction, Machine Learning, Human-Computer Interaction

Cite this

'