Robots in Need: What can they do to get our help?
: The Role of Emotion, Empathy and Ethics

  • Joseph E Daly

Student thesis: Doctoral ThesisDoctor of Philosophy (PhD)

Abstract

It is inevitable that robots will encounter situations where they need help from humans. They can encounter issues due to physical or software limitations, and any robot designed to work collaboratively with people also needs the ability to elicit assistance from a person. In interactions between people, emotion and empathy can be motivators for helping others. This could offer a potential route for robots to gain assistance. This thesis aimed to explore in detail the role of emotions in motivating people to help a robot, specifically comparing the role of negative and positive emotions - the latter having been largely ignored in previous research.
Robots performing behaviours perceived as emotional had a subtle effect on how quickly people went to assist a zoomorphic robot when stuck. However, people’s perceptions of robot behaviour was influenced by context. In particular, when ‘happy’ behaviour was seen in the context of the robot needing help, people perceived it instead as ‘distress’ and which also increased people’s intentions to help the robot. This research also investigated repeated interactions over a longer period and the role of ‘helper’s high’ - a positive feeling that can arise after helping others. People interacted with a robot over a week at home and found that people’s moods improved after interacting with the robot. Qualitative data indicated that the robot’s feedback contributed to the emergence of a form of ‘helper’s high’, and was a strong motivator to keep helping.
There are several ethical implications of robots using emotion to influence people’s behaviour. We investigated people’s perceptions of these interactions, and found that people generally thought it was ethically acceptable for the robots to use emotion behaviours to seek help. There was also an association between how likeable people found the robot, and how ethically acceptable they thought its behaviour was. People also displayed some awareness of the issues that could arise with vulnerable individuals interacting with the robot.
Overall, emotions, both positive and negative, did play a role in interactions where people assisted a robot, however, there remain many questions about exactly how and when it is ethically acceptable for robot’s to use emotion to gain assistance.
Date of Award1 Oct 2024
Original languageEnglish
Awarding Institution
  • University of Bristol
SupervisorPaul Bremner (Supervisor) & Ute B Leonards (Supervisor)

Cite this

'