Machines with high levels of autonomy such as robots and our growing need to interact with them creates challenges to ensure safe operation. The recent interest to create autonomous vehicles through the integration of control and decision-making systems makes such vehicles robots too. We therefore applied estimation and decision-making mechanisms currently investigated for human-robot interaction to humanvehicle interaction. In other words, we define the vehicle as an autonomous agent with which the human driver interacts, and focus on understanding the human intentions and decisionmaking processes. These are then integrated into the robot’s/vehicle’s own control and decision-making system not only to understand human behaviour while it occurs but to predict the next actions. To obtain knowledge about the human’s intentions, this work relies heavily on the use of motion tracking data (i.e. skeletal tracking, body posture) gathered from drivers whilst driving. We use a data-driven approach to both classify current driving manoeuvres and predict future manoeuvres, by using a fixed prediction window and augmenting a standard set of manoeuvres. Results are validated against drivers of different sizes, seat preferences and levels of driving expertise to evaluate the robustness of the methods; precision and recall metrics higher than 95% for manoeuvre classification and 90% for manoeuvre prediction with time-windows of up to 1.3 seconds are obtained. The idea of prediction adds a highly novel aspect to human-robot/human-vehicle interaction, allowing for decision and control at a later point.
|Name||IEEE International Conference on Intelligent Robots and Systems|
- Brain and Behaviour
- Cognitive Science
- Visual Perception