Regulatory requirements for evolving functionality in autonomous systems

Project Details

Description

Existing regulation on the functionality of autonomous systems (what they are meant to do, what they do, and what they could do) is based on quasi-static design principles where technologies learn so slowly that it is possible to understand fully their autonomous capability and intervene if needed. However, concerns have been raised about the adequacy of existing regulation on the emergence of autonomous systems that can change in functionality on much faster time scales through autonomous learning and adaptation - sometimes referred to as ‘evolving functionality’. Moreover, the binding principles of specification and verification that once served to help build and test autonomous systems need to be re-thought in the context of a faster functional evolution, as well as the fact that many ethical principles on autonomous systems have now become outdated. The proposed study aims to bring these three areas together in order to examine the regulation of evolving functionality with regards to three autonomous technologies being developed at Bristol Robotics Laboratory (BRL), and in doing so will produce new empirical guidance on the requirement for technologies with evolving functionality to be safe, reliable, resilient, ethical and trustworthy.
StatusFinished
Effective start/end date1/04/2130/04/24

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.