Digital technologies are becoming pervasive in society, from online shopping and social interactions through to finance, banking, transportation. With a future vision of smart cities, driven by a real-time, data-driven, digital economy, privacy is paramount. It is critical to engendering trust in the digital fabric on which society relies and is enshrined as a fundamental human right in the Universal Declation of Human Rights and regulations such as GDPR. Significant efforts have been made -- end-to-end encryption, anonymous communication, privacy nutrition labels in iOS and Android -- to provide users with more agency in understanding, controlling and assuring the way their data and information is processed and shared.
However, this ability to control, understand and assure is not equitably experienced across society. Examples include individuals from lower-income groups who have to share devices to access services that may include sensitive information or victims of intimate partner violence whereby an innocuous app (such as find my phone) or digital device (such as a smart doorbell) may be used to monitor their activities and who cannot use online reporting tools for fear of traceability. Such vulnerable and marginalised populations have specific privacy and information control needs and threat models whereby different types of privacy controls may serve as both protection mechanisms and attack vectors. These needs and requirements are not typically foregrounded to software developers. The challenge is compounded by the fact developers are neither privacy experts nor typically have the training, tools, support and guidance to design for the diverse privacy needs of marginalised and vulnerable groups.
We argue that, for privacy to be of meaningful and equitable value in our pervasive digital economy, everyone must be able to easily control how they share personal information, understand with whom they are sharing it, and ensure that sharing is limited to the intended purpose.
The project will work hand-in-hand with third sector organisations supporting such communities to develop:
New methods: a threat modelling approach, supported by a set of threat catalogues, that enables different "modalities" of protection logic whereby one can switch attackers, contextualise the vulnerabilities and acknowledge different types of controls as both protection mechanisms and attack vectors.
New digital tools: a privacy-in-use nutrition framework that promotes privacy-literacy in vulnerable and marginalised populations, identifies privacy concerns in-use and facilitates developer responses built through new application programming interfaces and evaluated through novel metrics supporting equitable privacy.
New processes: co-created, stakeholder-led revisions to the AREA framework for Responsible Innovation to lend structure to the way in which individuals, teams, and organisations approach deep thinking about equitable digital futures.
Our research will make the privacy needs of marginalised and vulnerable populations first-class considerations in designing and developing software applications and services to enable equitable privacy experiences. This, in turn, will enable universal privacy responses to work together and support particular responses to privacy issues experienced by vulnerable users.