‘We have opened a can of worms’: using collaborative ethnography to advance responsible artificial intelligence innovation

Andrés Domínguez Hernández*, Richard Owen

*Corresponding author for this work

Research output: Contribution to journalArticle (Academic Journal)peer-review

2 Citations (Scopus)

Abstract

With the recent rapid developments in artificial intelligence (AI), social scientists and computational scientists have approached overlapping questions about ethics, responsibility, and fairness. Joined-up efforts between these disciplines have nonetheless been scarce due to, among other factors, unfavourable institutional arrangements, unclear publication avenues, and sometimes incompatible normative, epistemological and methodological commitments. In this paper, we offer collaborative ethnography as one concrete methodology to address some of these challenges. We report on an interdisciplinary collaboration between science and technology studies scholars and data scientists developing an AI system to detect online misinformation. The study combined description, interpretation, and (self-)critique throughout the design and development of the AI system. We draw three methodological lessons to move from critique to action for interdisciplinary teams pursuing responsible AI innovation: (1) Collective self-critique as a tool to resist techno-centrism and relativism, (2) Moving from strategic vagueness to co-production, and (3) Using co-authorship as a method.
Original languageEnglish
Article number2331655
Number of pages21
JournalJournal of Responsible Innovation
Volume11
Issue number1
Early online date23 Apr 2024
DOIs
Publication statusPublished - 31 Dec 2024

Bibliographical note

Publisher Copyright:
© 2024 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.

Fingerprint

Dive into the research topics of '‘We have opened a can of worms’: using collaborative ethnography to advance responsible artificial intelligence innovation'. Together they form a unique fingerprint.

Cite this