Abstract

This briefing paper addresses concerns about the rise of predictive analytics systems used in child welfare contexts, primarily in the UK but also drawing on some international examples. Councils in the UK are using predictive algorithms for anything from traffic management to benefit sanctions, and police forces are using them for tasks such as providing guidance when considering which crimes should be investigated. Machine learning (ML) solutions that use predictive algorithms have become commonplace in many aspects of public life (Seigel, 2016). However, this paper focuses on the use of these systems in the field of child welfare. We consider predictive analytics, as a ‘system that combines data, algorithms, machine learning and statistical techniques to predict what may happen in the future’ (Redden, 2020: 102). This concept is often used interchangeably with associated concepts, such as predictive modelling and automated or algorithmic decision-making, to denote the ethical and social aspects of ML:

The use of predictive analytics in child welfare is part of the continuum of the history of the computerization, automation, rationalization of social work, in which a range of tools has been introduced to support decision-making processes in the past three decades(Rahman and Keseru, 2021: 103).

In this briefing paper, we trace the social policy drivers that chart a short policy and practice history across nearly three decades – from 1998 and the publication of the influential Supporting Families consultation report to the current time (2025) – specifically in the UK and particularly in relation to the increasing demands for information sharing across professions (Miller and Cameron, 2011). It is important to understand how we have reached the current situation, whereby, despite early critiques of this trend (Church and Fairchild, 2017), predictive analytics is increasingly being seen as a solution by authorities concerned about how to make decisions related to the allocation of resources in their child and family services and how to ensure they target such finite resources to the right children, to avoid unnecessary harm and potential child deaths
(Connell and Crowley, 2023; Hall et al., 2024).

In reviewing this history, we identify critiques of underlying bias and potential errors that could occur in predictive datasets, while recognising that such systems are usually built with good intentions (for example, improved efficiencies, protection of vulnerable children). Throughout this review, we emphasise concerns about inequality, transparency, public accountability and oversight (Redden, 2020). We also explore the challenges for professionals in trying to make critical and urgent decisions using predictive information, the underlying assumptions that machines are able to process more data, in more sophisticated and allegedly less biased ways than humans, and the implications that these claims have for professional judgement (Keddell, 2019; 2023).

Finally, we consider forms of resistance that have emerged as responses to the data-driven solutions used by statutory service providers globally. These collectively include forms of public resistance, the efforts of investigative journalists to reveal the inequities in predictive systems through methods such as Freedom of Information requests, and alternative approaches to addressing bias mitigations conceptually and computationally in predictive systems. We term these Indigenous data sovereignty, Civic resistance, and Algorithmic Reparations approaches, and it is in these subsections that we particularly emphasise learning from other parts of the world, including Canada, the United States (US), New Zealand, Denmark, and the Netherlands.

We hope that this paper will be of use to researchers, policymakers and children’s service professionals in supporting more critical consideration of both how and why predictive analytic solutions have found their way into the child welfare context. It also provides a deep understanding of the critiques and concerns that have been raised both in the UK and in other parts of the world where similar debates are taking place. Ultimately, we hope this will prove to be a useful summary to help inform critical decisions taken by public service providers as to whether they invest in the development of predictive solutions in child welfare contexts in future.
Original languageEnglish
Place of PublicationBristol
PublisherESRC Centre for Sociodigital Futures
Number of pages30
DOIs
Publication statusPublished - Jan 2026

Fingerprint

Dive into the research topics of 'The (problematic) rise of predictive analytics in child welfare: How did we get here?'. Together they form a unique fingerprint.

Cite this