World Social Work Day 2022. Artificial Intelligence in CSC.
Blog / March 15, 2022
“Now is a good time to stop. With the global coronavirus pandemic, everything has been changed, all our data scrambled to the point of uselessness in any case. Let those who believe in these approaches reflect on what to do next. Let those who believe they have already cracked it, prove it.”
Michael Sanders, What Works for Children’s Social Care, Chief Executive (blog) 10 September 2020
In September 2020, the What Works for Children’s Social Care published a research report. After working with four local authorities, and analysing thousands of case notes relating to tens of thousands of children, they tried to make a series of predictions about their future. “What we find is not encouraging.”
“Across 32 models, none meet the threshold we set in advance for success, with most of them falling far short of it. Models that attempt to predict the future – i.e. those that are actually useful in practice – do even worse – meaning that more families could see unnecessary intervention in their lives, and more opportunities for support could be missed. The models don’t perform any worse for specific groups – defined by race, age, or disability – but this is a cold comfort when the models don’t perform well anyway. It seems that increasing the sample size may help but the population changes quickly enough that in waiting for more data the previous data becomes obsolete and local authorities have different enough contexts that combining data is unlikely to help.”
Previously in January 2020, The Alan Turing Institute and the University of Oxford’s Rees Centre had carried out extensive research and published a report, the Ethics Review of Machine Learning in Children’s Social Care. The researchers found, “These issues related to the safe and ethical functioning of a predictive ML model are magnified in high-impact and safety-critical domains such as CSC, for system errors, unreliable performance, and lurking biases may have life and death consequences.”
We took part in a round table event organised during research by The Alan Turing Institute. We asked one of the key providers of the systems attending, how they measured errors rates, or that the system worked. What was their measure of failure? The CEO replied, “if we don’t save the Local Authority money.”
The WWCSC polling of social workers, “shows that only 10% of them believe these tools are appropriate in social work, a profession in which human relationships are key, while the Oxford Internet Institute have recently concluded that the hoped for financial benefits are unlikely to be realised.”
The Independent review of children’s social care has so far not engaged with this issue. It must. AI today has no safe place in children’s social care.
On World Social Work Day 2022, our question is why are are such tools still deployed in children’s social care at all. Is the government really going to wait until the life and death consequences affect a child before they act to protect children from bad decisions that remove the humanity and accountability from the systems and institutions that need it most?
In Michael Sanders’ words, “At the moment, the case has not been made.“