From The Guardian, 16 Sept 2018. The original article can be read here.
“Exclusive: Use of algorithms to identify families for attention raises stereotyping and privacy fears.
Vast quantities of data on hundreds of thousands of people is being used to construct computer models in an effort to predict child abuse and intervene before it can happen, the Guardian has learned.
Amid mounting financial pressure, local councils are developing “predictive analytics” systems to algorithmically identify families for attention from child services, allowing them to focus resources more effectively.
But while the new algorithmic profiling could be one way of helping social workers, it is likely to be hugely controversial due to its potential to intrude into individual privacy.
The Guardian has discovered at least five local authorities have developed or implemented a predictive analytics system for child safeguarding. At least 377,000 people’s data has been incorporated into the different predictive systems.
Hackney and Thurrock councils have both hired a private company, Xantura, to develop a predictive model for their children’s services teams. Two other councils, Newham and Bristol, have developed their own systems internally. Brent council is developing a system to predict vulnerability to gang exploitation.
One contract obtained by the Guardian reveals the sheer range of council data being considered for inclusion in a predictive model: school attendance and exclusion data, housing association repairs and arrears data, and police records on antisocial behaviour and domestic violence are all among the desired datasets, though some were later excluded from final models.
The Information Commissioner’s Office (ICO), which regulates the use of personal data by public and private bodies, said it would be asking questions of councils using predictive analytics to ensure they were compliant with data protection law.
“All organisations have a duty to look after personal information in their care but records involving children – often sensitive personal data – require particularly robust measures,” an ICO spokesperson said.
Advocates of predictive analytics argue that they enable councils to better target limited resources so they can act before tragedies happen.
“It’s not beyond the realms of possibility that one day we’ll know exactly what services or interventions work, who needs our help and how to support them earlier,” wrote Richard Selwyn, a civil servant at the Ministry of Housing, Communities and Local Government, earlier this year.
But others warn that the systems inevitably incorporate the biases of their designers, and risk perpetuating stereotyping and discrimination while effectively operating without any public scrutiny.
“We talk about them like no human decisions have been made, and it’s purely objective, but it’s human all the way through, with assumptions about what a safe family looks like,” said Virginia Eubanks, an associate professor at the University at Albany and author of the book Automating Inequality.
In Thurrock and Hackney, the Xantura system operates by running its predictive model against a household in response to warning signs, such as a child being expelled from school or a report of domestic violence. The model’s prediction is then passed to a social worker for potential action.
Hackney council said its system had generated 350 risk alerts for families in potential need of attention, while Thurrock said its system generated 300. A Thurrock council memo said earlier this year that all of its referrals to the government’s Troubled Families scheme were now identified by the data analytics system and predictive model provided by Xantura.
Councils are adopting predictive systems at a time when local government budgets are under unprecedented pressure. By 2020 government funding for councils will have been cut by £16bn under the Conservative government’s austerity programme, the Local Government Association has said.
The software can be used to generate revenue for the council through the Troubled Families payments-by-results scheme. Xantura’s website advertises how its products can help local authorities “maximise PBR [payment-by-results] payments” from the government as well as reduce child safeguarding costs.
Under the Troubled Families scheme, councils are paid £1,000 for each family they sign up to the programme, with a further payment of £800 when the family meets certain criteria.
Last month, Northamptonshire county council, which is technically insolvent, announced plans to cut a further £70m from its budget and said that services for children and vulnerable adults could not be protected. East Sussex council also said it would cut services to “the legal minimum”.
Northamptonshire council explored implementing a predictive analytics scheme several years ago, and even commissioned a private provider to produce a pilot scheme. It later discontinued the scheme, in part due to a lack of funding.”