Open letter: Data Use and Access Bill

Support Not Separation & Disabled Mothers’ Rights Campaign have signed this open letter about the impact of the Data (Use & Access) Bill and its likely impact on children & families.

4th March 2025

Rt Hon. Peter Kyle MP Secretary of State for Science, Innovation and Technology, Department for Science Innovation and Technology

Dear Minister,

We are alarmed at the negative impact the Data (Use and Access) Bill is liable to have on the well-being, rights and protections of children and their families.

Children and their families already experience significant and negative impacts arising from the increasing shift towards the sharing of information without consent across services, particularly in the name of safeguarding[1]. There is a lack of evidence that this increase of information sharing keeps children safe, but we know there are real-world harms.[2],[3]There is a troubling lack of transparency about the automated decision-making systems being used[4] in sensitive areas including safeguarding and policing and many families are unaware of how information about them[5] and their children is shared and used[6],[7].

Automated decision-making systems are not able to understand the messy complexity of people’s lives [8]. Systems fail to predict risk, within safeguarding significant numbers of children who do need help are missed and children who don’t are falsely flagged[9]. Children and their families can find themselves subject to traumatic state intrusion not even for anything they have done but because of how they are profiled. There is evidence that no amount of data makes such systems usable or accurate.[10]

Removal of protection
Given this we are particularly concerned at the measures which permit the use of so-called ‘artificial intelligence’ in a wide range of scenarios and also at the creation of a category of ‘recognised legitimate interest’ which has been carried over from the previous government’s Data Protection and Digital Information Bill. The latter allows the processing of children’s data in circumstances which fall under the broad scope of safeguarding[11], national security and law enforcement without any need to consider whether this processing impacts “the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child”.[12]

Far from the “specific protection” for children as outlined in Recital 38 of the GDPR[13], “recognised legitimate interest” enables those processing information to ignore children’s fundamental rights[14] in circumstances where they are most liable to need protection. The clause added which pertains specifically to children’s information does not remedy the harm.

We are concerned too at the designation of any person under 18 as ‘vulnerable’ both in terms of how this will impact the data sharing regime around families and how the focus on children who do need help will be lost.

No consent for measures
While this legislation is being rushed through Parliament we already know that there is public concern rather than consent for increased data sharing[15] and rollout of ‘artificial intelligence’.[16] Governmental research into public attitudes reported that: “As understanding and experience of AI grows, the public is becoming more pessimistic about its impact on society.” “A growing proportion of the public think that AI will have a net negative impact on society, with words such as ‘scary’, ‘worry’ and ‘unsure’ commonly used to express feelings associated with it.” Research commissioned by the government into proposed quality assurance terms for AI reported that people repeatedly came back to the question of whether it should be used at all in areas such as education and healthcare.[17] While there has been insufficient consultation with children what there has been tells us that they are concerned about the confidentiality of their information, especially around sensitive areas[18] such as healthcare and education.[19]

We share these concerns. Where these systems have been utilised for areas including safeguarding and policing we see the amplification and deepening of already unjust processes.[20],[21] We also see waste of public funds where local authorities abandon use of these systems when they do not deliver as promised. The issues of discrimination and harm amplified through use of automated decision-making systems are not technical issues which can be resolved but a more fundamental problem of focusing ever more on the people struggling rather than on changing the circumstances in which they live. Automated decision-making systems measure people’s usage of public services, whether voluntary or not, meaning an increasing intervention into the lives of those who are already marginalised.

Rights protect children
We are standing at a fork in the road. Year after year of cuts to the support and services to which families are entitled to has left too many children and their families in desperately dangerous circumstances. The path we are on is one of increasing surveillance and loss of privacy for all with increasing injustice, scapegoating and policing of the most marginalised. We can choose instead a path of greater respect for the rights of children and of their families. Measures in line with children’s social and economic rights, such as removing the two-child benefit cap which would immediately improve the well-being and lives of 1.6 million children[22].

Rights protect children and we must do all that we can to protect these rights and protections. This bill does not do that. We call for a rights-centred approach to data and information which prioritises protection rather than use.

Signatories

  • Professor Louise Amoore, Political Geography, Durham University
  • ATD Fourth World
  • Dr Chris Bagley, Educational Psychologist and Lecturer, University College London
  • Professor Andy Bilson, Emeritus Professor University of Central Lancashire, Visiting Researcher University of Cambridge, Adjunct Professor University of Western Australia
  • Professor Luke Clements, Cerebra Professor of Law and Social Justice at the School of Law, Leeds University
  • Dr Stephen Crossley, Assistant Professor, Sociology, Durham University
  • Define Fine
  • Professor Lina Dencik, Professor and University Research Leader in AI Justice, Goldsmiths, University of London
  • Disabled Mother’s Rights Campaign (Tracey Norton, WinVisible)
  • Professor Ros Edwards Sociology, Social Policy and Criminology, University of Southampton
  • Educational Freedom, (Michelle Zaher)
  • Professor Brid Featherstone, Emeritus Professor of Social Work, University of Huddersfield
  • Professor Val Gillies, Social Policy and Criminology, Westminster University
  • Professor Anna Gupta, Department of Law and Criminology, Royal Holloway, University of London
  • Professor Augustine John, Equity and Human Rights Campaigner, Honorary Fellow UCL Institute of Education
  • Dr Remi Joseph-Salisbury, Reader in Sociology, University of Manchester
  • Professor Emily Keddell, Social and Community Work, University of Otago
  • Dan McQuillan, Lecturer in Creative & Social Computing, Goldsmiths, University of London
  • Maslaha (Raheel Mohammed, Director)
  • No More Exclusions
  • Allan Norman, Independent Social Worker, Non-practising Solicitor, Celtic Knot
  • Northern Police Monitoring Project
  • Parents, Families and Allies Network
  • Dr Harriet Pattison, Senior Lecturer, Childhood Studies, Liverpool Hope University
  • Prevent Watch
  • Dr Joanna Redden, Associate Professor, Faculty of Information & Media Studies, Western University
  • Professor Katherine Runswick-Cole, School of Education, Sheffield University
  • Support Not Separation, (Anne Neale, Legal Action for Women)
  • The Victoria Climbié Foundation UK
  • Professor Debbie Watson, Professor of Child and Family Welfare, University of Bristol
  • Professor Sue White, Emeritus Professor, Sociology, Sheffield University
  • Dr Lauren Wroe, Associate Professor, Contextual Safeguarding, Durham University
  • York Travellers Trust

[1] L Wroe Journal of Children’s Services: Young people and “county lines”: a contextual and social account (2021)

[2] V Gillies, R Edwards and H Vannier Ducasse
De Gruyter Oldenbourg: 
Calibrating families: Data behaviourism and the new algorithmic logic (2024)

[3] Allan Norman Pink Tape: The ‘Named Persons’ Scheme – When Protecting Wellbeing Is Totalitarian (2016)

[4] Public Law Project: The Tracking Automated Government register

[5] Alan Turing Institute and Ada Lovelace Institute How do people feel about AI? A nationally representative survey of public attitudes to artificial intelligence in Britain (2023)

[6] Written evidence submitted on the right to privacy by the House of Commons Science and Technology Committee by Professor Rosalind Edwards, Professor Val Gillies and Dr Sarah Gorin (DDA0005) (2022)

[7] Parental Societal License for Data Linkage for Service Intervention,

[8] M Broussard
MIT Technology Review: Meet the AI expert who says we should stop using AI so much (2023)

[9] J Redden, L Dencik, and H Warne
Policy Studies: Datafied child welfare services: unpacking politics, economics and power (2020)

[10] M Salganik et al
Proceedings of the National Academy of Sciences: Measuring the predictability of life outcomes with a scientific mass collaboration (2020)

[11] L Devine
Journal of Social Welfare and Family Law: Considering social work assessment of families  (2015)

[12] ICO UK GDPR a guide to lawful basis: legitimate interests

[13] GDRP-text: Recital 38 GDRP

[14] Defend Digital Me: KC opinion DPDI Bill 27112023 Stephen Cragg (2023)

[15] A Bowyer et al Conference on Human Factors in Computing Systems: Understanding the Family Perspective on the Storage, Sharing and Handling of Family Civic Data (2018)

[16] Written evidence submitted on the right to privacy by the House of Commons Science and Technology Committee by Professor Rosalind Edwards, Professor Val Gillies and Dr Sarah Gorin (DDA0005) (2022)

[17] Thinks Insight and Strategy (Report prepared for Responsible Technology Adoption Unit, UK Govt) Public Attitudes to AI Assurance (2024)

[18] Children consulted on ContactPoint quoted,  Roger Morgan, Children’s Rights Director for England
OFSTED: Making ContactPoint Work Children’s views on the government guidance (2007)

[19] S Livingstone and K Pothong Digital Futures Commission: What do children think of EdTech or know of its data sharing? Read our survey findings (2022)

[20] E Keddell
Social Sciences: Algorithmic Justice in Child Protection: Statistical Fairness, Social Justice and the Implications for Practice (2019)

[21] Val Gillies, R Edwards and H Vannier Ducasse ibid

[22]  Gov.uk Universal Credit and Child Tax Credit claimants: statistics related to the policy to provide support for a maximum of two children, April 2024