The Trouble with the Data Bill and Children’s Data

Part 1: The Trouble with the Data Bill and Children’s Data

Will the Data Use and Access Bill fall at the final hurdle? The popular consensus on ping-pong is no, but the government’s intransigence on “AI-companies-trump-creators without protections for today’s status quo in copyright law” versus “the defense of transparency duties on sources of training data by the Lords” has been the last stand in democratic checks-and-balances on an executive, giddy with its Commons majority.

This year the government scrapped the Privacy and Consumer Advisory Group (PCAG) that advised the government on how to provide users with a simple, trusted and secure means of accessing public services. It then went on to scrap the body it had been merged into, the One Login Inclusion and Privacy Advisory Group (OLIPAG) to advise the Government Digital Service’s GOV.UK One Login programme on inclusion, privacy, data usage, equality and digital identity.

Despite wide-ranging concerns from civil society in data and technology, the government treated engagement on the Bill as a mere tick-box exercise, showing no meaningful willingness to revise the draft inherited from the previous administration.

Its parts that concern children, in particular but not explicitly, include (a) the changes to purpose limitation and the extension of ‘consent’ for research grounds that explicitly include commercial use and are broadly drawn; (b) removing balancing tests as a protection explicitly from vulnerable people (undefined, but one could assume children and elderly or minoritised) on the weak lawful basis of legitimate interests that the Bill elevates into its own condition for processing and (c) on when and how to bypass fair processing and informing people if the controller in effect thinks it’s too many people (and there’s no objective test for that).

Procedurally,  the fact that the new law’s powers will apply to all data already held on Commencement date, undermine fair processing made in the past and combined with these 3 changes mean personal data may now be used in ways that were not made clear or allowed at the time of collection.

This is significant. And that’s not even what the government chose to miss out, like addressing adequacy properly; suitable safeguards in automated decision making missed in the 2018 UK drafting; or protections for new and emerging misuses of the law and personal data under targeted advertising; technology undermining freedom of thought, and clarifying increased uses of bodily data that are not used for ‘singling out’ and companies claim are not biometrics but are, and that normalise very intrusive tech, even inferring emotion, that may covered in the EU AI Act but remain a free-for-all here outside it.

Other divergences may begin if the Bill does pass with some of its late additions. Clause 81 Data protection by design and default: children’s higher protection matters (p100 of the Bill) is one.

This in effects elevates a bit of recital 38 onto the face of the Bill, introducing an explicit acknowledgment of it being child’s data in impact assessments and the obligations to the child of data protection by design and default.

However, it has two challenges — the first, a somewhat puzzling caveat that excludes preventive or counselling services, and it is precisely those services, often that are processing health and other sensitive data and should require the highest standards of data protection by design and default. (Not forgetting children’s data controller the NSPCC was one of 11 major charities fined by the ICO for unlawful practices in 2017.)

Second, the Bill as now drafted starts to bring with it a new problem for UK data protection law with expanded expectations to treat ‘children’s data’ differently from adults.

There’s no definition of data from children and it’s a problem. Is it a quality of the data or the person it comes from? If personal data was collected when a child didn’t know about it or understand it, does that duty to extra consideration wear off if you wait long enough to use the data?

Do these protections apply only at the time of collection because the person the data is about is then aged under 18, or do they persist as a characteristic of the data even after the person it is from, ages into adulthood? How does this interface with the parents’ rights who perhaps made a consent choice or were informed, “on behalf of” their child, and the child is now an adult?

With the volume of data collected now about children that fails to respect data minimisation and persists into adulthood it is a new’ish set of problems that we need to address clearly if it will be used to guide practice or in court.

Furthermore, in order for data controllers to know who is and who is not a child and process data accordingly means knowing everyone who is not as well. It may be problematic if these Data Use and Access Bill changes come to UK Data Protection law, without defining these additional consideration duties towards “data from children”, and that they should require no additional personal data in order to meet the duty (recital 57 should have been put on the face of the Bill) and will no doubt become a blueprint for others beyond the UK.

This Data use and Access Bill brings nothing that enhances UK data protection law which was aiming to create something somewhere that someone could label a Brexit dividend by people who see data protection law not as it was designed. It was designed to protect people from intrusion on their lives by others in secret, and uphold our fundamental rights and freedoms that others can restrict because of the power that information through data gives them; all while enabling the free flow of data through a consolidated framework for its operationalisation. The GDPR has been successfully painted as red tape to be circumvented. But we remain signatories to the first legally binding international instrument in the data protection field. I for one, would be glad to see this bill fall and we keep the data and IP laws we’ve already got today.

Better law is both necessary and possible but must start with proper routes to drafting, consultation, and non-partisan collaborations. Expert groups outside the political process for prior consultation need reinstated. And pre-legislative scrutiny, with expertise in data and technology must happen with evidence taken only after a bill has had its final drafting but before being laid before Parliament with a window for change. It is too late for adequate data and tech scrutiny to make amendments by the time it comes to asking two chambers of largely non-specialists, to put lipstick on a pig.

========================

Part 2: The Trouble with the Data Bill and Screen Time debate

Perhaps separately, as it did not make it into the Bill but still stole a lot of oxygen from other matters that merited more attention, I cannot share the populist support of the amendment to raise the age in Article 8 of the UK GDPR. Introduced late again, after it failed to gain traction as the private member’s Safer Phones Bill and after failing to get the social media ban discussed under the Sunak government it has not made it into the Bill but government murmurings continue in the media on ideas to limit screen time like the Cinderella law that was tried and failed in South Korea.

I have already discussed the reasons why with the drafters of the original PMB proposals at UsForThem, and I can be quite open. Wearing a non-partisan hat, I find it an ill-thought out, rather authoritarian ideology-based approach, that restricts children’s rights and does not enhance them — including the right to access to information, and right to play — with unintended consequences (i) for the most vulnerable children who would just be bought ‘adult phones’ by unscrupulous or abusive adults, and (ii) the age-gating of everyone, without any objective evidence exists that the proposals on changing Article 8 of the GDPR or connected changes, will make anything better for children in the issues it seeks to address.

I fully recognise the validity of concerns about children’s use of social media — but the expert evidence here, of those who have studied children’s and media for decades such as Professor Sonia Livingstone or Candice Odgers, in my view do not support an evidence-based approach to raising the age of data processing by information society services (ISS) that process personal data if the goal is only to restrict access to services. Why? Parts of the various proposed change, as they were drafted, included banning the processing of children’s data, was also illogical on two grounds.

(a) A company cannot know that a data subject is a child without processing personal data that reveals their age or date of birth, or at very least, a credential that says ‘I am under X age’. How can the company not process the data of a child, but must process their data to prevent their access?

(b) Such services can continue to be used and accessed without hindrance by children as long as those ISS process no personal data under current law or the new clauses, Article 8 is not engaged unless personal data is processed — and then the arguments made about screen time and harmful content are all irrelevant to the clause — demonstrating again that the aim of these proposals is not about children’s rights or really children’s data processing, but only aim to restrict access to content — the role of the Online Safety Act, not Data Protection law. Here too, online safety law needs to consider defining “children’s” data and content and the changing nature of the characteristic over time.

There is also the rarely discussed issue that the age-assurance / AV tools don’t achieve their aims. The Data Protection Authority in France the CNIL found on age that, “all the solutions proposed can easily be circumvented.”

But such changes would have substantial unintended consequences.

I remember sitting in the 2015 Brussels CPDP audience, at a panel event run by Google, in which the age of 13 was being debated as the right one for the GDPR to adopt for children’s data processing by information society services without parental consent.  I remember thinking, ‘but that means they’ll need to know everyone else is not a child in order to identify those who are. That’s not good’. Age verification and age assurance are not “child safety” tools. They are measures that must be applied to every user of a service to treat them differently by identifying or inferring who is and who is not a child.

GDPR was a different law from the past regulation about children’s rights in Europe in that it was age-based, not capacity based. As such, it remains out of synch with the protection, participation and empowerment rights for children that are embodied in many countries domestic law, based on the UN Convention on the Rights of the Child.

Leave a Reply

Your email address will not be published. Required fields are marked *