Category Archives: children

The contest and clash of child rights and parent power

What does the U.S. election outcome mean for education here? One aspect is that while the ‘Christian right’ in the UK may not be as powerful as its US counterpart, it still exerts influence on public policy. While far from new, it has become more prominent in parliament since the 2019 election. But even in 2008, Channel 4 Dispatches broadcast an investigation into the growth of Christian fundamentalism in the UK. The programme, “In God’s Name” highlighted the political lobbying by pro-life groups behind changes to tighten abortion law in the Human Fertilisation Bill including work between their then key lobbyist, and the MP Nadine Dorries.

The programme highlighted the fears of some of their members based on the “great replacement” conspiracy theory of the rising power of Islam from the East, replacing Christianity in the West. And it also showed how the ADF from the U.S. was funding UK strategic litigation to challenge and change UK laws including the McClintock v Department of Constitutional Affairs [2008].

The work of Sian Norris today, highlights why this U.S. election result is likely to see more of all of that over here.  As we see the rights’ environment move towards an ever greater focus on protection and protectionism, I make the case why this is all relevant for the education sector in England, and we must far better train and support school staff in practice, to manage competing sources of authority, interests and rights.


Child rights supported by parent power

Over the last ten years, since I began working in this field, there has been a noticeable shift in public discourse in the UK parliament and media around child rights, shaping public policy. It is visible in the established print, radio and TV media. In social media. It is in the language used, the funding available, and the parliamentary space and time taken up by new stakeholder groups and individuals involved, to the detriment of crowding out more moderate or established voices. On the one hand it is a greater pluralism and democracy in action. On the other, where its organisation is orchestrated, are the aims and drivers transparent in the public interest?

When it comes to parents, those behind many seemingly grassroots small p “Parent Power” groups are opaque, often with large well funded, often U.S. organisations behind them.

The challenges for established academics and think tanks in this closed and crowded policy advisory space is that these new arrivals, astroturf  ‘grassroots’ and offshoots from existing groups bring with them loud voices who co-opt the language of child rights, who are adept in policy and media spaces that were previously given to expert and evidence-based child rights academics.

Emerging voices are given authority by a very narrow group of parliamentarians, and are lent support by institutional capture through an increasing number of individuals embedded from industry or with conservative religious views hired into positions of authority. There is a shift in the weight given to views and opinions compared with facts and research, and cherry picked evidence to inform institutional positions and consultations, as a result.

The new players bring no history of being interested in children’s rights —in fact, many act in opposition to equality rights, or access to information, and appear more interested in control of children than universal human rights and fundamental freedoms. The shift of a balance in discussion on child rights to child protection above all else is not only in the UK but mainland Europe, the U.S. and Australia which is the latest to plan a ban on under 16s access to social media.

Whose interests do these people serve really, while packaged in the language of child rights?

Taking back parent and teacher control

Parallel arguments made in the public sphere have grown: the first on why authority must be taken away from parents and teachers and returned to the State over fears of loss of parental control of children’s access to information and children’s ‘safety’ including calls for state-imposed bans on mobile phones for children or enforced parental surveillance control tools. And at the same time,  parents want fewer state interventions.  Arguments include that, “over the last few years the State has been assuming ever greater control, usurping the rights of parents over their children.”

The political football of the day seems to move regularly from ‘ban mobile phones in schools‘ or at all, to the content of classroom materials, ‘give parents a right to withdraw children from access to sex ed and relationships teaching’ (RSE not biology). But perhaps more important even than the substance, is that the essence of what the Brexit vote tapped into, a sense that BigTech and the State, ‘others’, interfere with everyday life in ways from which people want to ‘take back control’ is not going away.

Opening up classroom content opens a can of worms

The challenge for teachers can be in their schools every day. Parents have a right to request that their child is withdrawn from sex education, but not from relationships education. In 2023, the DfE published refreshed guidance saying, “parents should be able to see what their children are being taught in RSHE lessons. Schools must share teaching materials with parents.”

I often argue that there is too little transparency and parental control over what is taught and how, and that parents should be able to see what is being taught and its sources but not with regard to RSE, but when it comes to edTech.  We need a more open classroom when it comes to content from companies of all kinds.

But this means also addressing how far the rights of parents and the rights of the child complement or compete with one another, when it comes to Article 26 (3)(b) of the UDHR on education, “Parents have a prior right to choose the kind of education that shall be given to their children.” And how does this affect teachers agency and authority?

These clashes are starting to overlap in a troubling lack of ethical oversight in intrusive national pupil data gathering exercises in England and in Scotland both of which have left parents furious, to the data grab planned from GPs in Wales. Complaints will without a doubt become louder and more widespread, and public trust lost.

When interests are contested and not aligned, who decides what is in a child’s best interests for their protection in a classroom?

When does the public interest kick in as well as individual interests in the public good from having children attend school, present to health services, and how are collective losses taken into account?

In the law today, responsibility for fulfilling a child’s right to education rests with parents, not schools. So what happens when decisions by schools interfere with parents’ views? When I think about children in the context of AI, automated decisions and design in edTech shaping child development, I think about child protection from strangers engineering a child’s development in closed systems.  It matters to protect a child from an unknown and unlimited number of persons interfering with who they will become.

But even the most basic child protections are missing in the edtech environment today without any public sector standards or oversight. I might object to the school about a product. My child might have a right to object in data protection law. But in practice, objection is impossible to exercise.

The friction this creates is going to grow and there is no good way to deal with it right now. Because the education sector is being opened up to a wider range of commercial, outside parties, it is also being opened up to the risks and challenges that brings. It can no longer be something put in the box marked ‘too difficult’ but needs attention.

The backlash will only grow if the sense of ‘overreach’ continues.

Built-in political and cultural values

The values coming over here from the U.S. are not only coming through parents’ grassroots groups, the religious right, or anti-LGBTQ voices in the media of all kinds, but are coming in directly to the classroom embedded into edTech products. The values underpinning AI or any other technology used in the classroom are opaque because the people behind the product are usually hidden. We cannot therefore separate the products from their designers’ politics. If those products are primarily U.S. made, then it is unsurprising if the values from their education and their political systems are those embedded into their pedagogy. Many of which seem less about the UNCRC art. 29 aims of education, and far more about the purposes of education centred on creating human capital via, “an emphasis on the knowledge economy that can reduce both persons and education to economic actors and be detrimental to wider social and ethical goals. ”

This is nothing new.

In 2013, Michael Gove gave a keynote speech in the U.S. to the National Summit for Education Reform, an organisation set up by Governor Jeb Bush. He talked about edTech too, and the knowledge economy of education and needing “every pair of hands” to “rebuild our economies”. Aside from his normalisation of the acceptance of ‘badging’ children in the classroom with failure (32:15) (“rankings of the students in the test were posted with the students name with colour codes… and some of the lower performers would wear a sticker on a ribbon with the colour code of their performance“) he also shared his view with echoes of the “great replacement theory” that, “the 20th century may be the last American Century we face the fact that the West and the values that we associate with it, liberalism, openness, decency, democracy, the rule of law, risks being eclipsed by a Rising Sun from the East.” We could well ask, whose flavour of ‘liberalism’ is that?

The fight for or against a progressive future

Today, anti-foreign, anti-abortion, and pro-natalist pro-conservative Christian values all meet in a Venn diagram in organisations pushing to undermine classically liberal aspects of teaching in England’s education system. And before this sounds a bit extreme, consider how these conspiracy theories and polarised views have been normalised. Listen (25:00) to the end of discussion on “the nation state” at the 2023 NatCon UK Conference co-badged with the Edmund Burke Foundation.  Becoming a parent is followed by discussion on housing pressure *from migrants* as well as a more-than-slightly eugenic-themed discussion of longevity, and then in passing, AI.  At the same event, fellow MP Miriam Cates claimed the UK’s low birthrate is the most pressing policy issue of the generation and is caused in part by “cultural Marxism” as reported by the Guardian. Orbán in Hungary in 2022, claimed he was fighting against “the great European population exchange … a suicidal attempt to replace the lack of European, Christian children with adults from other civilisations – migrants”.

These debates are inextricably linked in a fight for or against a progressive future. We have a Westminster Opposition now fighting for its own future and the ‘culture wars’ have been routinely part of its frontbenchers’ media discussions for some time. Much of it that is likely to continue to be played out in the education system, starting with the challenge to the Higher Education Freedom of Speech Act 2023, which always seemed to me more about the control of content on campus than its freedoms.

In today’s information society, Castells arguments that cultural battles for power are primarily fought in the media, where identity plays a critical role in influencing public policy and societal norms, where politics becomes theatre and, “citizens around the world react defensively, voting to prevent harm from the state in place of entrusting it with their will,” seem timely. (End of Millenium, p.383). Companies and vested interests have actual power, and elected leaders are left only with influence. This undermines the spirit of a democratic society.

The future of authority and competing interests

After the U.S. election result, that influence coming from across the Pond into UK public policy will not only find itself more motivated and more empowered, but likely, better funded.

Why all this matters for schools is that we are likely to see more of its polarised values-sets imported from the U.S. and there is no handbook for school governors nor staff of all backgrounds, to manage parents and the strong feelings it can all create. Nor does the sector understand the legal framework it needs to withstand it.

Having opened up classrooms to outside interests on classroom content, some families are pulling children out of school because of these fundamental disagreements with their values and the vehicles for their delivery—from the contents of teaching, to intrusive data surveys, and concerns over commercialisation and screen time of tech-based tools without proven beneficial outcomes. Whose best interests does the system serve and who decides whose interests come first when they are in conflict? How are these to be assessed and explained to parents and children, together with their rights?

How do teachers remain in authority where they are perceived as overstepping what parents reasonably expect, or where AI manages curriculum content and teachers cannot explain its assessment scoring or benchmarking profile of a pupil? What should the boundaries be especially as edTech blurs them between school and home, teachers and parents. We need to far better train and support educational staff in practice, to be prepared to manage competing sources of authority, and the emerging fight for interests and rights.

School absence: Bums-in-seats-thinking is the wrong problem, and AI the wrong solution

What school absence trends do you believe AI can spot that current methods cannot, to improve coordination between education, social care and the wider services that support families?

Since the pandemic, conservative leaning organisations and most notably the CSJ, have been banging a drum very loudly about children not in school. This is not about private-school educated children, but children they expect be in state educational settings age 5-18. Often, other people’s children.

In recent debate, various sub-categories of children not in school are conflated together. In 2022, various stakeholders got together and with Defend Digital Me, we worked to point out in non-partisan way, where that thinking was wrong.

It is wrong on who is not counted. (And who is). It is wrong on assumptions built into conflating the differences between children not in school and children ‘not in receipt of suitable education’.  It is wrong on assumptions that children not in school are not already on registers and recorded on Local Authority databases. It is wrong to ignore that the definition of persistent absence means children are classified as persistently absent more quickly now than previously. The same number of children could be absent for the same number of school days as were a decade ago, but it will be reported as having doubled.

And it is arguably morally wrong, in ignoring a very significant part of the problem as they see it: why children are not in school once you remove all that is wrong with the counting and assumptions. And what are the consequences for children of forcing them to be without their consent, or respecting  families choice or agency?

Absence data (on school roll, not attending)

In the 2023/24 academic year up to 8 December 2023, DfE data shows that the attendance rate across the academic year to date was 93.4%. The absence rate was, therefore, 6.6% across all schools and the unauthorised rate was far less. By school type, the absence rates across the academic year 2023/24 to date were:

  • 5.1% in state-funded primary schools (3.7% authorised and 1.4% unauthorised)
  • 8.3% in state-funded secondary schools (5.2% authorised and 3.1% unauthorised)
  • 12.6% in state-funded special schools (9.5% authorised and 3.0% unauthorised)

Over 1.5 million pupils in England have special educational needs (SEN)

An increase of 87,000 from 2022. Both the number of pupils with an education, health and care (EHC plan) and the number of pupils [recorded] with SEN support have increased:

  • The percentage of pupils with an EHC plan has increased to 4.3%, from 4.0% in 2022.
  • The percentage of pupils with SEN but no EHC plan (SEN support) has increased to 13.0%, from 12.6% in 2022.

Both continue a trend of increases since 2016. As do the number of stories you hear of parents asked to bring in children only part-time because the schools cannot get EHC plans approved, (Local Councils have no money) without which, schools cannot access funds or allocate the staff and resources needed for that child in a school.

Which children are not in school?

The concept of children-not-in-school should be nothing at all to do with Elective Home Education (“EHE”). The premise of ‘not in school’ is that they are not attending school ‘but they should be’ and that action taken will be as a result. Elective Home Education (“EHE”) children are not on a school roll and not expected to be. No action should be taken as a result, to get them into schools.

A Guardian article today (Jan 9th) quotes Wendy Charles-Warner, chair of home education charity Education Otherwise, who sums up one problem here: “Yet again we see an inappropriate and frankly mangled conflation of [elective] home education and absenteeism.

“Home education is of equal legal status to school education and it is certainly not ‘non-attendance’. Home educated children are in full-time education, they are not school pupils let alone absent school pupils.

“A register of home-educated children will make no difference whatsoever to school absenteeism and, before proposing such a significant step, the Labour party should educate itself to the very basic facts of the matter.”

It was frustrating to hear Bridget Phillipson give the same impression today as many other MPs had in the 2022 debates on the Schools Bill but using selective evidence, that no one knows how many children are home educated. Every Local Authority we asked data for in 2021-22 that replied, already had a register of EHE.

Proposals to legislate for a new national register of children not in school were part of the Government’s now-scrapped Schools Bill and were hotly contested and debated in the House of Lords. There is no compelling case to have one.

We assessed the plans at Defend Digital Me as part of the Counting Children coalition, and not only were the policy issues pretty fundamentally flawed, but practically flawed too. The legislation on the database as set out would have meant for example, double counting a whole swathe of children already on school registers but in Alternative Provision or part-time.

The plans conflated Home Education (“EHE”) (not on school roll) with absenteeism (pupils registered on school roll but absent), and would have doubled counted some children who were part time at alternative school settings, and conflated these children with Children Missing Education (“CME”, not on a school roll and it has been assessed that they are ‘not in receipt of a suitable education otherwise’, so should be on a school roll but are not and who are known to the State); and further conflated those three groups with children not on any database at all (unknown to the state education system).

Piling in elective home educators with at-home children waiting for places or suitable state school services, with children already on roll but part-time and truancy, would have geared up to conflate a toxic mix of ‘victims’ and ‘perpetrators’ style narrative like the Met Police Gangs matrix, people to be treated with suspicion and requiring additional (often centralised) state surveillance, all as a result of what would have been bad numbers.

The not-in-school register plans angered many, among others, the Charedi community. Home Educators protested outside parliament and filled the public gallery in the House of Lords on the day it was most specifically debated in the Schools Bill.

I know of cases where children are wrongly labeled CME by Local Authorities. Some LAs cannot (for what seems like nothing but stubborn jobsworth bureaucracy) accept that children who are Home Educated can be in receipt of suitable education, even if an LA’s opaque methods of measuring ‘suitability’ are so arbitrary and intrusive and out of step with the law as to be only understandable to them. Such records create fundamentally flawed and inaccurate family portraits, turning EHE records into CME records mid-year, (spot the double counted child) without recourse or route for redress. Some of these data are therefore opinions, not facts. Automating any of this would be a worse mess.

Known-to-the state CME children are on Local Authority databases even if If they were never enrolled at a school. If they were in school and left and even if after doing so they ‘fall off the radar’, one of fifteen tick-box reasons-for-leaving is recorded on their detailed named record and kept. If they disappear without a known destination of the next educational setting, they will in addition be added into the part of the Common Transfer File system that posts children’s records into the Lost Pupils Database (LPD). They are pulled out of that LPD once a state school ‘receives’ them again.

Children about whom nothing is known by the state, the so called ‘invisible children’ cannot be magically added to any database. If they were known today, they would already be on the existing databases. It is believed there are very few of these, but of course, it is unknown. It also by default a number that cannot ever be known. The NCB and Children’s Commissioner have made guesstimates of around 3,000 individual children.

The CSJ has in my view whether accidentally or by intent, wrongly hyped up the perception of the numbers of those children, by inventing the new term “ghost children”. This has made the everyday listener or MP think of these as ‘unseen by the state’. This CSJ term has been sweepingly used in the media and parliament to cover any child not in school and means the perceived “problem” is wrongly seen as (a) one and the same thing and (b) much larger than in reality.

That reshaping of this reality matters. It’s been a persistently retold half-truth since 2021 (the Telegraph published my letter to the editor on it in March 2022).  Still it seems not easily fixed by fact alone. The costs of new databases duplicating data that already exist, would be far better spent on patching up the 70% cuts to Local Authorities youth services, CAMHS, or Early Intervention Grant, or basically anything else for children and young people or families.

Which children do we know should be in school?

Remembering that the claims are that we need new registers of children not-in-school, how many children do you think are known to be missing education, recorded as CME, in any one Local Authority? Yes, these children are recorded by name already at LAs.

Local authorities have a duty under section 436A of the Education Act 1996 to make arrangements to identify, as far as it is possible to do so, children missing education. What is possible, is already done.

There are currently 152 local education authorities in England and through the dedicated volunteer effort from the Counting Children coalition, we asked all of them for their data (as of June 30, 2021).

There were zero Children Missing Education (“CME”) in Powys, Wales. In Blackpool by contrast lots. There were 45 Children Missing Education (“CME”) (in the area waiting for provision to start, mainly recently arrived), 112 Children missing “Out” (left area being tracked) (of which 61 had been located) and 307 Elective Home Education (“EHE”). And across the academic year September 2020 – July 2021, the Isle of Wight recorded 49 children as Children Missing Education (CME), in East Riding there were 17 Children Missing Education (“CME”). In Leicester they even noted that children on their registers have been recorded in these ways since 2003.

Do those numbers surprise you? Local Authorities also collect a lot of data already about each child out of school. For example, Harrow’s central database on children not in school already includes Family Name, Forename, Middle name, DOB, Unique Pupil Number (“UPN”), Former UPN (For adopted children and children-at-risk this should not be so, but who knows if it is respected see 6.5 and 6.6 due to risks the UPNs create for those children — The UPN is supposed to be secure state identifier, and as such, has special protections including being a blind identifier and it should lapse when children leave school (see page 6-8).  There is also the  Unique Learner Number (ULN).

Any new number policymakers suggest inventing, would need to be subject to the same protections for the child (and throughout their adult life), and therefore it would serve little purpose to create yet another new number.

The list goes on, of what is collected in CME Local Authority databases on each named child. Address (multi-field), Chosen surname, Chosen given name, NCY (year group), Gender, Ethnicity, Ethnicity source, Home Language, First Language, EAL (English as additional language), Religion, Medical flag, Connexions Assent, School name, School start date, School end date, Enrol Status, Ground for Removal, Reason for leaving, Destination school, Exclusion reason, Exclusion start date, Exclusion end date, SEN Stage, SEN Needs, SEN History, Mode of travel, FSM History, Attendance, Student Service Family, Carer details, Carer address details, Carer contract details, Hearing Impairment And Visual Impairment, Education Psychology support, and Looked After status. (For in school children, the list is even longer, it lasts a lifetime, and it’s given away too).

Yet the Schools Bill would have granted Local Authorities powers to expand this already incredibly intrusive list to any further data at all of their choosing, without any limitation.

The CSJ perhaps most accurately, states in one report, that it is vulnerable children who are affected most by missing school time but this must not be conflated with Children Missing Education.

“In Autumn 2022, the latest term for which data is available, children in receipt of Free School Meals (FSM) had a severe absence rate which was triple the rate for children who were not eligible for FSM. Children in receipt of special educational needs (SEN) support are also more likely to be severely absent than their peers.”

Absenteeism and Children Missing Education are NOT the same. From the numbers above, I hope it is clear why.

The perception of reality matters in this topic area specifically because it is portrayed by the CSJ as an outcome of the pandemic. The CSJ is not politically neutral given its political founders, steering group and senior leadership with strong ties to the lockdown-skeptic COVID Recovery Group. That matters because it influences, and enables other influencers, to set the agenda on what is seen as cause and solution to a set of problems and the public policy interventions that are taken or funded as a result. In 2022 at the Tory party conference event on this subject which I also wrote up afterwards here, Iain Duncan Smith failed to acknowledge, even once, that thousands of people in the UK have died and continue to die or have lasting effects as a result of and with COVID-19.

It was such a contrast and welcome difference from the tone of Bridget Phillipson MPs speech today at CSJ, that she acknowledged what the pandemic reality was for thousands of families.

And after all, according to a King’s Fund report,Overall, the number of people who have died from Covid-19 to end-July 2022 is 180,000, about 1 in 8 of all deaths in England and Wales during the pandemic.” Furthermore in England and Wales, “The pandemic has resulted in about 139,000 excess deaths“. “Among comparator high-income countries (other than the US), only Spain and Italy had higher rates of excess mortality in the pandemic to mid-2021 than the UK.”

At the 2022 Conservative Conference fringe event, chaired by IDS, while there were several references made by the panel of the impact of the pandemic on children’s poor mental health, no one mentioned the cuts to youth services’ funding by 70% over ten years, that has allowed CAMHS funding and service provision to wither and fail children well before 2020. The pandemic exacerbated children’s pre-existing needs that the government has not only failed to meet since, but actively rationed and  reduced provision for. Event chair, Ian Duncan Smith, is also the architect of Universal Credit. And this matters in this very closely connected policy area for measuring and understanding the effectiveness of all these interventions.

Poverty and school attendance can but do not always have causes and correlations. But while we focus on the (inaccurately presented) number of children not in school we fail to pay attention to the Big Picture and conflated causes of children not in school. Missing bums on seats is not the problem, but a symptom. In some cases, literally. Historically, the main driver for absence is illness. In 2020/21, this was 2.1% across the full year. This was a reduction on the rates seen before the pandemic (2.5% in 2018/19).

What does persistent absentee mean?

Another part of these numbers often presented in the media is the ‘persistent absence’ rate. But is it meaningful? In 2018/19 the rate of persistent absentees (missing 10% of possible sessions, or the equivalent of one morning or one afternoon every week) was 10.8%. Now it is reported as around double that. But bear in mind that at any point in time, the label ‘persistent absentee’ may be misleading to the average man-on-the-street.

Don’t forget this is also a movable numbers game — understanding the definition of persistent absence and that it has changed three times since 2010, are both critical to appreciate the numbers and data used in discussing this subject.  Children are classified as persistently absent more quickly now than previously. The same number of children could be absent for the same number of school days as was a decade ago, but will be reported as having doubled.

A child who misses a full day in the term for unauthorised absence (illness) now stays marked as ‘persistently’ absent until their percentage of possible sessions in school outweighs the 10% missed.  So if they have 3.5 days of tummy bug or flu which every parent knows is pretty much the norm on “Back-to-School” then even the most dedicated pupils look ‘persistently’ absent for quite a while. It’s not only guilt inducing to the students who care, it creates stress to go-in-at-all-costs (including before recovery with risk of infection) for the ill, and stigmatises the disabled and those with long term health conditions.

A label unfit for purpose, it could usefully be re-named and the topic re-framed to rebuild trust with learners and families.

A table outlining the definitions by daily missed sessions to the persistent absence definition
Credit to George Stephenson High School, Newcastle upon Tyne. https://www.gshs.org.uk/attendance/what-persistent-absence-student

Another technical thing that could usefully be updated in terms of data collection is the inconsistency across Local Authorities for what age group they record which data. For example, children missing education (“CME”) is often Reception through to age 16 but for other categories it is for children to age 18, and age 25 for children with a special educational needs and disability (SEND) plan into young adulthood. Many Local Authorities use the definition in section 8 of the Education Act 1996 that is out of step with the more recently revised school leaving age in England.

Data-led decisions are not smart solutions

This is fundamentally not about data, but children’s lives.  In these debates there is a grave risk that a focus on the numbers, because of the way the data is presented, perceived, or used, means that reducing the numbers themselves becomes the goal.  The data is there. Joining it all up may feel like doing ‘something’ but it’s not going to contribute anything to getting bums on seats or deliver a quality education to every child. It won’t contribute to a solution except perhaps to some AI company CEO’s bottom line. And at what cost to children both by what you choose not to instead or direct harm? AI is not the solution or even a reliable tool when it comes to children’s social issues.

Entering data on a system with the hope of ‘spotting patterns’ without precise asks of data is rather like gazing at a crystal ball.  The computer cannot ‘guess’ what you are looking for. Rather than designing for the effective outcomes of what you want it to achieve, is symptomatic of the analysis of the problem. Lack of human authority and accountability.

It was said in the Laming report of the Victoria Climbié enquiry, there could be referrals coming in by fax, streaming on the floor and nobody picking them up. “It was not my job to pick up the fax from the fax machine. It was not my role. I had other things to do.” (5.22) All staff working in children and families’ services don’t need to be on one database to record data, but can work on decentralised systems with single role-based access. Systems can draw together data and present it without the need for a single record. Those are design questions, not a justification for building more databases and more national identifiers which may do nothing but duplicate the existing dysfunctions.

The Laming Review found that by the late 1990s social services had lost all of its human resources and training staff. They were overworked and missed things the computer showed them. Computers cannot make people do their job. Access to information does not create accountabilty for action.

In Victoria’s case, one example given was of a computer printout displaying a unique child reference number, that noted physical bruising.  The links were there for anyone who had access to see:

“but it seems likely that as the administrative staff were struggling to cope with the backlog of work at the time, it was simply overlooked.”

Children can experience the concentrated harms of poverty more than many in society. Some by government policy design. The data on child poverty is sometimes contradictory. But it is obvious to see that child poverty is not only more widespread but deeper now, than it was when the Conservative party took power in 2010. And it is important to consider a third further factor in the overlap between children’s school attendance and child poverty. Amos Toh, senior tech and human rights researcher at Human Rights Watch wrote recently on AI and public policy,

“As part of the welfare system since 2010, the government has ceded control of the country’s social assistance system to algorithms that often shrink people’s benefits in unpredictable ways. The system, known as Universal Credit, was rolled out in 2013 to improve administrative efficiency and save costs, primarily through automating benefits calculations and maximizing digital “self-service” by benefit claimants.”

Universal Credit has had a range of contested outcomes but  what should be uncontested is that the AI, the algorithms it uses, are flawed in various ways in various parts of the system.

So when I heard Shadow Education Secretary Bridget Phillipson say today that, “artificial intelligence (AI) will be used by Labour to spot absence trends to improve coordination between education, social care and the wider services that support families,” and, “plans to legislate for a new register of children in home education,” what I think to myself is this. Regardless of which political party is in power, imagine if there were a (duplicated) new database of children who are home educated and/ or known to be missing education at national level. Once these statistics are available at national level in one database (after all, only statistics not named records might be seen as necessary and proportionate beyond direct care, and practically the data will always be out of synch with local data), and imagine the money has not been spent on youth services, or Early Years intervention, but on ‘fix-it-all’ AI. What will change?

Fixing Britain is a People Problem

If you have followed the BBC Radio 4 Louise Casey ‘Fixing Britain’ series, you may or may not agree with all the suggestions but in the episode on Universal Credit, the summary is relevant for all of them. Often a public policy focus on money and legislation, forgets what it is about, people. “Policy disconnected from its purpose [people] is going to fail.” Policy makers often fail to understand most people’s lives whom the policy is intended to affect. Some of that today looks like this:

On poverty: There are far more food banks in the UK than branches of McDonalds.

On school leavers’ aspirations and opportunity: One third of children fail to get a pass in maths and English GCSE that is the gatekeeper to many jobs. AI-supported recruiting tools simply sort out and remove those who don’t have the qualifications in the applications process. Today one third of children are excluded from education and job opportunities not because they are necessarily unsuitable applicants, but because the grade boundaries are set so that one-third get D or lower.

On bad parenting: too often conflated into this debate by the Children’s Commissioner, children not-in-school is not a sign of bad parenting, any more than a child sent into school is a good one. Even joined up professional services and home visits can still be fobbed off and still fail to act on signs of neglect and abuse.  Parents and children on the radar of social services and on school rolls and in-school are known to the system and yet still it fails due to ‘underfunding of social services and the court system’.

On cuts to human support: Social workers vacancies were reported at a record high of 7,900 in 2022, a 21% rise on 2021. “The risks have been shown in safeguarding reviews after a series of scandals. A review of Bradford’s children’s services following Star Hobson’s death found record levels of vacancies and sickness among social workers.” No amount of data or Artificial intelligence can plug that hole in human capacity.

On policy aims: Much of this has been debated again and again. From twenty years ago, to the 2023 House of Commons Committee report on Persistent absence and support for disadvantaged pupils.

On children: Above all, contrary to some narratives, what is in a child’s best interests is not always being in school. If you ask primary children what they like and don’t like about school it may have changed little over time because what matters to them most is how it makes them feel. Some love sport, drama, music and art and are frustrated there is so little of it, and none at all from age 13 where the curriculum narrows to KS4 too early. For many it is not safe or supportive of their needs. Some are not fine in school and some need specialist support.  Expert individuals and oganisations identify those needs and are there to help. Children may more rarely be offered a choice or asked if they want to be in school, but without a consensual part in it, it doesn’t work.

On fault: Blame is then too often laid at parents’ feet, whether it is the narrative of parents are either feckless or failing to teach children to brush their teeth. In the year of a General Election how will this land with people who voted for the narrative, “Take back control”?

Control and choice

It was refreshing to hear Phillipson move at least a bit away from blame to responsibility and trust. The role of responsibility and trust in the system are, however, unevenly distributed and possibly under appreciated. Perhaps coincidentally, many parents and teachers today, are the first who went through the biggest costs and still have the largest debts owing from their Higher Education. As reported by politics.co.uk, “the results of the annual Higher Education Policy Unit and the Higher Education Academy student experience study in 2017 showed that just 35% of respondents believed their higher education experience represented ‘good’ or ‘very good’ value for money.” If parents see and act as if education is less of a gift in life or a public good, but more of a package that comes with consumer rights attached, then can you blame them? It was Labour that introduced the first student fees for Higher Ed.

It was the Conservatives who made ‘choice’ in the schools market central to their messaging on the role of parents in education for a decade. In the US they are now seeing the results of that ‘choice’ message, made even more extreme through per pupil cash transfers,  and by the political culture war divisions driven between communities and state schools that has helped steer state money away from the mainstream state school system.

Phillipson is right on why the current government approach isn’t working, “that broader reality is why the government’s approach – an Attendance Action Alliance – falls so far short of the challenge. Insofar as it tackles anything, it tackles the symptom, not the causes.”

But tackling things via different but wrong tools, won’t be better.

Failing to heed lessons from infrastructure projects, on pupil data, and AIs past and present, dooms us to repeat the same mistakes.  Who is in school is an outcome of the experience of the system at individual level, and if it delivers in the context of each child’s, family, and community life and the aims and quality of education. Focusing only on getting children into the classroom is of little value without understanding what the experience is like for them once there.  The outcomes of children not-in-school is not only a societal question, but one of long term sustainability for England’s state school system as a whole.


 

Ensuring people have a say in future data governance

Based on a talk prepared for an event in parliament, hosted by Connected By Data and chaired by Lord Tim Clement-Jones, focusing on the Data Protection and Digital Information Bill, on Monday 5th December 17:00-19:00. “Ensuring people have a say in future data governance”.

Some reflections on Data in Schools (a) general issues; (b) the direction of travel the Government going in and; (c) what should happen, in the Bill or more widely.

Following Professor Sonia Livingstone who focussed primarily on the issues connected with edTech, I focussed on the historical and political context of where we are today, on ‘having a say’ in education data and its processing in, across, and out of, the public sector.


What should be different with or without this Bill?

Since I ran out of time yesterday I’m going to put first what I didn’t get around to: the key conclusions that point to what is possible with or without new Data Protection law. We should be better at enabling the realisation of existing data rights in the education sector today. The state and extended services could build tools for schools to help them act as controllers and for children to realise rights like a PEGE (a personalized exam grade explainer to show exam candidates what data was used to calculate their grade and how), Data usage reports should be made available at least annually from schools to help families understand what data about their children has gone where; and methods that enable the child or family to correct errors or express a Right to Object should be mandatory in schools’ information management systems.  Supplier standards on accuracy and error notifications should be made explicit and statutory, and supplier service level agreements affected by repeated failures.

Where is the change needed to create the social license for today’s practice, even before we look to the future?

“Ensuring people have a say in future data governance”. There has been a lot of asking lots of people for a say in the last decade. When asked, the majority of people generally want the same thingsboth those who are willing and less willing to have personal data about them re-used that was collected for administrative purposes in the public sectorto be told what data is collected for and how it is used, opt-in to re-use, to be able to control distribution, and protections for redress and against misuse strengthened in legislation.

Read Doteveryone’s public attitudes work. Or the Ipsos MORI polls or work by Wellcome. (see below). Or even the care.data summaries.

The red lines in the “Dialogues on Data” report from workshops carried out across different devolved regions of the UK for the 2013 ADRN remain valid today (about the reuse of deidentified linked public admin datasets by qualified researchers in safe settings not even raw identifying data), in particular with relation to:

  • Creating large databases containing many variables/data from a large number of public sector sources
  • Allowing administrative data to be linked with business data
  • Linking of passively collected administrative data, in particular geo-location data

“All of the above were seen as having potential privacy implications or allowing the possibility of reidentification of individuals within datasets. The other ‘red-line’ for some participants was allowing researchers for private companies to access data, either to deliver a public service or in order to make profit. Trust in private companies’ motivations were low.”

Much of this reflects what children and young people say as well. RAENG (2010) carried out engagement work with children on health data Privacy and Prejudice: young people’s views on the development and use of Electronic Patient Records (911.18 KB). They are very clear about wanting to keep their medical details under their own control and away from the ‘wrong hands’ which includes potential employers, commercial companies and parents.

Our own engagement work with a youth group aged 14-25 at a small scale was published in 2020 in our work, The Words We Use in Data Policy: Putting People Back in the Picture, and reflected what the Office for the Regulation of National Statistics went to publish in their own 2022 report, Visibility, Vulnerability and Voice (as a framework to explore whether the current statistics are helping society to understand the experiences of children and young people in all aspects of their lives). Young people worry about misrepresentation, about the data being used in place of conversations about them to take decisions that affect their lives, and about the power imbalance it creates without practical routes for complaint or redress. We all agree children’s voice is left out of the debate on data about them.

Parents are left out too. Defenddigitalme commissioned a parental survey via Survation (2018) under 50% felt they had sufficient control of their child’s digital footprint, and 2/3rds had not heard of the National Pupil Database or its commercial reuse.

So why is it that the public voice, loud and clear, is ignored in public policy and ignored in the drafting of the Data Protection and Digital Information Bill?

When it comes to education, debate should start with children’s and family rights in education, and education policy, not about data produced as its by-product.

The Universal Declaration of Human Rights Article 26 grafts a parent’s right onto child’s right to education, to choose the type of that education and it defines the purposes of education.

Education shall be directed to the full development of the human personality and to the strengthening of respect for human rights and fundamental freedoms. It shall promote understanding, tolerance and friendship among all nations, racial or religious groups, and shall further the activities of the United Nations for the maintenance of peace. Becoming a set of data points for product development or research is not the reason children go to school and hand over their personal details in the admissions process at all.

The State of the current landscape
To realise change, we must accept the current state of play and current practice. This includes a backdrop of trying to manage data well in the perilous state of public infrastructure, shrinking legal services and legal aid for children, ever-shrinking educational services in and beyond mainstream education, staff shortages and retention issues, and the lack of ongoing training or suitable and sustainable IT infrastructure for staff and learners.

Current institutional guidance and national data policy in the field is poor and takes the perspective of the educational setting but not the person.

Three key issues are problems from top-down and across systems:

  • Data repurposing i.e. SATS Key Stage 2 tests which are supposed to be measures of school performance not individual attainment are re-used as risk indicators in Local Authority datasets used to identify families for intervention, which it’s not designed for.
  • Vast amount of data distribution and linkage with other data: policing, economic drivers (LEO) and Local Authority broad data linkage without consent for purposes that exceed the original data collection purpose parents are told and use it like Kent, or Camden, “for profiling the needs of the 38,000 families across the borough”  plus further automated decision-making.
  • Accuracy in education data is a big issue, in part because families never get to see the majority of data created about a child much of which is opinion, and not submitted by them: ie the Welsh government fulfilled a Subject Access Request to one parent concerned with their own child’s record, and ended up revealing that every child in 2010 had been wrongly recorded thanks to a  Capita SIMS coding error, as having been in-care at some point in the past. Procurement processes should build penalties for systemic mistakes and lessons learned like this, into service level agreements, but instead we seem to allow the same issues to repeat over and over again.

What the DfE Does today

Government needs to embrace the fact it can only get data right, if it does the right thing. That includes policy that upholds the law by design. This needs change in its own purposes and practice.

National Pupil Data is a bad example from the top down. The ICO 2019-20 audit of the Department for Education — it is not yet published in full but findings included failings such as no Record of Processing Activity (ROPA), Not able to demonstrate compliance, and no fair processing. All of which will be undermined further by the Bill.

The Department for Education has been giving away 15 million people’s personal confidential data since 2012 and never told them. They know this. They choose to ignore it. And on top of that, didn’t inform people who were in school since then, that Mr Gove changed the law. So now over 21 million people’s pupil records are being given away to companies and other third parties, for use in ways we do not expect, and it is misused too. In 2015, more secret data sharing began, with the Home Office. And another pilot in 2018 with the DWP.

Government wanted to and changed the law on education admin data in 2012 and got it wrong. Education data alone is a sin bin of bad habits and complete lack of public and professional engagement, before even starting to address data quality and accuracy and backwards looking policy built on bad historic data.

The Commercial department do not have appropriate controls in place to protect personal data being processed on behalf of the DfE by data processors.” (ICO audit of the DfE , 2020)

Gambling companies ended up misusing access to learner records for over two years exposed in 2020 by journalists at the Sunday Times.

The government wanted nationality data from the Department for Education to be collected for the purposes of another (the Home Office) and got it very wrong. People boycotted the collection until it was killed off and data later destroyed.

Government changed the law on Higher Education in 2017 and got it wrong.  Now  third parties pass around named equality monitoring records like religion, sexual orientation, and disability and it is stored forever on named national pupil records. The Department for Education (DfE) now holds sexual orientation data on almost 3.2 million, and religious belief data on 3.7 million people.

After the summary findings published by the ICO of their compulsory audit of the Department for Education,  the question now is what will the Department and government do to address the 139 recommendations for improvement, with over 60% classified as urgent or high priority. Is the government intentional about change? We don’t think so at defend digital me, so we are, and welcome any support of our legal challenge.

Before we write new national law we must recognise and consider UK inconsistency and differences across education

Existing frameworks law and statutory guidance and recommendations need understood in the round (eg devolved education, including the age of a child and their capacity to undertake a contract in Scotland (at 16), the geographical applications of the Protection of Freedoms Act 2012, also the Prevent Duty since 2015 and its wider effects as a result of profiling children in counter-terrorism that reach beyond poor data protection and impacts on privacy (see The UN Special Rapporteur 2014 report on children’s rights and freedom of expression) – a plethora of Council of Europe work is applicable here in education that applies to UK as a member state: guidelines on data protection, AI, human rights, rule of law and the role of education in the promotion of democratic citizenship and a protection against authoritarian regimes and extreme nationalism.

The Bill itself
The fundamental principles of the GDPR and Data Protection law are undermined further from an already weak starting point since the 2018 Bill adopted exemptions that were not introduced by other countries in immigration and law enforcement.

  • The very definitions of personal and biometric data need close scrutiny.
  • Accountability is weakened (DPO, DPIA and prior consultation for high risk no longer necessary, ROPA)
  • Purpose limitation is weakened (legitimate interests and additional conditions for LI)
  • Redress is missing (Children and routes for child justice)
  • Henry VIII powers on customer data and business data must go.
  • And of course it only covers the living. What about children’s data misuse that causes distress and harms to human dignity but that is not covered strictly by UK Data Protection law, such as the children whose identities were used for undercover police in the SpyCops scandal. Recital 27 under the GDPR permits a possible change here.

Where are the Lessons Learned reflected in the Bill?

This Bill should be able to look at recent ICO enforcement action or judicial reviews to learn where and what is working and not working in data protection law. Lessons learned should be plentiful on public communications and fair processing, on the definitions of research, on discrimination, accuracy and bad data policy decisions. But where are those lessons in the Bill learned from health data sharing, why the care.data programme ran into trouble and similar failures repeated in the most recent GP patient data grab, or Google DeepMind and the RoyalFree? In policing from the Met Police Gangs Matrix?  In Home Affairs from the judicial review launched to challenge the lawfulness of an algorithm used by the Home Office to process visa applications? Or in education from the summer of 2020 exams fiasco?

The major data challenges as a result of government policy are not about data at all, but bad policy decisions which invariably mean data is involved because of ubiquitous digital first policy, public administration, and the nature of digital record keeping. In education examples include:

  • Partisan political agendas: i.e. the narrative of absence numbers makes no attempt to disaggregate the “expected” absence rate from anything on top, and presenting the idea as fact, that 100,000 children have not returned to school, “as a result of all of this”, is badly misleading to the point of being a lie.
  • Policy that ignores the law. The biggest driver of profiling children in the state education sector, despite the law that profiling children should not be routine, is the Progress 8 measure: about which Leckie & late Harvey Goldstein (2017) concluded in their work on the evolution of school league tables in England 1992-2016: ‘Contextual value-added’, ‘expected progress’ and ‘progress 8’ that, “all these progress measures and school league tables more generally should be viewed with far more scepticism and interpreted far more cautiously than have often been to date.”

The Direction of Travel
Can any new consultation or debate on the changes promised in data protection reform, ensure people have a say in future data governance, the topic for today, and what if any difference would it make?

Children’s voice and framing of children in National Data Strategy is woeful, either projected as victims or potential criminals. That must change.

Data protection law has existed in much similar form to today since 1984. Yet we have scant attention paid to it in ways that meet public expectations, fulfil parental and children’s expectations, or respect the basic principles of the law today. We have enabled technologies to enter into classrooms without any grasp of scale or risks in England that even Scotland has not with their Local Authority oversight and controls over procurement standards. Emerging technologies: tools that claim to be able to identify emotion and mood and use brain scanning, the adoption of e-proctoring, and mental health prediction apps which are treated very differently from they would be in the NHS Digital environment with ethical oversight and quality standards to meet — these are all in classrooms interfering with real children’s lives and development now, not some far-off imagined future.

This goes beyond data protection into procurement, standards, safety, understanding pedagogy, behavioural influence, and policy design and digital strategy. It is furthermore, naive to think this legislation, if it happens at all, is going to be the piece of law that promotes children’s rights when the others in play from the current government do not: the revision of the Human Rights Act, the recent PCSC Bill clauses on data sharing, and the widespread use of exemptions and excuses around data for immigration enforcement.

Conclusion
If policymakers who want more data usage treat people as producers of a commodity, and continue to ignore the publics’ “say in future data governance” then we’ll keep seeing the boycotts and the opt-outs and create mistrust in government as well as data conveners and controllers widening the data trust deficit**. The culture must change in education and other departments.

Overall, we must reconcile the focus of the UK national data strategy, with a rights-based governance framework to move forward the conversation in ways that work for the economy and research, and with the human flourishing of our future generations at its heart. Education data plays a critical role in social, economic, democratic and even security policy today and should be recognised as needing urgent and critical attention.


References:

Local Authority algorithms

The Data Justice Lab has researched how public services are increasingly automated and government institutions at different levels are using data systems and AI. However, our latest report, Automating Public Services: Learning from Cancelled Systems, looks at another current development: The cancellation of automated decision-making systems (ADS) that did not fulfil their goals, led to serious harm, or met caused significant opposition through community mobilization, investigative reporting, or legal action. The report provides the first comprehensive overview of systems being cancelled across western democracies.

New Research Report: Learning from Cancelled Systems

The Children of Covid: Where are they now? #CPC22

At Conservative Party Conference (“CPC22”) yesterday, the CSJ Think Tank hosted an event called, The Children of Lockdown: Where are they now?

When the speakers were finished, and other questions had been asked, I had the opportunity to raise the following three points.

They matter to me because I am concerned that bad policy-making for children will come from the misleading narrative based on bad data. The data used in the discussion is bad data for a number of reasons, based on our research over the last 4 years at defenddigitalme, and previously as part of the Counting Children coalition with particular regard to the Schools Bill.

The first is a false fact that has been often bandied about over the last year in the media and in Parliamentary debate, and that the Rt Hon Sir Iain Duncan Smith MP repeated in opening the panel discussion, that 100,000 children have not returned to school, “as a result of all of this“.

Full Fact has sought to correct this misrepresentation by individuals and institutions in the public domain several times, including one year ago today, when a Sunday Times article, published on 3 October 2021, claimed new figures showed “that between 95,000 and 135,000 children did not return to school in the autumn term, credited to the Commission on Young Lives, a task force headed up by former Children’s Commissioner for England.” Anne Longfield had then told Full Fact, that on 16 September 2021, “the rate of absence was around 1.5 percentage points higher than would normally be expected in the autumn term pre-pandemic.

Full Fact wrote, “This analysis attempts to highlight an estimated level of ‘unexplained absence’, and comes with a number of caveats—for example it is just one day’s data, and it does not record or estimate persistent absence.”

There was no attempt made in the CPC22 discussion to disaggregate the “expected” absence rate from anything on top, and presenting the idea as fact, that 100,000 children have not returned to school, “as a result of all of this”, is misleading.

Suggesting this causation for 100,000 children is wrong for two reasons. The first, is not talking about the number of children within that number who were out of school before the pandemic and reasons for that. The CSJ’s own report published in 2021, said that, “In the autumn term of 2019, i.e pre-Covid 60,244 pupils were labeled as severely absent.”

Whether it is the same children or not who were out of school before and afterwards also matters to apply causation. This named pupil-level absence data is already available for every school child at national level on a termly basis, alongside the other personal details collected termly in the school census, among other collections.

Full Fact went on to say, “The Telegraph reported in April 2021 that more than 20,000 children had “fallen off” school registers when the Autumn 2020 term began. The Association of Directors of Children’s Services projected that, as of October 2020, more than 75,000 children were being educated at home. However, as explained above, this is not the same as being persistently absent.”

The second point I made yesterday, was that the definition of persistent absence has changed three times since 2010, so that children are classified as persistently absent more quickly now at 10%, than when it meant 20% or more of sessions were missed.

(It’s also worth noting that data are inconsistent over time in another way too. The 2019 Guide to Absence Statistics draws attention to the fact that, “Year on year comparisons of local authority data may be affected by schools converting to academies.”)

And third and finally, I pointed out where we have found a further problem in counting children correctly. Local Authorities do this in different ways. Some count each actual child once in the year in their data, some count each time a child changes status (i.e a move from mainstream into Alternative Provision to Elective Home Education could see the same child counted three times in total, once in each dataset across the same year), and some count full-time equivalent funded places (i.e. if five children each have one day a week outside mainstream education, they would be counted only as one single full-time child in total in the reported data).

Put together, this all means not only that the counts are wrong, but the very idea of “ghost children” who simply ‘disappear’ from school without anything known about them anywhere at all, is a fictitious and misleading presentation.

All schools (including academies and independent schools) must notify their local authority when they are about to remove a pupil’s name from the school admission register under any of the fifteen grounds listed in Regulation 8(1) a-n of the Education (Pupil Registration) (England) Regulations 2006. On top of that, children are recorded as Children Missing Education, “CME” where the Local Authority decides a child is not in receipt of suitable education.

For those children,  processing of personal data of children not-in-school by Local Authorities is already required under s436Aof the The Education Act 1996, Duty to make arrangements to identify children not receiving education.

Research done as part of the Counting Children coalition with regards to the Schools Bill, has found every Local Authority that has replied to date (with a 67% response rate to FOI on July 5, 2022) upholds its statutory duty to record these children who either leave state education, or who are found to be otherwise missing education. Every Local Authority has a record of these children, by name, together with much more detailed data.**  The GB News journalist on the panel said she had taken her children out of school and the Local Authority had not contacted her. But as a home-educating audience member then pointed out, that does not mean therefore the LA did not know about her decision, since they would already have her child-/ren’s details recorded. There is law in place already on what LAs must track. Whether or not and how the LA is doing its job, was beyond this discussion, but the suggestion that more law is needed to make them collect the same data as is already required is superfluous.

This is not only about the detail of context and nuance in the numbers and its debate, but substantially alters the understanding of the facts. This matters to have correct, so that bad policy doesn’t get made based on bad data and misunderstanding the conflated causes.

Despite this, in closing Iain Duncan Smith asked the attendees to go out from the meeting and evangelise about these issues. If they do so based on his selection of ‘facts’ they will spread misinformation.

At the event, I did not mention two further parts of this context that matter if policy makers and the public are to find solutions to what is no doubt an important series of problems, and that must not be manipulated to present as if they are entirely as a result of the pandemic. And not only the pandemic, but lockdowns specifically.

Historically, the main driver for absence is illness. In 2020/21, this was 2.1% across the full year. This was a reduction on the rates seen before the pandemic (2.5% in 2018/19).

A pupil on-roll is identified as a persistent absentee if they miss 10% or more of their possible sessions (one school day has two sessions, morning and afternoon.)  1.1% of pupil enrolments missed 50% or more of their possible sessions in 2020/21. Children with additional educational and health needs or disability, have higher rates of absence. During Covid, the absence rate for pupils with an EHC plan was 13.1% across 2020/21.

Authorised other reasons has risen to 0.9% from 0.3%, reflecting that vulnerable children were prioritised to continue attending school but where parents did not want their child to attend, schools were expected to authorise the absence.” (DfE data, academic year 2020/21)

While there were several references made by the panel to the impact of the pandemic on children’s poor mental health, no one mentioned the cuts to youth services’ funding by 70% over ten years, that has allowed CAMHS funding and service provision to wither and fail children well before 2020. The pandemic has exacerbated children’s pre-existing needs that the government has not only failed to meet since, but actively reduced provision for.

It was further frustrating to hear, as someone with Swedish relatives, of their pandemic approach presented as comparable with the UK and that in effect, they managed it ‘better’. It seems absurd to me, to compare the UK uncritically with a country with the population density of Sweden. But if we *are* going to do comparisons with other countries, it should be with fuller understanding of context, and all of their data, and caveats if comparison is to be meaningful.

I was somewhat surprised that Iain Duncan Smith also failed to acknowledge, even once, that thousands of people in the UK have died and continue to die or have lasting effects as a result of and with COVID-19. According to the King’s Fund report,Overall, the number of people who have died from Covid-19 to end-July 2022 is 180,000, about 1 in 8 of all deaths in England and Wales during the pandemic.” Furthermore in England and Wales, “The pandemic has resulted in about 139,000 excess deaths“. “Among comparator high-income countries (other than the US), only Spain and Italy had higher rates of excess mortality in the pandemic to mid-2021 than the UK.” I believe that if we’re going to compare ‘lockdown success’ at all, we should look at the wider comparable data before making it. He might also have chosen to mention alongside this, the UK success story of research and discovery, and the NHS vaccination programme.

And there was no mention at all made of the further context, that while much was made of the economic harm of the impact of the pandemic on children, “The Children of Lockdown” are also, “The Children of Brexit”. It is non-partisan to point out this fact, and, I would suggest, disingenuous to leave out entirely in any discussion of the reasons for or impact of economic downturn in the UK in the last three years. In fact, the FT recently called it a “deafening silence.”

At defenddigitalme, we raised the problem of this inaccurate “counting” narrative numerous times including with MPs, members of the House of Lords in the Schools Bill debate as part of the Counting Children coalition, and in a letter to The Telegraph in March this year. More detail is here, in a blog from April.


Update May 23, 2023

Today I received the DfE held figures of he number of children who leave an educational setting for an unknown onward destination, a section of the Common Transfer Files holding space, in effect a digital limbo after leaving an educational setting until the child is ‘claimed’ by the destination. It’s  known as, the Lost Pupils Database.

Furthermore, the DfE has published exploratory statistics on EHE
and ad hoc stats on CME too.

October 2022. More background:

The panel was chaired by the Rt Hon Sir Iain Duncan Smith MP and other speakers included Fraser Nelson, Editor of The Spectator Magazine; Kieron Boyle, Chief Executive Officer of Guy’s & St Thomas Foundation; the Rt Hon Robert Halfon MP, Education Select Committee Chair; and Mercy Muroki, Journalist at GB News.

We have previously offered to share our original research data and discuss with the Department for Education, and repeated this offer to the panel to help correct the false facts. I look forward in the hope they will take it up.

** Data collected in the record by Local Authorities when children are deregistered from state education (including to move to private school) may include a wide range of personal details, including as an example in Harrow: Family Name, Forename, Middle name, DOB, Unique Pupil Number (“UPN”), Former UPN, Unique Learner Number, Home Address (multi-field), Chosen surname, Chosen given name, NCY (year group), Gender, Ethnicity, Ethnicity source, Home Language, First Language, EAL (English as an additional language), Religion, Medical flag, Connexions Assent, School name, School start date, School end date, Enrol Status, Ground for Removal, Reason for leaving, Destination school, Exclusion reason, Exclusion start date, Exclusion end date, SEN Stage, SEN Needs, SEN History, Mode of travel, FSM History, Attendance, Student Service Family, Carer details, Carer address details, Carer contract details, Hearing Impairment And Visual Impairment, Education Psychology support, and Looked After status.

On #IWD2022 gender bias in #edTech

I’m a mother of three girls at secondary school. For international women’s day 2022 I’ve been thinking about the role of school technology in my life.

Could some of it be improved to stop baking-in gender discrimination norms to home-school relationships?

Families come in all shapes and sizes and not every family has defined Mum and Dad roles. I wonder if edTech could be better at supporting families if it offered the choice of a multi-parent-per-child relationship by-default?

School-home communications rarely come home in school bags anymore, but digitally, and routinely sent to one-parent-per-child. If something needs actioned, it’s typically going to one parent, not both. The design of digital tools can lock-in the responsibility for action to a single nominated person. Schools send the edTech company the ‘pupil parent contact’ email, but, at least in my experience, don’t ever ask what that should be after it’s been collected once. (And don’t do a good job of communicating data rights each time before doing so either, but that’s another story.)

Whether it’s about learning updates with report cards about the child, or weekly newsletters, changes of school clubs, closures, events or other ‘things you should know’ I filter emails I get daily from a number of different email accounts for relevance, and forward them on to Dad.

To administer cashless payments to school for contributions to art, cooking, science and technology lessons, school trips, other extras or to manage my child’s lunch money, there is a single email log-in and password for a parent role allocated to the child’s account.

And it might be just my own unrepresentative circle of friends, but it’s usually Mum who’s on the receiving end of demands at all hours.

In case of illness, work commitments, otherwise being unable to carry on as usual, it’s no longer as easy for a second designated parent role to automatically pick up or share the responsibilities.

One common cashless payment system’s approach does permit more than one parent role, but it’s manual and awkward to set up. “For a second parent to have access it is necessary for the school to send a second letter with a second temporary username and password combo to activate a second account. In short, the only way to do this is to ask your school.”

Some messaging services allow a school-to-multiple-parent email, but the message itself often forms an individual not group thread with the teacher, i.e designed for a class not a family.

Some might suggest it is easy enough to set up automatic email forwarding, but again this pushes back the onus onto the parent and doesn’t solve the problem of only one person able to perform transactions.

I wonder if one-way communications tools offered a second email address by default what difference it would make to overall parental engagement?

What if for financial management edTech permitted an option to have a ‘temporary re-route’ to another email address, or default second role with notification to the other something had been paid?

Why can’t one parent, once confirmed with secure access to the child-parent account, add a second parent role? These need not be the parent, but another relation managing the outgoing money. You can only make outgoing payments to the school, or withdraw money to the same single bank account it comes from, so fraud isn’t likely.

I wonder what research would look like at each of these tools, to assess whether there is a gender divide built into default admin?

What could it improve in work-life balance for staff and families, if emails were restricted to send or receive in preferred time windows?

Technology can be amazing and genuinely make life easier for some. But not everyone fits the default and I believe the defaults are rarely built to best suit users, but rather the institutions that procure them. In many cases edTech aren’t working well for the parents that make up their main user base.

If I were designing these, they’d be school not third-party cloud based, and distributed systems, centred on the child. I think we can do better, not only for women, but everyone.


PS When my children come home from school today, I’ll be showing them the Gender Pay Gap Bot @PayGapApp thread with explanations of mode, mean and median and worth a look.

Man or machine: who shapes my child? #WorldChildrensDay 2021

A reflection for World Children’s Day 2021. In ten years’ time my three children will be in their twenties. What will they and the world around them have become? What will shape them in the years in between?


Today when people talk about AI, we hear fears of consciousness in AI. We see, I, Robot.  The reality of any AI that will touch their lives in the next ten years is very different. The definition may be contested but artificial intelligence in schools already involves automated decision making at speed and scale, without compassion or conscience, but with outcomes that affect children’s lives for a long time.

The guidance of today—in policy documents, and well intentioned toolkits and guidelines and oh yes yet another ‘ethics’ framework— is all fairly same-y in terms of the issues identified.

Bias in training data. Discrimination in outcomes. Inequitable access or treatment. Lack of understandability or transparency of decision-making. Lack of routes for redress. More rarely thoughts on exclusion, disability and accessible design, and the digital divide. In seeking to fill it, the call can conclude with a cry to ensure ‘AI for all’.

Most of these issues fail to address the key questions in my mind, with regards to AI in education.

Who gets to shape a child’s life and the environment they grow up in? The special case of children is often used for special pleading in government tech issues. Despite this, in policy discussion and documents, govt. fails over and over again to address children as human beings.

Children are still developing. Physically, emotionally, their sense of fairness and justice, of humor, of politics and who they are.

AI is shaping children in ways that schools and parents cannot see.  And the issues go beyond limited agency and autonomy. Beyond the UNCRC articles 8 and 18, the role of the parent and lost boundaries between schools and home, and 23 and 29. (See at the end in detail).

Concerns about accessibility published on AI are often about the individual and inclusion, in terms of design to be able to participate. But once they can participate, where is the independent measurement and evaluation of impact on their educational progress, or physical and mental development? What is their effect?

From overhyped like Edgenuity, to the oversold like ClassCharts (that didn’t actually have any AI in it but it still won Bett Show Awards), frameworks often mention but still have no meaningful solutions for the products that don’t work and fail.

But what about the harms from products that work as intended? These can fail human dignity or create a chilling effect, like exam proctoring tech. Those safety tech that infer things and cause staff to intervene even if the child was only chatting about ‘a terraced house.’ Punitive systems that keep profiles of behaviour points long after a teacher would have let it go. What about those shaping the developing child’s emotions and state of mind by design and claim to operate within data protection law? Those who measure and track mental health or make predictions for interventions by school staff?

Brain headbands to transfer neurosignals aren’t biometric data in data protection terms if not used to or able to uniquely identify a child.

“Wellbeing” apps are not being regulated as medical devices and yet are designed to profile and influence mental health and mood and schools adopt them at scale.

If AI is being used to deliver a child’s education, but only in the English language, what risk does this tech-colonialism create in evangelising  children in non-native English speaking families through AI, not only in access to teaching, but on reshaping culture and identity?

At the institutional level, concerns are only addressed after the fact. But how should they be assessed as part of procurement when many AI are marketed as , it never stops “learning about your child”? Tech needs full life-cycle oversight, but what companies claim their products do is often only assessed to pass accreditation at a single point in time.

But the biggest gap in governance is not going to be fixed by audits or accreditation of algorithmic fairness. It is the failure to recognize the redistribution of not only agency but authority; from individuals to companies (teacher doesn’t decide what you do next, the computer does). From public interest institutions to companies (company X determines the curriculum content, not the school). And from State to companies (accountability for outcomes has fallen through the gap in outsourcing activity to the AI company). We are automating authority, and with it the shirking of responsibility, the liability for the machine’s flaws, and accepting it is the only way, thanks to our automation bias. Accountability must be human, but whose?

Around the world the rush to regulate AI, or related tech in Online Harms, or Digital Services, or Biometrics law, is going to embed, not redistribute power, through regulatory capitalism.

We have regulatory capture including on government boards and bodies that shape the agenda; unrealistic expectations of competition shaping the market; and we’re ignoring transnational colonialisation of whole schools or even regions and countries shaping the delivery of education at scale.

We’re not regulating the questions: Who does the AI serve and how do we deal with conflicts of interest between child’s rights, family, school staff, the institution or State, and the company’s wants? Where do we draw the line between public interest, private interests, and who decides what are the best interests of each child?

We’re not managing what the implications are of the datafied child being mined and analysed in order to train companies’ AI. Is it ethical or desirable to use children’s behaviour as sources of business intelligence, to donate free labour in school systems performed for companies to profit from, without any choice (see UNCRC Art 32)?

We’re barely aware as parents, if a company will decide how a child is tested in a certain way, asked certain questions about their mental health, given nudges to ‘improve’ their performance or mood.  It’s not a question of ‘is it in the best interests of a child’, but rather, who designs it and can schools assess compatibility with a child’s fundamental rights and freedoms to develop free from interference?

It’s not about protection of ‘the data’ although data protection should be about the protection of the person, not only enabling data flows for business.

It’s about protection from strangers engineering a child’s development in closed systems.

It is about child protection from unknown and unlimited number of persons interfering with who they will become.

Today’s laws and debate are too often about regulating someone else’s opinion; how it should be done, not if it should be done at all.

It is rare we read any challenge of the ‘inevitability’ of AI [in education] narrative.

Who do I ask my top two questions on AI in education:
(a) who gets and grants permission to shape my developing child, and
(b) what happens to the duty of care in loco parentis as schools outsource authority to an algorithm?


UNCRC

Article 8

1. States Parties undertake to respect the right of the child to preserve his or her identity, including nationality, name and family relations as recognised by law without unlawful interference.

Article 18

1. States Parties shall use their best efforts to ensure recognition of the principle that both parents have common responsibilities for the upbringing and development of the child. Parents or, as the case may be, legal guardians, have the primary responsibility for the upbringing and development of the child. The best interests of the child will be their basic concern.

Article 29

1. States Parties agree that the education of the child shall be directed to:

(a) The development of the child’s personality, talents and mental and physical abilities to their fullest potential;

(c) The development of respect for the child’s parents, his or her own cultural identity, language and values, for the national values of the country in which the child is living, the country from which he or she may originate, and for civilizations different from his or her own;

Article 30

In those States in which ethnic, religious or linguistic minorities or persons of indigenous origin exist, a child belonging to such a minority or who is indigenous shall not be denied the right, in community with other members of his or her group, to enjoy his or her own culture

 

Data-Driven Responses to COVID-19: Lessons Learned OMDDAC event

A slightly longer version of a talk I gave at the launch event of the OMDDAC Data-Driven Responses to COVID-19: Lessons Learned report on October 13, 2021. I was asked to respond to the findings presented on Young People, Covid-19 and Data-Driven Decision-Making by Dr Claire Bessant at Northumbria Law School.

[ ] indicates text I omitted for reasons of time, on the day.

Their final report is now available to download from the website.

You can also watch the full event here via YouTube. The part on young people presented by Claire and that I follow, is at the start.

—————————————————–

I’m really pleased to congratulate Claire and her colleagues today at OMDDAC and hope that policy makers will recognise the value of this work and it will influence change.

I will reiterate three things they found or included in their work.

  1. Young people want to be heard.
  2. Young people’s views on data and trust, include concerns about conflated data purposes

and

3. The concept of being, “data driven under COVID conditions”.

This OMDDAC work together with Investing in Children,  is very timely as a rapid response, but I think it is also important to set it in context, and recognize that some of its significance is that it reflects a continuum of similar findings over time, largely unaffected by the pandemic.

Claire’s work comprehensively backs up the consistent findings of over ten years of public engagement, including with young people.

The 2010 study with young people conducted by The Royal Academy of Engineering supported by three Research Councils and Wellcome, discussed attitudes towards the use of medical records and concluded: These questions and concerns must be addressed by policy makers, regulators, developers and engineers before progressing with the design, and implementation of record keeping systems and the linking of any databases.

In 2014, the House of Commons Science and Technology Committee in their report, Responsible Use of Data, said the Government has a clear responsibility to explain to the public how personal data is being used

The same Committee’s Big Data Dilemma 2015-16 report, (p9) concluded “data (some collected many years before and no longer with a clear consent trail) […] is unsatisfactory left unaddressed by Government and without a clear public-policy position.

Or see

2014, The Royal Statistical Society and Ipsos Mori work on the data trust deficit with lessons for policymakers, 2019  DotEveryone’s work on Public Attitudes or the 2020 The ICO Annual Track survey results.

There is also a growing body of literature to demonstrate what the implications are being a ‘data driven’ society, for the datafied child, as described by Deborah Lupton and Ben Williamson in their own research in 2017.

[This year our own work with young people, published in our report on data metaphors “the words we use in data policy”, found that young people want institutions to stop treating data about them as a commodity and start respecting data as extracts from the stories of their lives.]

The UK government and policy makers, are simply ignoring the inconvenient truth that legislation and governance frameworks such as the UN General Comment no 25 on Children in the Digital Environment, that exist today, demand people know what is done with data about them, and it must be applied to address children’s right to be heard and to enable them to exercise their data rights.

The public perceptions study within this new OMDDAC work, shows that it’s not only the views of children and young people that are being ignored, but adults too.

And perhaps it is worth reflecting here, that often people don’t tend to think about all this in terms of data rights and data protection, but rather human rights and protections for the human being from the use of data that gives other people power over our lives.

This project, found young people’s trust in use of their confidential personal data was affected by understanding who would use the data and why, and how people will be protected from prejudice and discrimination.

We could build easy-reporting mechanisms at public points of contact with state institutions; in education, in social care, in welfare and policing, to produce reports on demand of the information you hold about me and enable corrections. It would benefit institutions by having more accurate data, and make them more trustworthy if people can see here’s what you hold on me and here’s what you did with it.

Instead, we’re going in the opposite direction. New government proposals suggest making that process harder, by charging for Subject Access Requests.

This research shows that current policy is not what young people want. People want the ability to choose between granular levels of control in the data that is being shared. They value having autonomy and control, knowing who will have access, maintaining records accuracy, how people will be kept informed of changes, who will maintain and regulate the database, data security, anonymisation, and to have their views listened to.

Young people also fear the power of data to speak for them, that the data about them are taken at face value, listened to by those in authority more than the child in their own voice.

What do these findings mean for public policy? Without respect for what people want; for the fundamental human rights and freedoms for all, there is no social license for data policies.

Whether it’s confidential GP records or the school census expansion in 2016, when public trust collapses so does your data collection.

Yet the government stubbornly refuses to learn and seems to believe it’s all a communications issue, a bit like the ‘Yes Minister’ English approach to foreigners when they don’t understand: just shout louder.

No, this research shows data policy failures are not fixed by, “communicate the benefits”.

Nor is it fixed by changing Data Protection law. As a comment in the report says, UK data protection law offers a “how-to” not a “don’t-do”.

Data protection law is designed to be enabling of data flows. But that can mean that when state data processing rightly often avoids using the lawful basis of consent in data protection terms, the data use is not consensual.

[For the sake of time, I didn’t include this thought in the next two paragraphs in the talk, but I think it is important to mention that in our own work we find that this contradiction is not lost on young people. — Against the backdrop of the efforts after the MeToo movement and lots said by Ministers in Education and at the DCMS about the Everyone’s Invited work earlier this year to champion consent in relationships, sex and health education (RSHE) curriculum; adults in authority keep saying consent matters, but don’t demonstrate it, and when it comes to data, use people’s data in ways they do not want.

The report picks up that young people, and disproportionately those communities that experience harm from authorities, mistrust data sharing with the police. This is now set against the backdrop of not only the recent, Wayne Couzens case, but a series of very public misuses of police power, including COVID powers.]

The data powers used, “Under COVID conditions” are now being used as a cover for the attack on data protections in the future. The DCMS consultation on changing UK Data Protection law, open until November 19th, suggests that similarly reduced protections on data distribution in the emergency, should become the norm. While DP law is written expressly to permit things that are out of the ordinary in extraordinary circumstances, they are limited in time. The government is proposing that some things that were found convenient to do under COVID, now become commonplace.

But it includes things such as removing Article 22 from the UK GDPR with its protections for people in processes involving automated decision making.

Young people were those who felt first hand the risks and harms of those processes in the summer of 2020, and the “mutant algorithm” is something this Observatory Report work also addressed in their research. Again, it found young people felt left out of those decisions about them despite being the group that would feel its negative effects.

[Data protection law may be enabling increased lawful data distribution across the public sector, but it is not offering people, including young people, the protections they expect of their human right to privacy. We are on a dangerous trajectory for public interest research and for society, if the “new direction” this government goes in, for data and digital policy and practice, goes against prevailing public attitudes and undermines fundamental human rights and freedoms.]

The risks and benefits of the power obtained from the use of admin data are felt disproportionately across different communities including children, who are not a one size fits all, homogenous group.

[While views across groups will differ — and we must be careful to understand any popular context at any point in time on a single issue and unconscious bias in and between groups — policy must recognise where there are consistent findings across this research with that which has gone before it. There are red lines about data re-uses, especially on conflated purposes using the same data once collected by different people, like commercial re-use or sharing (health) data with police.]

The golden thread that runs through time and across different sectors’ data use, are the legal frameworks underpinned by democratic mandates, that uphold our human rights.

I hope the powers-at-be in the DCMS consultation, and wider policy makers in data and digital policy, take this work seriously and not only listen, but act on its recommendations.


2024 updates: opening paragraph edited to add current links.
A chapter written by Rachel Allsopp and Claire bessant discussing OMDDAC’s research with children will also be published on 21st May 2024 in Governance, democracy and ethics in crisis-decision-making: The pandemic and beyond (Manchester University Press) as part of its Pandemic and Beyond series https://manchesteruniversitypress.co.uk/9781526180049/ and an article discussing the research in the open access European Journal of Law and Technology is available here https://www.ejlt.org/index.php/ejlt/article/view/872.

Facebook View and Ray-Ban glasses: here’s looking at your kid

Ray-Ban (EssilorLuxxotica) is selling glasses with ‘Facebook View’. The questions have already been asked whether they  can be lawful in Europe, including in the UK, in particular in regards to enabling the processing of children’s personal data without consent.

The Italian data authority has asked the company to explain via the Irish regulator:

  • the legal basis on which Facebook processes personal data;
  • the measures in place to protect people recorded by the glasses, children in particular,
  • questions of anonymisation of the data collected; and
  • the voice assistant connected to the microphone in the glasses.

While the first questions in Europe may be bound to data protection law and privacy, there are also questions of why Facebook has gone ahead despite Google Glass that was removed from the market in 2013. You can see a pair displayed in a surveillance exhibit at the Victoria and Albert museum (September 2021).

We can’t wait to see the world from your perspective“, says Ray-ban Chief Wearables Officer Rocco Basilico in the promotional video together with Mark Zuckerberg.  I bet. But not as much as Facebook.

With cameras and microphones built-in, up to around 30 videos or 500 photos can be stored on the glasses, and shared with Facebook companion app. While the teensy light on a corner is supposed to be an indicator that recording is in progress, the glasses look much like any other and indistinguishable in the Ray-ban range. You can even buy them as prescription glasses, which intrigues me as to how that recording looks on playback, or shared via the companion apps.

While the Data Policy doesn’t explicitly mention Facebook View in the wording on how it uses data to “personalise and improve our Products,” and the privacy policy is vague on Facebook View, it seems pretty clear that Facebook will use the video capture to enhance its product development in augmented reality.

We believe this is an important step on the road to developing the ultimate augmented reality glasses“, says Mark Zuckerberg.(05:46)

The company needs a lawful basis to be able to process the data it receives for those purposes. It determines those purposes, and is therefore a data controller for that processing.

In the supplemental policy the company says that “Facebook View is intended solely for users who are 13 or older.” Data Protection law does not care about the age of the product user, but it does regulate under what basis a child’s data may be processed and that may be the user, setting up an account. It is also concerned about the data of the children who are recorded. By recognising  the legal limitations on who can be an account owner, it has a bit of a self-own here on what the law says on children’s data.

Personal privacy may have weak protection in data protection laws that offer the wearer exemptions for domestic** or journalistic purposes, but neither the user nor the company can avoid the fact that processing video and audio recordings may be without (a) adequately informing people whose data is processed or (b) appropriate purpose limitation for any processing that Facebook the company performs, across all of its front end apps and platforms or back-end processes.

I’ve asked Facebook how I would, as a parent or child, be able to get a wearer to destroy a child’s images and video or voice recorded in a public space, to which I did not consent. How would I get to see that content once held by Facebook, or request its processing be restricted by the company, or user, or the data destroyed?

Testing the Facebook ‘contact our DPO’ process as if I were a regular user, fails. It has sent me round the houses via automated forms.

Facebook is clearly wrong here on privacy grounds but if you can afford the best in the world on privacy law, why would you go ahead anyway? Might they believe after nearly twenty years of privacy invasive practice and a booming bottom line, that there is no risk to reputation, no risk to their business model, and no real risk to the company from regulation?

It’s an interesting partnership since Ray-Ban has no history in understanding privacy. Facebook has a well known controversial one.  Reputational risk shared, will not be reputational risk halved. And EssilorLuxottica has a share price to consider.  I wonder if they carried out any due diligence risk assessment for their investors?

If and when enforcement catches up and the product is withdrawn, regulators must act as the FTC did on the development of the product (in that case algorithms) from “ill gotten data”. (In the Matter of Everalbum and Paravision Commission File No. 1923172).

Destroy the data, destroy the knowledge gained, and remove it from any product development to  date.  All “Affected Work Product.”

Otherwise any penalty Facebook will get from this debacle, will be just the cost of doing business to have bought itself a very nice training dataset for its AR product development.

Ray-Ban of course, will take all the reputational hit if found enabling strangers to take covert video of our kids. No one expects any better from Facebook.  After all, we all know, Facebook takes your privacy, seriously.


Reference:  Rynes: On why your ring video doorbell may make you a controller under GDPR.

https://medium.com/golden-data/rynes-e78f09e34c52 (Golden Data, 2019)

Judgment of the Court (Fourth Chamber), 11 December 2014 František Ryneš v Úřad pro ochranu osobních údajů Case C‑212/13. Case file

exhibits from the Victoria and Albert museum (September 2021)

Data Protection law is being set up as a patsy.

After Dominic Cummings’ marathon session at the Select Committee, the Times published an article on,”The heroes and villains of the pandemic, according to Dominic Cummings”

One of Dom’s villains left out, was data protection law. He claimed, “if someone somewhere in the system didn’t say, ‘ignore GDPR’ thousands of people were going to die,” and that “no one even knew if that itself was legal—it almost definitely wasn’t.”

Thousands of people have died since that event he recalled from March 2020, but as a result of Ministers’ decisions, not data laws.

Data protection laws are *not* barriers, but permissive laws to *enable* use of personal data within a set of standards and safeguards designed to protect people. The opposite of what its detractors would have us believe.

The starting point is fundamental human rights. Common law confidentially. But the GDPR and its related parts on public health, are in fact specifically designed to enable data processing that overrules those principles for pandemic response purposes . In recognition of emergency needs for a limited time period, data protection laws permit interference with our fundamental rights and freedoms, including overriding privacy.

We need that protection of our privacy sometimes from government itself. And sometimes from those who see themselves as “the good guys” and above the law.

The Department of Health appears to have no plan to tell people about care.data 2,  the latest attempt at an NHS data grab, despite the fact that data protection laws require that they do. From September 1st (delayed to enable it to be done right, thanks to campaign efforts from medConfidential et supporters) all our GP medical records will be copied into a new national database for re-use, unless we actively opt out.

It’s groundhog day for the Department of Health. It is baffling why the government cannot understand or accept the need to do the right thing, and instead is repeating the same mistake of recent memory, all over again. Why the rush without due process and steamrollering any respect for the rule of law?

Were it not so serious, it might amuse me that some academic researchers appear to fail to acknowledge this matters, and they are getting irate on Twitter that *privacy* or ‘campaigners’ will prevent them getting hold of the data they appear to feel entitled to. Blame the people that designed a policy that will breach human rights and the law, not the people who want your rights upheld. And to blame the right itself is just, frankly, bizarre.

Such rants prompt me to recall the time when early on in my lay role on the Administrative Data Research Network approvals panel, a Director attending the meeting *as a guest* became so apoplectic with rage, that his face was nearly purple. He screamed, literally, at the panel of over ten well respected academics and experts in research and / or data because he believed the questions being asked over privacy and ethics principles in designing governance documents were unnecessary.

Or I might recall the request at my final meeting two years later in 2017 by another then Director, for access to highly sensitive and linked children’s health and education data to do (what I believed was valuable) public interest research involving the personal data of children with Down Syndrome. But the request came through the process with no ethical review. A necessary step before it should even have reached the panel for discussion.

I was left feeling from those two experiences, that both considered themselves and their work to be in effect “above the law” and expected special treatment, and a free pass without challenge. And that it had not improved over the two years.

If anyone in the research community cannot support due process, law, and human rights when it comes to admin data access, research using highly sensitive data about people’s lives with potential for significant community and personal impacts, then you are part of the problem.  There was extensive public outreach in 2012-13 across the UK about the use of personal if de-identified data in safe settings. And in 2014 the same concerns and red-lines were raised by hundreds of people in person, almost universally with the same reactions at a range of care.data public engagement events. Feedback which institutions say matters, but continue to ignore.

It seems nothing has changed since I wrote,

“The commercial intermediaries still need to be told, don’t pee in the pool. It spoils it, for everyone else.”

We could also look back to when Michael Gove as Secretary of State for Education, changed the law in 2012 to permit pupil level, identifying and sensitive personal data to be given away to third parties. Journalists. Charities. Commercial companies, even included an online tutoring business, pre-pandemic and an agency making heat maps of school catchment areas from identifying pupil data for estate agents — notably, without any SEND pupils’ data. (Cummings was coincidentally a Gove SpAd at the Department for Education.)  As a direct result of that decision to give away pupils’ personal data in 2012, (in effect ‘re-engineering’ how the education sector was structured and the roles of the local authority and non-state providers and creating a market for pupil data)  an ICO audit of the DfE in February 2020 found unlawful practice and made 139 recommendations for change. We’re still waiting to see if and how it will be fixed.  At the moment it’s business as usual. Literally. The ICO don’t appear even to have stopped further data distribution until made lawful.

In April 2021, in answer to a written Parliamentary Question Nick Gibb, Schools Minister, made a commitment to “publish an update to the audit in June 2021 and further details regarding the release mechanism of the full audit report will be contained in this update.”  Will they promote openess, transparency, accountablity,or continue to skulk from publishing the whole truth?

Children have lost control of their digital footprint in state education by their fifth birthday.  The majority of parents polled in 2018 do not know the National Pupil Database even exists. 69% of over 1,004 parents asked, replied that they had not been informed that the Department for Education may give away children’s data to third-parties at all.

Thousands of companies continue to exploit children’s school records, without opt-in or opt-out, including special educational needs, ethnicity, and other sensitive data at pupil level.

Data protection law alone is in fact so enabling of data flow, that it is inadequate to protect children’s rights and freedoms across the state education sector in England; whether from public interest, charity or commercial research interventions without opt in or out, without parental knowledge. We shouldn’t need to understand our rights or to be proactive, in order to have them protected by default but data protection law and the ICO in particular have been captured by the siren call of data as a source of ‘innovation’ and economic growth.

Throughout 2018 and questions over Vote Leave data uses, Cummings claimed to know GDPR well. It was everyone else who didn’t. On his blog that July he suggested, “MPs haven’t even bothered to understand GDPR, which they mis-explain badly,” and in April he wrote,  The GDPR legislation is horrific. One of the many advantages of Brexit is we will soon be able to bin such idiotic laws.” He lambasted the Charter of Fundamental Rights the protections of which the government went on to take away from us under European Union Withdrawal Act.

But suddenly, come 2020/21 he is suggesting he didn’t know the law that well after all, “no one even knew if that itself was legal—it almost definitely wasn’t.”

Data Protection law is being set up as a patsy, while our confidentiality is commodified. The problem is not the law. The problem is those in power who fail to respect it, those who believe themselves to be above it, and who feel an entitlement to exploit that for their own aims.


Added 21/06/2021: Today I again came across a statement that I thought worth mentioning, from the Explanatory Notes for the Data Protection Bill from 2017:

Accordingly, Parliament passed the Data Protection Act 1984 and ratified the Convention in 1985, partly to ensure the free movement of data. The Data Protection Act 1984 contained principles which were taken almost directly from Convention 108 – including that personal data shall be obtained and processed fairly and lawfully and held only for specified purposes.”

The Data Protection Directive (95/46/EC) (“the 1995 Directive”) provides the current basis for the UK’s data protection regime. The 1995 Directive stemmed from the European Commission’s concern that a number of Member States had not introduced national law related to Convention 108 which led to concern that barriers may be erected to data flows. In addition, there was a considerable divergence in the data protection laws between Member States. The focus of the 1995 Directive was to protect the right to privacy with respect to the processing of personal data and to ensure the free flow of personal data between Member States. “

The Rise of Safety Tech

At the CRISP hosted, Rise of Safety Tech, event  this week,  the moderator asked an important question: What is Safety Tech? Very honestly Graham Francis of the DCMS answered among other things, “It’s an answer we are still finding a question to.”

From ISP level to individual users, limitations to mobile phone battery power and app size compatibility, a variety of aspects within a range of technology were discussed. There is a wide range of technology across this conflated set of products packaged under the same umbrella term. Each can be very different from the other, even within one set of similar applications, such as school Safety Tech.

It worries me greatly that in parallel to the run up to the Online Harms legislation that their promotion appears to have assumed the character of a done deal. Some of these tools are toxic to children’s rights because of the policy that underpins them. Legislation should not be gearing up to make the unlawful lawful, but fix what is broken.

The current drive is towards the normalisation of the adoption of such products in the UK, and to make them routine. It contrasts with the direction of travel of critical discussion outside the UK.

Some Safety Tech companies have human staff reading flagged content and making decisions on it, while others claim to use only AI. Both might be subject to any future EU AI Regulation for example.

In the U.S. they also come under more critical scrutiny. “None of these things are actually built to increase student safety, they’re theater, Lindsay Oliver,  project manager for the Electronic Frontier Foundation was quoted as saying in an article just this week.

Here in the U.K. their regulatory oversight is not only startlingly absent, but the government is becoming deeply invested in cultivating the sector’s growth.

The big questions include who watches the watchers, with what scrutiny and safeguards? Is it safe, lawful, ethical, and does it work?

Safety Tech isn’t only an answer we are still finding a question to. It is a world view, with a particular value set. Perhaps the only lens through which its advocates believe the world wide web should be seen, not only by children, but by anyone. And one that the DCMS is determined to promote with “the UK as a world-leader” in a worldwide export market.

As an example one of the companies the DCMS champions in its May 2020 report, ‘‘Safer technology, safer users” claims to export globally already. eSafe Global is now providing a service to about 1 million students and schools throughout the UK, UAE, Singapore, Malaysia and has been used in schools in Australia since 2011.

But does the Department understand what they are promoting? The DCMS Minister responsible, Oliver Dowden said in Parliament on December 15th 2020: “Clearly, if it was up to individuals within those companies to identify content on private channels, that would not be acceptable—that would be a clear breach of privacy.”

He’s right. It is. And yet he and his Department are promoting it.

So how is this going to play out if at all, in the Online Harms legislation expected soon, that he owns together with the Home Office? Sadly the needed level of understanding by the Minister or in the third sector and much of the policy debate in the media, is not only missing, but is actively suppressed by the moral panic whipped up in emotive personal stories around a Duty of Care and social media platforms. Discussion is siloed about identifying CSAM, or grooming, or bullying or self harm, and actively ignores the joined-up, wider context within which Safety Tech operates.

That context is the world of the Home Office. Of anti-terrorism efforts. Of mass surveillance and efforts to undermine encryption that are as nearly old as the Internet. The efforts to combat CSAM or child grooming online, operate in the same space. WePROTECT for example, sits squarely amid it all, established in 2014 by the UK Government and the then UK Prime Minister, David Cameron. Scrutiny of UK breaches of human rights law are well documented in ECHR rulings. Other state members of the alliance including the UAE stand accused of buying spyware to breach activists’ encrypted communications. It is disingenuous for any school Safety Tech actors to talk only of child protection without mention of this context. School Safety Tech while all different, operate by tagging digital activity with categories of risk, and these tags can include terrorism and extremism.

Once upon a time, school filtering and blocking services meant only denying access to online content that had no place in the classroom. Now it can mean monitoring all the digital activity of individuals, online and offline, using school or personal devices, working around encryption, whenever connected to the school network. And it’s not all about in-school activity. No matter where a child’s account is connected to the school network, or who is actually using it, their activity might be monitored 24/7, 365 days a year. A user’s activity that matches with the thousands of words or phrases on watchlists and in keyword libraries gets logged, and profiles individuals with ‘vulnerable’ behaviour tags, sometimes creating alerts. Their scope has crept from flagging up content, to flagging up children. Some schools create permanent records including false positives because they retain everything in a risk-averse environment, even things typed that a child subsequently deleted, and may be distributed and accessible by an indefinite number of school IT staff and stored in further third parties’ systems like CPOMS or Capita SIMS.

A wide range of the rights of the child are breached by mass monitoring in the UK, such as outlined in the UN Committee on the Rights of the Child General Comment No.25 which states that, “Any digital surveillance of children, together with any associated automated processing of personal data, should respect the child’s right to privacy and should not be conducted routinely, indiscriminately or without the child’s knowledge or, in the case of very young children, that of their parent or caregiver; nor should it take place without the right to object to such surveillance, in commercial settings and educational and care settings, and consideration should always be given to the least privacy-intrusive means available to fulfil the desired purpose.” (para 75)

Even the NSPCC, despite their recent public policy that opposes secure messaging using end-to-send encryption, recognises on its own Childline webpage the risk for children from content monitoring of children’s digital spaces, and that such monitoring may make them less safe.

In my work in 2018, one school Safety Tech company accepted our objections from defenddigitalme, that this monitoring went too far in its breach of children’s confidentially and safe spaces, and it agreed to stop monitoring counselling services. But there are roughly fifteen active companies here in the UK and the data protection regulator, the ICO despite being publicly so keen to be seen to protect children’s rights, has declined to act to protect children from the breach of their privacy and data protection rights across this field.

There are questions that should be straightforward to ask and answer, and while some CEOs are more willing to engage constructively with criticism and ideas for change than others, there is reluctance to address the key question: what is the lawful basis for monitoring children in school, at home, in- or out-side school hours?

Another important question often without an answer, is how do these companies train their algorithms whether in age verification or child safety tech?  How accurate are the language inferences for an AI designed to catch children out who are being deceitful and where  are assumptions, machine or man-made, wrong or discriminatory? It is overdue that our Regulator, the ICO, should do what the FTC did with Paravision, and require companies that develop tools through unlawful data processing to delete the output from it, the trained algorithm, plus products created from it.

Many of the harms from profiling children were recognised by the ICO in the Met Police gangs matrix: discrimination, conflation of victim and perpetrator, notions of ‘pre-crime’ without independent oversight,  data distributed out of context, and excessive retention.

Harm is after all why profiling of children should be prohibited. And where, in exceptional circumstances, States may lift this restriction, it is conditional that appropriate safeguards are provided for by law.

While I believe any of the Safety Tech generated category profiles could be harmful to a child through mis-interventions, being treated differently by staff as a result, or harm a trusted relationship,  perhaps the potentially most devastating to a child’s prospects are from mistakes that could be made under the Prevent duty.

The UK Home Office has pushed its Prevent agenda through schools since 2015, and it has been built into school Safety Tech by-design. School Safety Tech while all different, operate by tagging digital activity with categories of risk, and these tags can include terrorism and extremism.  I know of schools that have flags attached to children’s records that are terrorism related, but who have had no Prevent referral. But there is no transparency of these numbers at all. There is no oversight to ensure children do not stay wrongly tagged with those labels. Families may never know.

Perhaps the DCMS needs to ask itself, are the values of the UK Home Office really what the UK should export to children globally from “the UK as a world-leader” without independent legal analysis, without safeguards, and without taking accountability for their effects?

The Home Office values are demonstrated in its approach to the life and death of migrants at sea, children with no recourse to public funds, to discriminatory stop and search, a Department that doesn’t care enough to even understand or publish the impact of its interventions on children and their families.

The Home Office talk is of safeguarding children, but it is opposed to them having safe spaces online. School Safety Tech tools actively work around children’s digital security, can act as a man-in-the-middle, and can create new risks. There is no evidence I have seen that on balance convinces me that school Safety Tech does in fact make children safer. But plenty of evidence that the Home Office appears to want to create the conditions that make children less secure so that such tools could thrive, by weakening the security of digital activity through its assault on end-to-end encryption. My question is whether Online Harms is to be the excuse to give it a lawful basis.

Today there are zero statutory transparency obligations, testing or safety standards required of school Safety Tech before it can be procured in UK state education at scale.

So what would a safe and lawful framework for operation look like? It would be open to scrutiny and require regulatory action, and law.

There are no published numbers of how many records are created about how many school children each year. There are no safeguards in place to protect children’s rights or protection from harm in terms of false positives, error retention, transfer of records to the U.S. or third party companies, or how many covert photos they have enabled to be taken of children via webcam by school staff.  There is no equivalent of medical device ‘foreseeable misuse risk assessment’  such as ISO 14971 would require, despite systems being used for mental health monitoring with suicide risk flags. Children need to know what is on their record and to be able to seek redress when it is wrong. The law would set boundaries and safeguards and both existing and future law would need to be enforced. And we need independent research on the effects of school surveillance, and its chilling effects on the mental health and behaviour of developing young people.

Companies may argue they are transparent, and seek to prove how accurate their tools are. Perhaps they may become highly accurate.

But no one is yet willing to say in the school Safety Tech sector, these are thousands of words that if your child types may trigger a flag, or indeed, here’s an annual report of all the triggered flags and your own or your child’s saved profile. A school’s interactions with children’s social care already offers a framework for dealing with information that could put a child at risk from family members, so reporting should be do-able.

At the end of the event this week, the CRISP event moderator said of their own work, outside schools, that, “we are infiltrating bad actor networks across the globe and we are looking at everything they are saying. […] We have a viewpoint that there are certain lines where privacy doesn’t exist anymore.”

Their company website says their work involves, “uncovering and predicting the actions of bad actor, activist, agenda-driven and interest groups“. That’s a pretty broad conflation right there.  Their case studies include countering social media activism against a luxury apparel brand. And their legal basis of ‘legitimate interests‘ for their data processing might seem flimsy at best, for such a wide ranging surveillance activity where, ‘privacy doesn’t exist anymore’.

I must often remind myself that the people behind Safety Tech may epitomise the very best of what some believe is making the world safer online as they see it. But it is *as they see it*.  And if  policy makers or CEOs have convinced themselves that because ‘we are doing it for good, a social impact, or to safeguard children’, that breaking the law is OK, then it should be a red flag that these self-appointed ‘good guys’ appear to think themselves above the law.

My takeaway time and time again, is that companies alongside governments, policy makers, and a range of lobbying interests globally, want to redraw the lines around human rights, so that they can overstep them. There are “certain lines” that don’t suit their own business models or agenda. The DCMS may talk about seeing its first safety tech unicorn, but not about the private equity funding, or where they pay their taxes. Children may be the only thing they talk about protecting but they never talk of protecting children’s rights.

In the school Safety Tech sector, there is activity that I believe is unsafe, or unethical, or unlawful. There is no appetite or motivation so far to fix it. If in upcoming Online Harms legislation the government seeks to make lawful what is unlawful today, I wonder who will be held accountable for the unsafe and the unethical, that come with the package dealand will the Minister run that reputational risk?