Tag Archives: pupil data

Farming out our children. AI AI Oh. (2/2)

Today Keir Starmer talked about us having more control in our lives. “Taking back control is a Labour argument”, he said. So let’s see it in education tech policy where parents told us in 2018, less than half felt they had sufficient control of their child’s digital footprint.

Not only has the UK lost control of which companies control large parts of the state education infrastructure and its delivery, the state is *literally* giving away control of our children’s lives recorded in identifiable data at national level, and since 2012 included giving it to journalists, think tanks, and companies.

Why it matters is less about the data per se, but what is done with it without our permission and how that affects our lives.

Politicians’ love affair with AI (undefined) seems to be as ardent as under the previous government. The State appears to have chosen to further commercialise children’s lives in data, having announced towards the end of the school summer holidays that  the DfE and DSIT will give pupils’ assessment data to companies for AI product development. I get angry about this, because the data is badly misunderstood, and not a product to pass around but the stories of children’s lives in data, and that belongs to them to control.

Are we asking the right questions today about AI and education?  In 2016 in a post for Nesta, Sam Smith foresaw the algorithmic fiasco that would happen in the summer of 2020  pointing out that exam-marking algorithms like any other decisions, have unevenly distributed consequences. What prevents that happening daily but behind closed doors and in closed systems? The answer is, nothing.

Both the adoption of AI in education and education about AI is unevenly distributed. Driven largely by commercial interests, some are co-opting teaching unions for access to the sector, others more cautious, have focused on the challenges of bias and discrimination and plagiarism. As I recently wrote in Schools Week, the influence of corporate donors and their interests in shaping public sector procurement, such as the Tony Blair Institute’s backing by Oracle owner Larry Ellison, therefore demands scrutiny.

Should society allow its public sector systems and laws to be shaped primarily to suit companies? The users of the systems are shaped by how those companies work, so who keeps the balance in check?

In a 2021 reflection here on World Children’s Day, I asked the question, Man or Machine, who shapes my child? Three years later, I am still concerned about the failure to recognize and address the question of redistribution of not only pupils’ agency but teachers’ authority; from individuals to companies (pupils and the teacher don’t decide what is ‘right’ you do next, the ‘computer’ does). From public interest institutions to companies (company X determines the curriculum content of what the computer does and how, not the school). And from State to companies (accountability for outcomes falls through the gap in outsourcing activity to the AI company).

Why it matters, is that these choices do not only influence how we are teaching and learning, but how children feel about it and develop.

The human response to surveillance (and that is what much of AI relies on, massive data-veillance and dashboards) is a result of the chilling effect of being ‘watched‘ by known or unknown persons behind the monitoring. We modify our behaviours to be compliant to their expectations. We try not to stand out from the norm, or to protect ourselves from resulting effects.

The second reason we modify our behaviours is to be compliant with the machine itself. Thanks to the lack of a responsible human in the interaction mediated by the AI tool, we are forced to change what we do to comply with what the machine can manage. How AI is changing human behaviour is not confined to where we walk, meet, play and are overseen in out- or indoor spaces. It is in how we respond to it, and ultimately, how we think.

In the simplest examples, using voice assistants shapes how children speak, and in prompting generative AI applications, we can see how we are forced to adapt how we think to put the questions best suited to getting the output we want. We are changing how we behave to suit machines. How we change behaviour is therefore determined by the design of the company behind the product.

There is limited public debate yet on the effects of this for education, on how children act, interact, and think using machines, and no consensus in the UK education sector whether it is desirable to introduce these companies and their steering that bring changes in teaching and learning and to the future of society, as a result.

And since then in 2021, I would go further. The neo-liberal approach to education and its emphasis on the efficiency of human capital and productivity, on individualism and personalisation,  all about producing ‘labour market value’, and measurable outcomes, is commonly at the core of AI in teaching and learning platforms.

Many tools dehumanise children into data dashboards, rank and spank their behaviours and achivements, punish outliers and praise norms, and expect nothing but strict adherence to rules (sometimes incorrect ones, like mistakes in maths apps). As some companies have expressly said, the purpose of this is to normalise such behaviours ready to be employees of the future, and the reason their tools are free is to normalise their adoption for life.

AI by the normalisation of values built-in by design to tools, is even seen by some as encouraging fascistic solutions to social problems.

But the purpose of education is not only about individual skills and producing human capital to exploit.  Education is a vital gateway to rights and the protection of a democratic society. Education must not only be about skills as an economic driver when talking about AI and learners in terms of human capital, but include rights, championing the development of a child’s personality to their fullest potential, and intercultural understanding, digital citizenship on dis-/misinformation, discrimination and the promotion and protection of democracy and the natural world. “It shall promote understanding, tolerance and friendship among nations, racial or religious groups, and shall further the activities of the United Nations for the maintenance of peace.”

Peter Kyle, the UK DSIT’s Secretary of State said last week, that, “more than anything else, it is growth that will shape those young people’s future.” But what will be used to power all this growth in AI, at what environmental and social costs, and will we get a say?

Don’t forget, in this project announcement the Minister said, “This is the first of many projects that will transform how we see and use public sector data.” That’s our data, about us. And when it comes to schools, that’s not only the millions of learners who’ve left already but who are school children today. Are we really going to accept turning them into data fodder for AI without a fight? As Michael Rosen summed up so perfectly in 2018, “First they said they needed data about the children  to find out what they’re learning… then the children became data.”  If this is to become the new normal, where is the mechanism for us to object? And why this, now, in such a hurry?

Purpose limitation should also prevent retrospective reuse of learners’ records and data, but it hasn’t so far on general identifying and sensitive data distribution from the NPD at national level or from edTech in schools. The project details, scant as they are, suggest parents were asked for consent in this particular pilot, but the Faculty AI notice seems legally weak for schools, and when it comes to using pupil data for building into AI products the question is whether consent can ever be valid — since it cannot be withdrawn once given, and the nature of being ‘freely given’ is affected by the power imbalance.

So far there is no field to record an opt out in any schools’ Information Management Systems though many discussions suggest it would be relatively straightforward to make it happen. However it’s important to note their own DSIT public engagement work on that project says that opt-in is what those parents told the government they would expect. And there is a decade of UK public engagement on data telling government opt-in is what we want.

The regulator has been silent so far on the DSIT/DfE announcement despite lack of fair processing and failures on Articles 12, 13 and 14 of the GDPR being one of the key findings in its 2020 DfE audit. I can use a website to find children’s school photos, scraped without our permission. What about our school records?

Will the government consult before commercialising children’s lives in data to feed AI companies and ‘the economy’ or any of the other “many projects that will transform how we see and use public sector data“?  How is it different from the existing ONS, ADR, or SAIL databank access points and processes? Will the government evaluate the impact on child development, behaviour or mental health of increasing surveillance in schools? Will MPs get an opt-in or even -out, of the commercialisation of their own school records?

I don’t know about ‘Britain belongs to us‘, but my own data should.


See also Part 1: The New Normal is Not Inevitable.

A fresh start for edtech? Maybe. But I wouldn’t start from here.

In 1924 the Hibbert Journal published what is accepted as the first printed copy of a well-known joke.

A genial Irishman, cutting peat in the wilds of Connemara, was once asked by a pedestrian Englishman to direct him on his way to Letterfrack. With the wonted enthusiasm of his race the Irishman flung himself into the problem and, taking the wayfarer to the top of a hill commanding a wide prospect of bogs, lakes, and mountains, proceeded to give him, with more eloquence than precision, a copious account of the route to be taken. He then concluded as follows: ‘Tis the divil’s own country, sorr, to find your way in. But a gintleman with a face like your honour’s can’t miss the road; though, if it was meself that was going to Letterfrack, faith, I wouldn’t start from here.’

Ty Goddard asked some sensible questions in TES on April 4 on the UK edTech strategy, under the overarching question, ‘A fresh start for edtech? Maybe. But the road is bumpy.’

We’d hope so, since he’s on the DfE edTech board and aims “to accelerate the edtech sector in Britain and globally.”

“The questions now being asked are whether you can protect learning at a time of national emergency? Can you truly connect educators working from home with their pupils?”

and he rightly noted that,

“One problem schools are now attempting to overcome is that many lack the infrastructure, experience and training to use digital resources to support a wholesale move to online teaching at short notice.”

He calls for “bold investment and co-ordination across Whitehall led by Downing Street to really set a sprint towards super-fast connectivity to schools, pupils’ homes and investment in actual devices for students. The Department for Education, too, has done much to think through our recent national edtech strategy – now it needs to own and explain it.”

But the own and explain it, is the same problematic starting point as care-data had in the NHS in 2014. And we know how that went.

The edTech demands and drive for the UK are not a communications issue. Nor are they simply problems of infrastructure, or the age-old idea of shipping suitable tech at scale. The ‘fresh start’ isn’t going to be what anyone wants, least of all the edTech evangelists if we start from where they are.

Demonstrators of certain programmes, platforms, and products to promote to others and drive adoption, is ‘the divil’s own country‘.

The edTech UK strategy in effect avoided online learning, and the reasons for that were not public knowledge but likely well founded. They’re mostly unevidenced and often any available research comes from the companies themselves or their partners and promoter think tanks and related, or self interested bodies.

I’ve not seen anyone yet talk about disadvantage and deprivation from not issuing course curriculum standard text books to every child.  Why on earth can secondary schools not afford to give each child their text book home? A darn sight cheaper than tech, independent of data costs and a guide to exactly what the exams will demand. Should we not seek to champion the most appropriate and equitable learning solutions, in addition to, rather than exclusively, the digital ones? GSCE children I support(ed) in foreign languages each improved once they had written materials. Getting out Chromebooks by contrast, simply interfered in the process, and wasted valuable classroom time.

Technology can deliver most vital communications, at speed and scale. It can support admin, expand learning and level the playing field through accessible tools. But done wrongly, it makes things worse than without.

Its procurement must assess any potential harmful consequences and safeguard against them, and not accept short term benefits, at the cost of long term harm. It should be safe, fair, and transparent.

“Responsible technology is no longer a nice thing to do to look good, it’s becoming a fundamental pillar of corporate business models. In a post-Cambridge Analytica world, consumers are demanding better technology and more transparency. Companies that do create those services are the ones that will have a better, brighter future.”

Kriti Sharma, VP of AI, Sage, (Doteveryone 2019 event, Responsible Technology)

The hype of ‘edTech’ achievement in the classroom so far, far outweighs the evidence of delivery. Neil Selwyn, Professor in the Faculty of Education, Monash University, Australia, writing in the Impact magazine of the Chartered College in January 2019 summed up:

“the impacts of technology use on teaching and learning remain uncertain. Andreas Schleicher – the OECD’s director of education – caused some upset in 2015 when suggesting that ICT has negligible impact on classrooms. Yet he was simply voicing what many teachers have long known: good technology use in education is very tricky to pin down.”

That won’t stop edTech being part of the mainstay of the UK export strategy post-Brexit whenever that may now be. But let’s be very clear that if the Department wants to be a world leader it shouldn’t promote products whose founders were last most notably interviewing fellow students online about their porn preferences. Or who are based in offshore organisations with very odd financial structures. Do your due diligence. Work with reputable people and organisations and build a trustworthy network of trustworthy products framed by the rule of law, that is rights’ respecting and appropriate to children. But don’t start with the products.

Above all build a strategy for education, for administrative support, for respecting rights, and for teaching in which tools that may or may not be technology-based add value; but don’t start with the product promotion.

To date the aims are to serve two masters. Our children’s education, and the UK edTech export strategy. You can if you’re prepared to do the proper groundwork, but it’s lacking right now. What is certain, is that if you get it wrong for UK children, the other will inevitably fail.

Covid19 must not be misused to direct our national edTech strategy. I wouldn’t start from here isn’t a joke, it’s a national call for change.

Here’s ten reasons where, why, and how to start instead.

1. The national edTech strategy board should start by demonstrating what it wants to see from others, with full transparency of its members, aims, terms of reference, partners and meeting minutes. There should be no need FOI to ask for them. There are much more sensitive subjects that operate in the open. It unfortunately emulates other DfE strategy, and the UK edTech network which has an in-crowd, and long standing controlling members. Both would be the richer for transparency and openness.

2. Stop bigging up the ‘Big Three’  and doing their market monopolisation for them, unless you want people to see you simply as promoting your friends’-on-the-board/foundation/ethics committee’s products. Yes,” many [educational settings] lack the infrastructure” but that should never mean encouraging ownership and delivery by only closed commercial partners.  That is the route to losing control of your state education curriculum, staff training  and (e)quality,  its delivery, risk management, data,  and cost control.

3. Start with designing for fairness in public sector systems. Minimum acceptable ethical standards could be framed around for example, accessibility, design, and restrictions on commercial exploitation and in-product advertising. This needs to be in place first, before fitting products ‘on top’ of an existing unfair, and imbalanced system, to avoid embedding disadvantage and the commodification of children in education, even further.

5. Accessibility and Internet access is a social justice issue.  Again as we’ve argued for at defenddigitalme for some time, these come *before* you promote products on top of the delivery systems:

  • Accessibility standards for all products used in state education should be defined and made compulsory in procurement processes, to ensure access for all and reduce digital exclusion.
  • All schools must be able to connect to high-speed broadband services to ensure equality of access and participation in the educational, economic, cultural and social opportunities of the world wide web.
  • Ensure a substantial improvement in support available to public and school library networks. CILIP has pointed to CIPFA figures of a net reduction of 178 libraries in England between 2009-10 and 2014-15.

6. Core national education infrastructure must be put on the national risk register, as we’ve argued for previously at defenddigitalme (see 6.6). Dependence such as MS Office 365, major cashless payment systems, and Google for Education all need assessed and to be part of the assessment for regular and exceptional delivery of education. We currently operate in the dark. And it should be unthinkable that companies get seats at the national UK edTech strategy table without full transparency over questions on their practices, policy and meeting the rule of law.

7. Shift the power balance back to schools and families, where they can trust an approved procurement route, and children and legal guardians can trust school staff to only be working with suppliers that are not overstepping the boundaries of lawful processing. Incorporate (1) the Recommendation CM/Rec(2018)7 of the Committee of Ministers to member States on Guidelines to respect, protect and fulfil the rights of the child in the digital environment  and (2) respect the UN General comment No. 16 (2013) on State obligations regarding the impact of the business sector on children’s rights, across the education and wider public sector.

8. Start with teacher training. Why on earth is the national strategy all about products, when it should be starting with people?

  • Introduce data protection and pupil privacy into basic teacher training, to support a rights-respecting environment in policy and practice, using edTech and broader data processing, to give staff the clarity, consistency and confidence in applying the high standards they need.
  • Ensure ongoing training is available and accessible to all staff for continuous professional development.
  • A focus on people, nor products, will deliver fundamental basics needed for good tech use.

9. Safe data by design and default. I’m tired of hearing from CEOs of companies that claim to be social entrepreneurs, or non-profit, or teachers who’ve designed apps, how well intentioned their products are. Show me instead. Meet the requirements of the rule of law.

  • Local systems must stop shipping out (often sensitive) pupil data at scale and speed to companies, and instead stay in control of terms and conditions, data purposes, and ban product developments for example.
  • Companies must stop using pupil data for their own purposes for profit, or to make inferences about autism or dyslexia for example, if that’s not your stated product aim, it’s likely unlawful.
  • Stop national pupil data distribution for third-party reuse. Start safe access instead.  And get the Home Office out of education.
  • Establish fair and independent oversight mechanisms of national pupil data, so that transparency and trust are consistently maintained across the public sector, and throughout the chain of data use, from collection, to the end of its life cycle, including annual data usage reports for each child.

10. We need a law that works for children’s rights. Develop a legislative framework for the fair use of a child’s digital footprint from the classroom for direct educational and administrative purposes at local level, including commercial acceptable use policies.  Build the national edTech strategy with a rights’ based framework and lawful basis in an Education and Privacy Act. Without this, you are building on sand.