All posts by jenpersson

Ethics washing in AI. Any colour as long as it’s dark blue?

The opening discussion from the launch of the Institute for Ethics in AI in the Schwarzman Centre for Humanties in Oxford both asked many questions and left many open.

The panel event is available to watch on YouTube.

The Director recognised in his opening remarks where he expected their work to differ from the talk of ethics in AI that can become ‘matters of facile mottos hard to distinguish from corporate PR’, like “Don’t be evil.” I would like to have heard him go on to point out the reasons why, because I fear this whole enterprise is founded on just that.

My first question is whether the Institute will ever challenge its own need for existence. It is funded, therefore it is. An acceptance of the technological value and inevitability of AI is after all, built into the name of the Institute.

As Powles and Nissenbaum, wrote in 2018, “the endgame is always to “fix” A.I. systems, never to use a different system or no system at all.”

My second question is on the three drivers they went on to identify, in the same article, “Artificial intelligence… is backed by real-world forces of money, power, and data.”

So let’s follow the money.

The funder of the Schwarzman Centre for Humanties the home of the new Institute is also funding AI ethics work across the Atlantic, at Harvard, Yale and other renowned institutions that you might expect to lead in the publication of influential research. The intention at the MIT Schwarzman College of Computing, is that his investment “will reorient MIT to address the opportunities and challenges presented by the rise of artificial intelligence including critical ethical and policy considerations to ensure that the technologies are employed for the common good.” Quite where does that ‘reorientation’ seek to end up?

The panel discussed power.

The idea of ‘citizens representing citizens rather than an elite class representing citizens’, should surely itself be applied to challenge who funds work that shapes public debate. How much influence is democratic for one person to wield?

“In 2007, Mr. Schwarzman was included in TIME’s “100 Most Influential People.” In 2016, he topped Forbes Magazine’s list of the most influential people in finance and in 2018 was ranked in the Top 50 on Forbes’ list of the “World’s Most Powerful People.” [Blackstone]

The panel also talked quite a bit about data.

So I wonder what work the Institute will do in this area and the values that might steer it.

In 2020 Schwarzman’s private equity company Blackstone, acquired a majority stake in Ancestry, a provider of ‘digital family history services with 3.6 million subscribers in over 30 countries’. DNA. The Chief Financial Officer of Alphabet Inc. and Google Inc sits on Blackstone’s board. Big data. The biggest. Bloomberg reported in December 2020 that, ‘Blackstone’s Next Product May Be Data From Companies It Buys’. “Blackstone, which holds stakes in about 97 companies through its private equity funds, ramped up its data push in 2015.”

It was Nigel Shadbolt who picked up the issues of data and of representation as relates to putting human values at the centre of design. He suggested that there is growing disquiet that rather than everyday humans’ self governance, or the agency of individuals, this can mean the values of ‘organised group interests’ assert control. He picked up on the values that we most prize, as things that matter in value-based computing and later on, that transparency of data flows as a form of power being important to understand. Perhaps the striving for open data as revealing power, should also apply to funding in a more transparent, publicly accessible model?

AI in a democratic culture.

Those whose lives are most influenced by AI are often those the most excluded in discussing its harms, and rarely involved in its shaping or application. Prof Hélène Landemore (Yale University) asked perhaps the most important question in the discussion, given its wide-ranging dance around the central theme of AI and its role or effects in a democratic culture, that included Age Appropriate Design, technical security requirements, surveillance capitalism and fairness. Do we in fact have democracy or agency today at all?

It is after all not technology itself that has any intrinsic ethics but those who wield its power, those who are designing it, and shaping the future through it, those human-accountability-owners who need to uphold ethical standards in how technology controls others’ lives.

The present is already one in which human rights are infringed by machine-made and data-led decisions about us without us, without fairness, without recourse, and without redress. It is a world that includes a few individuals in control of a lot. A world in which Yassen Aslam this week said, “the conditions of work, are being hidden behind the technology.”

The ethics of influence.

I want to know what’s in it for this funder to pivot from his work life, past and present, to funding ethics in AI, and why now? He’s not renowned for his ethical approach in the world. Rather from his past at Lehman Brothers to the funding of Donald Trump, he is better known for his reported “inappropriate analogy” on Obama’s tax policies or when he reportedly compared ‘Blackstone’s unsuccessful attempt to buy a mortgage company in the midst of the subprime homeloans crisis to the devastation wreaked by an atomic bomb dropped on Hiroshima in 1945.’

In the words of the 2017 International Business Times article, How Billionaire Trump Adviser Evades Ethics Law While Shaping Policies That Make Money For His Wall Street Firm, Schwarzman has long been a fixture in Republican politics.” “Despite Schwarzman’s formal policy role in the Trump White House, he is not technically on the White House payroll.” Craig Holman of Public Citizen, was reported as saying, “We’ve never seen this type of abuse of the ethics laws”. While politics may have moved on, we are arguably now in a time Schwarzman described as a golden age that arrives, when you have a mess.”

The values behind the money, power, and data matter in particular because it is Oxford. Emma Briant has raised her concerns in Wired, about the report from the separate Oxford Internet Institute, Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulationbecause of how influential the institute is.

Will the work alone at the new ethics Institute be enough to prove that its purpose is not for the funder or his friends to use their influence to have their business interests ethics-washed in Oxford blue?  Or might what the Institute chooses not to research, say just as much? It is going to have to prove its independence and own ethical position in everything it does, and does not do, indefinitely. The panel covered a wide range of already well-discussed, popular but interesting topics in the field, so we can only wait and see.

I still think, as I did in 2019, that corporate capture is unhealthy for UK public policy. If done at scale, with added global influence, it is not only unhealthy for the future of public policy, but for academia. In this case it has the potential in practice to be at best irrelevant corporate PR, but at worst to be harmful for the direction of travel in the shaping of global attitudes towards a whole field of technology.

Mutant algorithms, roadmaps and reports: getting real with public sector data

The CDEI has published ‘new analysis on the use of data in local government during the COVID-19 crisis’ (the Report) and it features some similarities in discussing data that the Office for AI roadmap (the Roadmap) did in January on machine learning.

A notable feature is that the CDEI work includes a public poll. Nearly a quarter of 2,000 adults said that the most important thing for them, to trust the council’s use of data, would be “a guarantee that information is anonymised before being shared, so your data can’t be linked back to you.”

Both the Report and the Roadmap shy away from or avoid that problematic gap in their conclusions, between public expectations and reality in the application of data used at scale in public service provision, especially in identifying vulnerability and risk prediction.

Both seek to provide vision and aims around the future development of data governance in the UK.

The fact is that everyone must take off their rose-tinted spectacles on data governance to accept this gap, and get basics fixed in existing practice to address it. In fact, as academic Michael Veale wrote, often the public sector is looking for the wrong solution entirely.The focus should be on taking off the ‘tech goggles’ to identify problems, challenges and needs, and to not be afraid to discover that other policy options are superior to a technology investment.”

But used as it is, the public sector procurement and use of big data at scale, whether in AI and Machine Learning or other systems, require significant changes in approach.

The CDEI poll asked, If an organisation is using an algorithmic tool to make decisions, what do you think are the most important safeguards that they should put in place  68% rated, that humans have a key role in overseeing the decision-making process, for example reviewing automated decisions and making the final decision, in their top three safeguards.

So what is this post about? Why our arms length bodies and various organisations’ work on data strategy are hindering the attainment of the goals they claim to promote, and what needs fixed to get back on track. Accountability.

Framing the future governance of data

On Data Infrastructure and Public Trust, the AI Council Roadmap stated an ambition to, “Lead the development of data governance options and its uses. The UK should lead in developing appropriate standards to frame the future governance of data.”

To suggest we not only should be a world leader but imagine that there is the capability to do so, suggests a disconnect with current reality, none of which was mentioned in the Roadmap but is drawn out a little more in the CDEI Report from local authority workshops.

When it comes to data policy and Artificial Intelligence (AI) or Machine Learning (ML) based on data processing and therefore dependent on its infrastructure, suggesting we should lead on data governance, as if separate from the existing standards and frameworks set out in law, would be disastrous for the UK and businesses in it.  Exports need to meet standards in the receiving countries. You cannot just ‘choose your own’ adventure here.

The CDEI Report says both that participants in their workshops found a lack of legal clarity “in the collection and use of data” and, “Participants finished the Forum by discussing ways of overcoming the barriers to effective and ethical data use.”

Lack of understanding of the law is a lack of competence and capability that I have seen and heard time and time and time again in participants at workshops, events, webinars, some of whom are in charge of deciding what tools are procured and how to implement public policy using administrative data, over the last 5 years. The law on data processing is accessible and generally straightforward.

If your work involves “overcoming barriers” then either there is not competence to understand what is lawful to proceed with confidence using data protections appropriately, or you are trying to avoid doing so. Neither is a good place to be in for public authorities, and bodes badly for the safe, fair, transparent and lawful use of our personal data by them.

But it is also lack of data infrastructure that increases the skills gap and leaves a bigger need to know what is lawful or not, because if your data is held in “excessive use of excel spreadsheets” then you need to make decisions about ‘sharing’ done through distribution of data. Data access can be more easily controlled through role-based access models, that make it clear when someone is working around their assigned security role, and creates an audit trail of access. You reduce risk by distributing access, not distributing data.

The CDEI Report quotes as a ‘concern’ that data access granted under emergency powers in the pandemic will be taken away. This is a mistaken view that should be challenged. That access was *always* conditional and time limited. It is not something that will be ‘taken away’ but an exceptional use only granted because it was temporary, for exceptional purposes in exceptional times. Had it not been time limited, you wouldn’t have had access. Emergency powers in law are not ‘taken away’, but can only be granted at all in an emergency. So let’s not get caught up in artificial imaginings of what could change and what ifs, but change what we know is necessary.

We would do well to get away from the hyperbole of being world-leading, and aim for a minimum high standard of competence and capability in all staff who have any data decision-making roles and invest in the basic data infrastructure they need to do a good job.

Appropriate standards to frame the future governance of data

The AI Council Roadmap suggested that, “The UK should lead in developing appropriate standards to frame the future governance of data.”  Let’s stop and really think for a minute, what did the Roadmap writers think they meant by that?

Because we have law that frames ‘appropriate standards.’ The UK government just seems unable or unwilling to meet it. And not only in these examples, in fact I’d challenge all the business owners on the AI Council to prove their own products meet it.

You could start with the Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (wp251rev.01). Or consider any of the Policy, recommendations, declarations, guidelines and other legal instruments issued by Council of Europe bodies or committees on artificial intelligence. Or valuable for export standards, ensure respect for the Convention 108  standards to which we are a signed up State party among its over 50 countries, and growing. That’s all before the simplicity of the UK Data Protection Act 2018 and the GDPR.

You could start with auditing current practice for lawfulness. The CDEI Roadmap says, “The CDEI is now working in partnership with local authorities, including Bristol City Council, to help them maximise the benefits of data and data-driven technologies.” I might suggest that includes a good legal team, as I think the Council needs one.

The UK is already involved in supporting the development of guidelines (as I was alongside UK representatives of government and the data regulator the ICO among hundreds of participants in drawing out Convention 108 Guidelines on data processing in education) but to suggest as a nation state that we have the authority to speak on the future governance of data without acknowledging what we should already be doing and where we get it wrong, is an odd place to start.

The current state of reality in various sectors

Take for example the ICO audit of the Department for Education.

Failures to meet basic principles of data protection law include knowing what data they’ve got, appropriate controls on distribution and failure to fair process (tell people you process their data). This is no small stuff. And it’s only highlights from the eight page summary.

The DfE don’t adequately understand what data they hold and not having a record of processing leads to a direct breach of #GDPR. Did you know the Department is not able to tell you to which third parties your own or your child’s sensitive, identifying personal data (from over 21m records) was sent, among 1000s of releases?

The approach on data releases has been to find a way to fit the law to suit data requests, rather than assess if data distribution should be approved at all. This ICO assessment was of only 400 applications — there’s been closer to 2,000 approved since 2012. One refusal was to the US. Another the MOD.


For too long, the DfE ‘internal cultural barriers and attitudes’ has meant it hasn’t cared about your rights and freedoms or meeting its lawful obligations. That is a national government Department in charge of over fifty such mega databases, the NPD is only one of. This is a systemic and structural set of problems, as a direct result of Ministerial decisions that changed the law in 2012 to give our personal data away from state education. It was a choice made not to tell the people whom the data were about. This continues to be in breach of the law. And that is the same across many government departments.

Why does it even matter some still ask? Because there is harm to people today. There is harm in history that must not be possible to repeat. And some of the data held could be used in dangerous ways.

You only need to glance at other applications in government departments and public services to see bad policy, bad data and bad AI or machine learning outcomes. And all of those lead to breakdowns in trust and relations between people and the systems meant to support them, that in turn lead to bad data, and policy.

Unless government changes its approach, the direction of travel is towards less trust, and for public health for example, we see the consequences in disastrous responses from not attending for vaccination based on mistrust of proven data sharing, to COVID conspiracy theories.

Commercial reuse of pubic admin data is a huge mistake and the direction of travel is damaging.

“Survey responses collected from more than 3,000 people across the UK and US show that in late 2018, some 95% of people were not willing to share their medical data with commercial industries. This contrasts with a Wellcome study conducted in 2016 which found that half of UK respondents were willing to do so.” (July 2020, Imperial College)

Mutant algorithms

Summer 2020 first saw no human accountability for grades “derailed by a mutant #algorithm — then the resignation of two  Ofqual executives. What aspects of the data governance failures will be addressed this year? Where’s the *fairness* —there is a legal duty to tell people how what data is used especially in its automated aspects.

Misplaced data and misplaced policy aims

In June 2020 The DWP argued in a court case that, “to change the way the benefit’s online computer calculation system worked in line with the original court ruling would undermine the principle of universal credit” — Not only does it fail its public interest purpose, and does harm, but is lax on its own #data governance controls. World leading is far, far, far away.

Entrenched racism

In August 2020 “The Home Office [has] agreed to stop using a computer algorithm to help decide visa applications after allegations that it contained “entrenched racism”. How did it ever get approved for use?

That entrenched racism is found in policing too. The Gangs Matrix use of data required an Enforcement Notice from the ICO and how it continues to operate at all, given its recognised discrimination and harm to young lives, is shocking.

Policy makers seem fixated on quick fixes that for the most part exist only in the marketing speak of the sellers of the products, while ignoring real problems in ethics and law, and denying harm.

“Now is a good time to stop.”

The most obvious case for me, where the Office for AI should step in, and where the CDEI Report from workshops with Local Authorities was most glaringly remiss, is where there is evidence of failure of efficacy and proven risk of danger to life through the procurement of technology in public policy. Don’t forget to ask what doesn’t work.

In January 2020  a report from researchers at The Turing institute, Rees Centre and What Works Centre published a report on ethics in Machine Learning in Children’s Social Care (CSC) and raised the “dangerous blind spots” and “lurking biases” in application of machine learning in UK children’s social care— totally unsuitable for life and death situations. Its later evidence showed models that do not work or wuld reach the threshold they set for defining ‘success’.

Out of the thirty four councils who had said they had acute difficulties in recruiting children’s social workers in December 2020 Local Government survey, 50 per cent said they had both difficulty recruiting generally and difficulty recruiting the required expertise, experience or qualification. Can staff in such challenging circumstances really have capacity to understand the limitations of developing technology on top of their every day expertise?

And when it comes to focussing on the data, there are problems too. By focusing on the data held, and using only that to make policy decisions rather than on the ground expertise, we end up in situations where only “those who get measured, get helped”.

As Michael Sanders wrote, on CSC, “Now is a good time to stop. With the global coronavirus pandemic, everything has been changed, all our data scrambled to the point of uselessness in any case.

There is no short cut

If the Office for AI Roadmap is to be taken seriously outside its own bubble, the board need to be and be seen to be independent of government. It must engage with reality of applied AI in practice in public services, getting basics fixed first.  Otherwise all its talk of “doubling down” and suggesting the UK government can build public trust and position the UK as a ‘global leader’ on Data Governance is misleading and a waste of everyone’s time and capacity.

I appreciate that it says, “This Roadmap and its recommendations reflects the views of the Council as well as 100+ additional experts.” All of whom I imagine are more expert than me. If so, which of them is working on fixing the basic underlying problems with data governance within public sector data, how and by when? If they are not, why are they not, and who is?

The CDEI report published today identified in local authorities that, “public consultation can be a ‘nice to have’, as it often involves significant costs where budgets are already limited.” If it’s a position the CDEI does not say is flawed, it may as well pack up and go home. On page 27 it reports, “When asked about their understanding of how their local council is currently using personal data and presented with a list of possible uses, 39% of respondents reported that they do not know how their personal data is being used.” The CDEI should be flagging this with a great big red pen as an indicator of unlawful practice.

The CDEI Report also draws on the GDS Ethical Framework but that will be forever flawed as long as its own users, not the used, are the leading principle focus, underpinning the aims. It starts with “Define and understand public benefit and user need.” There’s very little about ethics and it’s much more about “justifying our project”.

The Report did not appear to have asked the attendees what impact they think their processes have on everyday lives, and social justice.

Without fixes in these approaches, we will never be world leading, but will lag behind because we haven’t built the safe infrastructure necessitated by our vast public administrative data troves. We must end bad data practice which includes getting right the basic principles on retention and data minimisation, and security (all of which would be helped if we started by reducing those ‘vast public administrative data troves’ much of which ranges from poor to abysmal data quality anyway). Start proper governance and oversight procedures. And put in place all the communication channels, tools, policy and training to make telling people how data are used and fair processing happen. It is not, a ‘nice to have’ but is required in all data processing laws around the world.

Any genuine “barriers” to data use in data protection law,  are designed as protections for people; the people the public sector, its staff and these arms length bodies are supposed to serve.

Blaming algorithms, blaming lack of clarity in the law, blaming “barriers” is avoidance of one thing. Accountability. Accountability for bad policy, bad data and bad applications of tools is a human responsibility. The systems you apply to human lives affect people, sometimes forever and in the most harmful ways.

What would I love to see led from any of these arms length bodies?

  1. An audit of existing public admin data held, by national and local government, and consistent published registers of databases and algorithms / AI / ML currently in use.
  2. Expose where your data system is nothing more than excel spreadsheets and demand better infrastructure.
  3. Identify the lawful basis for each set of data processes, their earliest records dates and content.
  4. Publish that resulting ROPA and the retention schedule.
  5. Assign accountable owners to databases, tools and the registers.
  6. Sort out how you will communicate with people whose data you unlawfully process to meet the law, or stop processing it.
  7. And above all, publish a timeline for data quality processes and show that you understand how the degradation of data accuracy, quality, and storage limitations all affect the rights and responsibilities in law that change over time, as a result.

There is no short cut, to doing a good job, but a bad one.

If organisations and bodies are serious about “good data” use in the UK, they must stop passing the buck and spreading the hype. Let’s get on with what needs fixed.

In the words of Gavin Freeguard, then let’s see how it goes.

“Michal Serzycki” Data Protection Award 2021

It is a privilege to be a joint-recipient in the fourth year of the “Michal Serzycki” Data Protection Award, and I thank the Data Protection Authority in Poland (UODO) for the recognition of work for the benefit of promoting data protection values and the right to privacy.

I appreciate the award in particular as the founder of an NGO, and the indirect acknowledgement of the value of NGOs to be able to contribute to public policy, including openness towards international perspectives, standards, the importance of working together, and our role in holding the actions of state authorities and power to account, under the rule of law.

The award is shared with Mrs Barbara Gradkowska, Director of the Special School and Educational Center in Zamość, whose work in Poland has been central to the initiative, Your Data — Your Concern, an educational Poland-wide programme for schools that is supported and recognized by the UODO. It offers support to teachers in vocational training centres, primary, middle and high schools related to personal data protection and the right to privacy in education.

And it is also shared with Mr Maciej Gawronski, Polish legal advisor and authority in data protection, information technology, cloud computing, cybersecurity, intellectual property and business law.

The UODO has long been a proactive advocate in the schools’ sector in Poland for the protection of children’s data rights, including recent enforcement after finding the processing of children’s biometric data using fingerprint readers unlawful, when using a school canteen and ensuring destruction of pupil data obtained unlawfully.

In the rush to remote learning in 2020 in response to school closures in COVID-19, the UODO warmly received our collective international call for action, a letter in which over thirty organisations worldwide called on policy makers, data protection authorities and technology providers, to take action, and encouraged international collaboration to protect children around the world during the rapid adoption of digital educational technologies (“edTech”). The UODO issued statements and a guide on school IT security and data protection.

In September 2020, I worked with their Data Protection Office at a distance, in delivering a seminar for teachers, on remote education.

The award also acknowledges my part in the development of the Guidelines on Children’s Data Protection in an Education Setting adopted in November 2020, working in collaboration with country representatives at the Council of Europe Committee for Convention 108, as well as with observers, and the Committee’s incredible staff.

2020 was a difficult year for people around the world under COVID-19 to uphold human rights and hold the space to push back on encroachmentespecially for NGOs, and in community struggles from the Black Lives Matter movement to environmental action to  UK students on the streets of London to protest algorithmic unfairness. In Poland the direction of travel is to reduce women’s rights in particular. Poland’s ruling Law and Justice (PiS) party has been accused of politicising the constitutional tribunal and using it to push through its own agenda on abortion, and the government appears set on undermining the rule of law creating a ‘chilling effect’ for judges. The women of Poland are again showing the world, what it means and what it can cost to lose progress made.

In England at defenddigitalme, we are waiting to hear later this month, what our national Department for Education will do to better protect millions of children’s rights, in the management of national pupil records, after our Data Protection regulator, the ICO’s audit and intervention. Among other sensitive content, the National Pupil Database holds sexual orientation data on almost 3.2 million students’ named records, and religious belief on 3.7 million.

defenddigitalme is a call to action to protect children’s rights to privacy across the education sector in England, and beyond. Data protection has a role to playwithin the broader rule of law to protect and uphold the right to privacy, to prevent state interference in private and family life, and in the protection of the full range of human rights necessary in a democratic society. Fundamental human rights must be universally protected to foster human flourishing, to protect the personal dignity and freedoms of every individual, to promote social progress and better standards of life in larger freedoms.


The award was announced at the conference,Real personal data protection in remote reality,” organized by the Personal Data Protection Office UODO, as part of the celebration of the 15th Data Protection Day on 28th January, 2021 with an award ceremony held on its eve in Warsaw.

Is the Online Harms ‘Dream Ticket’ a British Green Dam?

The legal duty in Online Harms government proposals is still vague.

For some it may sound like the ‘“dream ticket”.  A framework of censorship to be decided by companies, enabled through the IWF and the government in Online Safety laws. And ‘free’ to all. What companies are already doing today in surveillance of all outgoing  and *incoming* communications that is unlawful, made lawful. Literally, the nanny state could decide, what content will be blocked, if, “such software should “absolutely” be pre-installed on all devices for children at point of sale and “…people could run it the other side to measure what people are doing as far as uploading content.”

From Parliamentary discussion it was clear that the government will mandate platforms, “to use automated technology…, including, where proportionate, on private channels,” even when services are encrypted.

No problem, others might say, there’s an app for that. “It doesn’t matter what program the user is typing in, or how it’s encrypted.”

But it was less clear in the consultation outcome updated yesterday,  that closed in July 2019 and still says, “we are consulting on definitions of private communications, and what measures should apply to these services.” (4.8)

Might government really be planning to impose or incentivise surveillance on [children’s] mobile phones at the point of sale in the UK? This same ‘dream ticket’ company was the only company  mentioned by the Secretary of State for DCMS yesterday. After all, it is feasible. In 2009 Chinese state media reported that the Green Dam Youth Escort service, was only installed in 20 million computers in internet cafes and schools.

If government thinks it would have support for such proposals, it  may have overlooked the outrage that people feel about companies prying on our everyday lives. Or has already forgotten the summer 2020 student protests over the ‘mutant algorithm’.

There is conversely already incidental harm and opaque error rates from the profiling UK children’s behaviour while monitoring their online and offline computer activity, logged against thousands of words in opaque keyword libraries. School safeguarding services are already routine in England, and are piggy backed by the Prevent programme. Don’t forget one third of referrals to Prevent come from education and over 70% are not followed through with action.  Your child and mine might already be labelled with ‘extremism’, ‘terrorism’, ‘suicide’ or ‘cyberbullying’ or have had their photos taken by the webcam of their device an unlimited number of times, thanks to some of these ‘safeguarding’ software and services, and the child and parents never know.

Other things that were not clear yesterday, but will matter, is if the ‘harm’ of the Online Harms proposals will be measured by intent, or measured by the response to it. What is harm or hate or not, is contested across different groups online, and weaponised, at scale.

The wording of the Law Commission consultation closing on Friday on communications offences also matters, and asks about intention to harm a likely audience, where harm is defined as any non-trivial emotional, psychological, or physical harm, but should not require proof of actual harm. This together with any changes on hate crime and on intimate images in effect proposes changes on ‘what’ can be said, how, and ‘to whom’ and what is considered ‘harmful’ or ‘hateful’ conduct.  It will undoubtedly have massive implications for the digital environment once all joined up. It matters when ‘culture wars’ online, can catch children in the cross fire.

I’ve been thinking about all this, against the backdrop of the Bell v Tavistock [2020] EWHC 3274 judgement with implications from the consideration of psychological harm, children’s evolving capacity, the right to be heard and their autonomy, a case where a parent involved reportedly has not even told their own child.

We each have a right to respect for our private life, our family life, our home and our correspondence. Children are rights holders in their own right. Yet it appears the government and current changes in lawmaking may soon interfere with that right in a number of ways, while children are used at the heart of everyone’s defence.

In order to find that an interference is “necessary in a democratic society” any interference with rights and freedoms should be necessary and proportionate for each individual, not some sort of ‘collective’ harm that permits a rolling, collective interference.

Will the proposed outcomes prevent children from exercising their views or full range of rights, and restrict online participation? There may be a chilling effect on speech. There is in schools. Sadly these effects may well be welcomed by those who believe not only that some rights are more equal than others, but some children, more than others.

We’ll have to wait for more details. As another MP in debate noted yesterday, “The Secretary of State rightly focused on children, but this is about more than children; it is about the very status of our society ….”

The National Data Strategy. Rewiring State power.

The National Data Strategy is not about “the data”.  People need to stop thinking of data only as abstract information, or even as personal data when it comes to national policy. Administrative data is the knowledge about the workings of the interactions between the public and the State and about us as people. It is the story of selected life events. To the State it is business intelligence. What resources are used, where, by whom and who costs The Treasury how much? How any government is permitted to govern that,  shapes our relationship with the State and the nature of the governance we get of people, of public services. How much power we cede to the State or retain at national, local, and individual levels over our communities and our lives matters.  Any change in National Data Strategy is about the rewiring of state power, and we need to understand its terms and conditions very, very carefully.


What government wants

“It’s not to say we don’t see privacy and data protection as not important,” said Phil Earl, Deputy Director at DCMS, in the panel discussion hosted by techUK as part of Birmingham Tech Week, exploring the UK Government’s recently released National Data Strategy.

I sighed so loudly I was glad to be on mute. The first of many big watch outs for the messaging around the National Data Strategy was already touted in the text, as “the high watermark of data use set during the pandemic.” In response to COVID “a few of the perceived barriers seem to have melted away,” said Earl, and saw this reduced state of data protections is desirable beyond the pandemic. “Can we maintain that level of collaboration and willingness to share data?” he asked.

Data protection laws are at their heart protections for people, not data, and if any government is seeking to reduce those protections for people we should pay attention to messaging very carefully.

This positioning fails to recognise that data protection law is more permissive in exceptional circumstances such as pandemics, with a recognition by default that the tests in law of necessity and proportionality are different from usual, and are time bound to the pandemic.

“What’s the equivalent? How do we empower people to feel that that greater good [felt in the pandemic] outweighs their legitimate concerns about data being shared,” he said.” The whole trust thing is something that must be constantly maintained,” but you may hear between the lines,  ‘preferably on our [government] terms.’

The idea that the public is ignorant about data, is often repeated and still wrong. The same old mantras resurfaced. If people can make more informed decisions, understand “the benefits”, then the government can influence their judgments, trusting us to “make the decisions that we want them to make [to agree to data re-use].”

If *this* is the government set course (again), then watch out.

What people want

In fact when asked, the majority of people both who are willing and less willing to have data about them reused, generally want the same things. Safeguards,  opt in to re use, restricted distribution, and protections for redress and against misuse strengthened in legislation.

Read Doteveryone’s public attitudes work. Or the Ipsos MORI polls or work by Wellcome. (see below). Or even the care.data summaries.

The red lines in the “Dialogues on Data” report from workshops carried out across different regions of the UK for the 2013 ADRN (about reuse of deidentified linked public admin datasets by qualified researchers in safe settings), remain valid today, in particular with relation to:

  • Creating large databases containing many variables/data from a large number of public sector sources

  • Allowing administrative data to be linked with business data

  • Linking of passively collected administrative data, in particular geo-location data

“All of the above were seen as having potential privacy implications or allowing the possibility of reidentification of individuals within datasets. The other ‘red-line’ for some participants was allowing researchers for private companies to access data, either to deliver a public service or in order to make profit. Trust in private companies’ motivations were low.”

The BT spokesperson on the panel, went on to say that their own survey showed 85% of people say their data is important to them, and 75% believe they have too little control.

Mr. Earl was absolutely correct in saying it puts the onus on government to be transparent and show how data will be used. But we hear *nothing* about concrete plans to deliver that. What does that look like? Saying it three times out loud, doesn’t make it real.

What government does

Government needs to embrace the fact it can only get data right, if it does the right thing. That includes upholding the law. This includes examining its own purposes and practice.

The Department for Education has been giving away 15 million people’s personal confidential data since 2012 and never told them. They knew this. They chose to ignore it. And on top of that, didn’t inform people who were in school since then, that Mr Gove changed the law. So now over 21 million people’s pupil records are being given away to comapnies and other third parties, for use in ways we do not expect, and is misused too. In 2015, more secret data sharing began, with the Home Office. And another pilot in 2018 with the DWP.  And in 2019, sharing with the police.

Promises on government data policy transparency right now are worth less than zero. What matters now is government actions. Trust will be based on what you do, not what you say. Is the government trustworthy?

After the summary findings published by the ICO of their compulsory audit of the Department for Education,  the question now is what will the Department and government do to address the 139 recommendations for improvement, with over 60% classified as urgent or high priority. Is the government intentional about change?

What will government do?

So I had a question for the panel: Is the government serious about its point in the strategy, 7.1.2 “Our data regime should empower individuals and groups to control and understand how their data is used.”

I didn’t get an answer.

I want to know if the government is prepared to build the necessary infrastructure to enable that understanding and control?

  • Enhance and build the physical infrastructure:
      • access to personal reports what data is held and how it is used.
      • management controls and communications over reuse [opt-in to meet the necessary permissions of legitimate interests or consent as lawful basis for further data processing, conditions for sensitive data processing, or at very least opt-out to respect objections].
      • secure systems (not just excel, and WannaCry resistant)
  • Enable the necessary transparency tools and create demonstrable accountability through registers of algorithms and data sharing with oversight functions and routes for redress.
  • Empower staff with the necessary human skills at all levels in the public sector on the laws around data that do not just consist of a sharepoint on GDPR — what about privacy law, communications law, equality and discrimination laws among others?
  • Empower the public with the controls they want to have their rights respected.
  • Examine toxic policy that drives bad data collection and re-use.

Public admin data is all about people

Coming soon is the publication of an Integrated Review, we were told, how ‘data and security’ and other joined up issues will feature.

A risk of this conflation is seeing the national data strategy as another dry review about data as ‘a thing’, or its management.

It should be about people. The people who our public admin data are about. The people that want access to it. The people making the policy decisions. And its human infrastructure. The controls on power about purposes of the data reuse, data governance is all about the roles and responsibilties of people and the laws that oversee them and require human accountability.

These NDS strategy missions, and pillars and aims are all about “the data”.

For a national data strategy to be realised and to thrive in all of our wide ranging societal ‘data’ interests,  it cannot be all about data as a commodity.  Or all about government wants. Or seen through the lens only of research. Allow that, and they divide and conquer. It must be founded on  building a social contract between government and the public in a digital environment, and setting the expectations of these multi-purpose relationships, at national, and local levels.

For a forward thinking data strategy truly building something in the long term public interest, it is  about understanding ‘wider public need‘. The purpose of ‘the data’ and its strategy, is as much about the purpose of government behind it. Data is personal. But when used to make decisions it also business intelligence. How does the government make the business of governing work, through data?

Any national data strategy does not sit in a vacuum of other policy and public experience of government either.  If Manchester‘s second lockdown funding treatment is seen as the expectations of how local needs get trumped by national wants, and people’s wishes will be steam rollered, then a national approach will not win support. A long list of bad government engagement over recent months, is a poor foundation and you don’t fix that by talking about “the benefits”.

Will government listen?

Edgenuity, the U.S. based school assessment system using AI for marking, made the news this summer, when parents found it could be gamed by simply packing essays with all the right keywords, but respondents didn’t need to make sense or give accurate answers. To be received well and get a good grade, they were expected simply to tell the system the words ‘it wanted to hear’.

If the teachers were looking at the responses, they didn’t care,” one student said.

Will the government actually look at responses to the National Data Strategy and care about getting it right? Not just care about getting what they want? Or about what commercial actors will do with it?

Government wanted to and changed the law on education admin data in 2012 and got it wrong. Education data alone is a sin bin of bad habits and complete lack of public and professional engagement, before even starting to address data quality and accuracy and backwards looking policy built on bad historic data.

The Commercial department do not have appropriate controls in place to protect personal data being processed on behalf of the DfE by data processors.” (ICO audit of the DfE , 2020)

Gambling companies ended up misusing learner records.

Government wanted data from one Department to be collected for the purposes of another and got it wrong. People boycotted the collection until it was killed off.

Government changed the law on Higher Education in 2017 and got it wrong.  Now  third parties pass around named equality monitoring records like religion, sexual orientation, and disability and it is stored forever on named national pupil records. The Department for Education (DfE) now holds sexual orientation data on almost 3.2 million, and religious belief data on 3.7 million people.

What could possibly go wrong?

If the current path is any indicator, this government is little interested in local power, or people, and certainly not in our human rights. They are interested in centralised power. We should be very cautious about giving that all away to the State on its own terms.


 

The national data strategy consultation is open for submissions until

Samples of public engagement on data reuse

 

 

What happens when a Regulator doesn’t regulate

The news is full of the exam Regulator Ofqual right now, since yesterday’s A-Level results came out. In the outcry over the clear algorithmic injustice and inexplicable data-driven results, the data regulator, the Information Commissioner (ICO) remains silent.**

I have been told the Regulators worked together from early on in the process. So did this collaboration help or hinder the thousands of students and children whose rights the Regulators are supposed to work to protect?

I have my doubts, and here is why.

My child’s named national school records

On April 29, 2015 I wrote to the Department for Education (DfE) to ask for a copy of the data that they held about my eldest child in the National Pupil Database (NPD). A so-called Subject Access Request. The DfE responded on 12 May 2015 and refused, claiming an exemption, section 33(4) of the Data Protection Act 1998. In effect saying it was a research-only, not operational database.

Despite being a parent of three children in state education in England, there was no clear information available to me what the government held in this database about my children. Building on what others in civil society had done before, I began research into what data was held. From where it was sourced and how often it was collected. Who the DfE gave the data to. For what purposes. How long it was kept. And I discovered a growing database of over 20 million individuals, of identifying and sensitive personal data, that is given away to commercial companies, charities, think tanks and press without suppression of small numbers and is never destroyed.

My children’s confidential records that I entrusted to school, and much more information they create that I never see, is given away for commercial purposes and I don’t get told which companies have it, why, or have any control over it? What about my Right to Object? I had imagined a school would only share statistics with third parties without parents’ knowledge or being asked. Well that’s nothing compared with what the Department does with it all next.

My 2015 complaint to the ICO

On October 6, 2015 I made a complaint to the Information Commissioner’s Office (the ICO). Admittedly, I was more naïve and less well informed than I am today, but the facts were clear.

Their response in April 2016, was to accept the DfE position, “at the stage at which this data forms part of its evidence base for certain purposes, it has been anonymised, aggregated and is statistical in nature.  Therefore, for the purposes of the DPA, at the stage at which the DfE use NPD data for such purposes, it no longer constitutes personal data in any event.”

The ICO was “satisfied that the DfE met the criteria needed to rely on the exemption contained at section 33(4) of the DPA” and was justified in not fulfilling my request.

And “in relation to your concerns about the NPD and the adequacy of the privacy notice provided by the DfE, in broad terms, we consider it likely that this complies with the relevant data protection principles of the DPA.”

The ICO claimed “the processing does not cause any substantial damage or distress to individuals and that any results of the research/statistics are not made available in a form which identifies data subjects.”

The ICO kept its eyes wide shut

In secret in July 2015, the DfE had started to supply the Home Office with the matched personal details of children from the NPD, including home address. The Home Office requested this for purposes including to further the Hostile Environment. (15.1.2) which I only discovered in detail one year to the day after my ICO complaint, on October 6, 2016. The rest is public.

Had the ICO investigated the uses of national pupil data a year earlier in 2015-16, might it have prevented this ongoing gross misuse of children’s personal data and public and professional trust?

The ICO made no public statement despite widespread media coverage throughout 2016 and legal action on the expansion of the data, and intended use of children’s  nationality and country-of-birth.

Identifying and sensitive not aggregated and  statistical

Since 2012 the DfE has given away the sensitive and identifying personal confidential data of over 23 million people without their knowledge, in over 1600 unique requests, that are not anonymous.

In 2015 there was no Data Protection Impact Assessment. The Department had done zero audits of data users after sending them identifying pupil data. There was no ethics process or paperwork.

Today in England pupil data are less protected than across the rest of the UK. The NPD is being used as a source for creating a parent polling panel. Onward data sharing is opaque but some companies continue to receive named data and have done so for many years. It is a linked dataset with over 25 different collections, and includes highly sensitive children’s social care data, is frequently expanded, its content scope grows increasiningly sensitive, facilitates linkage to external datasets including children at risk and for policing,  and has been used in criminology interventions which did harm and told children they were involved because they were “the worst kids.” Data has been given to journalists and a tutoring company. It has been sought after by Americans even if not (yet?) given to them.

Is the ICO a help or hindrance to protect children and young people’s data rights?

Five years ago the ICO told me the named records in the national pupil database was not personal data. Five years on, my legal team and I await a final regulatory response from the ICO that I hope will protect the human rights of my children, the millions currently in education in England whose data are actively collected, the millions aged 18-37 affected whose data were collected 1996-2012 but who don’t know, and those to come.

It has come at significant personal and legal costs and I welcome any support. It must be fixed. The question is whether the information rights Regulator is a help or hindrance?

If the ICO is working with organisations that have broken the law, or that plan dubious or unethical data processing, why is the Regulator collaborating to enable processing and showing them how to smooth off the edges rather than preventing harm and protecting rights? Can the ICO be both a friend and independent enforcer?

Why does it decline to take up complaints on behalf of parents that similarly affect millions of children in the UK and worldwide about companies that claim to use AI on their website but tell the ICO it’s just computing really. Or why has it given a green light on the reuse of religion and ethnicity from schools without consent, and tells the organisation they can later process it to make it anonymous, and keep all the personal data indefinitely?

I am angry at their inaction, but not as angry as thousands of children and their parents who know they have been let down by  data-led decisions this month, that to them are inexplicable.

Thousands of children who are caught up in the algorithmic A-Level debacle and will be in next week’s GCSE processes believe they have been unfairly treated through the use of their personal data and have no clear route of redress. Where is the voice of the Regulator? What harm should they have prevented but didn’t through inaction?

What is the point of all the press and posturing on an Age Appropriate Code of Practice which goes beyond the scope of data protection, if the ICO cannot or will not enforce on its core remit or support the public it is supposed to serve?


Update: This post was published at midday on Friday Aust 14. In the late afternoon the ICO did post a short statement on the A-levels crisis, and also wrote to me regarding one of these cases via email.

Damage that may last a generation.

Hosted by the Mental Health Foundation, it’s Mental Health Awareness Week until 24th May, 2020. The theme for 2020 is ‘kindness’.

So let’s not comment on the former Education Ministers and MPs, the great-and-the-good and the-recently-resigned, involved in the Mail’s continued hatchet job on teachers. They probably believe that they are standing up for vulnerable children when they talk about the “damage that may last a generation“. Yet the evidence of much of their voting, and policy design to-date, suggests it’s much more about getting people back to work.

Of course there are massive implications for children in families unable to work or living with the stress of financial insecurity on top of limited home schooling. But policy makers should be honest about the return to school as an economic lever, not use children’s vulnerability to pressure professionals to return to full-school early, or make up statistics to up the stakes.

The rush to get back to full-school for the youngest of primary age pupils has been met with understandable resistance, and too few practical facts. Going back to a school in COVID-19 measures for very young children, will take tonnes of adjustment, to the virus, to seeing friends they cannot properly play with, to grief and stress.

When it comes to COVID-19 risk, many countries with similar population density to the UK, locked down earlier and tighter and now have lower rates of community transmission than we do. Or compare where didn’t, Sweden, but that has a population density of 24 people per Km2. The population density for the United Kingdom is 274 people per square kilometre. In Italy, with 201 inhabitants per square kilometre,  you needed a permission slip to leave home.

And that’s leaving aside the unknowns on COVID-19 immunity, or identifying it, or the lack of testing offer to over a million children under-5,  the very group expected to be those who return first to full-school.

Children have rights to education, and to life, survival and development. But the blanket target groups and target date, don’t appear to take the Best Interests of The Child, for each child, into account at all. ‘Won’t someone think of the children?’ may never have been more apt.

Parenting while poor is highly political

What’s the messaging in the debate, even leaving media extremes aside?

The sweeping assumption by many commentators that ‘the poorest children will have learned nothing‘ (BBC Newsnight, May 19) is unfair, but this blind acceptance as fact, a politicisation of parenting while poor, conflated with poor parenting, enables the claimed concern for their vulnerability to pass without question.

Many of these most vulnerable children were not receiving full time education *before* the pandemic but look at how it is told.

It would be more honest in discussion or publishing ‘statistics’ around the growing gap expected if children are out of school, to consider what the ‘excess’ gap will be and why. (Just like measuring excess deaths, not only those people who died and had been tested for COVID-19.) Thousands of vulnerable children were out of school already, due tobudget decisions that had left local authorities unable to fulfil their legal obligation to provide education.’

Pupil Referral Units were labeled “a scandal” in 2012 and only last year the constant “gangs at the gates” narrative was highly political.

“The St Giles Trust research provided more soundbites. Pupils involved in “county lines” are in pupil referral units (PRUs), often doing only an hour each day, and rarely returning into mainstream education.’ (Steve Howell, Schools Week)

Nearly ten years on, there is still lack of adequate support for children in Alternative Provision and a destructive narrative of “us versus them”.

Source: @sarahkendzior

The value of being in school

Schools have remained open for children of key workers and more than half a million pupils labeled as ‘vulnerable’, which includes those classified as “children in need” as well as 270,000 children with an education, health and care (EHC) plan for special educational needs.  Not all of those are ‘at risk’ of domestic violence or abuse or neglect. The reasons why there is low turnout, tend to be conflated.

Assumptions abound about the importance of formal education and the best place for those very young children in Early Years (age 2-5) to be in school at all, despite conflicting UK evidence, that is thin on the ground. Research for the NFER [the same organisation running the upcoming Baseline Test of four year olds still due to begin this year] (Sharp, 2002), found:

“there would appear to be no compelling educational rationale for a statutory school age of five or for the practice of admitting four-year-olds to school reception classes.” And “a late start appears to have no adverse effect on children’s progress.”

Later research from 2008, from the IoE, Research Report No. DCSF-RR061 (Sylva et al, 2008) commissioned before the then ‘new’ UK Government took office in 2010, suggested better outcomes for children who are in excellent Early Years provision, but also pointed out that more often the most vulnerable are not those in the best of provision.

“quality appears to be especially important for disadvantaged groups.”

What will provision quality be like, under Coronavirus measures? How much stress-free space and time for learning will be left at all?

The questions we should be asking are a) What has been learned for the second wave and b) Assume by May 2021 nothing changes. What would ideal schooling look like, and how do we get there?

Attainment is not the only gap

While it is not compulsory to be in any form of education, including home education, till your fifth birthday in England, most children start school at age 4 and turn five in the course of the year. It is one of the youngest starts in Europe.  Many hundreds of thousands of children start formal education in the UK even younger from age 2 or three. Yet is it truly better for children? We are way down the Pisa attainment scores, or comparable regional measures.  There has been little change in those outcomes in 13 years, except to find that our children are measured as being progressively less happy.

“As Education Datalab points out, the PISA 2018 cohort started school around 2008, so their period at school not only lines up with the age of austerity and government cuts, but with the “significant reforms” to GCSEs introduced by Michael Gove while he was Education Secretary.”  [source: Schools Week, 2019]

There’s no doubt that some of the harmful economic effects of Brexit will be attributed to the effects of the pandemic. Similarly, many of the outcomes of ten years of policy that have increased  children’s vulnerability and attainment gap, pre-COVID-19, will no doubt be conflated with harms from this crisis in the next few years.

The risk of the acceptance of misattributing this gap in outcomes, is a willingness to adopt misguided solutions, and deny accountability.

Children’s vulnerability

Many experts in children’s needs, have been in their jobs much longer than most MPs and have told them for years the harm their policies are doing to the very children, those voices now claim to want to protect. Will the MPs look at that evidence and act on it?

More than a third of babies are living below the poverty line in the UK. The common thread in many [UK] families’ lives, as Helen Barnard, deputy director for policy and partnerships for the Joseph Rowntree Foundation described in 2019, is a rising tide of work poverty sweeping across the country.” Now the Coronavirus is hitting those families harder too. The ONS found that in England the death rate  in the most deprived areas is 118% higher than in the least deprived.

Charities speaking out this week, said that in the decade since 2010, local authority spending on early intervention services dropped by 46% but has risen on late intervention, from 58% to 78% of spending on children and young people’s services over the same period.

If those advocating for a return to school, for a month before the summer, really want to reduce children’s vulnerability, they might sort out CAMHs for simultaneous support of the return to school, and address those areas in which government must first do no harm. Fix these things that increase the “damage that may last a generation“.


Case studies in damage that may last

Adoption and Children (Coronavirus) (Amendment) Regulations 2020’

Source: Children’s Commissoner (April 2020)

“These regulations make significant temporary changes to the protections given in law to some of the most vulnerable children in the country – those living in care.” ” I would like to see all the regulations revoked, as I do not believe that there is sufficient justification to introduce them. This crisis must not remove protections from extremely vulnerable children, particularly as they are even more vulnerable at this time. As an urgent priority it is essential that the most concerning changes detailed above are reversed.”

CAMHS: Mental health support

Source: Local Government Association CAMHS Facts and Figures

“Specialist services are turning away one in four of the children referred to them by their GPs or teachers for treatment. More than 338,000 children were referred to CAMHS in 2017, but less than a third received treatment within the year. Around 75 per cent of young people experiencing a mental health problem are forced to wait so long their condition gets worse or are unable to access any treatment at all.”

“Only 6.7 per cent of mental health spending goes to children and adolescent mental health services (CAMHS). Government funding for the Early Intervention Grant has been cut by almost £500 million since 2013. It is projected to drop by a further £183 million by 2020.

“Public health funding, which funds school nurses and public mental health services, has been reduced by £600 million from 2015/16 to 2019/20.”

Child benefit two-child limit

Source: May 5, Child Poverty Action Group
“You could not design a policy better to increase child poverty than this one.” source: HC51 House of Commons Work and Pensions Committee
The two-child limit Third Report of Session 2019 (PDF, 1 MB)

“Around sixty thousand families forced to claim universal credit since mid-March because of COVID-19 will discover that they will not get the support their family needs because of the controversial ‘two-child policy”.

Housing benefit

Source: the Poverty and Social Exclusion in the United Kingdom research project funded by the Economic and Social Research Council.

“The cuts [introduced from 2010 to the 2012 budget] in housing benefit will adversely affect some of the most disadvantaged groups in society and are likely to lead to an increase in homelessness, warns the homeless charity Crisis.”

Legal Aid for all children

Source: The Children’s Society, Cut Off From Justice, 2017

“The enactment of the Legal Aid, Punishment and Sentencing of Offenders Act 2012 (LASPO) has had widespread consequences for the provision of legal aid in the UK. One key feature of the new scheme, of particular importance to The Children’s Society, were the changes made to the eligibility criteria around legal aid for immigration cases. These changes saw unaccompanied and separated children removed from scope for legal aid unless their claim is for asylum, or if they have been identified as victims of child trafficking.”

“To fulfill its obligations under the UNCRC, the Government should reinstate legal aid for all unaccompanied and separated migrant children in matters of immigration by bringing it back within ‘scope’ under the Legal Aid, Sentencing and Punishment of Offenders Act 2012. Separated and unaccompanied children are super-vulnerable.”

Library services

Source: CIPFA’s annual library survey 2018

“the number of public libraries and paid staff fall every year since 2010, with spending reduced by 12% in Britain in the last four years.” “We can view libraries as a bit of a canary in the coal mine for what is happening across the local government sector…” “There really needs to be some honest conversations about the direction of travel of our councils and what their role is, as the funding gap will continue to exacerbate these issues.”

No recourse to public funds: FSM and more

source: NRPF Network
“No recourse to public funds (NRPF) is a condition imposed on someone due to their immigration status. Section 115 Immigration and Asylum Act 1999 states that a person will have ‘no recourse to public funds’ if they are ‘subject to immigration control’.”

“children only get the opportunity to apply for free school meals if their parents already receive certain benefits. This means that families who cannot access these benefits– because they have what is known as “no recourse to public funds” as a part of their immigration status– are left out from free school meal provision in England.”

Sure Start

Source: Institute for Fiscal Studies (2019)

“the reduction in hospitalisations at ages 5–11 saves the NHS approximately £5 million, about 0.4% of average annual spending on Sure Start. But the types of hospitalisations avoided – especially those for injuries – also have big lifetime costs both for the individual and the public purse”.

Youth Services

Source: Barnardo’s (2019) New research draws link between youth service cuts and rising knife crime.

“Figures obtained by the All-Party Parliamentary Group (APPG) on Knife Crime show the average council has cut real-terms spending on youth services by 40% over the past three years. Some local authorities have reduced their spending – which funds services such as youth clubs and youth workers – by 91%.”

Barnardo’s Chief Executive Javed Khan said:

“These figures are alarming but sadly unsurprising. Taking away youth workers and safe spaces in the community contributes to a ‘poverty of hope’ among young people who see little or no chance of a positive future.”

A fresh start for edtech? Maybe. But I wouldn’t start from here.

In 1924 the Hibbert Journal published what is accepted as the first printed copy of a well-known joke.

A genial Irishman, cutting peat in the wilds of Connemara, was once asked by a pedestrian Englishman to direct him on his way to Letterfrack. With the wonted enthusiasm of his race the Irishman flung himself into the problem and, taking the wayfarer to the top of a hill commanding a wide prospect of bogs, lakes, and mountains, proceeded to give him, with more eloquence than precision, a copious account of the route to be taken. He then concluded as follows: ‘Tis the divil’s own country, sorr, to find your way in. But a gintleman with a face like your honour’s can’t miss the road; though, if it was meself that was going to Letterfrack, faith, I wouldn’t start from here.’

Ty Goddard asked some sensible questions in TES on April 4 on the UK edTech strategy, under the overarching question, ‘A fresh start for edtech? Maybe. But the road is bumpy.’

We’d hope so, since he’s on the DfE edTech board and aims “to accelerate the edtech sector in Britain and globally.”

“The questions now being asked are whether you can protect learning at a time of national emergency? Can you truly connect educators working from home with their pupils?”

and he rightly noted that,

“One problem schools are now attempting to overcome is that many lack the infrastructure, experience and training to use digital resources to support a wholesale move to online teaching at short notice.”

He calls for “bold investment and co-ordination across Whitehall led by Downing Street to really set a sprint towards super-fast connectivity to schools, pupils’ homes and investment in actual devices for students. The Department for Education, too, has done much to think through our recent national edtech strategy – now it needs to own and explain it.”

But the own and explain it, is the same problematic starting point as care-data had in the NHS in 2014. And we know how that went.

The edTech demands and drive for the UK are not a communications issue. Nor are they simply problems of infrastructure, or the age-old idea of shipping suitable tech at scale. The ‘fresh start’ isn’t going to be what anyone wants, least of all the edTech evangelists if we start from where they are.

Demonstrators of certain programmes, platforms, and products to promote to others and drive adoption, is ‘the divil’s own country‘.

The edTech UK strategy in effect avoided online learning, and the reasons for that were not public knowledge but likely well founded. They’re mostly unevidenced and often any available research comes from the companies themselves or their partners and promoter think tanks and related, or self interested bodies.

I’ve not seen anyone yet talk about disadvantage and deprivation from not issuing course curriculum standard text books to every child.  Why on earth can secondary schools not afford to give each child their text book home? A darn sight cheaper than tech, independent of data costs and a guide to exactly what the exams will demand. Should we not seek to champion the most appropriate and equitable learning solutions, in addition to, rather than exclusively, the digital ones? GSCE children I support(ed) in foreign languages each improved once they had written materials. Getting out Chromebooks by contrast, simply interfered in the process, and wasted valuable classroom time.

Technology can deliver most vital communications, at speed and scale. It can support admin, expand learning and level the playing field through accessible tools. But done wrongly, it makes things worse than without.

Its procurement must assess any potential harmful consequences and safeguard against them, and not accept short term benefits, at the cost of long term harm. It should be safe, fair, and transparent.

“Responsible technology is no longer a nice thing to do to look good, it’s becoming a fundamental pillar of corporate business models. In a post-Cambridge Analytica world, consumers are demanding better technology and more transparency. Companies that do create those services are the ones that will have a better, brighter future.”

Kriti Sharma, VP of AI, Sage, (Doteveryone 2019 event, Responsible Technology)

The hype of ‘edTech’ achievement in the classroom so far, far outweighs the evidence of delivery. Neil Selwyn, Professor in the Faculty of Education, Monash University, Australia, writing in the Impact magazine of the Chartered College in January 2019 summed up:

“the impacts of technology use on teaching and learning remain uncertain. Andreas Schleicher – the OECD’s director of education – caused some upset in 2015 when suggesting that ICT has negligible impact on classrooms. Yet he was simply voicing what many teachers have long known: good technology use in education is very tricky to pin down.”

That won’t stop edTech being part of the mainstay of the UK export strategy post-Brexit whenever that may now be. But let’s be very clear that if the Department wants to be a world leader it shouldn’t promote products whose founders were last most notably interviewing fellow students online about their porn preferences. Or who are based in offshore organisations with very odd financial structures. Do your due diligence. Work with reputable people and organisations and build a trustworthy network of trustworthy products framed by the rule of law, that is rights’ respecting and appropriate to children. But don’t start with the products.

Above all build a strategy for education, for administrative support, for respecting rights, and for teaching in which tools that may or may not be technology-based add value; but don’t start with the product promotion.

To date the aims are to serve two masters. Our children’s education, and the UK edTech export strategy. You can if you’re prepared to do the proper groundwork, but it’s lacking right now. What is certain, is that if you get it wrong for UK children, the other will inevitably fail.

Covid19 must not be misused to direct our national edTech strategy. I wouldn’t start from here isn’t a joke, it’s a national call for change.

Here’s ten reasons where, why, and how to start instead.

1. The national edTech strategy board should start by demonstrating what it wants to see from others, with full transparency of its members, aims, terms of reference, partners and meeting minutes. There should be no need FOI to ask for them. There are much more sensitive subjects that operate in the open. It unfortunately emulates other DfE strategy, and the UK edTech network which has an in-crowd, and long standing controlling members. Both would be the richer for transparency and openness.

2. Stop bigging up the ‘Big Three’  and doing their market monopolisation for them, unless you want people to see you simply as promoting your friends’-on-the-board/foundation/ethics committee’s products. Yes,” many [educational settings] lack the infrastructure” but that should never mean encouraging ownership and delivery by only closed commercial partners.  That is the route to losing control of your state education curriculum, staff training  and (e)quality,  its delivery, risk management, data,  and cost control.

3. Start with designing for fairness in public sector systems. Minimum acceptable ethical standards could be framed around for example, accessibility, design, and restrictions on commercial exploitation and in-product advertising. This needs to be in place first, before fitting products ‘on top’ of an existing unfair, and imbalanced system, to avoid embedding disadvantage and the commodification of children in education, even further.

5. Accessibility and Internet access is a social justice issue.  Again as we’ve argued for at defenddigitalme for some time, these come *before* you promote products on top of the delivery systems:

  • Accessibility standards for all products used in state education should be defined and made compulsory in procurement processes, to ensure access for all and reduce digital exclusion.
  • All schools must be able to connect to high-speed broadband services to ensure equality of access and participation in the educational, economic, cultural and social opportunities of the world wide web.
  • Ensure a substantial improvement in support available to public and school library networks. CILIP has pointed to CIPFA figures of a net reduction of 178 libraries in England between 2009-10 and 2014-15.

6. Core national education infrastructure must be put on the national risk register, as we’ve argued for previously at defenddigitalme (see 6.6). Dependence such as MS Office 365, major cashless payment systems, and Google for Education all need assessed and to be part of the assessment for regular and exceptional delivery of education. We currently operate in the dark. And it should be unthinkable that companies get seats at the national UK edTech strategy table without full transparency over questions on their practices, policy and meeting the rule of law.

7. Shift the power balance back to schools and families, where they can trust an approved procurement route, and children and legal guardians can trust school staff to only be working with suppliers that are not overstepping the boundaries of lawful processing. Incorporate (1) the Recommendation CM/Rec(2018)7 of the Committee of Ministers to member States on Guidelines to respect, protect and fulfil the rights of the child in the digital environment  and (2) respect the UN General comment No. 16 (2013) on State obligations regarding the impact of the business sector on children’s rights, across the education and wider public sector.

8. Start with teacher training. Why on earth is the national strategy all about products, when it should be starting with people?

  • Introduce data protection and pupil privacy into basic teacher training, to support a rights-respecting environment in policy and practice, using edTech and broader data processing, to give staff the clarity, consistency and confidence in applying the high standards they need.
  • Ensure ongoing training is available and accessible to all staff for continuous professional development.
  • A focus on people, nor products, will deliver fundamental basics needed for good tech use.

9. Safe data by design and default. I’m tired of hearing from CEOs of companies that claim to be social entrepreneurs, or non-profit, or teachers who’ve designed apps, how well intentioned their products are. Show me instead. Meet the requirements of the rule of law.

  • Local systems must stop shipping out (often sensitive) pupil data at scale and speed to companies, and instead stay in control of terms and conditions, data purposes, and ban product developments for example.
  • Companies must stop using pupil data for their own purposes for profit, or to make inferences about autism or dyslexia for example, if that’s not your stated product aim, it’s likely unlawful.
  • Stop national pupil data distribution for third-party reuse. Start safe access instead.  And get the Home Office out of education.
  • Establish fair and independent oversight mechanisms of national pupil data, so that transparency and trust are consistently maintained across the public sector, and throughout the chain of data use, from collection, to the end of its life cycle, including annual data usage reports for each child.

10. We need a law that works for children’s rights. Develop a legislative framework for the fair use of a child’s digital footprint from the classroom for direct educational and administrative purposes at local level, including commercial acceptable use policies.  Build the national edTech strategy with a rights’ based framework and lawful basis in an Education and Privacy Act. Without this, you are building on sand.

If schools close, what happens to children who need free school meals?

Here’s some collated questions, views and ideas from teachers, and eduTwitter and my thoughts on what could be done by government and schools. What is missing? What else is possible?

[Last edit: March 31, 11am, working document*, input welcome].

*Today’s guidance states a new policy position

Our school is open over the Easter holidays and our food supplier is able to continue to provide meals for children eligible for free school meals who are not in school. Is that allowed?

“Whilst the vouchers are for term time only, if there is a local arrangement to supply food that the school and the supplier intend to continue over this period then that can be agreed and managed locally. This would need to be manageable within schools’ existing resources, as there will not be additional funding available for this purpose.

This is unacceptable. At our tiny rural primary school parents have donated hundreds of pounds of personal money in the last month to feed local families’ children alone and support school with its extra costs, this is unsustainable as many themselves are now out of work or at reduced pay. — Ministers do not appear to understand the gravity of the situation.

Not scrapping FSM eligibility criteria (as set out in 10 Actions for Government to take now, below) and allowing schools to order the vouchers they need for families, rather than only allowing schools to get vouchers to those children that meet eligibility test criteria, will mean children are starving and schools already starved of funds, will feed them because they must through volunteer support where they can, but have to do so at their own expense.

This is wrong and must be fixed. The virus and its economic effects on millions of families, do not respect a two- week school holiday. Children in families with no recourse to public funds have nothing, and now have no work — or will have to go out to work to feed their children but jeopardise their own, their families, and our community public health because the system puts them in an impossible position.

The well documented 5-week delays in Universal Credit applications, which are on a steep incline, will mean children have nothing for 5 weeks although their poverty is clear, while the eligibility ’criteria’ is met in the system.

Government must scrap eligibility tests and criteria and fund schools for every FSM they provide to any child in need, at any time.

Previous question asked:
How will the DfE know how much money a school needs in order to meet growing demand for FSM without knowing how many children at each school need FSM?

Suggested answer:
They won’t. There will be an inevitable lag. The DfE must offer schools funding as demand grows, and allow them to plan securely. Schools must be able to offer families a way to indicate need, and be able to meet it, even if not ‘eligible’ for for FSM.

Assumptions:

(1) The number of children in need of an FSM will grow over the next few weeks and months.
(2) The school census is the mechanism for telling the DfE a count of how many children are FSM eligible,  and it does not get taken next until May 21st.
(3) The January 2021 school census is the next mechanism for telling the DfE a count of how many children are FSM eligible, and taken as the basis for the count of pupil premium school funding.

Public Health England has updated its guidance for schools today. As school closures at scale may look increasingly more likely, many in civil society have called on the Government to  offer cash measures to ensure that children do not go hungry.

Health and education are both devolved issues. Who takes leadership here? It is also a question of interaction with DWP.

About 1.5 million children across the UK are currently eligible for free school meals in families living on a very low income. The precarious nature of many parents’ employment in the gig economy and service industry, will push that number higher due to the economic and health effects of the virus. Children must not experience barriers to access food and support.

Child Poverty Action Group is calling on the government to match the support it is providing to small business and boost the income of struggling families with children by increasing child benefit by £10 per week for the duration of the pandemic response, for example. This is in addition to and not instead of the actions needed on FSM. This should be step zero for the government to action.

Now is not a time for eligibility tests, conditionality or exclusion about feeding children.

    • What are the implications for eligibility, of the Budget 2020 changes in welfare criteria and coronavirus support measures?
    • How can children who become newly eligible, find out that  they are and access needed support available to them if out of school> who is responsible for approvals, and communications between families and schools if closed?
    • Many families will now be staying at home for all meals, without access to meals at work, in canteens, or staff discounts. Where supermarket shelves are empty, an increase in the number of people needing fed at home may put an additional strain on families’ supplies and budgets.

We already know,  that while 1.1 million children in English primary and secondary schools were eligible for and claiming free school meals, there were also between 2.2 million and 4.1 million children living in poverty in 2016/17, depending on the measure used. [Source: The Children’s Society.]

Table numbers* are estimated as may be have been taken on different dates and eligibility criteria vary by location.

The Government needs to do everything within its power to mitigate the effects of Coronavirus on children’s nutrition and in a sustainable solution beyond the short term. School staff across Twitter at least, seem to have plenty of ad hoc ideas going on, but there is no public guidance from the Department, at the time of writing. Local areas need empowered to support their own families based on local needs and knowledge.

Some schools are already closing. Some parents are withdrawing children as a precautionary measure. All may need support.


A. Ten things government could do quickly

1. Appoint a dedicated Local Authority central contact for
(a) families and (b) separate for school (telephone and email) — local knowledge needed to answer questions and offer support. (Note challenge D2)

2. Remove eligibility and conditionality requirements to allow all children to access FSM based on need, not current criteria. This change would remove any questions or confusion over ‘do I qualify?’ especially for families newly claiming welfare payments as part of coronavirus support measures. This may see government simply  need to treble FSM allocation, so schools can help their wider community including children not classed as eligible, but in need.

3. Make funding available now and quick to access for:

  • Breakfast club bags
  • All FSM eligible children (2-18), including infants
  • meeting need at aggregated, not individual level.

4. Emergency funding must be made accessible and quick to claim  for those families who are going to slide into poverty and become FSM eligible but may not be able to demonstrate Universal Credit eligibility for example. (Delays in UC must not delay getting FSM to a child). Schools must have discretion based on need.

5. Empower local schools to decide how to distribute this best –– as cash transfers, emergency feeding programmes, vouchers, or otherwise.

6. *Unlink FSM funding, eligibility,  and individual level pupil premium (PP) registration. [This may be a longer term issue that can be ignored for now, if not counted till the January 2021 census.] Clarify any short term, and further implications. There may be interconnected systems and implications for algorithms (at LA level) of PP system registration. Schools will need to know whether they must or must not register pupils as PP status on an individual level, or can simply meet pupils’ FSM needs.

7. Introduce a business rate relief on state schools, as afforded to private schools operating as charities.

8. The intention of any top-down imposed closures and these changes will need to be made very clear, to set staff and families and suppliers’ expectations for the potential time periods involved and allow school staff to plan capacity and funding accordingly as best they can. (Flatten the curve? Slow spread? etc)

9. Scrap the next 21 May 2020 school census day “FSM meals taken” count.

10. Give schools an extra supplies fund with flexibility, including  for unexpected additional hygiene costs and temporary staff.

And don’t forget step zero, in addition to FSM needs. Many families are soon going to be in dire straits as services and shops stop paying staff. Child Poverty Action Group is calling on the government to match the support it is providing to small business and boost the income of struggling families with children by increasing child benefit by £10 per week for the duration of the pandemic response.


B. Things schools could do

1. Appoint a dedicated school FSM questions and support contact for (a) families and (b) for other organisations who may want to refer / reach (telephone and email) with allocated school back up chain, in case of illness — local knowledge needed to answer questions and offer support.

2. ‘Cash transfers direct to individuals or households are the most effective tool in order to aid families to weather the storm (not vouchers for food aid or financial or in kind support for food aid providers including lunch clubs)‘ [Letter to Rishi Sunak MP from civil society, March 12, 2020] (Recommendation from multiple civil society orgs / charities.)

3. Schools stay open on skeleton schedule as meal collection points distributing meals from usual suppliers (cold alternatives) Schools need to best define and decide for themselves what this looks like.

4. For [rural] children on school bus routes who cannot access school, or for individuals with SEND special transport that stops, could the buses continue to run, and deliver meals to bus stop collection points (routes could need re-time tabling and contingent on safe staffing)?

5. Some schools are looking at supermarket vouchers. They will need support to be able to transfer funding from school meal suppliers if so. The least disruptive model will be to keep existing provision from current contracted suppliers. Must have flexibility.

6. Other schools are preparing food packages as a contingency to safeguard children who would not be able to access FSMs in the possible event of any future closure.

7. Contingency planning may be needed where schools plan to provide food, not cash transfers. (a) Self isolation and (b) sickness may prevent or disincentivise families receiving physical food transfers. Schools need to plan if actual food transfers becomes no longer feasible due to (a)  or (b).

8. Recognise that other partner organisations (churches, food banks, local charities, youth groups) may themselves  have reduced capacity and this may change over time. Self isolation and sickness may reduce staffing. Contingency thinking needed.


C. What is missing and questions?

    1.  Contracts between schools and supplier?
      • Force Majeure Termination Rights?
      • Safeguarding supplies: can suppliers get guaranteed / prioritised food deliveries
      • Suppliers have staff to pay etc – will they be paid for services they don’t provide if schools close?
    2. Delivery
      • What contracts are in place with suppliers?
      • Are school bus companies viable for drop off deliveries?
      • Could/should they enable children to get to school if self defeating the aims of social distancing and self isolation, or could school buses deliver meals to bus collection stops?
    3. Can schools stay open for provision
      • Assuming contingency for safe staffing: what sort of numbers of pupils / staff is viable for in-school collection of grab bags?
      • Should schools act like local food banks to support a community?
      • How will children in families that are sick or all in self isolation that cannot access the school, get support?
    4. Children’s FSM Eligibility
      • Are there implications of the Budget 2020 changes in Universal Credit and welfare criteria, for pupil premium calculations and school funding? If “UC eligible” status takes 5 weeks to reach, what does this mean for FSM? The advance payment in the 5 week must be a grant, not a loan.
      • How are newly eligible children brought into the system whilst out of school> who is responsible for the eligibility tests, and communications between families and schools if closed?
      • Destitute families with no recourse to public funds have no welfare safety net to fall back on.  “As a result, there will be an increase in homelessness, hunger and health issues amongst these families.” [Eve Dickson Project 17]
      • This matters to the DfE and the Treasury because if you are *ever* registered as FSM eligible in your period of education, you keep that eligibility for six years (ie across primary, or all of secondary school). Pupil premium is paid accordingly to schools. (Goodness knows our children’s schools need the cash, those I teach don’t even have a text book each). There are many interconnected systems and knock on implications for algorithms often at LA level, of the implications here of PP registration.
    5. The arbitrariness of taking the total number of children who eat a school meal on school census date the next Thursday 21 May 2020, as a measure of need, is likely going to be evidenced at scale. Where ‘free school meals taken’ or ‘school lunches taken’ are affected by unusual events, a day and time when the situation is regarded as normal is to be substituted. “You could use the next normal day, an earlier day in census week or the previous Thursday where that reflects the normal situation. Where other days or times are used, schools must record these for audit purposes.” [DfE school census guidance]
    6. Beyond FSM — and of secondary importance, but nonetheless  of importance for families that will now need to spend money twice in the same time period, intended for children’s lunches. Will regular school meal orders that have been pre-ordered & pre-paid by parents be fulfilled at later date?
    7. Recovery volunteers if people have had/ or not been tested but assume they have had it, can they volunteer for support?

Continue reading If schools close, what happens to children who need free school meals?

Thoughts from the YEIP Event: Preventing trust.

Here’s some thoughts about the Prevent programme, after the half day I spent at the event this week, Youth Empowerment and Addressing Violent Youth Radicalisation in Europe.

It was hosted by the Youth Empowerment and Innovation Project at the University of East London, to mark the launch of the European study on violent youth radicalisation from YEIP.

Firstly, I appreciated the dynamic and interesting youth panel. Young people, themselves involved in youth work, or early researchers on a range of topics. Panelists shared their thoughts on:

  • Removal of gang databases and systemic racial targeting
  • Questions over online content takedown with the general assumption that “someone’s got to do it.”
  • The purposes of Religious Education and lack of religious understanding as cause of prejudice, discrimination, and fear.

From these connections comes trust.

Next, Simon Chambers, from the British Council, UK National Youth Agency, and Erasmus UK, talked about the programme of Erasmus Plus, under the striking sub theme, from these connections comes trust.

  • 42% of the world’s population are under 25
  • Young people understand that there are wider, underlying complex factors in this area and are disproportionately affected by conflict, economic change and environmental disaster.
  • Many young people struggle to access education and decent work.
  • Young people everywhere can feel unheard and excluded from decision-making — their experience leads to disaffection and grievance, and sometimes to conflict.

We then heard a senior Home Office presenter speak about Radicalisation: the threat, drivers and Prevent programme.

On Contest 2018 Prevent / Pursue / Protect and Prepare

What was perhaps most surprising was his statement that the programme believes there is no checklist, [but in reality there are checklists] no single profile, or conveyer belt towards radicalisation.

“This shouldn’t be seen as some sort of predictive model,” he said. “It is not accurate to say that somehow we can predict who is going to become a terrorist, because they’ve got poor education levels, or because necessarily have a deprived background.”

But he then went on to again highlight the list of identified vulnerabilities in Thomas Mair‘s life, which suggests that these characteristics are indeed seen as indicators.

When I look at the ‘safeguarding-in-school’ software that is using vulnerabilities as signals for exactly that kind of prediction of intent, the gap between theory and practice here, is deeply problematic.

One slide included Internet content take downs, and suggested 300K pieces of illegal terrorist material have been removed since February 2010. That number he later suggested are contact with CTIRU, rather than content removal defined as a particular form. (For example it isn’t clear if this is a picture, a page, or whole site). This is still somewhat unclear and there remain important open questions, given its focus  in the online harms policy and discussion.

The big gap that was not discussed and that I believe matters, is how much autonomy teachers have, for example, to make a referral. He suggested “some teachers may feel confident” to do what is needed on their own but others, “may need help” and therefore make a referral. Statistics on those decision processes are missing, and it is very likely I believe that over referral is in part as a result of fearing that non-referral, once a computer has tagged issues as Prevent related, would be seen as negligent, or not meeting the statutory Prevent duty as it applies to schools.

On the Prevent Review, he suggested that the current timeline still stands, of August 2020, even though there is currently no Reviewer. It is for Ministers to make a decision, who will replace Lord Carlile.

Safeguarding children and young people from radicalisation

Mark Chalmers of Westminster City Council., then spoke about ‘safeguarding children and young people from radicalisation.’

He started off with a profile of the local authority demographic, poverty and wealth, migrant turnover,  proportion of non-English speaking households. This of itself may seem indicative of deliberate or unconscious bias.

He suggested that Prevent is not a security response, and expects  that the policing role in Prevent will be reduced over time, as more is taken over by Local Authority staff and the public services. [Note: this seems inevitable after the changes in the 2019 Counter Terrorism Act, to enable local authorities, as well as the police, to refer persons at risk of being drawn into terrorism to local channel panels. Should this have happened at all, was not consulted on as far as I know]. This claim that Prevent is not a security response, appears different in practice, when Local Authorities refuse FOI questions on the basis of security exemptions in the FOI Act, Section 24(1).

Both speakers declined to accept my suggestion that Prevent and Channel is not consensual. Participation in the programme, they were adamant is voluntary and confidential. The reality is that children do not feel they can make a freely given informed choice, in the face of an authority and the severity of the referral.  They also do not understand where their records go to, how confidential are they really, and how long they are kept or why.

The  recently concluded legal case and lengths one individual had to go to, to remove their personal record from the Prevent national database, shows just how problematic the mistaken perception of a consensual programme by authorities is.

I knew nothing of the Prevent programme at all in 2015. I only began to hear about it once I started mapping the data flows into, across and out of the state education sector, and teachers started coming to me with stories from their schools.

I found it fascinating to hear those speak at the conference that are so embedded in the programme. They seem unable to see it objectively or able to accept others’ critical point of view as truth. It stems perhaps from the luxury of having the privilege of believing you yourself, will be unaffected by its consequences.

“Yes,” said O’Brien, “we can turn it off. We have that privilege” (1984)

There was no ground given at all for accepting that there are deep flaws in practice. That in fact ‘Prevent is having the opposite of its intended effect: by dividing, stigmatising and alienating segments of the population, Prevent could end up promoting extremism, rather than countering it’ as concluded in the 2016 report  Preventing Education: Human Rights and Countering terrorism in UK Schools by Rights Watch UK .

Mark Chalmers conclusion was to suggest perhaps Prevent is not always going to be the current form, of bolt on ‘big programme’ and instead would be just like any other form of child protection, like FGM. That would mean every public sector worker, becomes an extended arm of the Home Office policy, expected to act in counter terrorism efforts.

But the training, the nuance, the level of application of autonomy that the speakers believe exists in staff and in children is imagined. The trust between authorities and people who need shelter, safety, medical care or schooling must be upheld for the public good.

No one asked, if and how children should be seen through the lens of terrorism, extremism and radicalisation at all. No one asked if and how every child, should be able to be surveilled online by school imposed software and covert photos taken through the webcam in the name of children’s safeguarding. Or labelled in school, associated with ‘terrorist.’ What happens when that prevents trust, and who measures its harm?

smoothwall monitor dashboard with terrorist labels on child profile

[click to view larger file]

Far too little is known about who and how makes decisions about the lives of others, the criteria for defining inappropriate activity or referrals, or the opacity of decisions on online content.

What effects will the Prevent programme have on our current and future society, where everyone is expected to surveil and inform upon each other? Failure to do so, to uphold the Prevent duty, becomes civic failure.  How is curiosity and intent separated? How do we safeguard children from risk (that is not harm) and protect their childhood experiences,  their free and full development of self?

No one wants children to be caught up in activities or radicalisation into terror groups. But is this the correct way to solve it?

This comprehensive new research by the YEIP suggests otherwise. The fact that the Home Office disengaged with the project in the last year, speaks volumes.

“The research provides new evidence that by attempting to profile and predict violent youth radicalisation, we may in fact be breeding the very reasons that lead those at risk to violent acts.” (Professor Theo Gavrielides).

Current case studies of lived experience, and history also say it is mistaken. Prevent when it comes to children, and schools, needs massive reform, at very least, but those most in favour of how it works today, aren’t the ones who can be involved in its reshaping.

“Who denounced you?” said Winston.

“It was my little daughter,” said Parsons with a sort of doleful pride. “She listened at the keyhole. Heard what I was saying, and nipped off to the patrols the very next day. Pretty smart for a nipper of seven, eh? I don’t bear her any grudge for it. In fact I’m proud of her. It shows I brought her up in the right spirit, anyway.” (1984).

 



The event was the launch of the European study on violent youth radicalisation from YEIP:  The project investigated the attitudes and knowledge of young Europeans, youth workers and other practitioners, while testing tools for addressing the phenomenon through positive psychology and the application of the Good Lives Model.

Its findings include that young people at risk of violent radicalisation are “managed” by the existing justice system as “risks”. This creates further alienation and division, while recidivism rates continue to spiral.