Tag Archives: National Pupil Database

Failing a generation is not what post-Brexit Britain needs

Basically Britain needs Prof. Brian Cox shaping education policy:

“If it were up to me I would increase pay and conditions and levels of responsibility and respect significantly, because it is an investment that would pay itself back many times over in the decades to come.”

Don’t use children as ‘measurement probes’ to test schools

What effect does using school exam results to reform the school system have on children? And what effect does it have on society?

Last autumn Ofqual published a report and their study on consistency of exam marking and metrics.

The report concluded that half of pupils in English Literature, as an example, are not awarded the “correct” grade on a particular exam paper due to marking inconsistencies and the design of the tests.
Given the complexity and sensitivity of the data, Ofqual concluded, it is essential that the metrics stand up to scrutiny and that there is a very clear understanding behind the meaning and application of any quality of marking.  They wrote that, “there are dangers that information from metrics (particularly when related to grade boundaries) could be used out of context.”

Context and accuracy are fundamental to the value of and trust in these tests. And at the moment, trust is not high in the system behind it. There must also be trust in policy behind the system.

This summer two sets of UK school tests, will come under scrutiny. GCSEs and SATS. The goal posts are moving for children and schools across the country. And it’s bad for children and bad for Britain.

Grades A-G will be swapped for numbers 1 -9

GCSE sitting 15-16 year olds will see their exams shift to a numerical system, scoring from the highest Grade 9 to Grade 1, with the three top grades replacing the current A and A*. The alphabetical grading system will be fully phased out by 2019.

The plans intended that roughly the same proportion of students as have achieved a Grade C will be awarded a new Grade 4 and as Schools Week reported: “There will be two GCSE pass rates in school performance tables.”

One will measure grade 5s or above, and this will be called the ‘strong’ pass rate. And the other will measure grade 4s or above, and this will be the ‘standard’ pass rate.

Laura McInerney summed up, “in some senses, it’s not a bad idea as it will mean it is easier to see if the measures are comparable. We can check if the ‘standard’ rate is better or worse over the next few years. (This is particularly good for the DfE who have been told off by the government watchdog for fiddling about with data so much that no one can tell if anything has worked anymore).”

There’s plenty of confusion in parents, how the numerical grading system will work. The confusion you can gauge in playground conversations, is also reflected nationally in a more measurable way.

Market research in a range of audiences – including businesses, head teachers, universities, colleges, parents and pupils – found that just 31 per cent of secondary school pupils and 30 per cent of parents were clear on the new numerical grading system.

So that’s a change in the GCSE grading structure. But why? If more differentiators are needed, why not add one or two more letters and shift grade boundaries? A policy need for these changes is unclear.

Machine marking is training on ten year olds

I wonder if any of the shift to numerical marking, is due in any part to a desire to move GCSEs in future to machine marking?

This year, ten and eleven year olds, children in their last year of primary school, will have their SATs tests computer marked.

That’s everything in maths and English. Not multiple choice papers or one word answers, but full written responses. If their f, b or g doesn’t look like the correct  letter in the correct place in the sentence, then it gains no marks.

Parents are concerned about children whose handwriting is awful, but their knowledge is not. How well can they hope to be assessed? If exams are increasingly machine marked out of sight, many sent to India, where is our oversight of the marking process and accuracy?

The concerns I’ve heard simply among local parents and staff, seem reflected in national discussions and the assessor, Oftsed. TES has reported Ofsted’s most senior officials as saying that the inspectorate is just as reluctant to use this year’s writing assessments as it was in 2016. Teachers and parents locally are united in feeling it is not accurate, not fair, and not right.

The content is also to be tougher.

How will we know what is being accurately measured and the accuracy of the metrics with content changes at the same time? How will we know if children didn’t make the mark, or if the marks were simply not awarded?

The accountability of the process is less than transparent to pupils and parents. We have little opportunity for Ofqual’s recommended scrutiny of these metrics, or the data behind the system on our kids.

Causation, correlation and why we should care

The real risk is that no one will be able to tell if there is an error, where it stems from, and where there is a reason if pass rates should be markedly different from what was expected.

After the wide range of changes across pupil attainment, exam content, school progress scores, and their interaction and dependencies, can they all fit together and be comparable with the past at all?

If the SATS are making lots of mistakes simply due to being bad at reading ten year’ old’s handwriting, how will we know?

Or if GCSE scores are lower, will we be able to see if it is because they have genuinely differentiated the results in a wider spread, and stretched out the fail, pass and top passes more strictly than before?

What is likely, is that this year’s set of children who were expecting As and A star at GCSE but fail to be the one of the two children nationally who get the new grade 9, will be disappointed to feel they are not, after all, as great as they thought they were.

And next year, if you can’t be the one or two to get the top mark, will the best simply stop stretching themselves and rest a bit easier, because, whatever, you won’t get that straight grade As anyway?

Even if children would not change behaviours were they to know, the target range scoring sent by third party data processors to schools, discourages teachers from stretching those at the top.

Politicians look for positive progress, but policies are changing that will increase the number of schools deemed to have failed. Why?

Our children’s results are being used to reform the school system.

Coasting and failing schools can be compelled to become academies.

Government policy on this forced academisation was rejected by popular revolt. It appears that the government is determined that schools *will* become academies with the same fervour that they *will* re-introduce grammar schools. Both are unevidenced and unwanted. But there is a workaround.  Create evidence. Make the successful scores harder to achieve, and more will be seen to fail.

A total of 282 secondary schools in England were deemed to be failing by the government this January, as they “have not met a new set of national standards”.

It is expected that even more will attain ‘less’ this summer. Tim Leunig, Chief Analyst & Chief Scientific Adviser Department for Education, made a personal guess at two reaching the top mark.

The context of this GCSE ‘failure’ is the changes in how schools are measured. Children’s progress over 8 subjects, or “P8” is being used as an accountability measure of overall school quality.

But it’s really just: “a school’s average Attainment 8 score adjusted for pupils’ Key Stage 2 attainment.” [Dave Thomson, Education Datalab]

Work done by FFT Education Datalab showed that contextualising P8 scores can lead to large changes for some schools.  (Read more here and here). You cannot meaningfully compare schools with different types of intake, but it appears that the government is determined to do so. Starting ever younger if new plans go ahead.

Data is being reshaped to tell stories to fit to policy.

Shaping children’s future

What this reshaping doesn’t factor in at all, is the labelling of a generation or more, with personal failure, from age ten and up.

All this tinkering with the data, isn’t just data.

It’s tinkering badly with our kids sense of self, their sense of achievement, aspiration, and with that; the country’s future.

Education reform has become the aim, and it has replaced the aims of education.

Post-Brexit Britain doesn’t need policy that delivers ideology. We don’t need “to use children as ‘measurement probes’ to test schools.

Just as we shouldn’t use children’s educational path to test their net worth or cost to the economy. Or predict it in future.

Children’s education and human value cannot be measured in data.

The perfect storm: three bills that will destroy student data privacy in England

Lords have voiced criticism and concern at plans for ‘free market’ universities, that will prioritise competition over collaboration and private interests over social good. But while both Houses have identified the institutional effects, they are yet to discuss the effects on the individuals of a bill in which “too much power is concentrated in the centre”.

The Higher Education and Research Bill sucks in personal data to the centre, as well as power. It creates an authoritarian panopticon of the people within the higher education and further education systems. Section 1, parts 72-74 creates risks but offers no safeguards.

Applicants and students’ personal data is being shifted into a  top-down management model, at the same time as the horizontal safeguards for its distribution are to be scrapped.

Through deregulation and the building of a centralised framework, these bills will weaken the purposes for which personal data are collected, and weaken existing requirements on consent to which the data may be used at national level. Without amendments, every student who enters this system will find their personal data used at the discretion of any future Secretary of State for Education without safeguards or oversight, and forever. Goodbye privacy.

Part of the data extraction plans are for use in public interest research in safe settings with published purpose, governance, and benefit. These are well intentioned and this year’s intake of students will have had to accept that use as part of the service in the privacy policy.

But in addition and separately, the Bill will permit data to be used at the discretion of the Secretary of State, which waters down and removes nuances of consent for what data may or may not be used today when applicants sign up to UCAS.

Applicants today are told in the privacy policy they can consent separately to sharing their data with the Student Loans company for example. This Bill will remove that right when it permits all Applicant data to be used by the State.

This removal of today’s consent process denies all students their rights to decide who may use their personal data beyond the purposes for which they permit its sharing.

And it explicitly overrides the express wishes registered by the 28,000 applicants, 66% of respondents to a 2015 UCAS survey, who said as an example, that they should be asked before any data was provided to third parties for student loan applications (or even that their data should never be provided for this).

Not only can the future purposes be changed without limitation,  by definition, when combined with other legislation, namely the Digital Economy Bill that is in the Lords at the same time right now, this shift will pass personal data together with DWP and in connection with HMRC data expressly to the Student Loans Company.

In just this one example, the Higher Education and Research Bill is being used as a man in the middle. But it will enable all data for broad purposes, and if those expand in future, we’ll never know.

This change, far from making more data available to public interest research, shifts the balance of power between state and citizen and undermines the very fabric of its source of knowledge; the creation and collection of personal data.

Further, a number of amendments have been proposed in the Lords to clause 9 (the transparency duty) which raise more detailed privacy issues for all prospective students, concerns UCAS share.

Why this lack of privacy by design is damaging

This shift takes away our control, and gives it to the State at the very time when ‘take back control’ is in vogue. These bills are building a foundation for a data Brexit.

If the public does not trust who will use it and why or are told that when they provide data they must waive any rights to its future control, they will withhold or fake data. 8% of applicants even said it would put them off applying through UCAS at all.

And without future limitation, what might be imposed is unknown.

This shortsightedness will ultimately cause damage to data integrity and the damage won’t come in education data from the Higher Education Bill alone. The Higher Education and Research Bill is just one of three bills sweeping through Parliament right now which build a cumulative anti-privacy storm together, in what is labelled overtly as data sharing legislation or is hidden in tucked away clauses.

The Technical and Further Education Bill – Part 3

In addition to entirely new Applicant datasets moving from UCAS to the DfE in clauses 73 and 74 of the  Higher Education and Research Bill,  Apprentice and FE student data already under the Secretary of State for Education will see potentially broader use under changed purposes of Part 3 of the Technical and Further Education Bill.

Unlike the Higher Education and Research Bill, it may not fundamentally changing how the State gathers information on further education, but it has the potential to do so on use.

The change is a generalisation of purposes. Currently, subsection 1 of section 54 refers to “purposes of the exercise of any of the functions of the Secretary of State under Part 4 of the Apprenticeships, Skills, Children and Learning Act 2009”.

Therefore, the government argues, “it would not hold good in circumstances where certain further education functions were transferred from the Secretary of State to some combined authorities in England, which is due to happen in 2018.”<

This is why clause 38 will amend that wording to “purposes connected with further education”.

Whatever the details of the reason, the purposes are broader.

Again, combined with the Digital Economy Bill’s open ended purposes, it means the Secretary of State could agree to pass these data on to every other government department, a range of public bodies, and some private organisations.

The TFE BIll is at Report stage in the House of Commons on January 9, 2017.

What could go possibly go wrong?

These loose purposes, without future restrictions, definitions of third parties it could be given to or why, or clear need to consult the public or parliament on future scope changes, is a  repeat of similar legislative changes which have resulted in poor data practices using school pupil data in England age 2-19 since 2000.

Policy makers should consider whether the intent of these three bills is to give out identifiable, individual level, confidential data of young people under 18, for commercial use without their consent? Or to journalists and charities access? Should it mean unfettered access by government departments and agencies such as police and Home Office Removals Casework teams without any transparent register of access, any oversight, or accountability?

These are today’s uses by third-parties of school children’s individual, identifiable and sensitive data from the National Pupil Database.

Uses of data not as statistics, but named individuals for interventions in individual lives.

If the Home Secretaries past and present have put international students at the centre of plans to cut migration to the tens of thousands and government refuses to take student numbers out of migration figures, despite them being seen as irrelevant in the substance of the numbers debate, this should be deeply worrying.

Will the MOU between the DfE and the Home Office Removals Casework team be a model for access to all student data held at the Department for Education, even all areas of public administrative data?

Hoping that the data transfers to the Home Office won’t result in the deportation of thousands we would not predict today, may be naive.

Under the new open wording, the Secretary of State for Education might even  decide to sell the nation’s entire Technical and Further Education student data to Trump University for the purposes of their ‘research’ to target marketing at UK students or institutions that may be potential US post-grad applicants. The Secretary of State will have the data simply because she “may require [it] for purposes connected with further education.”

And to think US buyers or others would not be interested is too late.

In 2015 Stanford University made a request of the National Pupil Database for both academic staff and students’ data. It was rejected. We know this only from the third party release register. Without any duty to publish requests, approved users or purposes of data release, where is the oversight for use of these other datasets?

If these are not the intended purposes of these three bills, if there should be any limitation on purposes of use and future scope change, then safeguards and oversight need built into the face of the bills to ensure data privacy is protected and avoid repeating the same again.

Hoping that the decision is always going to be, ‘they wouldn’t approve a request like that’ is not enough to protect millions of students privacy.

The three bills are a perfect privacy storm

As other Europeans seek to strengthen the fundamental rights of their citizens to take back control of their personal data under the GDPR coming into force in May 2018, the UK government is pre-emptively undermining ours in these three bills.

Young people, and data dependent institutions, are asking for solutions to show what personal data is held where, and used by whom, for what purposes. That buys in the benefit message that builds trust showing what you said you’d do with my data, is what you did with my data. [1] [2]

Reality is that in post-truth politics it seems anything goes, on both sides of the Pond. So how will we trust what our data is used for?

2015-16 advice from the cross party Science and Technology Committee suggested data privacy is unsatisfactory, “to be left unaddressed by Government and without a clear public-policy position set out“.  We hear the need for data privacy debated about use of consumer data, social media, and on using age verification. It’s necessary to secure the public trust needed for long term public benefit and for economic value derived from data to be achieved.

But the British government seems intent on shortsighted legislation which does entirely the opposite for its own use: in the Higher Education Bill, the Technical and Further Education Bill and in the Digital Economy Bill.

These bills share what Baroness Chakrabarti said of the Higher Education Bill in its Lords second reading on the 6th December, “quite an achievement for a policy to combine both unnecessary authoritarianism with dangerous degrees of deregulation.”

Unchecked these Bills create the conditions needed for catastrophic failure of public trust. They shift ever more personal data away from personal control, into the centralised control of the Secretary of State for unclear purposes and use by undefined third parties. They jeopardise the collection and integrity of public administrative data.

To future-proof the immediate integrity of student personal data collection and use, the DfE reputation, and public and professional trust in DfE political leadership, action must be taken on safeguards and oversight, and should consider:

  • Transparency register: a public record of access, purposes, and benefits to be achieved from use
  • Subject Access Requests: Providing the public ways to access copies of their own data
  • Consent procedures should be strengthened for collection and cannot say one thing, and do another
  • Ability to withdraw consent from secondary purposes should be built in by design, looking to GDPR from 2018
  • Clarification of the legislative purpose of intended current use by the Secretary of State and its boundaries shoud be clear
  • Future purpose and scope change limitations should require consultation – data collected today must not used quite differently tomorrow without scrutiny and ability to opt out (i.e. population wide registries of religion, ethnicity, disability)
  • Review or sunset clause

If the legislation in these three bills pass without amendment, the potential damage to privacy will be lasting.


[1] http://www.parliament.uk/business/publications/written-questions-answers-statements/written-question/Commons/2016-07-15/42942/  Parliamentary written question 42942 on the collection of pupil nationality data in the school census starting in September 2016:   “what limitations will be placed by her Department on disclosure of such information to (a) other government departments?”

Schools Minister Nick Gibb responded on July 25th 2016: ”

“These new data items will provide valuable statistical information on the characteristics of these groups of children […] “The data will be collected solely for internal Departmental use for the analytical, statistical and research purposes described above. There are currently no plans to share the data with other government Departments”

[2] December 15, publication of MOU between the Home Office  Casework Removals Team and the DfE, reveals “the previous agreement “did state that DfE would provide nationality information to the Home Office”, but that this was changed “following discussions” between the two departments.” http://schoolsweek.co.uk/dfe-had-agreement-to-share-pupil-nationality-data-with-home-office/ 

The agreement was changed on 7th October 2016 to not pass nationality data over. It makes no mention of not using the data within the DfE for the same purposes.

Datasharing, lawmaking and ethics: power, practice and public policy

“Lawmaking is the Wire, not Schoolhouse Rock. It’s about blood and war and power, not evidence and argument and policy.”

"We can't trust the regulators," they say. "We need to be able to investigate the data for ourselves." Technology seems to provide the perfect solution. Just put it all online - people can go through the data while trusting no one.  There's just one problem. If you can't trust the regulators, what makes you think you can trust the data?" 

Extracts from The Boy Who Could Change the World: The Writings of Aaron Swartz. Chapter: ‘When is Technology Useful? ‘ June 2009.

The question keeps getting asked, is the concept of ethics obsolete in Big Data?

I’ve come to some conclusions why ‘Big Data’ use keeps pushing the boundaries of what many people find acceptable, and yet the people doing the research, the regulators and lawmakers often express surprise at negative reactions. Some even express disdain for public opinion, dismissing it as ignorant, not ‘understanding the benefits’, yet to be convinced. I’ve decided why I think what is considered ‘ethical’ in data science does not meet public expectation.

It’s not about people.

Researchers using large datasets, often have a foundation in data science, applied computing, maths, and don’t see data as people. It’s only data. Creating patterns, correlations, and analysis of individual level data are not seen as research involving human subjects.

This is embodied in the nth number of research ethics reviews I have read in the last year in which the question is asked, does the research involve people? The answer given is invariably ‘no’.

And these data analysts using, let’s say health data, are not working in a subject that is founded on any ethical principle, contrasting with the medical world the data come from.

The public feels differently about the information that is about them, and may be known, only to them or select professionals. The values that we as the public attach to our data  and expectations of its handling may reflect the expectation we have of handling of us as people who are connected to it. We see our data as all about us.

The values that are therefore put on data, and on how it can and should be used, can be at odds with one another, the public perception is not reciprocated by the researchers. This may be especially true if researchers are using data which has been de-identified, although it may not be anonymous.

New legislation on the horizon, the Better Use of Data in Government,  intends to fill the [loop]hole between what was legal to share in the past and what some want to exploit today, and emphasises a gap in the uses of data by public interest, academic researchers, and uses by government actors. The first incorporate by-and-large privacy and anonymisation techniques by design, versus the second designed for applied use of identifiable data.

Government departments and public bodies want to identify and track people who are somehow misaligned with the values of the system; either through fraud, debt, Troubled Families, or owing Student Loans. All highly sensitive subjects. But their ethical data science framework will not treat them as individuals, but only as data subjects. Or as groups who share certain characteristics.

The system again intrinsically fails to see these uses of data as being about individuals, but sees them as categories of people – “fraud” “debt” “Troubled families.” It is designed to profile people.

Services that weren’t built for people, but for government processes, result in datasets used in research, that aren’t well designed for research. So we now see attempts to shoehorn historical practices into data use  by modern data science practitioners, with policy that is shortsighted.

We can’t afford for these things to be so off axis, if civil service thinking is exploring “potential game-changers such as virtual reality for citizens in the autism spectrum, biometrics to reduce fraud, and data science and machine-learning to automate decisions.”

In an organisation such as DWP this must be really well designed since “the scale at which we operate is unprecedented: with 800 locations and 85,000  colleagues, we’re larger than most retail operations.”

The power to affect individual lives through poor technology is vast and some impacts seem to be being badly ignored. The ‘‘real time earnings’ database improved accuracy of benefit payments was widely agreed to have been harmful to some individuals through the Universal Credit scheme, with delayed payments meaning families at foodbanks, and contributing to worse.

“We believe execution is the major job of every business leader,” perhaps not the best wording in on DWP data uses.

What accountability will be built-by design?

I’ve been thinking recently about drawing a social ecological model of personal data empowerment or control. Thinking about visualisation of wants, gaps and consent models, to show rather than tell policy makers where these gaps exist in public perception and expectations, policy and practice. If anyone knows of one on data, please shout. I think it might be helpful.

But the data *is* all about people

Regardless whether they are in front of you or numbers on a screen, big or small datasets using data about real lives are data about people. And that triggers a need to treat the data with an ethical approach as you would people involved face-to-face.

Researchers need to stop treating data about people as meaningless data because that’s not how people think about their own data being used. Not only that, but if the whole point of your big data research is to have impact, your data outcomes, will change lives.

Tosh, I know some say. But, I have argued, the reason being is that the applications of the data science/ research/ policy findings / impact of immigration in education review / [insert purposes of the data user’s choosing] are designed to have impact on people. Often the people about whom the research is done without their knowledge or consent. And while most people say that is OK, where it’s public interest research, the possibilities are outstripping what the public has expressed as acceptable, and few seem to care.

Evidence from public engagement and ethics all say, hidden pigeon-holing, profiling, is unacceptable. Data Protection law has special requirements for it, on autonomous decisions. ‘Profiling’ is now clearly defined under article 4 of the GDPR as ” any form of automated processing of personal data consisting of using those data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.”

Using big datasets for research that ‘isn’t interested in individuals’ may still intend to create results profiling groups for applied policing, or discriminate, to make knowledge available by location. The data may have been deidentified, but in application becomes no longer anonymous.

Big Data research that results in profiling groups with the intent for applied health policy impacts for good, may by the very point of research, with the intent of improving a particular ethnic minority access to services, for example.

Then look at the voting process changes in North Carolina and see how that same data, the same research knowledge might be applied to exclude, to restrict rights, and to disempower.

Is it possible to have ethical oversight that can protect good data use and protect people’s rights if they conflict with the policy purposes?

The “clear legal basis”is not enough for public trust

Data use can be legal and can still be unethical, harmful and shortsighted in many ways, for both the impacts on research – in terms of withholding data and falsifying data and avoiding the system to avoid giving in data – and the lives it will touch.

What education has to learn from health is whether it will permit the uses by ‘others’ outside education to jeopardise the collection of school data intended in the best interests of children, not the system. In England it must start to analyse what is needed vs wanted. What is necessary and proportionate and justifies maintaining named data indefinitely, exposed to changing scope.

In health, the most recent Caldicott review suggests scope change by design – that is a red line for many: “For that reason the Review recommends that, in due course, the opt-out should not apply to all flows of information into the HSCIC. This requires careful consideration with the primary care community.”

The community spoke out already, and strongly in Spring and Summer 2014 that there must be an absolute right to confidentiality to protect patients’ trust in the system. Scope that ‘sounds’ like it might sneakily change in future, will be a death knell to public interest research, because repeated trust erosion will be fatal.

Laws change to allow scope change without informing people whose data are being used for different purposes

Regulators must be seen to be trusted, if the data they regulate is to be trustworthy. Laws and regulators that plan scope for the future watering down of public protection, water down public trust from today. Unethical policy and practice, will not be saved by pseudo-data-science ethics.

Will those decisions in private political rooms be worth the public cost to research, to policy, and to the lives it will ultimately affect?

What happens when the ethical black holes in policy, lawmaking and practice collide?

At the last UK HealthCamp towards the end of the day, when we discussed the hard things, the topic inevitably moved swiftly to consent, to building big databases, public perception, and why anyone would think there is potential for abuse, when clearly the intended use is good.

The answer came back from one of the participants, “OK now it’s the time to say. Because, Nazis.” Meaning, let’s learn from history.

Given the state of UK politics, Go Home van policies, restaurant raids, the possibility of Trump getting access to UK sensitive data of all sorts from across the Atlantic, given recent policy effects on the rights of the disabled and others, I wonder if we would hear the gentle laughter in the room in answer to the same question today.

With what is reported as Whitehall’s digital leadership sharp change today, the future of digital in government services and policy and lawmaking does indeed seem to be more “about blood and war and power,” than “evidence and argument and policy“.

The concept of ethics in datasharing using public data in the UK is far from becoming obsolete. It has yet to begin.

We have ethical black holes in big data research, in big data policy, and big data practices in England. The conflicts between public interest research and government uses of population wide datasets, how the public perceive the use of our data and how they are used, gaps and tensions in policy and practice are there.

We are simply waiting for the Big Bang. Whether it will be creative, or destructive we are yet to feel.

*****

image credit: LIGO – graphical visualisation of black holes on the discovery of gravitational waves

References:

Report: Caldicott review – National Data Guardian for Health and Care Review of Data Security, Consent and Opt-Outs 2016

Report: The OneWay Mirror: Public attitudes to commercial access to health data

Royal Statistical Society Survey carried out by Ipsos MORI: The Data Trust Deficit

Can new datasharing laws win social legitimacy, public trust and support without public engagement?

I’ve been struck by stories I’ve heard on the datasharing consultation, on data science, and on data infrastructures as part of ‘government as a platform’ (#GaaPFuture) in recent weeks. The audio recorded by the Royal Statistical Society on March 17th is excellent, and there were some good questions asked.

There were even questions from insurance backed panels to open up more data for commercial users, and calls for journalists to be seen as accredited researchers, as well as to include health data sharing. Three things that some stakeholders, all users of data, feel are  missing from consultation, and possibly some of those with the most widespread public concern and lowest levels of public trust. [1]

What I feel is missing in consultation discussions are:

  1. a representative range of independent public voice
  2. a compelling story of needs – why tailored public services benefits citizens from whom data is taken, not only benefits data users
  3. the impacts we expect to see in local government
  4. any cost/risk/benefit assessment of those impacts, or for citizens
  5. how the changes will be independently evaluated – as some are to be reviewed

The Royal Statistical Society and ODI have good summaries here of their thoughts, more geared towards the statistical and research aspects of data,  infrastructure and the consultation.

I focus on the other strands that use identifiable data for targeted interventions. Tailored public services, Debt, Fraud, Energy Companies’ use. I think we talk too little of people, and real needs.

Why the State wants more datasharing is not yet a compelling story and public need and benefit seem weak.

So far the creation of new data intermediaries, giving copies of our personal data to other public bodies  – and let’s be clear that this often means through commercial representatives like G4S, Atos, Management consultancies and more –  is yet to convince me of true public needs for the people, versus wants from parts of the State.

What the consultation hopes to achieve, is new powers of law, to give increased data sharing increased legal authority. However this alone will not bring about the social legitimacy of datasharing that the consultation appears to seek through ‘open policy making’.

Legitimacy is badly needed if there is to be public and professional support for change and increased use of our personal data as held by the State, which is missing today,  as care.data starkly exposed. [2]

The gap between Social Legitimacy and the Law

Almost 8 months ago now, before I knew about the datasharing consultation work-in-progress, I suggested to BIS that there was an opportunity for the UK to drive excellence in public involvement in the use of public data by getting real engagement, through pro-active consent.

The carrot for this, is achieving the goal that government wants – greater legal clarity, the use of a significant number of consented people’s personal data for complex range of secondary uses as a secondary benefit.

It was ignored.

If some feel entitled to the right to infringe on citizens’ privacy through a new legal gateway because they believe the public benefit outweighs private rights, then they must also take on the increased balance of risk of doing so, and a responsibility to  do so safely. It is in principle a slippery slope. Any new safeguards and ethics for how this will be done are however unclear in those data strands which are for targeted individual interventions. Especially if predictive.

Upcoming discussions on codes of practice [which have still to be shared] should demonstrate how this is to happen in practice, but codes are not sufficient. Laws which enable will be pushed to their borderline of legal and beyond that of ethical.

In England who would have thought that the 2013 changes that permitted individual children’s data to be given to third parties [3] for educational purposes, would mean giving highly sensitive, identifiable data to journalists without pupils or parental consent? The wording allows it. It is legal. However it fails the DPA Act legal requirement of fair processing.  Above all, it lacks social legitimacy and common sense.

In Scotland, there is current anger over the intrusive ‘named person’ laws which lack both professional and public support and intrude on privacy. Concerns raised should be lessons to learn from in England.

Common sense says laws must take into account social legitimacy.

We have been told at the open policy meetings that this change will not remove the need for informed consent. To be informed, means creating the opportunity for proper communications, and also knowing how you can use the service without coercion, i.e. not having to consent to secondary data uses in order to get the service, and knowing to withdraw consent at any later date. How will that be offered with ways of achieving the removal of data after sharing?

The stick for change, is the legal duty that the recent 2015 CJEU ruling reiterating the legal duty to fair processing [4] waved about. Not just a nice to have, but State bodies’ responsibility to inform citizens when their personal data are used for purposes other than those for which those data had initially been consented and given. New legislation will not  remove this legal duty.

How will it be achieved without public engagement?

Engagement is not PR

Failure to act on what you hear from listening to the public is costly.

Engagement is not done *to* people, don’t think explain why we need the data and its public benefit’ will work. Policy makers must engage with fears and not seek to dismiss or diminish them, but acknowledge and mitigate them by designing technically acceptable solutions. Solutions that enable data sharing in a strong framework of privacy and ethics, not that sees these concepts as barriers. Solutions that have social legitimacy because people support them.

Mr Hunt’s promised February 2014 opt out of anonymised data being used in health research, has yet to be put in place and has had immeasurable costs for delayed public research, and public trust.

How long before people consider suing the DH as data controller for misuse? From where does the arrogance stem that decides to ignore legal rights, moral rights and public opinion of more people than those who voted for the Minister responsible for its delay?

 

This attitude is what fails care.data and the harm is ongoing to public trust and to confidence for researchers’ continued access to data.

The same failure was pointed out by the public members of the tiny Genomics England public engagement meeting two years ago in March 2014, called to respond to concerns over the lack of engagement and potential harm for existing research. The comms lead made a suggestion that the new model of the commercialisation of the human genome in England, to be embedded in the NHS by 2017 as standard clinical practice, was like steam trains in Victorian England opening up the country to new commercial markets. The analogy was felt by the lay attendees to be, and I quote, ‘ridiculous.’

Exploiting confidential personal data for public good must have support and good two-way engagement if it is to get that support, and what is said and agreed must be acted on to be trustworthy.

Policy makers must take into account broad public opinion, and that is unlikely to be submitted to a Parliamentary consultation. (Personally, I first knew such  processes existed only when care.data was brought before the Select Committee in 2014.) We already know what many in the public think about sharing their confidential data from the work with care.data and objections to third party access, to lack of consent. Just because some policy makers don’t like what was said, doesn’t make that public opinion any less valid.

We must bring to the table the public voice from past but recent public engagement work on administrative datasharing [5], the voice of the non-research community, and from those who are not stakeholders who will use the data but the ‘data subjects’, the public  whose data are to be used.

Policy Making must be built on Public Trust

Open policy making is not open just because it says it is. Who has been invited, participated, and how their views actually make a difference on content and implementation is what matters.

Adding controversial ideas at the last minute is terrible engagement, its makes the process less trustworthy and diminishes its legitimacy.

This last minute change suggests some datasharing will be dictated despite critical views in the policy making and without any public engagement. If so, we should ask policy makers on what mandate?

Democracy depends on social legitimacy. Once you lose public trust, it is not easy to restore.

Can new datasharing laws win social legitimacy, public trust and support without public engagement?

In my next post I’ll post look at some of the public engagement work done on datasharing to date, and think about ethics in how data are applied.

*************

References:

[1] The Royal Statistical Society data trust deficit

[2] “The social licence for research: why care.data ran into trouble,” by Carter et al.

[3] FAQs: Campaign for safe and ethical National Pupil Data

[4] CJEU Bara 2015 Ruling – fair processing between public bodies

[5] Public Dialogues using Administrative data (ESRC / ADRN)

img credit: flickr.com/photos/internetarchivebookimages/

The front door to our children’s personal data in schools

“EdTech UK will be a pro-active organisation building and accelerating a vibrant education and learning technology sector and leading new developments with our founding partners. It will also be a front door to government, educators, companies and investors from Britain and globally.”

Ian Fordham, CEO, EdTech UK

This front door is a gateway to access our children’s personal data and through it some companies are coming into our schools and homes and taking our data without asking.  And with that, our children lose control over their safeguarded digital identity. Forever.

Companies are all “committed to customer privacy” in those privacy policies which exist at all. However, typically this means they also share your information with ‘our affiliates, our licensors, our agents, our distributors and our suppliers’ and their circles are wide and often in perpetuity. Many simply don’t have a published policy.

Where do they store any data produced in the web session? Who may access it and use it for what purposes? Or how may they use the personal data associated with staff signing up with payment details?

According to research from London & Partners, championed by Boris Johnson, Martha Lane-Fox and others in EdTech, education is one of the fastest growing tech sectors in Britain and is worth £45bn globally; a number set to reach a staggering £129bn by 2020. And perhaps the EdTech diagrams in US dollars shows where the UK plan to draw companies from. If you build it, they will come.

The enthusiasm that some US EdTech type entrepreneurs I have met or listened to speak, is akin to religious fervour. Such is their drive for tech however, that they appear to forget that education is all about the child. Individual children. Not cohorts, or workforces. And even when they do it can be sincerely said, but lacks substance when you examine policies in practice.

How is the DfE measuring the cost and benefit of tech and its applications in education?

Is anyone willing to say not all tech is good tech, not every application is a wise application? Because every child is unique, not every app is one size fits all?

My 7-yo got so caught up in the game and in the mastery of the app their class was prescribed for homework in the past, that she couldn’t master the maths and harmed her confidence. (Imagine something like this, clicking on the two correct sheep with numbers stamped on them, that together add up to 12, for example, before they fall off and die.)

She has no problem with maths. Nor doing sums under pressure. She told me happily today she’d come joint second in a speed tables test. That particular app style simply doesn’t suit her.

I wonder if other children and parents find the same and if so, how would we know if these apps do more harm than good?

Nearly 300,000 young people in Britain have an anxiety disorder according to the Royal College of Psychiatrists. Feeling watched all the time on-and offline is unlikely to make anxiety any better.

How can the public and parents know that edTech which comes into the home with their children, is behaviourally sound?

How can the public and parents know that edTech which affects their children, is ethically sound in both security and application?

Where is the measured realism in the providers’ and policy makers fervour when both seek to marketise edTech and our personal data for the good of the economy, and ‘in the public interest’.

Just because we can, does not always mean we should. Simply because data linkage is feasible, even if it brings public benefit, cannot point blank mean it will always be in our best interest.

In whose best Interest is it anyway?

Right now, I’m not convinced that the digital policies at the heart of the Department for Education, the EdTech drivers or many providers have our children’s best interests at heart at all. It’s all about the economy; when talking if at all about children using the technology, many talk only of ‘preparing the workforce’.

Are children and parents asked to consent at individual level to the terms and conditions of the company and told what data will be extracted from the school systems about their child? Or do schools simply sign up their children and parents en masse, seeing it as part of their homework management system?

How much ‘real’ personal data they use varies. Some use only pseudo-IDs assigned by the teacher. Others log, store and share everything they do assigned to their ID or real email address , store performance over time and provide personalised reports of results.

Teachers and schools have a vital role to play in understanding data ethics and privacy to get this right and speaking to many, it doesn’t seem something they feel well equipped to do. Parents aren’t always asked. But should schools not always have to ask before giving data to a commercial third party or when not in an ’emergency’ situation?

I love tech. My children love making lego robots move with code. Or driving drones with bananas. Or animation. Technology offers opportunity for application in and outside schools for children that are fascinating, and worthy, and of benefit.

If however all parents are to protect children’s digital identity for future, and to be able to hand over any control and integrity over their personal data to them as adults,  we must better accommodate children’s data privacy in this 2016 gold rush for EdTech.

Pupils and parents need to be assured their software is both educationally and ethically sound.  Who defines those standards?

Who is in charge of Driving, Miss Morgan?

Microsoft’s vice-president of worldwide education, recently opened the BETT exhibition and praised teachers for using technology to achieve amazing things in the classroom, and urged innovators to  “join hands as a global community in driving this change”.

While there is a case to say no exposure to technology in today’s teaching would be neglectful, there is a stronger duty to ensure exposure to technology is positive and inclusive, not harmful.

Who regulates that?

We are on the edge of an explosion of tech and children’s personal data ‘sharing’ with third parties in education.

Where is its oversight?

The community of parents and children are at real risk of being completely left out these decisions, and exploited.

The upcoming “safeguarding” policies online are a joke if the DfE tells us loudly to safeguard children’s identity out front, and quietly gives their personal data away for cash round the back.

The front door to our children’s data “for government, educators, companies and investors from Britain and globally” is wide open.

Behind the scenes  in pupil data privacy, it’s a bit of a mess. And these policy makers and providers forgot to ask first,  if they could come in.

If we build it, would you come?

My question now is, if we could build something better on pupil data privacy AND better data use, what would it look like?

Could we build an assessment model of the collection, use and release of data in schools that could benefit pupils and parents, AND educational establishments and providers?

This could be a step towards future-proofing public trust which will be vital for companies who want a foot-in-the door of EdTech. Design an ethical framework for digital decision making and a practical data model for use in Education.

Educationally and ethically sound.

If together providers, policy makers, schools at group Trust level, could meet with Data Protection and Privacy civil society experts to shape a tool kit of how to assess privacy impact, to ensure safeguarding and freedoms, enable safe data flow and help design cybersecurity that works for them and protects children’s privacy that is lacking today, designing for tomorrow, would you come?

Which door will we choose?

*******

image credit: @ Ben Buschfeld Wikipedia

*added February 13th: Oftsed Chair sought from US

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (2)

“Children do not lose their human rights by virtue of passing through the school gates” (UN Committee on the Rights of the Child, General Comment on ‘The aims of education’, 2001).

The Digital Skills in Schools inquiry [1] is examining the gap in education of our children to enable them to be citizens fit for the future.

We have an “educational gap” in digital skills and I have suggested it should not be seen only as functional or analytical, but should also address a gap in ethical skills and framework to equip our young people to understand their digital rights, as well as responsibilities.

Children must be enabled in education with opportunity to understand how they can grow “to develop physically, mentally, morally, spiritually and socially in a healthy and normal manner and in conditions of freedom and dignity”. [2]

Freedom to use the internet in privacy does not mean having to expose children to risks, but we should ask, are there ways of implementing practices which are more proportionate, and less intrusive than monitoring and logging keywords [3] for every child in the country? What problem is the DfE trying to solve and how?

Nicky Morgan’s “fantastic” GPS tracking App

The second technology tool Nicky Morgan mentioned in her BETT speech on January 22nd, is an app with GPS tracking and alerts creation. Her app verdict was “excellent” and “fantastic”:

“There are excellent examples at the moment such as the Family First app by Group Call. It uses GPS in mobile phones to help parents keep track of their children’s whereabouts, allowing them to check that they have arrived safely to school, alerting them if they stray from their usual schedule.” [4]

I’m not convinced tracking every child’s every move is either excellent or fantastic. Primarily because it will foster a nation of young people who feel untrusted, and I see a risk it could create a lower sense of self-reliance, self-confidence and self-responsibility.

Just as with the school software monitoring [see part one], there will be a chilling effect on children’s freedom if these technologies become the norm. If you fear misusing a word in an online search, or worry over stigma what others think, would you not change your behaviour? Our young people need to feel both secure and trusted at school.

How we use digital in schools shapes our future society

A population that trusts one another and trusts its government and organisations and press, is vital to a well functioning society.

If we want the benefits of a global society, datasharing for example to contribute to medical advance, people must understand how their own data and digital footprint fits into a bigger picture to support it.

In schools today pupils and parents are not informed that their personal confidential data are given to commercial third parties by the Department for Education at national level [5]. Preventing public engagement, hiding current practices, downplaying the risks of how data are misused, also prevents fair and transparent discussion of its benefits and how to do it better. Better, like making it accessible only in a secure setting not handing data out to Fleet Street.

For children this holds back public involvement in the discussion of the roles of technology in their own future. Fear of public backlash over poor practices must not hold back empowering our children’s understanding of digital skills and how their digital identity matters.

Digital skills are not shorthand for coding, but critical life skills

Skills our society will need must simultaneously manage the benefits to society and deal with great risks that will come with these advances in technology; advances in artificial intelligence, genomics, and autonomous robots, to select only three examples.

There is a glaring gap in their education how their own confidential personal data and digital footprint fit a globally connected society, and how they are used by commercial business and third parties.

There are concerns how apps could be misused by others too.

If we are to consider what is missing in our children’s preparations for life in which digital will no longer be a label but a way of life, then to identify the gap, we must first consider what we see as whole.

Rather than keeping children safe in education, as regards data sharing and digital privacy, the DfE seems happy to keep them ignorant. This is no way to treat our young people and develop their digital skills, just as giving their data away is not good cyber security.

What does a Dream for a  great ‘digital’ Society look like?

Had Martin Luther King lived to be 87 he would have continued to inspire hope and to challenge us to fulfill his dream for society – where everyone would have an equal opportunity for “life, liberty and the pursuit of happiness.”

Moving towards that goal, supported with technology, with ethical codes of practice, my dream is we see a more inclusive, fulfilled, sustainable and happier society. We must educate our children as fully rounded digital and data savvy individuals, who trust themselves and systems they use, and are well treated by others.

Sadly, introductions of these types of freedom limiting technologies for our children, risk instead that it may be a society in which many people do not feel comfortable, that lost sight of the value of privacy.

References:

[1] Digital Skills Inquiry: http://www.parliament.uk/business/committees/committees-a-z/commons-select/science-and-technology-committee/inquiries/parliament-2015/digital-skills-inquiry-15-16/

[2] UN Convention of the Rights of the Child

[3] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

[4] Nicky Morgan’s full speech at BETT

[5] The defenddigitalme campaign to ask the Department forEducation to change practices and policy around The National Pupil Database

 

 

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (1)

Nicky Morgan, the Education Secretary,  gave a speech [1] this week and shared her dream of the benefits technology for pupils.

It mentioned two initiatives to log children’s individual actions; one is included in a consultation on new statutory guidance, and the other she praised, is a GPS based mobile monitoring app.

As with many new applications of technology, how the concept is to be implemented in practice is important to help understand how intrusive any new use of data is going to be.

Unfortunately for this consultation there is no supporting code of practice what the change will mean, and questions need asked.

The most significant aspects in terms of changes to data collection through required monitoring are in the areas of statutory monitoring, systems, and mandatory teaching of ‘safeguarding’:

Consultation p11/14: “We believe including the requirement to ensure appropriate filtering and monitoring are in place, in statutory guidance, is proportional and reasonable in order to ensure all schools and colleges are meeting this requirement. We don’t think including this requirement will create addition burdens for the vast majority of schools, as they are already doing this, but we are keen to test this assumption.”

Consultation:  paragraph 75 on page 22 introduces this guidance section and ends with a link to “Buying advice for schools.” “Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. Guidance on e-security is available from the National Education Network.

Guidance: para 78 “Whilst it is essential that governing bodies and proprietors ensure that appropriate filters and monitoring systems are in place they should be careful  that “over blocking” does not lead to unreasonable restrictions as to what children can be taught with regards to online teaching  and safeguarding.” —

Consultation: “The Opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and  “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities. This is an important topic and the assumption is the vast majority of governing bodies and proprietors will already be ensuring the children in their school are suitably equipped with regards to safeguarding. But we are keen to hear views as to the change in emphasis.”

Paragraph 88 on p24  is oddly phrased: “Governing bodies and proprietors should ensure that staff members do not agree confidentiality and always act in the best interests of the child.”

What if confidentiality may sometimes be in the best interests of the child? What would that mean in practice?

 

Keeping Children Safe in Education – Questions on the Consultation and Use in practice

The consultation [2] is open until February 16th, and includes a new requirement to have web filtering and monitoring systems.

Remembering that 85% of children’s waking hours are spent outside school, and in a wide range of schools our children aged 2 -19, not every moment is spent unsupervised and on-screen, is it appropriate that this 24/7 monitoring would be applied to all types of school?

This provider software is reportedly being used in nearly 1,400 secondary schools in the UK.  We hear little about its applied use.

The cases of cyber bullying or sexting in schools I hear of locally, or read in the press, are through smartphones. Unless the school snoops on individual devices I wonder therefore if the cost, implementation and impact is proportionate to the benefit?

  1. Necessary and proportionate? How does this type of monitoring compare with other alternatives?
  2. Privacy impact assessment? Has any been done – surely required as a minimum measure?
  3. Cost benefit risk assessment of the new guidance in practice?
  4. Problem vs Solution: What problem is it trying to solve and how will they measure if it is successful, or stop its use if it is not?  Are other methods on offer?
  5. Due diligence: how do parents know that the providers have undergone thorough vetting and understand who they are? After all, these providers have access to millions of our children’s  online interactions.
  6. Evidence: If it has been used for years in school, how has it been assessed against other methods of supervision?
  7. The national cash cost: this must be enormous when added up for every school in the country, how is cost balanced against risk?
  8. Intangible costs – has anyone asked our children’s feeling on this? Where is the boundary between what is constructive and creepy? Is scope change documented if they decide to collect more data?

Are we Creating a Solution that Solves or creates a Problem?

The private providers would have no incentive to say their reports don’t work and schools, legally required to be risk averse, would be unlikely to say stop if there is no outcome at all.

Some providers  include “review of all incidents by child protection and forensic experts; freeing up time for teachers to focus on intervention” and “trends such as top users can be viewed.” How involved are staff who know the child as a first point of information sharing?

Most tools are multipurposed and I understand the reasons given behind them, but how it is implemented concerns me.

If the extent of these issues really justify this mass monitoring in every school – what are we doing to fix the causes, not simply spy on every child’s every online action in school? (I look at how it extends outside in part two.)

Questions on Public engagement: How are children and families involved in the implementation and with what oversight?

  1. Privacy and consent: Has anyone asked pupils and parents what they think and what rights they have to say no to sharing data?
  2. Involvement: Are parents to be involved and informed in software purchasing and in all data sharing decisions at local or regional level? Is there consistency of message if providers vary?
  3. Transparency: Where are the data created through the child’s actions stored, and for how long? Who has access to the data? What actions may result from it? And with what oversight?
  4. Understanding: How will children and parents be told what is “harmful and inappropriate content” as loosely defined by the consultation, and what they may or may not research? Children’s slang changes often, and “safeguarding” terms are subjective.
  5. Recourse: Will it include assessment of unintended consequences from misinterpretation of information gathered?
  6. Consent: And can I opt my child out from data collection by these unknown and ‘faceless’ third parties?

If children and parents are told their web use is monitored, what chilling effect may that have on their trust of the system, of teaching staff, and their ability to look for content to support their own sensitive concerns or development  that they may not be able to safe to look for at home? What limitation will that put on their creativity?

These are all questions that should be asked to thoroughly understand the consultation and requires wide public examination.

Since key logging is already common practice (according to provider websites) and will effectively in practice become statutory, where is the public discussion? If it’s not explicitly statutory, should pupils be subject to spying on their web searches in a postcode lottery?

What exactly might this part of the new guidance mean for pupils?

In part two, I include the other part of her speech, the GPS app and ask whether if we track every child in and outside school, are we moving closer to the digital dream, or nightmare, in the search to close the digital skills gap?

****

References:

[1] Nicky Morgan’s full speech at BETT

[2] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

“Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. [Proposed statutory guidance]

Since “guidance on procuring appropriate ICT” from the National Education Network NEN* is offered, it is clearly intended that this ‘system’ to be ‘in place’, should be computer based. How will it be applied in practice? A number of the software providers for schools already provide services that include key logging, using “keyword detection libraries” that “provides a complete log of all online activity”.

(*It’s hard to read more about as many of NEN’s links are dead.)  

Access to school pupil personal data by third parties is changing

The Department for Education in England and Wales [DfE] has lost control of who can access our children’s identifiable school records by giving individual and sensitive personal data out to a range of third parties, since government changed policy in 2012. It looks now like they’re panicking how to fix it.

Applicants wanting children’s personal identifiable and/or sensitive data now need to first apply for the lowest level criminal record check, DBS, in the access process, to the National Pupil Database.

Schools Week wrote about it and asked for comment on the change [1] (as discussed by Owen in his blog [2] and our tweets).

At first glance, it sound like a great idea, but what real difference will this make to who can receive 8 million school pupils’ data?

Yes, you did read that right.

The National Pupil Database gives away the personal data of eight million children, aged 2-19. Gives it away outside its own protection,  because users get sent raw data, to their own desks.[3]

It would be good to know people receiving your child’s data hadn’t ever been cautioned or convicted about something related to children in their past, right?

Unfortunately, this DBS check won’t tell the the Department for Education (DfE) that – because it’s the the basic £25 DBS check [4], not full version.

So this change seems less about keeping children’s personal data safe than being seen to do something. Anything. Anything but the thing that needs done. Which is to keep the data secure.

Why is this not a brilliant solution? 

Moving towards the principle of keeping the data more secure is right, but in practice, the DBS check is only useful if it would make data safe by stopping people receiving data and the risks associated with data misuse. So how will this DBS check achieve this? It’s not designed for people who handle data. It’s designed for people working with children.

There is plenty of evidence available of data inappropriately used for commercial purposes often in the news, and often through inappropriate storage and sharing of data as well as malicious breaches. I am not aware, and refer to this paper [5], of risks realised through malicious data misuse of data for academic purposes in safe settings. Though mistakes do happen through inappropriate processes, and through human error and misjudgement.

However it is not necessary to have a background check for its own sake. It is necessary to know that any users handle children’s data securely and appropriately, and with transparent oversight. There is no suggestion at all that people at TalkTalk are abusing data, but their customers’ data were not secure and those data held in trust are now being misused.

That risk is the harm that is likely to affect a high number of individuals if bulk personal data are not securely managed. Measures to make it so must be proportionate to that risk. [6]

Coming back to what this will mean for individual applicants and its purpose: Basic Disclosure contains only convictions considered unspent under The Rehabilitation of Offenders Act 1974. [7]

The absence of a criminal record does not mean data are securely stored or appropriately used by the recipient.

The absence of a criminal record does not mean data will not be forwarded to another undisclosed recipient and there be a way for the DfE to ever know it happened.

The absence of a criminal record showing up on the basic DBS check does not even prove that the person has no previous conviction related to misuse of people or of data. And anything you might consider ‘relevant’ to children for example, may have expired.

DBS_box copy

So for these reasons, I disagree that the decision to have a basic DBS check is worthwhile.  Why? Because it’s effectively meaningless and doesn’t solve the problem which is this:

Anyone can apply for 8m children’s personal data, and as long as they meet some purposes and application criteria, they get sent sensitive and identifiable children’s data to their own setting. And they do. [8]

Anyone the 2009 designed legislation has defined as a prescribed person or researcher, has come to mean journalists for example. Like BBC Newsnight, or Fleet Street papers. Is it right journalists can access my children’s data, but as pupils and parents we cannot, and we’re not even informed? Clearly not.

It would be foolish to be reassured by this DBS check. The DfE is kidding themselves if they think this is a workable or useful solution.

This step is simply a tick box and it won’t stop the DfE regularly giving away the records of eight million children’s individual level and sensitive data.

What problem is this trying to solve and how will it achieve it?

Before panicking to implement a change DfE should first answer:

  • who will administer and store potentially sensitive records of criminal convictions, even if unrelated to data?
  • what implications does this have for other government departments handling individual personal data?
  • why are 8m children’s personal and sensitive data given away ‘into the wild’ beyond DfE oversight in the first place?

Until the DfE properly controls the individual personal data flowing out from NPD, from multiple locations, in raw form, and its governance, it makes little material difference whether the named user is shown to have, or not have a previous criminal record. [9] Because the DfE has no idea if they are they only person who uses it.

The last line from DfE in the article is interesting: “it is entirely right that we we continue to make sure that those who have access to it have undergone the necessary background checks.”

Continue from not doing it before? Tantamount to a denial of change, to avoid scrutiny of the past and status quo? They have no idea who has “access” to our children’s data today after they have released it, except on paper and trust, as there’s no audit process.[10]

If this is an indicator of the transparency and type of wording the DfE wants to use to communicate to schools, parents and pupils I am concerned. Instead we need to see full transparency, assessment of privacy impact and a public consultation of coordinated changes.

Further, if I were an applicant, I’d be concerned that DfE is currently handling sensitive pupil data poorly, and wants to collect more of mine.

In summary: because of change in Government policy in 2012 and the way in which it is carried out in practice, the Department for Education in England and Wales [DfE] has lost control of who can access our 8m children’s identifiable school records. Our children deserve proper control of their personal data and proper communication about who can access that and why.

Discovering through FOI [11] the sensitivity level and volume of identifiable data access journalists are being given, shocked me. Discovering that schools and parents have no idea about it, did not.

This is what must change.

 

*********

If you have questions or concerns about the National Pupil Database or your own experience, or your child’s data used in schools, please feel free to get in touch, and let’s see if we can make this better to use our data well, with informed public support and public engagement.

********

References:
[1] National Pupil Database: How to apply: https://www.gov.uk/guidance/national-pupil-database-apply-for-a-data-extract

[2]Blogpost: http://mapgubbins.tumblr.com/post/132538209345/no-more-fast-track-access-to-the-national-pupil

[3] Which third parties have received data since 2012 (Tier 1 and 2 identifiable, individual and/or sensitive): release register https://www.gov.uk/government/publications/ national-pupil-database-requests-received

[4] The Basic statement content http://www.disclosurescotland.co.uk/disclosureinformation/index.htm

[5] Effective Researcher management: 2009 T. Desai (London School of Economics) and F. Ritchie (Office for National Statistics), United Kingdom http://www.unece.org/fileadmin/DAM/stats/documents/ece/ces/ge.46/2009/wp.15.e.pdf

[6] TalkTalk is not the only recent significant data breach of public trust. An online pharmacy that sold details of more than 20,000 customers to marketing companies has been fined £130,000 https://ico.org.uk/action-weve-taken/enforcement/pharmacy2u-ltd/

[7] Guidance on rehabilitation of Offenders Act 1974 https://www.gov.uk/government/uploads/system/uploads/
attachment_data/file/299916/rehabilitation-of-offenders-guidance.pdf

[8] the August 2014 NPD application from BBC Newsnight https://www.whatdotheyknow.com/request/293030/response/723407/attach/10/BBC%20Newsnight.pdf

[9] CPS Guidelines for offences involving children https://www.sentencingcouncil.org.uk/wp-content/uploads/Final_Sexual_Offences_Definitive_Guideline_content_web1.pdf
indecent_images_of_children/

[10] FOI request https://www.whatdotheyknow.com/request/pupil_data_application_approvals#outgoing-482241

[11] #saveFOI – I found out exactly how many requests had been fast tracked and not scrutinised by the data panel via a Freedom of Information Request, as well as which fields journalists were getting access to. The importance of public access to this kind of information is a reason to stand up for FOI  http://www.pressgazette.co.uk/press-gazette-launches-petition-stop-charges-foi-requests-which-would-be-tax-journalism

 

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex. Too little, too late.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex ed. They should be discussing prevention and personal data protection for all our personal data, not just one company, after the event.

Everyone’s been talking about TalkTalk and for all the wrong reasons. Data loss and a 15-year-old combined with a reportedly reckless response to data protection, compounded by lack of care.

As Rory Cellan-Jones wrote [1] rebuilding its reputation with customers and security analysts is going to be a lengthy job.

In Parliament Chi Onwarah, Shadow Minister for Culture & the Digital Economy, summed up in her question, asking the Minister to acknowledge “that all the innovation has come from the criminals while the Government sit on their hands, leaving it to businesses and consumers to suffer the consequences?”  [Hansard 2]

MPs were concerned for the 4 million* customers’ loss of name, date of birth, email, and other sensitive data, and called for an inquiry. [It may now be fewer*.] [3] The SciTech committee got involved too.

I hope this means Parliament will talk about TalkTalk not as the problem to be solved, but as one case study in a review of contemporary policy and practices in personal data handling.

Government spends money in data protection work in the [4] “National Cyber Security Programme”. [NCSP] What is the measurable outcome – particularly for TalkTalk customers and public confidence – from its £860M budget?  If you look at the breakdown of those sums, with little going towards data protection and security compared with the Home Office and Defence, we should ask if government is spending our money in an appropriately balanced way on the different threats it perceives. Keith Vaz suggested British companies that lose £34 billion every year to cybercrime. Perhaps this question will come into the inquiry.

This all comes after things have gone wrong.  Again [5]. An organisation we trusted has abused that trust by not looking after data with the stringency that customers should be able to expect in the 21st century, and reportedly not making preventative changes, apparent a year ago. Will there be consequences this time?

The government now saying it is talking about data protection and consequences, is like saying they’re talking sex education with teens, but only giving out condoms to the boys.

It could be too little too late. And they want above all to avoid talking about their own practices. Let’s change that.

Will this mean a review to end risky behaviour, bring in change, and be wiser in future?

If MPs explore what the NCSP does, then we the public, should learn more about what government’s expectations of commercial companies is in regards modern practices.

In addition, any MPs’ inquiry should address government’s own role in its own handling of the public’s personal data. Will members of government act in a responsible manner or simply tell others how to do so?

Public discussion around both commercial and state use of our personal data, should mean genuine public engagement. It should involve a discussion of consent where necessary for purposes  beyond those we expect or have explained when we submit our data, and there needs to be a change in risky behaviour in terms of physical storage and release practices, or all the talk, is wasted.

Some say TalkTalk’s  practices mean they have broken their contract along with consumer trust. Government departments should also be asking whether their data handling would constitute a breach of the public’s trust and reasonable expectations.

Mr Vaizey should apply his same logic to government handling data as he does to commercial handling. He said he is open to suggestions for improvement. [6]

Let’s not just talk about TalkTalk.

    • Let’s Talk Consequences: organisations taking risk seriously and meaningful consequences if not [7]
    • Let’s Talk Education: the education of the public on personal data use by others and rights and responsibilities we have [8]
    • Let’s Talk Parliament’s Policies and Practices: about its own complementary lack of data  understanding in government and understand what good practice is in physical storage, good governance and transparent oversight
    • Let’s Talk Public Trust: and the question whether government can be trusted with public data it already has and whether its current handling makes it trustworthy to take more [9]

Vaizey said of the ICO now in his own department: “The Government take the UK’s cyber-security extremely seriously and we will continue to do everything in our power to protect organisations and individuals from attacks.”

“I will certainly meet the Information Commissioner to look at what further changes may be needed in the light of this data breach. [..] It has extensive powers to take action and, indeed, to levy significant fines. “

So what about consequences when data are used in ways the public would consider a loss, and not through an attack or a breach, but government policy? [10]

Let’s Talk Parliament’s Policies and Practices

Commercial companies are not alone in screwing up the use and processing [11] management of our personal data. The civil service under current policy seems perfectly capable of doing by itself. [12]

Government data policy has not kept up with 21st century practices and to me seems to work in the dark, as Chi Onwarah said,

‘illuminated by occasional flashes of incompetence.’

This incompetence can risk harm to people’s lives, to business and to public confidence.

And once given, trust would be undermined by changing the purposes or scope of use for which it was given, for example as care.data plans to do after the pilot. A most risky idea.

Trust in these systems, whether commercial or state, is crucial. Yet reviews which highlight this, and make suggestions to support trust such as ‘data should never be (and currently is never) released with personal identifiers‘ in The Shakespeare Review have been ignored by government.

Where our personal data are not used well in government departments by the department themselves, they seem content to date to rely on public ignorance to get away with current shoddy practices.

Practices such as not knowing who all your customers are, because they pass data on to others. Practices, such as giving individual level identifiable personal data to third parties without informing the public, or asking for consent. Practices, such as never auditing or measuring any benefit of giving away others personal data.

“It is very important that all businesses, particularly those handling significant amounts of sensitive customer data, have robust procedures in place to protect those data and to inform customers when there may have been a data breach.” Ed Vaizey, Oct 26th, HOC

If government departments prove to be unfit to handle the personal data we submit in trust to the state today, would we be right to trust them with even more?

While the government is busy wagging fingers at commercial data use poor practices, the care.data debacle is evidence that not all its MPs or civil service understand how data are used in commercial business or through government departments.

MPs calling for commercial companies to sharpen up their data protection must understand how commercial use of data often piggy-backs the public use of our personal data, or others getting access to it via government for purposes that were unintended.

Let’s Talk Education

If the public is to understand how personal data are to be kept securely with commercial organisations, why should they not equally ask to understand how the state secures their personal data? Educating the public could lead to better engagement with research, better understanding of how we can use digital services and a better educated society as a whole. It seems common sense.

At a recent public event [13],  I asked civil servants talking about big upcoming data plans they announced, linking school data with more further education and employment data, I asked how they planned to involve the people whose data they would use. There was no public engagement to mention. Why not? Inexcusable in this climate.

Public engagement is a matter of trust and developing understanding in a relationship. Organisations must get this right.[14]

If government is discussing risky practices by commercial companies, they also need to look closer to home and fix what is broken in government data handling where it exposes us to risk through loss of control of our personal data.

The National Pupil Database for example, stores and onwardly shares identifiable individual sensitive data of at least 8m children’s records from age 2 -19. That’s twice as big as the TalkTalk loss was first thought to be.

Prevention not protection is what we should champion. Rather than protection after the events,  MPs and public must demand emphasis on prevention measures in our personal data use.

This week sees more debate on how and why the government will legislate to have more powers to capture more data about all the people in the country. But are government policy, process and practices fit to handle our personal data, what they do with it and who they give it to?

Population-wide gathering of data surveillance in any of its many forms is not any less real just because you don’t see it. Children’s health, schools, increases in volume of tax data collection. We don’t discuss enough how these policies can be used every day without the right oversight. MPs are like the conservative parents not comfortable talking to their teens about sleeping with someone. Just because you don’t know, it doesn’t mean they’re not doing it. [15] It just means you don’t want to know because if you find out they’re not doing it safely, you’ll have to do something about it.

And it might be awkward. (Meanwhile in schools real, meaningful PHSE has been left off the curriculum.)

Mr. Vaizey asked in the Commons for suggestions for improvement.

My suggestion is this. How government manages data has many options. But the principle should be simple. Our personal data needs not only protected, but not exposed to unnecessary risk in the first place, by commercial or state bodies. Doing nothing, is not an option.

Let’s Talk about more than TalkTalk

Teens will be teens. If commercial companies can’t manage their systems better to prevent a child successfully hacking it, then it’s not enough to point at criminal behaviour. There is fault to learn from on all sides. In commercial and state uses of personal data.

There is talk of new, and bigger, data sharing plans. [16]

Will the government wait to see  and keep its fingers crossed each month to see if our data are used safely at unsecured settings with some of these unknown partners data might be onwardly shared with, hoping we won’t find out and they won’t need to talk about it, or have a grown up public debate based on public education?

Will it put preventative measures in place appropriate to the sensitivity and volume of the data it is itself responsible for?

Will moving forward with new plans mean safer practices?

If government genuinely wants our administrative data at the heart of digital government fit for the 21st century, it must first understand how all government departments collect and use public data. And it must educate the public in this and commercial data use.

We need a fundamental shift in the way the government respects public opinion and shift towards legal and privacy compliance – both of which are lacking.

Let’s not talk about TalkTalk. Let’s have meaningful grown up debate with genuine engagement. Let’s talk about prevention measures in our data protection. Let’s talk about consent. It’s personal.

******

[1] Questions for TalkTalk: http://www.bbc.co.uk/news/technology-34636308

[2] Hansard: http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm151026/debtext/151026-0001.htm#15102612000004

[3] TalkTalk update: http://www.talktalkgroup.com/press/press-releases/2015/cyber-attack-update-tuesday-october-30-2015.aspx

[4] The Cyber Security Programme: http://www.civilserviceworld.com/articles/feature/depth-look-national-cyber-security-programme

[5] Paul reviews TalkTalk; https://paul.reviews/value-security-avoid-talktalk/

[6] https://ico.org.uk/for-organisations/guide-to-data-protection/conditions-for-processing/

[7] Let’s talk Consequences: the consequences of current failures to meet customers’ reasonable expectations of acceptable risk, are low compared with elsewhere.  As John Nicolson (East Dunbartonshire) SNP pointed out in the debate, “In the United States, AT&T was fined £17 million for failing to protect customer data. In the United Kingdom, the ICO can only place fines of up to £500,000. For a company that received an annual revenue of nearly £1.8 billion, a fine that small will clearly not be terrifying. The regulation of telecoms must be strengthened to protect consumers.”

[8] Let’s talk education: FOI request revealing a samples of some individual level data released to members of the press: http://www.theyworkforyou.com/debates/?id=2015-10-26b.32.0

The CMA brought out a report in June, on the use of consumer data, the topic should be familiar in parliament, but little engagement has come about as a result. It suggested the benefit:

“will only be realised if consumers continue to provide data and this relies on them being able to trust the firms that collect and use it”, and that “consumers should know when and how their data is being collected and used and be able to decide whether and how to participate. They should have access to information from firms about how they are collecting, storing and using data.”

[9] Let’s Talk Public Trust – are the bodies involved Trustworthy? Government lacks an effective data policy and is resistant to change. Yet it wants to collect ever more personal and individual level for unknown purposes from the majority of 60m people, with an unprecedented PR campaign.  When I heard the words ‘we want a mature debate’ it was reminiscent of HSCIC’s ‘intelligent grown up debate’ requested by Kinglsey Manning, in a speech when he admitted lack of public knowledge was akin to a measure of past success, and effectively they would rather have kept the use of population wide health data ‘below the radar’.

Change: We need change, the old way after all, didn’t work, according to Minister Matt Hancock: “The old model of government has failed, so we will build a new one.” I’d like to see what that new one will look like. Does he mean to expand only data sharing policy, or the powers of the civil service?

[10] National Pupil Database detailed data releases to third parties https://www.whatdotheyknow.com/request/pupil_data_national_pupil_databa

[11] http://adrn.ac.uk/news-events/latest-news/adrn-rssevent

[12] http://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[13] https://www.liberty-human-rights.org.uk/human-rights/privacy/state-surveillance

[14] http://www.computerweekly.com/news/4500256274/Government-will-tackle-barriers-to-sharing-and-linking-data-says-Cabinet-Office-minister-Hancock

Free School Meals: A political football and the need for research to referee

I wrote this post in July 2014, before the introduction of the universal infant free school meals programme (UIFSM) and before I put my interest in data to work. Here’s an updated version. My opinion why I feel it is vital that  public health and socio economic research should create an evidence base that justifies or refutes policy. 

I wondered last year whether our children’s health and the impact of UIFSM was simply a political football, which was given as a concession in the last Parliament, rushed through to get checked-off, without being properly checked out first?

How is UIFSM Entitlement Measured and What Data do we Have?

I have wondered over this year how the new policy which labels more children as entitled to free school meals may affect public health and social research.

The Free School Meal (FSM) indicator has been commonly used as a socio-economic indicator.

In fact, there is still a practical difference within the ‘free school meals’ label.

In my county, West Sussex, those who are entitled to FSM beyond infants must actively register online. Although every child in Reception, Years 1 and 2  is automatically entitled to UIFSM, parents in receipt of the state income benefits must actively register with county to have an FSM eligibility check, so that schools receive the Pupil Premium.  Strangely having to register for ‘Free School Meals’ where others need not under automatic entitlement in infants – because it’s not called as it probably should be ‘sign up for Pupil Premium’ which benefits the school budget and one hopes, the child with support or services they would not otherwise get.

Registering for a free school meal eligibility check could raise an extra grant of £1,320 per year, per child, for the child’s primary school, or £935 per child for secondary schools, to fund valuable support like extra tuition, additional teaching staff or after school activities. [source]

Researchers will need to give up the FSM indicator used as an adopted socio-economic function in age groups under 8. Over 8 (once children leave infants) only those entitled due to welfare status and actively  registered will have the FSM label. Any comparative research can only use the Pupil Premium status, but as the benefits which permit applying for it changed too, comparison will be hard. An obvious and important change to remember measuring  the effects of the policy change have had.

One year on, I’d also like to understand how research may capture the changes of children’s experience in reality.

There are challenges in this; not least getting hold of the data. Given that private providers may not all be open to provision of information, do not provide data as open data, and separately, are not subject to the Freedom of Information Act, we may not be able to find out the facts around the changes and how catering meets the needs of some of our youngest children.

If it can be hard to access information from private providers held by them, it can be even harder to do research in the public interest using information about them. In my local area Capita manages a local database and the meal providers are private companies. (No longer staff directly employed and accountable to schools as once was).

[updated Aug 30 HT Owen Boswara for the link to the Guardian article in March 2015 reporting that there are examples where this has cut the Pupil Premium uptake]

Whom does it benefit most?

Quantity or Quality and Equality?

In last year’s post I considered food quality and profit for the meal providers.

I would now be interested to see research on what changes if any there have been in the profit and costs of school meal providers since the UIFSM introduction and what benefits we see for them compared with children.

4 in 10 children are classed as living in poverty but may not meet welfare benefit criteria according to Nick Clegg, on LBC on Sept 5th 2014. That was a scandalous admission of the whole social system failure on child poverty. Hats off to the nine year-old who asked good questions last year.

The entitlement is also not applied to all primary children equally, but infants only. So within one family some children are now entitled and others are not.

I wonder if this has reshaped family evening meals for those who do not quite qualify for FSM, where now one child has already ‘had a hot meal today’ and others have not?

The whole programme of child health in school is not only unequal in application to children by age, but is not made to apply to all schools equally.

Jamie Oliver did his darnedest to educate and bring in change, showing school meals needed improvement in quality across the board. What has happened to those quality improvements he championed? Abandoned at least in free school where schools are exempt from national standards. [update: Aug 25 his recent comment].

There is clearly need when so many children are growing up in an unfairly distributed society of have and have-not, but the gap seems to be ever wider. Is Jamie right that in England eating well is a middle class concern? Is it impossible in this country to eat cheaply and eat well?

In summary, I welcome anything that will help families feed their children well. But do free school dinners necessarily mean good nutrition? The work by the Trussel Trust and others, shows what desperate measures are needed to help children who need it most and simply ‘a free school meal’ is not necessarily a ticket to good food, without rigorous application and monitoring of standards, including reviewing in schools what is offered vs what children actually eat from the offering.

Where is the analysis for people based policy that will tackle the causes of need, and assess if those needs are being met?

Evidence based understanding

It appears there were pilots and trials but we hadn’t heard much about them before September 2014. I agreed with then MP David Laws, on the closure of school kitchens, but from my own experience, the UIFSM programme lacked adequate infrastructure and education before it began.

Mr. Laws MP said,

“It is going to be one of the landmark social achievements of this coalition government – good for attainment, good for health, great for British food, and good for hard working families. Ignore the critics who want to snipe from the sidelines.”

I don’t want to be a critic from the sidelines, I’d like to be an informed citizen and a parent and know that this programme brought in good food for good health. Good for very child, but I’d like to know it brought the necessary change for the children who really needed it. [Ignoring his comment on hard working families, which indicates some sort of value judgement and out of place.]

Like these people and their FOIs, I want to ask and understand. Will this have a positive effect on the nutrition children get, which may be inadequate today?

How will we measure if UIFSM is beneficial to children who need it most?

Data used well gives insights into society that researchers should use to learn from and make policy recommendations.

The data from the meal providers and the data on UIFSM indicators as well as Pupil Premium need looked at together. That won’t be easy.

What is accessible is the data held by the DfE but that may also be “off” for true comparison because the need for active sign up is reportedly patchy.

Data on individual pupils needs used with great care due to these measurement changes in practice as well as its sensitivity. To measure that the policy is working needs careful study accounting for all the different factors that changed at the same time. The NPD has pupil premium tracked but has its uptake affected the numbers as to make it a useful comparator?

Using this administrative data  — aggregated and open data — and at other detailed levels for bona fide research is vital to understand if policies work. The use of administrative data for research has widespread public support in the public interest, as long as it is done well and not for commercial use.

To make it more usefully available, and as I posted previously, I believe the Department of Education should shape up its current practices in its capacity as the data processor and controller of the National Pupil Database to be fit for the 21st century if it is to meet public expectations of how it should be done.

Pupils and parents should be encouraged to become more aware about information used about them, in the same way that the public should be encouraged to understand how that information is being used to shape policy.

At the same time as access to state held data could be improved, we should also demand that access to information for public health and social benefit should be required from private providers. Public researchers must be prepare to stand up and defend this need, especially at a time when Freedom of Information is also under threat and should in fact expanded to cover private providers like these, not be restricted further.

Put together, this data in secure settings with transparent oversight could be invaluable in the public interest. Being seen to do things well and seeing public benefits from the data will also future-proof public trust that is vital to research. It could be better for everyone.

So how and when will we find out how the UIFSM policy change made a difference?

What did UIFSM ever do for us?

At a time when so many changes have taken place around child health, education, poverty and its measurement it is vital that public health and socio economic research creates an evidence base that justifies or refutes policy.

In some ways, neutral academic researchers play the role of referee.

There are simple practical things which UIFSM policy ignores, such as 4 year-olds starting school usually start on packed lunch only for a half term to get to grips with the basics of school, without having to manage trays and getting help to cut up food. The length of time they need for a hot meal is longer than packed lunch. How these things have affected starting school is intangible.

Other tangible concerns need more attention, many of which have been reported in drips of similar feedback such as reduced school hall and gym access affecting all primary age children (not only infants) because the space needs to be used for longer due to the increase in numbers eating hot meals.

Research to understand the availability of facilities and time spent on sport in schools since the introduction of UIFSM will be interesting to look at together with child obesity rates.

The child poverty measurements also moved this year. How will this influence our perception of poverty and policies that are designed to tackle it?

Have we got the data to analyse these policy changes? Have we got analysis of the policy changes to see if they benefit children?

As a parent and citizen, I’d like to understand who positions the goalposts in these important public policies and why.

And who is keeping count of the score?

****

image source: The Independent

refs: Helen Barnard, JRF. http://www.jrf.org.uk/blog/2015/06/cutting-child-benefit-increasing-free-childcare-where-poverty-test