Category Archives: privacy

Commission on Freedom of Information: submission

Since it appears that the Independent Commission on Freedom of Information [FOI] has not published all of the received submissions, I thought I’d post what I’d provided via email.

I’d answered two of the questions with two case studies. The first on application of section 35 and 36 exemptions and the safe space. The second on the proposal for potential charges.

On the Commission website, the excel spreadsheet of evidence submitted online, tab 2 notes that NHS England asked belatedly for its submission be unpublished.

I wonder why.

Follow up to both these FOI requests are now long overdue in 2016. The first from NHS England for the care.data decision making  behind the 2015 decision not to publish a record of whether part of the board meetings were to be secret. Transparency needs to be seen in action, to engender public trust. After all, they’re deciding things like how care.data and genomics will be at the “heart of the transformation of the NHS.”

The second is overdue at the Department for Education on the legal basis for identifiable sensitive data releases from the National Pupil Database that meets Schedule 3 of the Data Protection Act 1998 to permit this datasharing with commercial third parties.

Both in line with the apparently recommended use of FOI
according to Mr. Grayling who most recently said:

“It is a legitimate and important tool for those who want to understand why and how Government is taking decisions and it is not the intention of this Government to change that”.  [Press Gazette]

We’ll look forward to see whether that final sentence is indeed true.

*******

Independent Commission on Freedom of Information Submission
Question 1: a) What protection should there be for information relating to the internal deliberations of public bodies? b) For how long after a decision does such information remain sensitive? c) Should different protections apply to different kinds of information that are currently protected by sections 35 and 36?

A “safe space” in which to develop and discuss policy proposals is necessary. I can demonstrate where it was [eventually] used well, in a case study of a request I made to NHS England. [1]

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. I asked in October 2014 for NHS England to publish the care.data planning and decision making for the national NHS patient data extraction programme. This programme has been controversial [2]. It will come at great public expense and to date has been harmful to public and professional trust with no public benefit. [3]

NHS England refused my request based on Section 22 [intended for future publication]. [4] However ten months later the meeting minutes had never been published. In July 2015, after appeal, the Information Commissioner issued an Information Notice and NHS England published sixty-three minutes and papers in August 2015.

In these released documents section 36 exemption was then applied to only a tiny handful of redacted comments. This was sufficient to protect the decisions that NHS England had felt to be most sensitive and yet still enable the release of a year’s worth of minutes.

Transparency does not mean that difficult decisions cannot be debated since only outcomes and decisions are recorded, not every part of every discussion verbatim.

The current provision for safe space using these exemptions is effective and in this case would have been no different made immediately after the meeting or one and a half years later.  If anything, publication sooner may have resulted in better informed policy and decision making through wider involvement from professionals and civil society.  The secrecy in the decision making did not build trust.

When policies such as these are found to have no financial business cost-benefit case for example, I believe it is strongly in the public interest to have transparency of these facts, to scrutinise the policy governance in the public interest to enable early intervention when seen to be necessary.
In the words of the Information Commissioner:

“FOIA can rightly challenge and pose awkward questions to public authorities. That is part of democracy. However, checks and balances are needed to ensure that the challenges are proportionate when viewed against all the other vital things a public authority has to do.

“The Commissioner believes that the current checks and balances in the legislation are sufficient to achieve this outcome.” [5]

Given that most public bodies, including NHS England’s Board, routinely publish its minutes this would seem a standard good practice to be expected and I believe routine publication of meeting minutes would have raised trustworthiness of the programme and its oversight and leadership.

The same section 36 exemption could have been applied from the start to the small redactions that were felt necessary balanced against the public interest of open and transparent decision making.

I do not believe more restrictive applications should be made than are currently under sections 35 and 36.

_____________________________________________________________________

Question 6: Is the burden imposed on public authorities under the Act justified by the public interest in the public’s right to know? Or are controls needed to reduce the burden of FoI on public authorities?

As an individual I made 40 requests of schools and 2 from the Department for Education which may now result in benefit for 8 million children and their families, as well as future citizens.

The transparency achieved through these Freedom of Information requests will I hope soon transform the culture at the the Department for Education from one of secrecy to one of openness.

There is the suggestion that a Freedom of Information request would incur a charge to the applicant.

I believe that the benefits of the FOI Act in the public interest outweigh the cost of FOI to public authorities.  In this second example [6], I would ask the Commission to consider if I had not been able to make these Freedom of Information requests due to cost, and therefore I was not able to present evidence to the Minister, Department, and the Information Commissioner, would the panel members support the secrecy around the ongoing risk that current practices pose to children and our future citizens?

Individual, identifiable and sensitive pupil data are released to third parties from the National Pupil Database without telling pupils, parents and schools or their consent. This Department for Education (DfE) FOI request aimed to obtain understanding of any due diligence and the release process: privacy impact and DfE decision making, with a focus on its accountability.

This was to enable transparency and scrutiny in the public interest, to increase the understanding of how our nation’s children’s personal data are used by government, commercial third parties, and even identifiable and sensitive data given to members of the press.

Chancellor Mr. Osborne spoke on November 17 about the importance of online data protection:

“Each of these attacks damages companies, their customers, and the public’s trust in our collective ability to keep their data and privacy safe.”[…] “Imagine the cumulative impact of repeated catastrophic breaches, eroding that basic faith… needed for our online economy & social life to function.”

Free access to FOI enabled me as a member of the public to ask and take action with government and get information from schools to improve practices in the broad public interest.

If there was a cost to this process I could not afford to ask schools to respond.  Schools are managed individually, and as such I requested the answer to the question; whether they were aware of the National Pupil Database and how the Department shared their pupils’ data onwardly with third parties.

I asked a range of schools in the South and East. In order to give a fair picture of more than one county I made requests from a range of types of school – from academy trusts to voluntary controlled schools – 20 primary and 20 secondary.  Due to the range of schools in England and Wales [7] this was a small sample.

Building even a small representative picture of pupil data privacy arrangements in the school system therefore required a separate request to each school.

I would not have been able to do this, had there been a charge imposed for each request.  This research subsequently led me to write to the Information Commissioner’s Office, with my findings.

Were this only to be a process that access costs would mean organisations or press could enter into due to affordability, then the public would only be able to find out what matters or was felt important to those organisations, but not what matters to individuals.

However what matters to one individual might end up making a big difference to many people.

Individuals may be interested in what are seen as minority topics, perhaps related to discrimination according to gender, sexuality, age, disability, class, race or ethnicity.  If individuals cannot afford to  challenge government policies that matter to them as an individual, we may lose the benefit that they can bring when they go on to champion the rights of more people in the country as a whole.

Eight million children’s records, from children aged 2-19 are stored in the National Pupil Database. I hope that due to the FOI request increased transparency and better practices will help restore their data protections for individuals and also re-establish organisational trust in the Department.

Information can be used to enable or constrain citizenship. In order to achieve universal access to human rights to support participation, transparency and accountability, I appeal that the Commission recognise the need for individuals to tackle vested interests, unjust laws and policies.

Any additional barriers such as cost, only serve to reduce equality and make society less just. There is however an immense intangible value in an engaged public which is hard to measure. People are more likely to be supportive of public servant decision making if they are not excluded from it.

Women for example are underrepresented in Parliament and therefore in public decision making. Further, the average gap within the EU pay is 16 per cent, but pay levels throughout the whole of Europe differ hugely, and in the South East of the UK men earn 25 per cent more than their female counterparts. [8]  Women and mothers like me may therefore find it more difficult to participate in public life and to make improvements on behalf of other families and children across the country.

To charge for access to information about our public decision making process could therefore be excluding and discriminatory.

I believe these two case studies show that the Act’s intended objectives, on parliamentary introduction — to ‘transform the culture of Government from one of secrecy to one of openness’; ‘raise confidence in the processes of government, and enhance the quality of decision making by Government’; and to ‘secure a balance between the right to information…and the need for any organisation, including Government, to be able to formulate its collective policies in private’ — work in practice.

If anything, they need strengthened to ensure accessibility.

Any actions to curtail free and equal access to these kinds of information would not be in the public interest and a significant threat to the equality of opportunity offered to the public in making requests. Charging would particularly restrict access to FOI for poorer individuals and communities who are often those already excluded from full public participation in public life.
___________________________________________________________________________

[1] https://www.whatdotheyknow.com/request/caredata_programme_board_minutes
[2] http://www.theguardian.com/society/2014/dec/12/nhs-patient-care-data-sharing-scheme-delayed-2015-concerns
[3] http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers
[4] https://jenpersson.com/wp-content/uploads/2015/11/caredataprogramme_FOI.pdf
[5] https://ico.org.uk/media/about-the-ico/consultation-responses/2015/1560175/ico-response-independent-commission-on-freedom-of-information.pdf
[6] https://jenpersson.com/wp-content/uploads/2015/11/NPD_FOI_submissionv3.pdf
[7] http://www.newschoolsnetwork.org/sites/default/files/Comparison%20of%20school%20types.pdf
[8] http://www.equalpayportal.co.uk/statistics/

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (2)

“Children do not lose their human rights by virtue of passing through the school gates” (UN Committee on the Rights of the Child, General Comment on ‘The aims of education’, 2001).

The Digital Skills in Schools inquiry [1] is examining the gap in education of our children to enable them to be citizens fit for the future.

We have an “educational gap” in digital skills and I have suggested it should not be seen only as functional or analytical, but should also address a gap in ethical skills and framework to equip our young people to understand their digital rights, as well as responsibilities.

Children must be enabled in education with opportunity to understand how they can grow “to develop physically, mentally, morally, spiritually and socially in a healthy and normal manner and in conditions of freedom and dignity”. [2]

Freedom to use the internet in privacy does not mean having to expose children to risks, but we should ask, are there ways of implementing practices which are more proportionate, and less intrusive than monitoring and logging keywords [3] for every child in the country? What problem is the DfE trying to solve and how?

Nicky Morgan’s “fantastic” GPS tracking App

The second technology tool Nicky Morgan mentioned in her BETT speech on January 22nd, is an app with GPS tracking and alerts creation. Her app verdict was “excellent” and “fantastic”:

“There are excellent examples at the moment such as the Family First app by Group Call. It uses GPS in mobile phones to help parents keep track of their children’s whereabouts, allowing them to check that they have arrived safely to school, alerting them if they stray from their usual schedule.” [4]

I’m not convinced tracking every child’s every move is either excellent or fantastic. Primarily because it will foster a nation of young people who feel untrusted, and I see a risk it could create a lower sense of self-reliance, self-confidence and self-responsibility.

Just as with the school software monitoring [see part one], there will be a chilling effect on children’s freedom if these technologies become the norm. If you fear misusing a word in an online search, or worry over stigma what others think, would you not change your behaviour? Our young people need to feel both secure and trusted at school.

How we use digital in schools shapes our future society

A population that trusts one another and trusts its government and organisations and press, is vital to a well functioning society.

If we want the benefits of a global society, datasharing for example to contribute to medical advance, people must understand how their own data and digital footprint fits into a bigger picture to support it.

In schools today pupils and parents are not informed that their personal confidential data are given to commercial third parties by the Department for Education at national level [5]. Preventing public engagement, hiding current practices, downplaying the risks of how data are misused, also prevents fair and transparent discussion of its benefits and how to do it better. Better, like making it accessible only in a secure setting not handing data out to Fleet Street.

For children this holds back public involvement in the discussion of the roles of technology in their own future. Fear of public backlash over poor practices must not hold back empowering our children’s understanding of digital skills and how their digital identity matters.

Digital skills are not shorthand for coding, but critical life skills

Skills our society will need must simultaneously manage the benefits to society and deal with great risks that will come with these advances in technology; advances in artificial intelligence, genomics, and autonomous robots, to select only three examples.

There is a glaring gap in their education how their own confidential personal data and digital footprint fit a globally connected society, and how they are used by commercial business and third parties.

There are concerns how apps could be misused by others too.

If we are to consider what is missing in our children’s preparations for life in which digital will no longer be a label but a way of life, then to identify the gap, we must first consider what we see as whole.

Rather than keeping children safe in education, as regards data sharing and digital privacy, the DfE seems happy to keep them ignorant. This is no way to treat our young people and develop their digital skills, just as giving their data away is not good cyber security.

What does a Dream for a  great ‘digital’ Society look like?

Had Martin Luther King lived to be 87 he would have continued to inspire hope and to challenge us to fulfill his dream for society – where everyone would have an equal opportunity for “life, liberty and the pursuit of happiness.”

Moving towards that goal, supported with technology, with ethical codes of practice, my dream is we see a more inclusive, fulfilled, sustainable and happier society. We must educate our children as fully rounded digital and data savvy individuals, who trust themselves and systems they use, and are well treated by others.

Sadly, introductions of these types of freedom limiting technologies for our children, risk instead that it may be a society in which many people do not feel comfortable, that lost sight of the value of privacy.

References:

[1] Digital Skills Inquiry: http://www.parliament.uk/business/committees/committees-a-z/commons-select/science-and-technology-committee/inquiries/parliament-2015/digital-skills-inquiry-15-16/

[2] UN Convention of the Rights of the Child

[3] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

[4] Nicky Morgan’s full speech at BETT

[5] The defenddigitalme campaign to ask the Department forEducation to change practices and policy around The National Pupil Database

 

 

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (1)

Nicky Morgan, the Education Secretary,  gave a speech [1] this week and shared her dream of the benefits technology for pupils.

It mentioned two initiatives to log children’s individual actions; one is included in a consultation on new statutory guidance, and the other she praised, is a GPS based mobile monitoring app.

As with many new applications of technology, how the concept is to be implemented in practice is important to help understand how intrusive any new use of data is going to be.

Unfortunately for this consultation there is no supporting code of practice what the change will mean, and questions need asked.

The most significant aspects in terms of changes to data collection through required monitoring are in the areas of statutory monitoring, systems, and mandatory teaching of ‘safeguarding’:

Consultation p11/14: “We believe including the requirement to ensure appropriate filtering and monitoring are in place, in statutory guidance, is proportional and reasonable in order to ensure all schools and colleges are meeting this requirement. We don’t think including this requirement will create addition burdens for the vast majority of schools, as they are already doing this, but we are keen to test this assumption.”

Consultation:  paragraph 75 on page 22 introduces this guidance section and ends with a link to “Buying advice for schools.” “Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. Guidance on e-security is available from the National Education Network.

Guidance: para 78 “Whilst it is essential that governing bodies and proprietors ensure that appropriate filters and monitoring systems are in place they should be careful  that “over blocking” does not lead to unreasonable restrictions as to what children can be taught with regards to online teaching  and safeguarding.” —

Consultation: “The Opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and  “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities. This is an important topic and the assumption is the vast majority of governing bodies and proprietors will already be ensuring the children in their school are suitably equipped with regards to safeguarding. But we are keen to hear views as to the change in emphasis.”

Paragraph 88 on p24  is oddly phrased: “Governing bodies and proprietors should ensure that staff members do not agree confidentiality and always act in the best interests of the child.”

What if confidentiality may sometimes be in the best interests of the child? What would that mean in practice?

 

Keeping Children Safe in Education – Questions on the Consultation and Use in practice

The consultation [2] is open until February 16th, and includes a new requirement to have web filtering and monitoring systems.

Remembering that 85% of children’s waking hours are spent outside school, and in a wide range of schools our children aged 2 -19, not every moment is spent unsupervised and on-screen, is it appropriate that this 24/7 monitoring would be applied to all types of school?

This provider software is reportedly being used in nearly 1,400 secondary schools in the UK.  We hear little about its applied use.

The cases of cyber bullying or sexting in schools I hear of locally, or read in the press, are through smartphones. Unless the school snoops on individual devices I wonder therefore if the cost, implementation and impact is proportionate to the benefit?

  1. Necessary and proportionate? How does this type of monitoring compare with other alternatives?
  2. Privacy impact assessment? Has any been done – surely required as a minimum measure?
  3. Cost benefit risk assessment of the new guidance in practice?
  4. Problem vs Solution: What problem is it trying to solve and how will they measure if it is successful, or stop its use if it is not?  Are other methods on offer?
  5. Due diligence: how do parents know that the providers have undergone thorough vetting and understand who they are? After all, these providers have access to millions of our children’s  online interactions.
  6. Evidence: If it has been used for years in school, how has it been assessed against other methods of supervision?
  7. The national cash cost: this must be enormous when added up for every school in the country, how is cost balanced against risk?
  8. Intangible costs – has anyone asked our children’s feeling on this? Where is the boundary between what is constructive and creepy? Is scope change documented if they decide to collect more data?

Are we Creating a Solution that Solves or creates a Problem?

The private providers would have no incentive to say their reports don’t work and schools, legally required to be risk averse, would be unlikely to say stop if there is no outcome at all.

Some providers  include “review of all incidents by child protection and forensic experts; freeing up time for teachers to focus on intervention” and “trends such as top users can be viewed.” How involved are staff who know the child as a first point of information sharing?

Most tools are multipurposed and I understand the reasons given behind them, but how it is implemented concerns me.

If the extent of these issues really justify this mass monitoring in every school – what are we doing to fix the causes, not simply spy on every child’s every online action in school? (I look at how it extends outside in part two.)

Questions on Public engagement: How are children and families involved in the implementation and with what oversight?

  1. Privacy and consent: Has anyone asked pupils and parents what they think and what rights they have to say no to sharing data?
  2. Involvement: Are parents to be involved and informed in software purchasing and in all data sharing decisions at local or regional level? Is there consistency of message if providers vary?
  3. Transparency: Where are the data created through the child’s actions stored, and for how long? Who has access to the data? What actions may result from it? And with what oversight?
  4. Understanding: How will children and parents be told what is “harmful and inappropriate content” as loosely defined by the consultation, and what they may or may not research? Children’s slang changes often, and “safeguarding” terms are subjective.
  5. Recourse: Will it include assessment of unintended consequences from misinterpretation of information gathered?
  6. Consent: And can I opt my child out from data collection by these unknown and ‘faceless’ third parties?

If children and parents are told their web use is monitored, what chilling effect may that have on their trust of the system, of teaching staff, and their ability to look for content to support their own sensitive concerns or development  that they may not be able to safe to look for at home? What limitation will that put on their creativity?

These are all questions that should be asked to thoroughly understand the consultation and requires wide public examination.

Since key logging is already common practice (according to provider websites) and will effectively in practice become statutory, where is the public discussion? If it’s not explicitly statutory, should pupils be subject to spying on their web searches in a postcode lottery?

What exactly might this part of the new guidance mean for pupils?

In part two, I include the other part of her speech, the GPS app and ask whether if we track every child in and outside school, are we moving closer to the digital dream, or nightmare, in the search to close the digital skills gap?

****

References:

[1] Nicky Morgan’s full speech at BETT

[2] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

“Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. [Proposed statutory guidance]

Since “guidance on procuring appropriate ICT” from the National Education Network NEN* is offered, it is clearly intended that this ‘system’ to be ‘in place’, should be computer based. How will it be applied in practice? A number of the software providers for schools already provide services that include key logging, using “keyword detection libraries” that “provides a complete log of all online activity”.

(*It’s hard to read more about as many of NEN’s links are dead.)  

Children’s private chats and personal data lost through VTech toys

If you’ve got young children who have an Innotab console  or other ed tech apps and games from Vtech, then you need to pay attention.

Your and your children’s personal data may have been stolen. The Vtech security breach has exposed private data of more than six million children worldwide, including 700,000 British customers.IMG_3125

The games are designed for children age 2-9. The loss reportedly includes thousands of pictures of children and parents, as well as a year’s worth of chat logs, names and addresses.

Where from? Well, the information that parents and children entered in set up or created using the games like Innotab for example.  The Innotab using an apps allows children to record voice and text messages, take photos and send these to the matching app on the parents’ phone. The data from both users has been lost. The link is the registered email account that connects both child’s toy, and parent’s phone, via the downloaded app.

And why kids’ photos may be included, is that during the set up, a profile photo can be taken by the child, and stored and used in a similar way to social media sites.

VTech’s Learning Lodge app store customer database is affected and VTech Kid Connect servers accessed. As a precautionary step, Vtech says on their website, they have suspended Learning Lodge, the Kid Connect network and a dozen websites temporarily whilst they ‘conduct a thorough security assessment’.

Reactions

One mum spoke to Good Morning Britain about how she felt when she discovered information about her child may have been stolen.

She says she hadn’t received any notification about the loss from VTech and didn’t know about it until she saw it on the six o’clock news. She then pro-actively contacted VTech customer services.

VTech’s response was confused, first telling her they had not lost data related to the KidConnect app – but in a later email say they did.

What’s disappointing in GMB’s coverage they focused in the VTech story on how disappointing this would be for the toymaker VTech in the run up to Christmas.

There was little information for families on what this could mean for using the toys safely in future.  They went on to talk about some other web based tools, but didn’t talk about data protection which really should be much stronger for children’s data.

While parents must take an active role in thinking ahead for our children and how their digital identities can be compromised, we also need to be able to rely on organisations with whom we entrust our own and our children’s personal data, and know that when they ask us for data that they will look after it securely, and use it in ways we expect. On the technical side, data security systems need to be proportionate to the risks they place children in, if data are breached. This is true of commercial companies and public bodies.

On the human side, public transparency and good communication are key to managing expectations, to ensure we know what users sign up to, and to know what happens if things go wrong.

Sadly VTech is continuing to downplay the loss of children’s personal data. In their first statement their focus was to tell people not to worry because credit card details were not included.

When I asked five days after the breach was announced, VTech declined to confirm to me whether avatars and profile pictures had been accessed, arguing that its internal investigation is still ongoing. That’s now two weeks ago.

Their FAQs still say this is unknown. If this is true it would appear surprisingly incompetent on the part of VTech to know that all the other items have been lost in detail, but not profile pictures.

That it is possible for personal details that include date of birth, address and photo to all be lost together is clearly a significant threat for identity theft. It shows one risk of having so much identifiable personal data stored in one place.

The good news, is that it appears not to have any malicious motive. According to the report in Motherboard; “The hacker who broke into VTech’s systems […] that he never intended to release the data to the public.

”Frankly, it makes me sick that I was able to get all this stuff,” the hacker told [Motherboard] in an encrypted chat on Monday.

Could this be the biggest consumer breach of children’s personal data in history?

What now for the 700,000 users of the systems?

Parent accounts need to be directly and fully informed by VTech:

a) what was compromised, by platform, by website, or by customer

b) if and how they will be able to use their equipment again

c) how children’s data would be made safe in future and what Vtech are doing that will different from how they handled data before

Any organisation needs to demonstrate through full transparency and how it acts in the event of such a breach, that it is worthy of trust.

The children’s toys and systems appear to have been shut down.

They’re not cheap with the Innotab coming in at around £90 and its cartridge games upwards of £17 each.  Toy sellers will be in the front line for public facing questions in the shops. Anyone that had already bought these just before Christmas will be wondering what to do now, if access to the systems and the apps have been shut down, they won’t work.

And if and when they do work, will they work securely?

Did Vtech contact you and tell you about the breach?

The sensible thing is to stop using that email address, change the password at very minimum and not only on the adult’s phone and child’s game app, but also anywhere else you use it.

What else do you need to know?

What do parents do when their child’s digital identity has been compromised?

More information is needed from the company, and soon.

####

If you want to get in touch, come over and visit defenddigitalme.com You can also sign up to the Twitter group, support the campaign to get 8 million school pupils’ data made safe, or leave a comment.

####

References:

VTech website FAQs as of December 3, 2015

November 28, 2015: blog TryHunt.com by Microsoft MVP for Developer Security

December 1, 2015: Motherboard report by @josephfcox

December 1, 2015: Motherboard article by Lorenzo Franceschi-Bicchierai

 

 

 

Act now: Stand up and speak out for your rights to finding out the facts #saveFOI

The Freedom of Information Act has enabled me to stand up for my children’s rights. It really matters to me. And we might lose it.

For every member of the public, who has ever or who has never used their rights under the Freedom of Information Act laws, the government consultation on changing them that closes today is worth caring about. If you haven’t yet had your say, go and take action here >> now.  If it is all you have time for before the end of today, you can sign 38 degrees petition or write an email to your MP.

Or by the end of today you can reply to the call for evidence. There is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also got this plain English version.

Please do. Now. It closes today, on November 20th.

If you need convinced why it matters to me and it should to you, read on.

What will happen

If the proposed changes come to pass, information about public accountability will be lost. Political engagement will not be open to all equally. It will promote an unfair society in which individuals are not only prevented from taking part in full public life, but prevented from understanding decisions made about them or that affect them. Campaign groups will be constrained from standing up for human rights by cost.  The press will be restrained in what they can ask.

MySociety has a brilliant summary.  Michael Sheen spoke up calling it “nothing short of a full frontal attack” on the principles of democratic government. And Tom Watson spoke of three serious instances where facts would have stayed hidden, were it not for access made using the law of Freedom of Information:

1. death rates in cardiac patient care
2. cases when the police use Tasers on children
3. the existence of cracks in the nuclear power station at Hinckley

Why does FOI matter to me personally? In Education.

Because it’s enabled me to start a conversation to get the Department for Education to start to improve their handling of our 8 million children’s personal and sensitive data they hold in the National Pupil Database for England and Wales. Through FOI I asked for unpublished facts how many releases of identifiable personal data of school pupils have been fast-tracked at the Department of Education without panel oversight. And to see the panel terms of reference which are still not on their website.

The request: whatdotheykknow.com
The outcome:
National Pupil Database FOI case study summary here.

I’m now coordinating calls for changes on behalf of the 8m children whose records they hold and parents across the country.

******

Why does FOI matter to me personally? In Health.

Because Freedom of Information law has enabled public transparency of decision making and accountability of the care.data programme board decision making that was kept secret for over a year. NHS England refused to publish them. Their internal review declined appeal. The Information Commissioner’s Office upheld it.

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. In fact my case study, while highlighting that NHS England refused to release information, also shows that only a handful of genuine redactions were necessary, using Section 36 to keep them hidden, when the minutes were finally released.

In October 2014 I simply wanted to see the meeting minutes form part of the public record of care.data planning. I wanted to see the cost-benefit business case and scrutinise it against the benefits case that the public were told of at every public engagement event I had been to.  When at every turn the public is told how little money the NHS can afford to spend I wanted scrutiny of what the programme would cost at national and local levels. It was in the public interest to better inform public debate about the merits of the national programme. And I strongly believe that it is in the public interest to be informed and fully understand the intention and programme that demands the use of sensitive personal data.

The request: whatdotheyknow.com
The outcome: care.data FOI case study summary here.

Others could use this information I hoped, to ask the right questions about missing meeting minutes and transparency, and for everyone to question why there was no cost-benefit business plan at all in private; while the public kept being told of the benefits.  And it shows that data collection is further set to expand, without public debate.

Since then the programme has been postoned again and work is in progress on improved public engagement to enable public and professional confidence.

What has Freedom of Information achieved?

One of the most memorable results of Freedom of Information was the MPs expenses scandal. Who knows how much this Freedom of Information Request saved the taxpayers in immeasurable amounts of future spending on duck houses since MPs have been required to publish expenses since 2010? Four MPs were jailed for false accounting. Peers were expelled. Second homes and what appeared to the public as silly spending on sundries were revealed. Mr. Cameron apologized in 2009, saying he was “appalled” by the expenses. The majority of MPs had done nothing illegal but the Freedom of Information request enabled the start of a process of increased transparency to the public which showed where activities, while permitted by law, were simply unethical or unreasonable.

Historical record

Information published under the Freedom of Information Act can help to ensure that important records of decision-making processes are retained as part of the historic background to government.

Increased trust

The right information at the right time helps make better decisions, make spending more transparent and makes policies and practices more trustworthy.

Access to official information can also improve public confidence where public sector bodies are seen as being open. In a 2011 survey carried out on behalf of the Information Commissioner’s Office, 81% of public bodies questioned agreed that the Act had increased the public’s trust in their organisation.

A key argument made by the commission is that those in public office need private space for decision making. The Information Commissioner’s Office countered this in their submission to the consultation saying,

“there is a distinction between a need for a private space, depending on the circumstances and a desire for secrecy across a broad area of public sector activity. It was the latter tendency that FOIA was intended to correct.”

So how much more “private space” do public servants need?

Holding back information

When information that are judged should not be released in the public interest, there are already exemptions that can be applied to prevent disclosure of information under the Freedom of Information Act. [1]

The exemptions include:

  • if the information can easily be accessed by other means – e.g. the internet or published documents
  • if the information is personal information
  • if the information is provided in confidence (but only if legally enforceable)
  • when there is a legal reason not to disclose
  • if the information is about national security, defence, the economy, law enforcement, formulation of Government policy, health and safety, communications with Her Majesty or other royalty, international relations, intended for future publication and commercial interests. (All exemptions in this group must be tested to see if disclosure is in the public interest.)

In addition to these exemptions, organisations can withhold information if it will take more than two-and-a-half days to provide it, or they cannot identify what information is needed (although they have to work with the requester to clarify what is being requested).

They can also withhold information if they decide the request is vexatious.

Does it cost us too much to administer?

Some people who are supportive of these changes say they are concerned about costs in answering requests but have perhaps not considered the savings in exceptional cases (like the Expenses Scandal outcome). And as mySociety has reported [2], money spent responding to Freedom of Information requests also needs to be considered fairly in the context of wider public spending. In 2012 it was reported that Staffordshire County Council had spent £38,000 in a year responding to Freedom of Information requests. The then Director of mySociety, Tom Steinberg, commented:

“From this I can see that oversight by citizens and journalists cost only £38,000 from a yearly total budget of £1.3bn. I think it is fantastic that Staffordshire County Council can provide such information for only 0.002 per cent of its operating budget.”

Why does the government want to make itself less transparent? Even the Information Commissioner’s office has replied to the consultation to say that the Commissioner does not consider that significant changes to the core principles of the legislation are needed. This is a good law, that gives the public rights in our favour and transparency into how we are governed and tax money spent.

How will the value of FOI be measured of what would be lost if the changes are made?

What can you do?

The call for evidence is here and there is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also put together this super-easy Plain English version.

To have your say in the consultation closing on November 20th go online.

Or simply call or write to your MP.  Today. This really matters.


References:

[1] Requests can be refused https://ico.org.uk/for-organisations/guide-to-freedom-of-information/refusing-a-request/

[2] MySociety opposes restrictions https://www.mysociety.org/2015/11/11/voices-from-whatdotheyknow-why-we-oppose-foi-act-restrictions/

[3] National Pupil Database FOI case study summary here

[4] My care.data programme board FOI case study summary here

The National Pupil Database end of year report: an F in Fair Processing

National Pupil Database? What National Pupil Database? Why am I on it?

At the start of the school year last September 2014, I got the usual A4 pieces of paper. Each of my children’s personal details, our home address and contact details, tick boxes for method of transport each used to get to school, types of school meal eaten all listed, and a privacy statement at the bottom:

“Data Protection Act 1988: The school is registered under the Data Protection Act for holding personal data. The school has a duty to protect this information and to keep it up to date. The school is required to share some of the data with the Local Authority and with the DfE.”

There was no mention of the DfE sharing it onwards with anyone else. But they do, through the National Pupil Database [NPD] and  it is enormous [1].  It’s a database which holds personal information of every child who has ever been in state education since 2002, some data since 1996. [That includes me as both a student AND a parent.]

“Never heard of it?”

Well neither have I from my school, which is what I pointed out to the DfE in September 2014.

School heads, governors, and every parent I have spoken with in my area and beyond, are totally unaware of the National Pupil database. All are surprised. Some are horrified at the extent of data sharing at such an identifiable and sensitive level, without school and parental knowledge.[2]

Here’s a list what it holds. Fully identifiable data at unique, individual level. Tiered from 1-4, where 1 is the most sensitive. A full list of what data is available in each of the tiers and standard extracts can be found in the ‘NPD data tables’.

K5

I’d like to think it has not been deliberately hidden from schools and parents. I hope it has simply been careless about its communications.

Imagine that the data once gathered only for administration since 1996, was then decided about at central level and they forgot to tell the people whom they should have been asking. The data controllers and subjects the data were from – the schools, parents/guardians and pupils – were forgotten. That could happen when you see data as a commodity and not as people’ s personal histories.

The UK appears to have gathered admin data for years until the coalition decided it was an asset it could further exploit. The DfE may have told others in 2002 and in 2012 when it shaped policy on how the NPD would be used, but it forgot to tell the children whose information it is and used them without asking. In my book, that’s an abuse of power and misuse of data.

It seems to me that current data policies in practice across all areas of government have simply drifted at national level towards ever greater access by commercial users.

And although that stinks, it has perhaps arisen from lack of public transparency and appropriate oversight, rather than some nefarious intent.

Knowingly failing to inform schools, pupils and guardians how the most basic of our personal data are used is outdated and out of touch with public feeling. Not to mention, that it fails fair processing under Data Protection law.

Subject Access Request – User experience gets an ‘F’ for failing

The submission of the school census, including a set of named pupil records, is a statutory requirement on schools.

This means that children and parents data, regardless of how well or poorly informed they may be, are extracted for administrative purposes, and are used in addition to those we would expect, for various secondary reasons.

Unless the Department for Education makes schools aware of the National Pupil Database use and users, the Department fails to provide an adequate process to enable schools to meet their local data protection requirements. If schools don’t know, they can’t process data properly.

So I wrote to the Department for Education (DfE) in September 2014, including the privacy notice used in schools like ours, showing it fails to inform parents how our children’s personal data and data about us (as related parent/guardians) are stored and onwardly used by the National Pupil Database (NPD). And I asked three questions:

1. I would like to know what information is the minimum you require for an individual child from primary schools in England?

2. Is there an opt out to prevent this sharing and if so, under what process can parents register this?

3. Is there a mechanism for parents to restrict the uses of the data (i.e. opt out our family data) with third parties who get data from the National Pupil Database?

I got back some general information, but no answer to my three questions.

What data do you hold and share with third parties about my children?

In April 2015 I decided to find out exactly what data they held, so I made a subject access request [SAR], expecting to see the data they held about my children. They directed me to ask my children’s school instead and to ask for their educational record. The difficulty with that is, it’s a different dataset.

My school is not the data controller of the National Pupil Database. I am not asking for a copy of my children’s educational records held by the school, but what information that the NPD holds about me and my children. One set of data may feed the other but they are separately managed. The NPD is the data controller for that data it holds and as such I believe has data controller responsibility for it, not the school they attend.

Why do I care? Well for starters, I want to know if the data are accurate.  And I want to know who else has access to it and for what purposes – school can’t tell me that. They certainly couldn’t two months ago, as they had no idea the NPD existed.

I went on to ask the DfE for a copy of the publicly accessible subject access request (SAR) policy and procedures, aware that I was asking on behalf of my children. I couldn’t find any guidance, so asked for the SAR policy. They helpfully provided some advice, but I was then told:

“The department does not have a publicly accessible standard SAR policy and procedures document.”  and “there is not an expectation that NPD data be made available for release in response to a SAR.”

It seems policies are inconsistent. For this other DfE project, there is information about the database, how participants can opt out and  respecting your choice. On the DfE website a Personal Information Charter sets out “what you can expect when we ask for and hold your personal information.”

It says: “Under the terms of the Data Protection Act 1998, you’re entitled to ask us:

  • if we’re processing your personal data
  • to give you a description of the data we hold about you, the reasons why we’re holding it and any recipient we may disclose it to (eg Ofsted)
  • for a copy of your personal data and any details of its source

You’re also entitled to ask us to change the information we hold about you, if it is wrong.

To ask to see your personal data (make a ‘subject access request’), or to ask for clarification about our processing of your personal data, contact us via the question option on our contact form and select ‘other’.”

So I did. But it seems while it applies to that project,  Subject Access Request is not to apply to the data they hold in the NPD. And they finally rejected my request last week, stating it is exempt:

SAR_reject

I appealed the decision on the basis that the section 33 Data Protection Act criteria given, are not met:

“the data subject was made fully aware of the use(s) of their personal data (in the form of a privacy notice)”

But it remains rejected.

It seems incomprehensible that third parties can access my children’s data and I can’t even check to see if it is correct.

While acknowledging section 7 of the Data Protection Act 1998 (DPA) “an individual has the right to ask an organisation to provide them with information they hold which identifies them and, in certain circumstances, a parent can make such a request on behalf of a child” they refused citing the Research, History and Statistics exemption (i.e. section 33(4) of the DPA).

Fair processing, another F for failure and F for attitude

The Department of Education response to me said that it “makes it clear what information is held, why it is held, the uses made of it by DfE and its partners and publishes a statement on its website setting this out. Schools also inform parents and pupils of how the data is used through privacy notices.”

I have told the DfE the process does not work. The DfE / NPD web instructions do not reach parents. Even if they did, information is thoroughly inadequate and either deliberately hides or does so by omission, the commercial third party use of data.

The Department for Education made a web update on 03/07/2015 with privacy information to be made available to parents by schools: http://t.co/PwjN1cwe6r

Despite this update this year, it is inadequate on two counts. In content and communication.

To claim as they did in response to me that: “The Department makes it clear to children and their parents what information is held about pupils and how it is processed, through a statement on its website,” lacks any logic.

Updating their national web page doesn’t create a thorough communications process or engage anyone who does not know about it to start with.

Secondly, the new privacy policy is inadequate in it content and utterly confusing. What does this statement mean, is there now some sort of opt out on offer? I doubt it, but it is unclear:

“A parent/guardian can ask that no information apart from their child’s name, address and date of birth be passed to [insert name of local authority or the provider of Youth Support Services in your area] by informing [insert name of school administrator]. This right is transferred to the child once he/she reaches the age 16. For more information about services for young people, please go to our local authority website [insert link].” [updated privacy statement, July 3, 2015]

Information that I don’t know exists, about a database I don’t know exists, that my school does not know exists, they believe meets fair processing through a statement on its own website?

Appropriate at this time of year,  I have to ask, “you cannot be serious?”

Fair processing means transparently sharing the purpose or purposes for which you intend to process the information, not hiding some of the users through careful wording.

It thereby fails to legally meet the first data protection principle. as parents are not informed at all, never mind fully of further secondary uses.

As a parent, when I register my child for school, I of course expect that some personal details must be captured to administer their education.

There must be data shared to adequately administer, best serve, understand, and sometimes protect our children.  And bona fide research is in the public interest.

However I have been surprised in the last year to find that firstly, I can’t ask what is stored on my own children and that secondly, a wide range of sensitive data are shared through the Department of Education with third parties.

Some of these potential third parties don’t meet research criteria in my understanding of what a ‘researcher’ should be. Journalists? the MOD?

To improve, there would be little additional time or work burden required to provide proper fair processing as a starting point, but to do so, the department can’t only update a policy on its website and think it’s adequate. And the newly updated suggested text for pupils is only going to add confusion.

The privacy policy text needs carefully reworded in human not civil service speak.

It must not omit [as it does now] the full range of potential users.

After all the Data Protection principles state that: “If you wish to use or disclose personal data for a purpose that was not contemplated at the time of collection (and therefore not specified in a privacy notice), you have to consider whether this will be fair.”

Now that it must be obvious to DfE that it is not the best way to carry on, why would they choose NOT to do better? Our children deserve better.

What would better look like? See part 3. The National Pupil Database end of year report: a D in transparency, C minus in security.

*****

[PS: I believe the Freedom of Information Officer tried their best and was professional and polite in our email exchanges, B+. Can’t award an A as I didn’t get any information from my requests. Thank you to them for their effort.]

*****

Updated on Sunday 19th July to include the criteria of my SAR rejection.

1. Our children’s school data: an end of year report card
2. The National Pupil Database end of year report: an F in fair processing
3. The National Pupil Database end of year report: a D in transparency, C minus in security

References:

[1] The National Pupil Database user guide: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/261189/NPD_User_Guide.pdf

[2] The Department for Education has specific legal powers to collect pupil, child and workforce data held by schools, local authorities and awarding bodies under section 114 of the Education Act 2005section 537A of the Education Act 1996, and section 83 of the Children Act 1989. The submission of the school census returns, including a set of named pupil records, is a statutory requirement on schools under Section 537A of the Education Act 1996.

[3] Data tables to see the individual level data items stored and shared (by tabs on the bottom of the file) https://www.gov.uk/government/publications/national-pupil-database-user-guide-andsupporting-information

[4] The table to show who has bought or received data and for what purpose https://www.gov.uk/government/publications/national-pupil-database-requests-received

[5] Data Trust Deficit – from the RSS: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[6] Talk by Phil Booth and Terri Dowty: http://www.infiniteideasmachine.com/2013/04/terris-and-my-talk-on-the-national-pupil-database-at-the-open-data-institute/

[7] Presentation given by Paul Sinclair of the Department for Education at the Workshop on Evaluating the Impact of Youth Programmes, 3rd June 2013

The National Pupil Database end of year report: D for transparency, C minus in security.

Transparency and oversight of how things are administered are simple ways that the public can both understand and trust that things run as we expect.

For the National Pupil Database, parents might be surprised, as I was about some of the current practices.

The scope of use and who could access the National Pupil Database was changed in 2012 and although I had three children at school at that time and heard nothing about it, nor did I read it in the papers. (Hah – time to read the papers?)  So I absolutely agree with Owen Boswara’s post when he wrote:

“There appears to have been no concerted effort to bring the consultation or the NPD initiative to the attention of parents or pupils (i.e. the data subjects themselves). This is a quote from one of the parents who did respond:

“I am shocked and appalled that I wasn’t notified about this consultation through my child’s school – I read about it on Twitter of all things. A letter should have gone to every single parent explaining the proposals and how to respond to this consultation.”

(Now imagine that sentiment amplified via Mumsnet …)”
[July 2013, blog by O. Boswara]

As Owen wrote,  imagine that sentiment amplified via Mumsnet indeed.

Here’s where third parties can apply and here’s a list of who has been given data from the National Pupil Database . (It’s only been updated twice in 18 months. The most recent of which has been since I’ve asked about it, in .) The tier groups 1-4 are explained here on p.18, where 1 is the most sensitive identifiable classification.

The consultation suggested in 2012 that the changes could be an “effective engine of economic growth, social wellbeing, political accountability and public service improvement.”.  

Has this been measured at all if the justification given has begun to be achieved? Often research can take a long time and implementing any changes as a result, more time. But perhaps there has been some measure of public benefit already begun to be accrued?

The release panel would one hope, have begun to track this. [update: DfE confirmed August 20th they do not track benefits, nor have ever done any audit of recipients]

And in parallel what oversight governs checks and balances to make sure that the drive for the ‘engine of economic growth’ remembers to treat these data as knowledge about our children?

Is there that level of oversight from application to benefits measurement?

Is there adequate assessment of privacy impact and ethics in applications?

Why the National Pupil Database troubles me, is not the data it contains per se, but the lack of child/guardian involvement, lack of accountable oversight how it is managed and full transparency around who it is used by and its processes.

Some practical steps forward

Taken now, steps could resolve some of these issues and avoid the risk of them becoming future issues of concern.

The first being thorough fair processing, as I covered in my previous post.

The submission of the school census returns, including a set of named pupil records, has been a statutory requirement on schools since the Education Act 1996. That’s almost twenty years ago in the pre-mainstream internet age.

The Department must now shape up its current governance practices in its capacity as the data processor and controller of the National Pupil Database, to be fit for the 21st century.

Ignoring current weaknesses, actively accepts an ever-increasing reputational risk for the Department, schools, other data sharing bodies or those who link to the data and its bona fide research users. If people lose trust in data uses, they won’t share at all and the quality of data will suffer, bad for functional admin of the state and individual, but also for the public good.

That concerns me also wearing my hat as a lay member on the ADRN panel because it’s important that the public trusts our data is looked after wisely so that research can continue to use it for advances in health and social science and all sorts of areas of knowledge to improve our understanding of society and make it better.

Who decides who gets my kids data, even if I can’t?

A Data Management Advisory Panel (DMAP) considers applications for only some of the applications, tier 1 data requests. Those are the most, but not the only applications for access to sensitive data.

“When you make a request for NPD data it will be considered for approval by the Education Data Division (EDD) with the exception of tier 1 data requests, which will be assessed by the department’s Data Management Advisory Panel. The EDD will inform you of the outcome of the decision.”

Where is governance transparency?

What is the make up of both the Data Management Advisory Panel and and the Education Data Division (EDD)? Who sits on them and how are they selected? Do they document their conflicts of interest for each application? For how long are they appointed and under what selection criteria?

Where is decision outcome transparency?

The outcome of the decision should be documented and published. However, the list has been updated only twice since its inception in 2012. Once was December 2013, and the most recently was, ahem, May 18 2015. After considerable prodding. There should be a regular timetable, with responsible owner and a depth of insight into its decision making.

Where is transparency over decision making to approve or reject requests?

Do privacy impact assessments and ethics reviews play any role in their application and if so, how are they assessed and by whom?

How are those sensitive and confidential data stored and governed?

The weakest link in any system is often said to be human error. Users of the NPD data vary from other government departments to “Mom and Pop” small home businesses, selling schools’ business intelligence and benchmarking.

So how secure are our children’s data really, and once the data have left the Department database, how are they treated? Does lots of form filling and emailed data with a personal password ensure good practice, or simply provide barriers to slow down the legitimate applications process?

What happens to data that are no longer required for the given project? Are they properly deleted and what audits have ever been carried out to ensure that?

The National Pupil Database end of year report: a C- in security

The volume of data that can be processed now at speed is incomparable with 1996, and even 2012 when the current processes were set up. The opportunities and risks in cyber security have also moved on.

Surely the Department for Education should take responsibility seriously to treat our children’s personal data and sensitive records equally as well as the HSCIC now intends to manage health data?

Processing administrative or linked data in an environment with layered physical security (e.g. a secure perimeter, CCTV, security guarding or a locked room without remote connection such as internet access) is good practice. And reduces the risk of silly, human error. Or  simple theft.

Is giving out chunks of raw data by email, with reams of paperwork as its approval ‘safeguards’ really fit for the 21st century and beyond?

tiers

Twenty years on from the conception of the National Pupil Database, it is time to treat the personal data of our future adult citizens with the respect it deserves and we expect of best-in-class data management.

It should be as safe and secure as we treat other sensitive government data, and lessons could be learned from the FARR, ADRN and HSCIC safe settings.

Back to school – more securely, with public understanding and transparency

Understanding how that all works, how technology and people, data sharing and privacy, data security and trust all tie together is fundamental to understanding the internet. When administrations take our data, they take on responsibilities for some of our participation in dot.everyone that the state is so keen for us all to take part in. Many of our kids will live in the world which is the internet of things.  Not getting that, is to not understand the Internet.

And to reiterate some of why that matters, I go back to my previous post in which I quoted Martha Lane Fox recently and the late Aaron Swartz when he said: “It’s not OK not understand the internet, anymore”.

While the Department of Education has turned down my subject access request to find out what the National Pupil Database stores on my own children, it matters too much to brush the issues aside, as only important for me. About 700,000 children are born each year and will added to this database every academic year. None ever get deleted.

Parents can, and must ask that it is delivered to the highest standards of fair processing, transparency, oversight and security. I’m certainly going to.

It’s going to be Back to School in September, and those annual privacy notices, all too soon.

*****

1. The National Pupil Database end of year report card

2. The National Pupil Database end of year report: an F in fair processing

3. The National Pupil Database end of year report: a D in transparency

References:

[1] The National Pupil Database user guide: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/261189/NPD_User_Guide.pdf

[2] Data tables to see the individual level data items stored and shared (by tabs on the bottom of the file) https://www.gov.uk/government/publications/national-pupil-database-user-guide-andsupporting-information

[3] The table to show who has bought or received data and for what purpose https://www.gov.uk/government/publications/national-pupil-database-requests-received

[4] Data Trust Deficit – from the RSS: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[5] Talk by Phil Booth and Terri Dowty: http://www.infiniteideasmachine.com/2013/04/terris-and-my-talk-on-the-national-pupil-database-at-the-open-data-institute/

[6] Presentation given by Paul Sinclair of the Department for Education at the Workshop on Evaluating the Impact of Youth Programmes, 3rd June 2013

What is in the database?

The Schools Census dataset contains approximately eight million records incrementally every year (starting in 1996) and includes variables on the pupil’s home postcode, gender, age, ethnicity, special educational needs (SEN), free school meals eligibility, and schooling history. It covers pupils in state-funded primary, secondary, nursery, special schools and pupil referral units. Schools that are entirely privately funded are not included.

Pupils can be tracked across schools. Pupils can now be followed throughout their school careers. And it provides a very rich set of data on school characteristics. There is further use by linking the data from other related datasets such as those on higher education, neighbourhoods and teachers in schools.

Data stored include the full range of personal and sensitive data from name, date of birth and address, through SEN and disability needs. (Detail of content is here.)  To see what is in it download the excel sheet : NPD Requests.

 

The Department for Education has specific legal powers to collect pupil, child and workforce data held by schools, local authorities and awarding bodies under section 114 of the Education Act 2005section 537A of the Education Act 1996, and section 83 of the Children Act 1989. The submission of the school census returns, including a set of named pupil records, is a statutory requirement on schools under Section 537A of the Education Act 1996.

Driving digital health, revolution by design

This follows on from: 1. Digital revolution by design: building for change and people.

***

Talking about the future of digital health in the NHS, Andy Williams went on to ask, what makes the Internet work?

In my head I answered him, freedom.

Freedom from geographical boundaries. Freedom of speech to share ideas and knowledge in real time with people around the world.  The freedom to fair and equal use. Cooperation, creativity, generosity…

Where these freedoms do not exist or are regulated the Internet may not work well for its citizens and its potential is restricted, as well as its risks.

But the answer he gave, was standards.

And of course he was right.  Agreed standards are needed when sharing a global system so that users, their content and how it works behind the screen cooperate and function as intended.

I came away wondering what the digital future embodied in the NHS NIB plans will look like, who has their say in its content and design and who will control  it?

What freedoms and what standards will be agreed upon for the NHS ‘digital future’ to function and to what purpose?

Citizens help shape the digital future as we help define the framework of how our data are to be collected and used, through what public feeling suggests is acceptable and people actually use.

What are some of the expectations the public have and what potential barriers exist to block achieving its benefits?

It’s all too easy when discussing the digital future of the NHS to see it as a destination. Perhaps we could shift the conversation focus to people, and consider what tools digital will offer the public on their life journey, and how those tools will be driven and guided.

Expectations

One key public expectation will be of trust, if something digital is offered under the NHS brand, it must be of the rigorous standard we expect.

Is every app a safe, useful tool or fun experiment and how will users [especially for mental health apps where the outcomes may be less tangibly measured than say, blood glucose] know the difference?

A second expectation must be around universal equality of access.

A third expectation must be that people know once the app is downloaded or enrolment done, what they have signed up to.

Will the NHS England / NIB digital plans underway create or enable these barriers and expectations?

What barriers exist to the NHS digital vision and why?

Is safety regulation a barrier to innovation?

The ability to broadly share innovation at speed is one of the greatest strengths of digital development, but can also risk spreading harm quickly. Risk management needs to be upfront.

We  assume that digital designs will put at their heart the core principles in the spirit of the NHS.  But if apps are not available on prescription and are essentially a commercial product with no proven benefit, does that exploit the NHS brand trust?

Regulation of quality and safety must be paramount, or they risk doing harm as any other treatment could to the person and regulation must further consider reputational risk to the NHS and the app providers.

Regulation shouldn’t be seen as a barrier, but as an enabler to protect and benefit both user and producer, and indirectly the NHS and state.

Once safety regulation is achieved, I hope that spreading benefits will not be undermined by creating artificial boundaries that restrict access to the tools by affordability, in a postcode lottery,  or in language.

But are barriers being built by design in the NHS digital future?

Cost: commercial digital exploitation or digital exclusion?

There appear to be barriers being built by design into the current NHS apps digital framework. The first being cost.

For the poorest even in the UK today in maternity care, exclusion is already measurable in those who can and cannot afford the data allowance it costs on a smart phone for e-red book access, attendees were told by its founder at #kfdigital15.

Is digital participation and its resultant knowledge or benefit to become a privilege reserved for those who can afford it? No longer free at the point of service?

I find it disappointing that for all the talk of digital equality, apps are for sale on the NHS England website and many state they may not be available in your area – a two-tier NHS by design. If it’s an NHS app, surely it should be available on prescription and/or be free at the point of use and for all like any other treatment? Or is yet another example of  NHS postcode lottery care?

There are tonnes of health apps on the market which may not have much proven health benefit, but they may sell well anyway.

I hope that decision makers shaping these frameworks and social contracts in health today are also looking beyond the worried well, who may be the wealthiest and can afford apps leaving the needs of those who can’t afford to pay for them behind.

At home, it is some of the least wealthy who need the most intervention and from whom there may be little profit to be made There is little in 2020 plans I can see that focuses on the most vulnerable, those in prison and IRCs, and those with disabilities.

Regulation in addition to striving for quality and safety by design, can ensure there is no commercial exploitation of purchasers.  However it is a  question of principle that will decide for or against exclusion for users based on affordability.

Geography: crossing language, culture and country barriers

And what about our place in the wider community, the world wide web, as Andy Williams talked about: what makes the Internet work?

I’d like to think that governance and any “kite marking” of digital tools such as apps, will consider this and look beyond our bubble.

What we create and post online will be on the world wide web.  That has great potential benefits and has risks.

I feel that in the navel gazing focus on our Treasury deficit, the ‘European question’ and refusing refugees, the UK government’s own insularity is a barrier to our wider economic and social growth.

At the King’s Fund event and at the NIB meeting the UK NHS leadership did not discuss one of the greatest strengths of online.

Online can cross geographical boundaries.

How are NHS England approved apps going to account for geography and language and cross country regulation?

What geographical and cultural barriers to access are being built by design just through lack of thought into the new digital framework?

Barriers that will restrict access and benefits both in certain communities within the UK, and to the UK.

One of the three questions asked at the end of the NIB session, was how the UK Sikh community can be better digitally catered for.

In other parts of the world both traditional and digital access to knowledge are denied to those who cannot afford it.

school

This photo reportedly from Indonesia, is great [via Banksy on Twitter, and apologies I cannot credit the photographer] two boys on the way to school, pass their peers on their way to work.

I wonder if one of these boys has the capability to find the cure for cancer?
What if he is one of the five, not one of the two?

Will we enable the digital infrastructure we build today to help global citizens access knowledge and benefits, or restrict access?

Will we enable broad digital inclusion by design?

And what of  data sharing restrictions: Barrier or Enabler?

Organisations that talk only of legal, ethical or consent ‘barriers’ to datasharing don’t understand human behaviour well enough.

One of the greatest risks to achieving the potential benefits from data is the damage done to it by organisations that are paternalistic and controlling. They exploit a relationship rather than nurturing it.

The data trust deficit from the Royal Statistical Society has lessons for policymakers. Including finding that: “Health records being sold to private healthcare companies to make money for government prompted the greatest opposition (84%).”

Data are not an abstract to be exploited, but personal information. Unless otherwise informed, people expect that information offered for one purpose, will not be used for another. Commercial misuse is the greatest threat to public trust.

Organisations that believe behavioural barriers to data sharing are an obstacle,  have forgotten that trust is not something to be overcome, but to be won and continuously reviewed and protected.

The known barrier without a solution is the lack of engagement that is fostered where there is a lack of respect for the citizen behind the data. A consensual data charter could help to enable a way forward.

Where is the wisdom we have lost in knowledge?

Once an app is [prescribed[, used, data exchanged with the NHS health provider and/or app designer, how will users know that what they agreed to in an in-store app, does not change over time?

How will ethical guidance be built into the purposes of any digital offerings we see approved and promoted in the NHS digital future?

When the recent social media experiment by Facebook only mentioned the use of data for research after the experiment, it caused outcry.

It crossed the line between what people felt acceptable and intrusive, analysing the change in behaviour that Facebook’s intervention caused.

That this manipulation is not only possible but could go unseen, are both a risk and cause for concern in a digital world.

Large digital platforms, even small apps have the power to drive not only consumer, but potentially social and political decision making.

“Where is the knowledge we have lost in information?” asks the words of T S Elliott in Choruses, from the Rock. “However you disguise it, this thing does not change: The perpetual struggle of Good and Evil.”

Knowledge can be applied to make a change to current behaviour, and offer or restrict choices through algorithmic selection. It can be used for good or for evil.

‘Don’t be evil’ Google’s adoptive mantra is not just some silly slogan.

Knowledge is power. How that power is shared or withheld from citizens matters not only today’s projects, but for the whole future digital is helping create. Online and offline. At home and abroad.

What freedoms and what standards will be agreed upon for it to function and to what purpose? What barriers can we avoid?

When designing for the future I’d like to see discussion consider not only the patient need, and potential benefits, but also the potential risk for exploitation and behavioural change the digital solution may offer. Plus, ethical solutions to be found for equality of access.

Regulation and principles can be designed to enable success and benefits, not viewed as barriers to be overcome

There must be an ethical compass built into the steering of the digital roadmap that the NHS is so set on, towards its digital future.

An ethical compass guiding app consumer regulation,  to enable fairness of access and to know when apps are downloaded or digital programmes begun, that users know to what they are signed up.

Fundamental to this the NIB speakers all recognised at #kfdigital15 is the ethical and trustworthy extraction, storage and use of data.

There is opportunity to consider when designing the NHS digital future [as the NIB develops its roadmaps for NHS England]:

i making principled decisions on barriers
ii. pro-actively designing ethics and change into ongoing projects, and,
iii. ensuring engagement is genuine collaboration and co-production.

The barriers do not need got around, but solutions built by design.

***

Part 1. Digital revolution by design: building for change and people
Part 3. Digital revolution by design: building infrastructures

NIB roadmaps: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/384650/NIB_Report.pdf

The Economic Value of Data vs the Public Good? [1] care.data, Concerns and the cost of Consent

They say ‘every little helps’.  care.data needs every little it can get.

In my new lay member role on the ADRN panel, I read submissions for research requests for any ethical concerns that may be reflected in wider public opinion.

The driving force for sharing administrative data research is non-commercial, with benefits to be gained for the public good.

So how do we quantify the public good, and ‘in the public interest’?

Is there alignment between the ideology of government, the drivers of policy [for health, such as the commissioning body NHS England] and the citizens of the country on what constitutes ‘the public good’?

There is public good to be gained for example, from social and health data seen as a knowledge base,  by using it using in ‘bona fide’ research, often through linking with other data to broaden insights.

Insight that might result in improving medicines, health applications, and services. Social benefits that should help improve lives, to benefit society.

Although social benefits may be less tangible, they are no harder for the public to grasp than the economic. And often a no brainer as long as confidentiality and personal control are not disregarded.

When it comes to money making from our data the public is less happy. The economic value of data raises more questions on use.

There is economic benefit to extract from data as a knowledge base to inform decision making, being cost efficient and investing wisely. Saving money.

And there is measurable economic public good in terms of income tax from individuals and corporations who by using the data make a profit, using data as a basis from which to create tools or other knowledge. Making money for the public good through indirect sales.

Then there is economic benefit from data trading as a commodity. Direct sales.

In all of these considerations, how does what the public feels and their range of opinions, get taken into account in the public good cost and benefit accounting?

Do we have a consistent and developed understanding of ‘the public interest’ and how it is shifting to fit public expectation and use?

Public concern

“The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.”  [Wellcome blog, April 2015]

If something is jeopardising that public good it is in the public interest to say so, and for the right reasons.

The loss of public trust in data sharing measured by public feeling in 2014 is a threat to data used in the public interest, so what are we doing to fix it and are care.data lessons being learned?

The three biggest concerns voiced by the public at care.data listening events[1] were repeatedly about commercial companies’ use, and re-use of data, third parties accessing data for unknown purposes and the resultant loss of confidentiality.

 Question from Leicester: “Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.” [NHS Open Day, June 2014]

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial purposes.

Much of the debate and upset caused by the revelations of how our hospital episode statistics were managed in the past centred on the sense of loss of ownership. And with that, the inability to consent to who uses it. This despite acknowledgment that patients own their data.

Significant concern centres on use of the information gleaned from data that patients consider commercial exploitation. For use segmenting the insurance markets. For consumer market research. Using data for individual targeting. And its utter lack of governance.

There is also concern about data being directly sold or exchanged as a commodity.

These concerns were raised meeting after meeting in the 2014 care.data “listening process.”

To read in Private Eye that commercially sensitive projects were discussed in various meetings between NHS England and supermarket giant Tesco throughout 2014 [2] by the Patients and Information Director, responsible for care.data, is therefore all the more surprising.

They may of course be quite unrelated.

But when transparency is the mother of trust, it’s perhaps a surprising liason while ‘listening’ to care.data concerns.

It could appear that greater confidentiality was given to the sensitivity of commercial meetings than citizens’ sensitive data.

Consent package deals may be a costly mistake

People are much more aware since care.data a year ago, that unknown third parties may access data without our consent.

Consent around secondary NHS data sharing and in wider fora is no longer an inconvenient ethical dilemma best left on the shelf, as it has been for the last 25 years in secondary use, dusted off in the care.data crisis. [3]

Consent is front and centre in the latest EU data protection discussions [4] in which consent may become a requirement for all research purposes.

How that may affect social science and health research use, its pros and cons [5] remain to be seen.

However, in principle consent has always been required and good practice in applied medicine, despite the caveat for data used in medical research. As a general rule: “An intervention in the health field may only be carried out after the person concerned has given free and informed consent to it”. But this is consent for your care. Assuming that information is shared when looking after you, for direct care, during medical treatment itself is not causes concerns.

The idea is becoming increasingly assumed in discussions I have heard, [at CCG and other public meetings] that because patients have given implied consent to sharing their information for their care, that the same data may be shared for other purposes. It is not, and it is those secondary purposes that the public has asked at care.data events, to see split up, and differentiated.

Research uses are secondary uses, and those purposes cannot ethically be assumed. However, legal gateways, access to that data which makes it possible to uses for clearly defined secondary purposes by law, may make that data sharing legal.

That legal assumption, for the majority of people polls and dialogue show [though not for everyone 6b], comes  a degree of automatic support for bona fide research in the public interest. But it’s not a blanket for all secondary uses by any means, and it is this blanket assumption which has damaged trust.

So if data use in research assumes consent, and any panel is the proxy for personal decision making, the panel must consider the public voice and public interest in its decision making.

So what does the public want?

In those cases where there is no practicable alternative [to consent], there is still pressure to respect patient privacy and to meet reasonable expectations regarding use. The stated ambition of the CAG, for example, is to only advise disclosure in those circumstances where there is reason to think patients would agree it to be reasonable.

Whether active not implied consent does or does not become a requirement for research purposes without differentiation between kinds, the public already has different expectations and trust around different users.

The biggest challenge for championing the benefits of research in the public good, may be to avoid being lumped in with commercial marketing research for private profit.

The latter’s misuse of data is an underlying cause of the mistrust now around data sharing [6]. It’s been a high price to pay for public health research and others delayed since the Partridge audit.

Consent package deals mean that the public cannot choose how data are used in what kids of research and if not happy with one kind, may refuse permission for the other.

By denying any differentiation between direct, indirect, economic and social vale derived from data uses, the public may choose to deny all researchers access to their all personal data.

That may be costly to the public good, for public health and in broader research.

A public good which takes profit into account for private companies and the state, must not be at the expense of public feeling, reasonable expectations and ethical good practice.

A state which allows profit for private companies to harm the perception of  good practice by research in the public interest has lost its principles and priorities. And lost sight of the public interest.

Understanding if the public, the research community and government have differing views on what role economic value plays in the public good matters.

It matters when we discuss how we should best protect and approach it moving towards a changing EU legal framework.

“If the law relating to health research is to be better harmonised through the passing of a Regulation (rather than the existing Directive 95/46/EC), then we need a much better developed understanding of ‘the public interest’ than is currently offered by law.”  [M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1]

In the words of Dr Mark Taylor, “we need to do this better.”

How? I took a look at some of this in more detail:

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

Update note: A version of these three posts was combined into an opinion piece – care.data: ‘The Value of Data versus the Public Interest?’ published on StatsLife on June 3rd 2015.

****

image via Tesco media

 

[1] care.data listening event questions: https://jenpersson.com/pathfinder/

[2] Private Eye – on Tesco / NHS England commercial meetings https://twitter.com/medConfidential/status/593819474807148546

[3] HSCIC audit and programme for change www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

[4] EU data protection discussion http://www.digitalhealth.net/news/EHI/9934/eu-ministers-back-data-privacy-changes

[5] Joint statement on EU Data Protection proposals http://www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/WTP055584.pdf

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[6b] The ‘Dialogue on Data’ Ipsos MORI research 2014 https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx – commissioned by the Economic and Social Research Council (ESRC) and the Office for National Statistics (ONS) to conduct a public dialogue examining the public’s views on using linked administrative data for research purposes,

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

[10] Personalisation in health data plans http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-002-Adult-Social-Care-Informatics.pdf

[11] Tim Kelsey Keynote speech at Strata November 2013 https://www.youtube.com/watch?v=s8HCbXsC4z8

[12] Forbes: Illumina CEO on the US$20bn DNA market http://www.forbes.com/sites/luketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market/

 

The Economic Value of Data vs the Public Good? [2] Pay-for-privacy, defining purposes

Differentiation. Telling customers apart and grouping them by similarities is what commercial data managers want.

It enables them to target customers with advertising and sales promotion most effectively. They segment the market into chunks and treat one group differently from another.

They use market research data, our loyalty card data, to get that detailed information about customers, and decide how to target each group for what purposes.

As the EU states debate how research data should be used and how individuals should be both enabled and protected through it, they might consider separating research purposes by type.

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial consumer research purposes. [ref part 1].

Separating consumer and commercial market research from the definition of research purposes for the public good by the state, could be key to rebuilding people’s trust in government data use.

Having separate purposes would permit separate consent and control procedures to govern them.

But what role will profit make in the state’s definition of ‘in the public interest’ – is it in the public interest if the UK plc makes money from its citizens? and how far along any gauge of public feeling will a government be prepared to go to push making money for the UK plc at our own personal cost?

Pay-for-privacy?

In January this year, the Executive Vice President at Dunnhumby, Nishat Mehta, wrote in this article [7], about how he sees the future of data sharing between consumers and commercial traders:

“Imagine a world where data and services that are currently free had a price tag. You could choose to use Google or Facebook freely if you allowed them to monetize your expressed data through third-party advertisers […]. Alternatively, you could choose to pay a fair price for these services, but use of the data would be forbidden or limited to internal purposes.”

He too, talked about health data. Specifically about its value when accurate expressed and consensual:

“As consumers create and own even more data from health and fitness wearables, connected devices and offline social interactions, market dynamics would set the fair price that would compel customers to share that data. The data is more accurate, and therefore valuable, because it is expressed, rather than inferred, unable to be collected any other way and comes with clear permission from the user for its use.”

What his pay-for-privacy model appears to have forgotten, is that this future consensual sharing is based on the understanding that privacy has a monetary value. And that depends on understanding the status quo.

It is based on the individual realising that there is money made from their personal data by third parties today, and that there is a choice.

The extent of this commercial sharing and re-selling will be a surprise to most loyalty card holders.

“For years, market research firms and retailers have used loyalty cards to offer money back schemes or discounts in return for customer data.”

However despite being signed up for years, I believe most in the public are unaware of the implied deal. It may be in the small print. But everyone knows that few read it, in the rush to sign up to save money.

Most shoppers believe the supermarket is buying our loyalty. We return to spend more cash because of the points. Points mean prizes, petrol coupons, or pounds off.

We don’t realise our personal identity and habits are being invisibly analysed to the nth degree and sold by supermarkets as part of those sweet deals.

But is pay-for-privacy discriminatory? By creating the freedom to choose privacy as a pay-for option, it excludes those who cannot afford it.

Privacy should be seen as a human right, not as a pay-only privilege.

Today we use free services online but our data is used behind the scenes to target sales and ads often with no choice and without our awareness.

Today we can choose to opt in to loyalty schemes and trade our personal data for points and with it we accept marketing emails, and flyers through the door, and unwanted calls in our private time.

The free option is to never sign up at all, but by doing so customers pay a premium by not getting the vouchers and discounts.  Or trading convenience of online shopping.

There is a personal cost in all three cases, albeit in a rather opaque trade off.

 

Does the consumer really benefit in any of these scenarios or does the commercial company get a better deal?

In the sustainable future, only a consensual system based on understanding and trust will work well. That’s assuming by well, we mean organisations wish to prevent PR disasters and practical disruption as resulted for example to NHS data in the last year, through care.data.

For some people the personal cost to the infringement of privacy by commercial firms is great. Others care less. But once informed, there is a choice on offer even today to pay for privacy from commercial business, whether one pays the price by paying a premium for goods if not signed up for loyalty schemes or paying with our privacy.

In future we may see a more direct pay-for-privacy offering along  the lines of Nishat Mehta.

And if so, citizens will be asking ever more about how their data is used in all sorts of places beyond the supermarket.

So how can the state profit from the economic value of our data but not exploit citizens?

‘Every little bit of data’ may help consumer marketing companies.  Gaining it or using it in ways which are unethical and knowingly continue bad practices won’t win back consumers and citizens’ trust.

And whether it is a commercial consumer company or the state, people feel exploited when their information is used to make money without their knowledge and for purposes with which they disagree.

Consumer commercial use and use in bona fide research are separate in the average citizen’s mind and understood in theory.

Achieving differentiation in practice in the definition of research purposes could be key to rebuilding consumers’ trust.

And that would be valid for all their data, not only what data protection labels as ‘personal’. For the average citizen, all data about them is personal.

Separating in practice how consumer businesses are using data about customers to the benefit of company profits, how the benefits are shared on an individual basis in terms of a trade in our privacy, and how bona fide public research benefits us all, would be beneficial to win continued access to our data.

Citizens need and want to be offered paths to see how our data are used in ways which are transparent and easy to access.

Cutting away purposes which appear exploitative from purposes in the public interest could benefit commerce, industry and science.

By reducing the private cost to individuals of the loss of control and privacy of our data, citizens will be more willing to share.

That will create more opportunity for data to be used in the public interest, which will increase the public good; both economic and social which the government hopes to see expand.

And that could mean a happy ending for everyone.

The Economic Value of Data vs the Public Good?  They need not be mutually exclusive. But if one exploits the other, it has the potential to continue be corrosive. The UK plc cannot continue to assume its subjects are willing creators and repositories of information to be used for making money. [ref 1] To do so has lost trust in all uses, not only those in which citizens felt exploited.[6]

The economic value of data used in science and health, whether to individual app creators, big business or the commissioning state in planning and purchasing is clear. Perhaps not quantified or often discussed in the public domain perhaps, but it clearly exists.

Those uses can co-exist with good practices to help people understand what they are signed up to.

By defining ‘research purposes’, by making how data are used transparent, and by giving real choice in practice to consent to differentiated data for secondary uses, both commercial and state will secure their long term access to data.

Privacy, consent and separation of purposes will be wise investments for its growth across commercial and state sectors.

Let’s hope they are part of the coming ‘long-term economic plan’.

****

Related to this:

Part one: The Economic Value of Data vs the Public Good? [1] Concerns and the cost of Consent

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

****

image via Tesco media

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/