Category Archives: privacy

Children’s private chats and personal data lost through VTech toys

If you’ve got young children who have an Innotab console  or other ed tech apps and games from Vtech, then you need to pay attention.

Your and your children’s personal data may have been stolen. The Vtech security breach has exposed private data of more than six million children worldwide, including 700,000 British customers.IMG_3125

The games are designed for children age 2-9. The loss reportedly includes thousands of pictures of children and parents, as well as a year’s worth of chat logs, names and addresses.

Where from? Well, the information that parents and children entered in set up or created using the games like Innotab for example.  The Innotab using an apps allows children to record voice and text messages, take photos and send these to the matching app on the parents’ phone. The data from both users has been lost. The link is the registered email account that connects both child’s toy, and parent’s phone, via the downloaded app.

And why kids’ photos may be included, is that during the set up, a profile photo can be taken by the child, and stored and used in a similar way to social media sites.

VTech’s Learning Lodge app store customer database is affected and VTech Kid Connect servers accessed. As a precautionary step, Vtech says on their website, they have suspended Learning Lodge, the Kid Connect network and a dozen websites temporarily whilst they ‘conduct a thorough security assessment’.

Reactions

One mum spoke to Good Morning Britain about how she felt when she discovered information about her child may have been stolen.

She says she hadn’t received any notification about the loss from VTech and didn’t know about it until she saw it on the six o’clock news. She then pro-actively contacted VTech customer services.

VTech’s response was confused, first telling her they had not lost data related to the KidConnect app – but in a later email say they did.

What’s disappointing in GMB’s coverage they focused in the VTech story on how disappointing this would be for the toymaker VTech in the run up to Christmas.

There was little information for families on what this could mean for using the toys safely in future.  They went on to talk about some other web based tools, but didn’t talk about data protection which really should be much stronger for children’s data.

While parents must take an active role in thinking ahead for our children and how their digital identities can be compromised, we also need to be able to rely on organisations with whom we entrust our own and our children’s personal data, and know that when they ask us for data that they will look after it securely, and use it in ways we expect. On the technical side, data security systems need to be proportionate to the risks they place children in, if data are breached. This is true of commercial companies and public bodies.

On the human side, public transparency and good communication are key to managing expectations, to ensure we know what users sign up to, and to know what happens if things go wrong.

Sadly VTech is continuing to downplay the loss of children’s personal data. In their first statement their focus was to tell people not to worry because credit card details were not included.

When I asked five days after the breach was announced, VTech declined to confirm to me whether avatars and profile pictures had been accessed, arguing that its internal investigation is still ongoing. That’s now two weeks ago.

Their FAQs still say this is unknown. If this is true it would appear surprisingly incompetent on the part of VTech to know that all the other items have been lost in detail, but not profile pictures.

That it is possible for personal details that include date of birth, address and photo to all be lost together is clearly a significant threat for identity theft. It shows one risk of having so much identifiable personal data stored in one place.

The good news, is that it appears not to have any malicious motive. According to the report in Motherboard; “The hacker who broke into VTech’s systems […] that he never intended to release the data to the public.

”Frankly, it makes me sick that I was able to get all this stuff,” the hacker told [Motherboard] in an encrypted chat on Monday.

Could this be the biggest consumer breach of children’s personal data in history?

What now for the 700,000 users of the systems?

Parent accounts need to be directly and fully informed by VTech:

a) what was compromised, by platform, by website, or by customer

b) if and how they will be able to use their equipment again

c) how children’s data would be made safe in future and what Vtech are doing that will different from how they handled data before

Any organisation needs to demonstrate through full transparency and how it acts in the event of such a breach, that it is worthy of trust.

The children’s toys and systems appear to have been shut down.

They’re not cheap with the Innotab coming in at around £90 and its cartridge games upwards of £17 each.  Toy sellers will be in the front line for public facing questions in the shops. Anyone that had already bought these just before Christmas will be wondering what to do now, if access to the systems and the apps have been shut down, they won’t work.

And if and when they do work, will they work securely?

Did Vtech contact you and tell you about the breach?

The sensible thing is to stop using that email address, change the password at very minimum and not only on the adult’s phone and child’s game app, but also anywhere else you use it.

What else do you need to know?

What do parents do when their child’s digital identity has been compromised?

More information is needed from the company, and soon.

####

If you want to get in touch, come over and visit defenddigitalme.com You can also sign up to the Twitter group, support the campaign to get 8 million school pupils’ data made safe, or leave a comment.

####

References:

VTech website FAQs as of December 3, 2015

November 28, 2015: blog TryHunt.com by Microsoft MVP for Developer Security

December 1, 2015: Motherboard report by @josephfcox

December 1, 2015: Motherboard article by Lorenzo Franceschi-Bicchierai

 

 

 

Act now: Stand up and speak out for your rights to finding out the facts #saveFOI

The Freedom of Information Act has enabled me to stand up for my children’s rights. It really matters to me. And we might lose it.

For every member of the public, who has ever or who has never used their rights under the Freedom of Information Act laws, the government consultation on changing them that closes today is worth caring about. If you haven’t yet had your say, go and take action here >> now.  If it is all you have time for before the end of today, you can sign 38 degrees petition or write an email to your MP.

Or by the end of today you can reply to the call for evidence. There is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also got this plain English version.

Please do. Now. It closes today, on November 20th.

If you need convinced why it matters to me and it should to you, read on.

What will happen

If the proposed changes come to pass, information about public accountability will be lost. Political engagement will not be open to all equally. It will promote an unfair society in which individuals are not only prevented from taking part in full public life, but prevented from understanding decisions made about them or that affect them. Campaign groups will be constrained from standing up for human rights by cost.  The press will be restrained in what they can ask.

MySociety has a brilliant summary.  Michael Sheen spoke up calling it “nothing short of a full frontal attack” on the principles of democratic government. And Tom Watson spoke of three serious instances where facts would have stayed hidden, were it not for access made using the law of Freedom of Information:

1. death rates in cardiac patient care
2. cases when the police use Tasers on children
3. the existence of cracks in the nuclear power station at Hinckley

Why does FOI matter to me personally? In Education.

Because it’s enabled me to start a conversation to get the Department for Education to start to improve their handling of our 8 million children’s personal and sensitive data they hold in the National Pupil Database for England and Wales. Through FOI I asked for unpublished facts how many releases of identifiable personal data of school pupils have been fast-tracked at the Department of Education without panel oversight. And to see the panel terms of reference which are still not on their website.

The request: whatdotheykknow.com
The outcome:
National Pupil Database FOI case study summary here.

I’m now coordinating calls for changes on behalf of the 8m children whose records they hold and parents across the country.

******

Why does FOI matter to me personally? In Health.

Because Freedom of Information law has enabled public transparency of decision making and accountability of the care.data programme board decision making that was kept secret for over a year. NHS England refused to publish them. Their internal review declined appeal. The Information Commissioner’s Office upheld it.

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. In fact my case study, while highlighting that NHS England refused to release information, also shows that only a handful of genuine redactions were necessary, using Section 36 to keep them hidden, when the minutes were finally released.

In October 2014 I simply wanted to see the meeting minutes form part of the public record of care.data planning. I wanted to see the cost-benefit business case and scrutinise it against the benefits case that the public were told of at every public engagement event I had been to.  When at every turn the public is told how little money the NHS can afford to spend I wanted scrutiny of what the programme would cost at national and local levels. It was in the public interest to better inform public debate about the merits of the national programme. And I strongly believe that it is in the public interest to be informed and fully understand the intention and programme that demands the use of sensitive personal data.

The request: whatdotheyknow.com
The outcome: care.data FOI case study summary here.

Others could use this information I hoped, to ask the right questions about missing meeting minutes and transparency, and for everyone to question why there was no cost-benefit business plan at all in private; while the public kept being told of the benefits.  And it shows that data collection is further set to expand, without public debate.

Since then the programme has been postoned again and work is in progress on improved public engagement to enable public and professional confidence.

What has Freedom of Information achieved?

One of the most memorable results of Freedom of Information was the MPs expenses scandal. Who knows how much this Freedom of Information Request saved the taxpayers in immeasurable amounts of future spending on duck houses since MPs have been required to publish expenses since 2010? Four MPs were jailed for false accounting. Peers were expelled. Second homes and what appeared to the public as silly spending on sundries were revealed. Mr. Cameron apologized in 2009, saying he was “appalled” by the expenses. The majority of MPs had done nothing illegal but the Freedom of Information request enabled the start of a process of increased transparency to the public which showed where activities, while permitted by law, were simply unethical or unreasonable.

Historical record

Information published under the Freedom of Information Act can help to ensure that important records of decision-making processes are retained as part of the historic background to government.

Increased trust

The right information at the right time helps make better decisions, make spending more transparent and makes policies and practices more trustworthy.

Access to official information can also improve public confidence where public sector bodies are seen as being open. In a 2011 survey carried out on behalf of the Information Commissioner’s Office, 81% of public bodies questioned agreed that the Act had increased the public’s trust in their organisation.

A key argument made by the commission is that those in public office need private space for decision making. The Information Commissioner’s Office countered this in their submission to the consultation saying,

“there is a distinction between a need for a private space, depending on the circumstances and a desire for secrecy across a broad area of public sector activity. It was the latter tendency that FOIA was intended to correct.”

So how much more “private space” do public servants need?

Holding back information

When information that are judged should not be released in the public interest, there are already exemptions that can be applied to prevent disclosure of information under the Freedom of Information Act. [1]

The exemptions include:

  • if the information can easily be accessed by other means – e.g. the internet or published documents
  • if the information is personal information
  • if the information is provided in confidence (but only if legally enforceable)
  • when there is a legal reason not to disclose
  • if the information is about national security, defence, the economy, law enforcement, formulation of Government policy, health and safety, communications with Her Majesty or other royalty, international relations, intended for future publication and commercial interests. (All exemptions in this group must be tested to see if disclosure is in the public interest.)

In addition to these exemptions, organisations can withhold information if it will take more than two-and-a-half days to provide it, or they cannot identify what information is needed (although they have to work with the requester to clarify what is being requested).

They can also withhold information if they decide the request is vexatious.

Does it cost us too much to administer?

Some people who are supportive of these changes say they are concerned about costs in answering requests but have perhaps not considered the savings in exceptional cases (like the Expenses Scandal outcome). And as mySociety has reported [2], money spent responding to Freedom of Information requests also needs to be considered fairly in the context of wider public spending. In 2012 it was reported that Staffordshire County Council had spent £38,000 in a year responding to Freedom of Information requests. The then Director of mySociety, Tom Steinberg, commented:

“From this I can see that oversight by citizens and journalists cost only £38,000 from a yearly total budget of £1.3bn. I think it is fantastic that Staffordshire County Council can provide such information for only 0.002 per cent of its operating budget.”

Why does the government want to make itself less transparent? Even the Information Commissioner’s office has replied to the consultation to say that the Commissioner does not consider that significant changes to the core principles of the legislation are needed. This is a good law, that gives the public rights in our favour and transparency into how we are governed and tax money spent.

How will the value of FOI be measured of what would be lost if the changes are made?

What can you do?

The call for evidence is here and there is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also put together this super-easy Plain English version.

To have your say in the consultation closing on November 20th go online.

Or simply call or write to your MP.  Today. This really matters.


References:

[1] Requests can be refused https://ico.org.uk/for-organisations/guide-to-freedom-of-information/refusing-a-request/

[2] MySociety opposes restrictions https://www.mysociety.org/2015/11/11/voices-from-whatdotheyknow-why-we-oppose-foi-act-restrictions/

[3] National Pupil Database FOI case study summary here

[4] My care.data programme board FOI case study summary here

The National Pupil Database end of year report: an F in Fair Processing

National Pupil Database? What National Pupil Database? Why am I on it?

At the start of the school year last September 2014, I got the usual A4 pieces of paper. Each of my children’s personal details, our home address and contact details, tick boxes for method of transport each used to get to school, types of school meal eaten all listed, and a privacy statement at the bottom:

“Data Protection Act 1988: The school is registered under the Data Protection Act for holding personal data. The school has a duty to protect this information and to keep it up to date. The school is required to share some of the data with the Local Authority and with the DfE.”

There was no mention of the DfE sharing it onwards with anyone else. But they do, through the National Pupil Database [NPD] and  it is enormous [1].  It’s a database which holds personal information of every child who has ever been in state education since 2002, some data since 1996. [That includes me as both a student AND a parent.]

“Never heard of it?”

Well neither have I from my school, which is what I pointed out to the DfE in September 2014.

School heads, governors, and every parent I have spoken with in my area and beyond, are totally unaware of the National Pupil database. All are surprised. Some are horrified at the extent of data sharing at such an identifiable and sensitive level, without school and parental knowledge.[2]

Here’s a list what it holds. Fully identifiable data at unique, individual level. Tiered from 1-4, where 1 is the most sensitive. A full list of what data is available in each of the tiers and standard extracts can be found in the ‘NPD data tables’.

K5

I’d like to think it has not been deliberately hidden from schools and parents. I hope it has simply been careless about its communications.

Imagine that the data once gathered only for administration since 1996, was then decided about at central level and they forgot to tell the people whom they should have been asking. The data controllers and subjects the data were from – the schools, parents/guardians and pupils – were forgotten. That could happen when you see data as a commodity and not as people’ s personal histories.

The UK appears to have gathered admin data for years until the coalition decided it was an asset it could further exploit. The DfE may have told others in 2002 and in 2012 when it shaped policy on how the NPD would be used, but it forgot to tell the children whose information it is and used them without asking. In my book, that’s an abuse of power and misuse of data.

It seems to me that current data policies in practice across all areas of government have simply drifted at national level towards ever greater access by commercial users.

And although that stinks, it has perhaps arisen from lack of public transparency and appropriate oversight, rather than some nefarious intent.

Knowingly failing to inform schools, pupils and guardians how the most basic of our personal data are used is outdated and out of touch with public feeling. Not to mention, that it fails fair processing under Data Protection law.

Subject Access Request – User experience gets an ‘F’ for failing

The submission of the school census, including a set of named pupil records, is a statutory requirement on schools.

This means that children and parents data, regardless of how well or poorly informed they may be, are extracted for administrative purposes, and are used in addition to those we would expect, for various secondary reasons.

Unless the Department for Education makes schools aware of the National Pupil Database use and users, the Department fails to provide an adequate process to enable schools to meet their local data protection requirements. If schools don’t know, they can’t process data properly.

So I wrote to the Department for Education (DfE) in September 2014, including the privacy notice used in schools like ours, showing it fails to inform parents how our children’s personal data and data about us (as related parent/guardians) are stored and onwardly used by the National Pupil Database (NPD). And I asked three questions:

1. I would like to know what information is the minimum you require for an individual child from primary schools in England?

2. Is there an opt out to prevent this sharing and if so, under what process can parents register this?

3. Is there a mechanism for parents to restrict the uses of the data (i.e. opt out our family data) with third parties who get data from the National Pupil Database?

I got back some general information, but no answer to my three questions.

What data do you hold and share with third parties about my children?

In April 2015 I decided to find out exactly what data they held, so I made a subject access request [SAR], expecting to see the data they held about my children. They directed me to ask my children’s school instead and to ask for their educational record. The difficulty with that is, it’s a different dataset.

My school is not the data controller of the National Pupil Database. I am not asking for a copy of my children’s educational records held by the school, but what information that the NPD holds about me and my children. One set of data may feed the other but they are separately managed. The NPD is the data controller for that data it holds and as such I believe has data controller responsibility for it, not the school they attend.

Why do I care? Well for starters, I want to know if the data are accurate.  And I want to know who else has access to it and for what purposes – school can’t tell me that. They certainly couldn’t two months ago, as they had no idea the NPD existed.

I went on to ask the DfE for a copy of the publicly accessible subject access request (SAR) policy and procedures, aware that I was asking on behalf of my children. I couldn’t find any guidance, so asked for the SAR policy. They helpfully provided some advice, but I was then told:

“The department does not have a publicly accessible standard SAR policy and procedures document.”  and “there is not an expectation that NPD data be made available for release in response to a SAR.”

It seems policies are inconsistent. For this other DfE project, there is information about the database, how participants can opt out and  respecting your choice. On the DfE website a Personal Information Charter sets out “what you can expect when we ask for and hold your personal information.”

It says: “Under the terms of the Data Protection Act 1998, you’re entitled to ask us:

  • if we’re processing your personal data
  • to give you a description of the data we hold about you, the reasons why we’re holding it and any recipient we may disclose it to (eg Ofsted)
  • for a copy of your personal data and any details of its source

You’re also entitled to ask us to change the information we hold about you, if it is wrong.

To ask to see your personal data (make a ‘subject access request’), or to ask for clarification about our processing of your personal data, contact us via the question option on our contact form and select ‘other’.”

So I did. But it seems while it applies to that project,  Subject Access Request is not to apply to the data they hold in the NPD. And they finally rejected my request last week, stating it is exempt:

SAR_reject

I appealed the decision on the basis that the section 33 Data Protection Act criteria given, are not met:

“the data subject was made fully aware of the use(s) of their personal data (in the form of a privacy notice)”

But it remains rejected.

It seems incomprehensible that third parties can access my children’s data and I can’t even check to see if it is correct.

While acknowledging section 7 of the Data Protection Act 1998 (DPA) “an individual has the right to ask an organisation to provide them with information they hold which identifies them and, in certain circumstances, a parent can make such a request on behalf of a child” they refused citing the Research, History and Statistics exemption (i.e. section 33(4) of the DPA).

Fair processing, another F for failure and F for attitude

The Department of Education response to me said that it “makes it clear what information is held, why it is held, the uses made of it by DfE and its partners and publishes a statement on its website setting this out. Schools also inform parents and pupils of how the data is used through privacy notices.”

I have told the DfE the process does not work. The DfE / NPD web instructions do not reach parents. Even if they did, information is thoroughly inadequate and either deliberately hides or does so by omission, the commercial third party use of data.

The Department for Education made a web update on 03/07/2015 with privacy information to be made available to parents by schools: http://t.co/PwjN1cwe6r

Despite this update this year, it is inadequate on two counts. In content and communication.

To claim as they did in response to me that: “The Department makes it clear to children and their parents what information is held about pupils and how it is processed, through a statement on its website,” lacks any logic.

Updating their national web page doesn’t create a thorough communications process or engage anyone who does not know about it to start with.

Secondly, the new privacy policy is inadequate in it content and utterly confusing. What does this statement mean, is there now some sort of opt out on offer? I doubt it, but it is unclear:

“A parent/guardian can ask that no information apart from their child’s name, address and date of birth be passed to [insert name of local authority or the provider of Youth Support Services in your area] by informing [insert name of school administrator]. This right is transferred to the child once he/she reaches the age 16. For more information about services for young people, please go to our local authority website [insert link].” [updated privacy statement, July 3, 2015]

Information that I don’t know exists, about a database I don’t know exists, that my school does not know exists, they believe meets fair processing through a statement on its own website?

Appropriate at this time of year,  I have to ask, “you cannot be serious?”

Fair processing means transparently sharing the purpose or purposes for which you intend to process the information, not hiding some of the users through careful wording.

It thereby fails to legally meet the first data protection principle. as parents are not informed at all, never mind fully of further secondary uses.

As a parent, when I register my child for school, I of course expect that some personal details must be captured to administer their education.

There must be data shared to adequately administer, best serve, understand, and sometimes protect our children.  And bona fide research is in the public interest.

However I have been surprised in the last year to find that firstly, I can’t ask what is stored on my own children and that secondly, a wide range of sensitive data are shared through the Department of Education with third parties.

Some of these potential third parties don’t meet research criteria in my understanding of what a ‘researcher’ should be. Journalists? the MOD?

To improve, there would be little additional time or work burden required to provide proper fair processing as a starting point, but to do so, the department can’t only update a policy on its website and think it’s adequate. And the newly updated suggested text for pupils is only going to add confusion.

The privacy policy text needs carefully reworded in human not civil service speak.

It must not omit [as it does now] the full range of potential users.

After all the Data Protection principles state that: “If you wish to use or disclose personal data for a purpose that was not contemplated at the time of collection (and therefore not specified in a privacy notice), you have to consider whether this will be fair.”

Now that it must be obvious to DfE that it is not the best way to carry on, why would they choose NOT to do better? Our children deserve better.

What would better look like? See part 3. The National Pupil Database end of year report: a D in transparency, C minus in security.

*****

[PS: I believe the Freedom of Information Officer tried their best and was professional and polite in our email exchanges, B+. Can’t award an A as I didn’t get any information from my requests. Thank you to them for their effort.]

*****

Updated on Sunday 19th July to include the criteria of my SAR rejection.

1. Our children’s school data: an end of year report card
2. The National Pupil Database end of year report: an F in fair processing
3. The National Pupil Database end of year report: a D in transparency, C minus in security

References:

[1] The National Pupil Database user guide: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/261189/NPD_User_Guide.pdf

[2] The Department for Education has specific legal powers to collect pupil, child and workforce data held by schools, local authorities and awarding bodies under section 114 of the Education Act 2005section 537A of the Education Act 1996, and section 83 of the Children Act 1989. The submission of the school census returns, including a set of named pupil records, is a statutory requirement on schools under Section 537A of the Education Act 1996.

[3] Data tables to see the individual level data items stored and shared (by tabs on the bottom of the file) https://www.gov.uk/government/publications/national-pupil-database-user-guide-andsupporting-information

[4] The table to show who has bought or received data and for what purpose https://www.gov.uk/government/publications/national-pupil-database-requests-received

[5] Data Trust Deficit – from the RSS: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[6] Talk by Phil Booth and Terri Dowty: http://www.infiniteideasmachine.com/2013/04/terris-and-my-talk-on-the-national-pupil-database-at-the-open-data-institute/

[7] Presentation given by Paul Sinclair of the Department for Education at the Workshop on Evaluating the Impact of Youth Programmes, 3rd June 2013

The National Pupil Database end of year report: D for transparency, C minus in security.

Transparency and oversight of how things are administered are simple ways that the public can both understand and trust that things run as we expect.

For the National Pupil Database, parents might be surprised, as I was about some of the current practices.

The scope of use and who could access the National Pupil Database was changed in 2012 and although I had three children at school at that time and heard nothing about it, nor did I read it in the papers. (Hah – time to read the papers?)  So I absolutely agree with Owen Boswara’s post when he wrote:

“There appears to have been no concerted effort to bring the consultation or the NPD initiative to the attention of parents or pupils (i.e. the data subjects themselves). This is a quote from one of the parents who did respond:

“I am shocked and appalled that I wasn’t notified about this consultation through my child’s school – I read about it on Twitter of all things. A letter should have gone to every single parent explaining the proposals and how to respond to this consultation.”

(Now imagine that sentiment amplified via Mumsnet …)”
[July 2013, blog by O. Boswara]

As Owen wrote,  imagine that sentiment amplified via Mumsnet indeed.

Here’s where third parties can apply and here’s a list of who has been given data from the National Pupil Database . (It’s only been updated twice in 18 months. The most recent of which has been since I’ve asked about it, in .) The tier groups 1-4 are explained here on p.18, where 1 is the most sensitive identifiable classification.

The consultation suggested in 2012 that the changes could be an “effective engine of economic growth, social wellbeing, political accountability and public service improvement.”.  

Has this been measured at all if the justification given has begun to be achieved? Often research can take a long time and implementing any changes as a result, more time. But perhaps there has been some measure of public benefit already begun to be accrued?

The release panel would one hope, have begun to track this. [update: DfE confirmed August 20th they do not track benefits, nor have ever done any audit of recipients]

And in parallel what oversight governs checks and balances to make sure that the drive for the ‘engine of economic growth’ remembers to treat these data as knowledge about our children?

Is there that level of oversight from application to benefits measurement?

Is there adequate assessment of privacy impact and ethics in applications?

Why the National Pupil Database troubles me, is not the data it contains per se, but the lack of child/guardian involvement, lack of accountable oversight how it is managed and full transparency around who it is used by and its processes.

Some practical steps forward

Taken now, steps could resolve some of these issues and avoid the risk of them becoming future issues of concern.

The first being thorough fair processing, as I covered in my previous post.

The submission of the school census returns, including a set of named pupil records, has been a statutory requirement on schools since the Education Act 1996. That’s almost twenty years ago in the pre-mainstream internet age.

The Department must now shape up its current governance practices in its capacity as the data processor and controller of the National Pupil Database, to be fit for the 21st century.

Ignoring current weaknesses, actively accepts an ever-increasing reputational risk for the Department, schools, other data sharing bodies or those who link to the data and its bona fide research users. If people lose trust in data uses, they won’t share at all and the quality of data will suffer, bad for functional admin of the state and individual, but also for the public good.

That concerns me also wearing my hat as a lay member on the ADRN panel because it’s important that the public trusts our data is looked after wisely so that research can continue to use it for advances in health and social science and all sorts of areas of knowledge to improve our understanding of society and make it better.

Who decides who gets my kids data, even if I can’t?

A Data Management Advisory Panel (DMAP) considers applications for only some of the applications, tier 1 data requests. Those are the most, but not the only applications for access to sensitive data.

“When you make a request for NPD data it will be considered for approval by the Education Data Division (EDD) with the exception of tier 1 data requests, which will be assessed by the department’s Data Management Advisory Panel. The EDD will inform you of the outcome of the decision.”

Where is governance transparency?

What is the make up of both the Data Management Advisory Panel and and the Education Data Division (EDD)? Who sits on them and how are they selected? Do they document their conflicts of interest for each application? For how long are they appointed and under what selection criteria?

Where is decision outcome transparency?

The outcome of the decision should be documented and published. However, the list has been updated only twice since its inception in 2012. Once was December 2013, and the most recently was, ahem, May 18 2015. After considerable prodding. There should be a regular timetable, with responsible owner and a depth of insight into its decision making.

Where is transparency over decision making to approve or reject requests?

Do privacy impact assessments and ethics reviews play any role in their application and if so, how are they assessed and by whom?

How are those sensitive and confidential data stored and governed?

The weakest link in any system is often said to be human error. Users of the NPD data vary from other government departments to “Mom and Pop” small home businesses, selling schools’ business intelligence and benchmarking.

So how secure are our children’s data really, and once the data have left the Department database, how are they treated? Does lots of form filling and emailed data with a personal password ensure good practice, or simply provide barriers to slow down the legitimate applications process?

What happens to data that are no longer required for the given project? Are they properly deleted and what audits have ever been carried out to ensure that?

The National Pupil Database end of year report: a C- in security

The volume of data that can be processed now at speed is incomparable with 1996, and even 2012 when the current processes were set up. The opportunities and risks in cyber security have also moved on.

Surely the Department for Education should take responsibility seriously to treat our children’s personal data and sensitive records equally as well as the HSCIC now intends to manage health data?

Processing administrative or linked data in an environment with layered physical security (e.g. a secure perimeter, CCTV, security guarding or a locked room without remote connection such as internet access) is good practice. And reduces the risk of silly, human error. Or  simple theft.

Is giving out chunks of raw data by email, with reams of paperwork as its approval ‘safeguards’ really fit for the 21st century and beyond?

tiers

Twenty years on from the conception of the National Pupil Database, it is time to treat the personal data of our future adult citizens with the respect it deserves and we expect of best-in-class data management.

It should be as safe and secure as we treat other sensitive government data, and lessons could be learned from the FARR, ADRN and HSCIC safe settings.

Back to school – more securely, with public understanding and transparency

Understanding how that all works, how technology and people, data sharing and privacy, data security and trust all tie together is fundamental to understanding the internet. When administrations take our data, they take on responsibilities for some of our participation in dot.everyone that the state is so keen for us all to take part in. Many of our kids will live in the world which is the internet of things.  Not getting that, is to not understand the Internet.

And to reiterate some of why that matters, I go back to my previous post in which I quoted Martha Lane Fox recently and the late Aaron Swartz when he said: “It’s not OK not understand the internet, anymore”.

While the Department of Education has turned down my subject access request to find out what the National Pupil Database stores on my own children, it matters too much to brush the issues aside, as only important for me. About 700,000 children are born each year and will added to this database every academic year. None ever get deleted.

Parents can, and must ask that it is delivered to the highest standards of fair processing, transparency, oversight and security. I’m certainly going to.

It’s going to be Back to School in September, and those annual privacy notices, all too soon.

*****

1. The National Pupil Database end of year report card

2. The National Pupil Database end of year report: an F in fair processing

3. The National Pupil Database end of year report: a D in transparency

References:

[1] The National Pupil Database user guide: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/261189/NPD_User_Guide.pdf

[2] Data tables to see the individual level data items stored and shared (by tabs on the bottom of the file) https://www.gov.uk/government/publications/national-pupil-database-user-guide-andsupporting-information

[3] The table to show who has bought or received data and for what purpose https://www.gov.uk/government/publications/national-pupil-database-requests-received

[4] Data Trust Deficit – from the RSS: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[5] Talk by Phil Booth and Terri Dowty: http://www.infiniteideasmachine.com/2013/04/terris-and-my-talk-on-the-national-pupil-database-at-the-open-data-institute/

[6] Presentation given by Paul Sinclair of the Department for Education at the Workshop on Evaluating the Impact of Youth Programmes, 3rd June 2013

What is in the database?

The Schools Census dataset contains approximately eight million records incrementally every year (starting in 1996) and includes variables on the pupil’s home postcode, gender, age, ethnicity, special educational needs (SEN), free school meals eligibility, and schooling history. It covers pupils in state-funded primary, secondary, nursery, special schools and pupil referral units. Schools that are entirely privately funded are not included.

Pupils can be tracked across schools. Pupils can now be followed throughout their school careers. And it provides a very rich set of data on school characteristics. There is further use by linking the data from other related datasets such as those on higher education, neighbourhoods and teachers in schools.

Data stored include the full range of personal and sensitive data from name, date of birth and address, through SEN and disability needs. (Detail of content is here.)  To see what is in it download the excel sheet : NPD Requests.

 

The Department for Education has specific legal powers to collect pupil, child and workforce data held by schools, local authorities and awarding bodies under section 114 of the Education Act 2005section 537A of the Education Act 1996, and section 83 of the Children Act 1989. The submission of the school census returns, including a set of named pupil records, is a statutory requirement on schools under Section 537A of the Education Act 1996.

Driving digital health, revolution by design

This follows on from: 1. Digital revolution by design: building for change and people.

***

Talking about the future of digital health in the NHS, Andy Williams went on to ask, what makes the Internet work?

In my head I answered him, freedom.

Freedom from geographical boundaries. Freedom of speech to share ideas and knowledge in real time with people around the world.  The freedom to fair and equal use. Cooperation, creativity, generosity…

Where these freedoms do not exist or are regulated the Internet may not work well for its citizens and its potential is restricted, as well as its risks.

But the answer he gave, was standards.

And of course he was right.  Agreed standards are needed when sharing a global system so that users, their content and how it works behind the screen cooperate and function as intended.

I came away wondering what the digital future embodied in the NHS NIB plans will look like, who has their say in its content and design and who will control  it?

What freedoms and what standards will be agreed upon for the NHS ‘digital future’ to function and to what purpose?

Citizens help shape the digital future as we help define the framework of how our data are to be collected and used, through what public feeling suggests is acceptable and people actually use.

What are some of the expectations the public have and what potential barriers exist to block achieving its benefits?

It’s all too easy when discussing the digital future of the NHS to see it as a destination. Perhaps we could shift the conversation focus to people, and consider what tools digital will offer the public on their life journey, and how those tools will be driven and guided.

Expectations

One key public expectation will be of trust, if something digital is offered under the NHS brand, it must be of the rigorous standard we expect.

Is every app a safe, useful tool or fun experiment and how will users [especially for mental health apps where the outcomes may be less tangibly measured than say, blood glucose] know the difference?

A second expectation must be around universal equality of access.

A third expectation must be that people know once the app is downloaded or enrolment done, what they have signed up to.

Will the NHS England / NIB digital plans underway create or enable these barriers and expectations?

What barriers exist to the NHS digital vision and why?

Is safety regulation a barrier to innovation?

The ability to broadly share innovation at speed is one of the greatest strengths of digital development, but can also risk spreading harm quickly. Risk management needs to be upfront.

We  assume that digital designs will put at their heart the core principles in the spirit of the NHS.  But if apps are not available on prescription and are essentially a commercial product with no proven benefit, does that exploit the NHS brand trust?

Regulation of quality and safety must be paramount, or they risk doing harm as any other treatment could to the person and regulation must further consider reputational risk to the NHS and the app providers.

Regulation shouldn’t be seen as a barrier, but as an enabler to protect and benefit both user and producer, and indirectly the NHS and state.

Once safety regulation is achieved, I hope that spreading benefits will not be undermined by creating artificial boundaries that restrict access to the tools by affordability, in a postcode lottery,  or in language.

But are barriers being built by design in the NHS digital future?

Cost: commercial digital exploitation or digital exclusion?

There appear to be barriers being built by design into the current NHS apps digital framework. The first being cost.

For the poorest even in the UK today in maternity care, exclusion is already measurable in those who can and cannot afford the data allowance it costs on a smart phone for e-red book access, attendees were told by its founder at #kfdigital15.

Is digital participation and its resultant knowledge or benefit to become a privilege reserved for those who can afford it? No longer free at the point of service?

I find it disappointing that for all the talk of digital equality, apps are for sale on the NHS England website and many state they may not be available in your area – a two-tier NHS by design. If it’s an NHS app, surely it should be available on prescription and/or be free at the point of use and for all like any other treatment? Or is yet another example of  NHS postcode lottery care?

There are tonnes of health apps on the market which may not have much proven health benefit, but they may sell well anyway.

I hope that decision makers shaping these frameworks and social contracts in health today are also looking beyond the worried well, who may be the wealthiest and can afford apps leaving the needs of those who can’t afford to pay for them behind.

At home, it is some of the least wealthy who need the most intervention and from whom there may be little profit to be made There is little in 2020 plans I can see that focuses on the most vulnerable, those in prison and IRCs, and those with disabilities.

Regulation in addition to striving for quality and safety by design, can ensure there is no commercial exploitation of purchasers.  However it is a  question of principle that will decide for or against exclusion for users based on affordability.

Geography: crossing language, culture and country barriers

And what about our place in the wider community, the world wide web, as Andy Williams talked about: what makes the Internet work?

I’d like to think that governance and any “kite marking” of digital tools such as apps, will consider this and look beyond our bubble.

What we create and post online will be on the world wide web.  That has great potential benefits and has risks.

I feel that in the navel gazing focus on our Treasury deficit, the ‘European question’ and refusing refugees, the UK government’s own insularity is a barrier to our wider economic and social growth.

At the King’s Fund event and at the NIB meeting the UK NHS leadership did not discuss one of the greatest strengths of online.

Online can cross geographical boundaries.

How are NHS England approved apps going to account for geography and language and cross country regulation?

What geographical and cultural barriers to access are being built by design just through lack of thought into the new digital framework?

Barriers that will restrict access and benefits both in certain communities within the UK, and to the UK.

One of the three questions asked at the end of the NIB session, was how the UK Sikh community can be better digitally catered for.

In other parts of the world both traditional and digital access to knowledge are denied to those who cannot afford it.

school

This photo reportedly from Indonesia, is great [via Banksy on Twitter, and apologies I cannot credit the photographer] two boys on the way to school, pass their peers on their way to work.

I wonder if one of these boys has the capability to find the cure for cancer?
What if he is one of the five, not one of the two?

Will we enable the digital infrastructure we build today to help global citizens access knowledge and benefits, or restrict access?

Will we enable broad digital inclusion by design?

And what of  data sharing restrictions: Barrier or Enabler?

Organisations that talk only of legal, ethical or consent ‘barriers’ to datasharing don’t understand human behaviour well enough.

One of the greatest risks to achieving the potential benefits from data is the damage done to it by organisations that are paternalistic and controlling. They exploit a relationship rather than nurturing it.

The data trust deficit from the Royal Statistical Society has lessons for policymakers. Including finding that: “Health records being sold to private healthcare companies to make money for government prompted the greatest opposition (84%).”

Data are not an abstract to be exploited, but personal information. Unless otherwise informed, people expect that information offered for one purpose, will not be used for another. Commercial misuse is the greatest threat to public trust.

Organisations that believe behavioural barriers to data sharing are an obstacle,  have forgotten that trust is not something to be overcome, but to be won and continuously reviewed and protected.

The known barrier without a solution is the lack of engagement that is fostered where there is a lack of respect for the citizen behind the data. A consensual data charter could help to enable a way forward.

Where is the wisdom we have lost in knowledge?

Once an app is [prescribed[, used, data exchanged with the NHS health provider and/or app designer, how will users know that what they agreed to in an in-store app, does not change over time?

How will ethical guidance be built into the purposes of any digital offerings we see approved and promoted in the NHS digital future?

When the recent social media experiment by Facebook only mentioned the use of data for research after the experiment, it caused outcry.

It crossed the line between what people felt acceptable and intrusive, analysing the change in behaviour that Facebook’s intervention caused.

That this manipulation is not only possible but could go unseen, are both a risk and cause for concern in a digital world.

Large digital platforms, even small apps have the power to drive not only consumer, but potentially social and political decision making.

“Where is the knowledge we have lost in information?” asks the words of T S Elliott in Choruses, from the Rock. “However you disguise it, this thing does not change: The perpetual struggle of Good and Evil.”

Knowledge can be applied to make a change to current behaviour, and offer or restrict choices through algorithmic selection. It can be used for good or for evil.

‘Don’t be evil’ Google’s adoptive mantra is not just some silly slogan.

Knowledge is power. How that power is shared or withheld from citizens matters not only today’s projects, but for the whole future digital is helping create. Online and offline. At home and abroad.

What freedoms and what standards will be agreed upon for it to function and to what purpose? What barriers can we avoid?

When designing for the future I’d like to see discussion consider not only the patient need, and potential benefits, but also the potential risk for exploitation and behavioural change the digital solution may offer. Plus, ethical solutions to be found for equality of access.

Regulation and principles can be designed to enable success and benefits, not viewed as barriers to be overcome

There must be an ethical compass built into the steering of the digital roadmap that the NHS is so set on, towards its digital future.

An ethical compass guiding app consumer regulation,  to enable fairness of access and to know when apps are downloaded or digital programmes begun, that users know to what they are signed up.

Fundamental to this the NIB speakers all recognised at #kfdigital15 is the ethical and trustworthy extraction, storage and use of data.

There is opportunity to consider when designing the NHS digital future [as the NIB develops its roadmaps for NHS England]:

i making principled decisions on barriers
ii. pro-actively designing ethics and change into ongoing projects, and,
iii. ensuring engagement is genuine collaboration and co-production.

The barriers do not need got around, but solutions built by design.

***

Part 1. Digital revolution by design: building for change and people
Part 3. Digital revolution by design: building infrastructures

NIB roadmaps: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/384650/NIB_Report.pdf

The Economic Value of Data vs the Public Good? [1] care.data, Concerns and the cost of Consent

They say ‘every little helps’.  care.data needs every little it can get.

In my new lay member role on the ADRN panel, I read submissions for research requests for any ethical concerns that may be reflected in wider public opinion.

The driving force for sharing administrative data research is non-commercial, with benefits to be gained for the public good.

So how do we quantify the public good, and ‘in the public interest’?

Is there alignment between the ideology of government, the drivers of policy [for health, such as the commissioning body NHS England] and the citizens of the country on what constitutes ‘the public good’?

There is public good to be gained for example, from social and health data seen as a knowledge base,  by using it using in ‘bona fide’ research, often through linking with other data to broaden insights.

Insight that might result in improving medicines, health applications, and services. Social benefits that should help improve lives, to benefit society.

Although social benefits may be less tangible, they are no harder for the public to grasp than the economic. And often a no brainer as long as confidentiality and personal control are not disregarded.

When it comes to money making from our data the public is less happy. The economic value of data raises more questions on use.

There is economic benefit to extract from data as a knowledge base to inform decision making, being cost efficient and investing wisely. Saving money.

And there is measurable economic public good in terms of income tax from individuals and corporations who by using the data make a profit, using data as a basis from which to create tools or other knowledge. Making money for the public good through indirect sales.

Then there is economic benefit from data trading as a commodity. Direct sales.

In all of these considerations, how does what the public feels and their range of opinions, get taken into account in the public good cost and benefit accounting?

Do we have a consistent and developed understanding of ‘the public interest’ and how it is shifting to fit public expectation and use?

Public concern

“The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.”  [Wellcome blog, April 2015]

If something is jeopardising that public good it is in the public interest to say so, and for the right reasons.

The loss of public trust in data sharing measured by public feeling in 2014 is a threat to data used in the public interest, so what are we doing to fix it and are care.data lessons being learned?

The three biggest concerns voiced by the public at care.data listening events[1] were repeatedly about commercial companies’ use, and re-use of data, third parties accessing data for unknown purposes and the resultant loss of confidentiality.

 Question from Leicester: “Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.” [NHS Open Day, June 2014]

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial purposes.

Much of the debate and upset caused by the revelations of how our hospital episode statistics were managed in the past centred on the sense of loss of ownership. And with that, the inability to consent to who uses it. This despite acknowledgment that patients own their data.

Significant concern centres on use of the information gleaned from data that patients consider commercial exploitation. For use segmenting the insurance markets. For consumer market research. Using data for individual targeting. And its utter lack of governance.

There is also concern about data being directly sold or exchanged as a commodity.

These concerns were raised meeting after meeting in the 2014 care.data “listening process.”

To read in Private Eye that commercially sensitive projects were discussed in various meetings between NHS England and supermarket giant Tesco throughout 2014 [2] by the Patients and Information Director, responsible for care.data, is therefore all the more surprising.

They may of course be quite unrelated.

But when transparency is the mother of trust, it’s perhaps a surprising liason while ‘listening’ to care.data concerns.

It could appear that greater confidentiality was given to the sensitivity of commercial meetings than citizens’ sensitive data.

Consent package deals may be a costly mistake

People are much more aware since care.data a year ago, that unknown third parties may access data without our consent.

Consent around secondary NHS data sharing and in wider fora is no longer an inconvenient ethical dilemma best left on the shelf, as it has been for the last 25 years in secondary use, dusted off in the care.data crisis. [3]

Consent is front and centre in the latest EU data protection discussions [4] in which consent may become a requirement for all research purposes.

How that may affect social science and health research use, its pros and cons [5] remain to be seen.

However, in principle consent has always been required and good practice in applied medicine, despite the caveat for data used in medical research. As a general rule: “An intervention in the health field may only be carried out after the person concerned has given free and informed consent to it”. But this is consent for your care. Assuming that information is shared when looking after you, for direct care, during medical treatment itself is not causes concerns.

The idea is becoming increasingly assumed in discussions I have heard, [at CCG and other public meetings] that because patients have given implied consent to sharing their information for their care, that the same data may be shared for other purposes. It is not, and it is those secondary purposes that the public has asked at care.data events, to see split up, and differentiated.

Research uses are secondary uses, and those purposes cannot ethically be assumed. However, legal gateways, access to that data which makes it possible to uses for clearly defined secondary purposes by law, may make that data sharing legal.

That legal assumption, for the majority of people polls and dialogue show [though not for everyone 6b], comes  a degree of automatic support for bona fide research in the public interest. But it’s not a blanket for all secondary uses by any means, and it is this blanket assumption which has damaged trust.

So if data use in research assumes consent, and any panel is the proxy for personal decision making, the panel must consider the public voice and public interest in its decision making.

So what does the public want?

In those cases where there is no practicable alternative [to consent], there is still pressure to respect patient privacy and to meet reasonable expectations regarding use. The stated ambition of the CAG, for example, is to only advise disclosure in those circumstances where there is reason to think patients would agree it to be reasonable.

Whether active not implied consent does or does not become a requirement for research purposes without differentiation between kinds, the public already has different expectations and trust around different users.

The biggest challenge for championing the benefits of research in the public good, may be to avoid being lumped in with commercial marketing research for private profit.

The latter’s misuse of data is an underlying cause of the mistrust now around data sharing [6]. It’s been a high price to pay for public health research and others delayed since the Partridge audit.

Consent package deals mean that the public cannot choose how data are used in what kids of research and if not happy with one kind, may refuse permission for the other.

By denying any differentiation between direct, indirect, economic and social vale derived from data uses, the public may choose to deny all researchers access to their all personal data.

That may be costly to the public good, for public health and in broader research.

A public good which takes profit into account for private companies and the state, must not be at the expense of public feeling, reasonable expectations and ethical good practice.

A state which allows profit for private companies to harm the perception of  good practice by research in the public interest has lost its principles and priorities. And lost sight of the public interest.

Understanding if the public, the research community and government have differing views on what role economic value plays in the public good matters.

It matters when we discuss how we should best protect and approach it moving towards a changing EU legal framework.

“If the law relating to health research is to be better harmonised through the passing of a Regulation (rather than the existing Directive 95/46/EC), then we need a much better developed understanding of ‘the public interest’ than is currently offered by law.”  [M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1]

In the words of Dr Mark Taylor, “we need to do this better.”

How? I took a look at some of this in more detail:

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

Update note: A version of these three posts was combined into an opinion piece – care.data: ‘The Value of Data versus the Public Interest?’ published on StatsLife on June 3rd 2015.

****

image via Tesco media

 

[1] care.data listening event questions: https://jenpersson.com/pathfinder/

[2] Private Eye – on Tesco / NHS England commercial meetings https://twitter.com/medConfidential/status/593819474807148546

[3] HSCIC audit and programme for change www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

[4] EU data protection discussion http://www.digitalhealth.net/news/EHI/9934/eu-ministers-back-data-privacy-changes

[5] Joint statement on EU Data Protection proposals http://www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/WTP055584.pdf

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[6b] The ‘Dialogue on Data’ Ipsos MORI research 2014 https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx – commissioned by the Economic and Social Research Council (ESRC) and the Office for National Statistics (ONS) to conduct a public dialogue examining the public’s views on using linked administrative data for research purposes,

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

[10] Personalisation in health data plans http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-002-Adult-Social-Care-Informatics.pdf

[11] Tim Kelsey Keynote speech at Strata November 2013 https://www.youtube.com/watch?v=s8HCbXsC4z8

[12] Forbes: Illumina CEO on the US$20bn DNA market http://www.forbes.com/sites/luketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market/

 

The Economic Value of Data vs the Public Good? [2] Pay-for-privacy, defining purposes

Differentiation. Telling customers apart and grouping them by similarities is what commercial data managers want.

It enables them to target customers with advertising and sales promotion most effectively. They segment the market into chunks and treat one group differently from another.

They use market research data, our loyalty card data, to get that detailed information about customers, and decide how to target each group for what purposes.

As the EU states debate how research data should be used and how individuals should be both enabled and protected through it, they might consider separating research purposes by type.

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial consumer research purposes. [ref part 1].

Separating consumer and commercial market research from the definition of research purposes for the public good by the state, could be key to rebuilding people’s trust in government data use.

Having separate purposes would permit separate consent and control procedures to govern them.

But what role will profit make in the state’s definition of ‘in the public interest’ – is it in the public interest if the UK plc makes money from its citizens? and how far along any gauge of public feeling will a government be prepared to go to push making money for the UK plc at our own personal cost?

Pay-for-privacy?

In January this year, the Executive Vice President at Dunnhumby, Nishat Mehta, wrote in this article [7], about how he sees the future of data sharing between consumers and commercial traders:

“Imagine a world where data and services that are currently free had a price tag. You could choose to use Google or Facebook freely if you allowed them to monetize your expressed data through third-party advertisers […]. Alternatively, you could choose to pay a fair price for these services, but use of the data would be forbidden or limited to internal purposes.”

He too, talked about health data. Specifically about its value when accurate expressed and consensual:

“As consumers create and own even more data from health and fitness wearables, connected devices and offline social interactions, market dynamics would set the fair price that would compel customers to share that data. The data is more accurate, and therefore valuable, because it is expressed, rather than inferred, unable to be collected any other way and comes with clear permission from the user for its use.”

What his pay-for-privacy model appears to have forgotten, is that this future consensual sharing is based on the understanding that privacy has a monetary value. And that depends on understanding the status quo.

It is based on the individual realising that there is money made from their personal data by third parties today, and that there is a choice.

The extent of this commercial sharing and re-selling will be a surprise to most loyalty card holders.

“For years, market research firms and retailers have used loyalty cards to offer money back schemes or discounts in return for customer data.”

However despite being signed up for years, I believe most in the public are unaware of the implied deal. It may be in the small print. But everyone knows that few read it, in the rush to sign up to save money.

Most shoppers believe the supermarket is buying our loyalty. We return to spend more cash because of the points. Points mean prizes, petrol coupons, or pounds off.

We don’t realise our personal identity and habits are being invisibly analysed to the nth degree and sold by supermarkets as part of those sweet deals.

But is pay-for-privacy discriminatory? By creating the freedom to choose privacy as a pay-for option, it excludes those who cannot afford it.

Privacy should be seen as a human right, not as a pay-only privilege.

Today we use free services online but our data is used behind the scenes to target sales and ads often with no choice and without our awareness.

Today we can choose to opt in to loyalty schemes and trade our personal data for points and with it we accept marketing emails, and flyers through the door, and unwanted calls in our private time.

The free option is to never sign up at all, but by doing so customers pay a premium by not getting the vouchers and discounts.  Or trading convenience of online shopping.

There is a personal cost in all three cases, albeit in a rather opaque trade off.

 

Does the consumer really benefit in any of these scenarios or does the commercial company get a better deal?

In the sustainable future, only a consensual system based on understanding and trust will work well. That’s assuming by well, we mean organisations wish to prevent PR disasters and practical disruption as resulted for example to NHS data in the last year, through care.data.

For some people the personal cost to the infringement of privacy by commercial firms is great. Others care less. But once informed, there is a choice on offer even today to pay for privacy from commercial business, whether one pays the price by paying a premium for goods if not signed up for loyalty schemes or paying with our privacy.

In future we may see a more direct pay-for-privacy offering along  the lines of Nishat Mehta.

And if so, citizens will be asking ever more about how their data is used in all sorts of places beyond the supermarket.

So how can the state profit from the economic value of our data but not exploit citizens?

‘Every little bit of data’ may help consumer marketing companies.  Gaining it or using it in ways which are unethical and knowingly continue bad practices won’t win back consumers and citizens’ trust.

And whether it is a commercial consumer company or the state, people feel exploited when their information is used to make money without their knowledge and for purposes with which they disagree.

Consumer commercial use and use in bona fide research are separate in the average citizen’s mind and understood in theory.

Achieving differentiation in practice in the definition of research purposes could be key to rebuilding consumers’ trust.

And that would be valid for all their data, not only what data protection labels as ‘personal’. For the average citizen, all data about them is personal.

Separating in practice how consumer businesses are using data about customers to the benefit of company profits, how the benefits are shared on an individual basis in terms of a trade in our privacy, and how bona fide public research benefits us all, would be beneficial to win continued access to our data.

Citizens need and want to be offered paths to see how our data are used in ways which are transparent and easy to access.

Cutting away purposes which appear exploitative from purposes in the public interest could benefit commerce, industry and science.

By reducing the private cost to individuals of the loss of control and privacy of our data, citizens will be more willing to share.

That will create more opportunity for data to be used in the public interest, which will increase the public good; both economic and social which the government hopes to see expand.

And that could mean a happy ending for everyone.

The Economic Value of Data vs the Public Good?  They need not be mutually exclusive. But if one exploits the other, it has the potential to continue be corrosive. The UK plc cannot continue to assume its subjects are willing creators and repositories of information to be used for making money. [ref 1] To do so has lost trust in all uses, not only those in which citizens felt exploited.[6]

The economic value of data used in science and health, whether to individual app creators, big business or the commissioning state in planning and purchasing is clear. Perhaps not quantified or often discussed in the public domain perhaps, but it clearly exists.

Those uses can co-exist with good practices to help people understand what they are signed up to.

By defining ‘research purposes’, by making how data are used transparent, and by giving real choice in practice to consent to differentiated data for secondary uses, both commercial and state will secure their long term access to data.

Privacy, consent and separation of purposes will be wise investments for its growth across commercial and state sectors.

Let’s hope they are part of the coming ‘long-term economic plan’.

****

Related to this:

Part one: The Economic Value of Data vs the Public Good? [1] Concerns and the cost of Consent

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

****

image via Tesco media

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

 

Nothing to fear, nowhere to hide – a mother’s attempt to untangle UK surveillance law and programmes

“The Secret Service should start recruiting through Mumsnet to attract more women to senior posts, MPs have said.”
[SkyNews, March 5, 2015]

Whilst we may have always dreamed of being ‘M’, perhaps we can start by empowering all Mums to understand how real-life surveillance works today, in all our lives, homes and schools.

In the words of Franklin D. Roosevelt at his 1933 inaugural address:

“This is preeminently the time to speak the truth, the whole truth, frankly and boldly…

“Let me assert my firm belief that the only thing we have to fear is fear itself—nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.”

It is hard to know in the debate in the ‘war on terror’, what is truthful and what is ‘justified’ fear as opposed to ‘nameless and unreasoning.’

To be reasoned, we need to have information to understand what is going on and it can feel that picture is complex and unclear.

What concrete facts do you and I have about terrorism today, and the wider effects it has on our home life?

If you have children in school, or are a journalist, a whistleblower, lawyer or have thought about the effects of the news recently, it may affect our children or any of us in ways which we may not expect.

It might surprise you that it was surveillance law that was used to track a mother and her children’s [1] movements when a council wasn’t sure if her school application was for the correct catchment area. [It legally used the Regulation of Investigatory Powers Act 2000, (RIPA) [2]

Recent headlines are filled with the story of three more girls who are reported to have travelled to Syria.

As a Mum I’d be desperate for my teens, and I cannot imagine what their family must feel. There are conflicting opinions, and politics,  but let’s leave that aside. These girls are each somebody’s daughter, and at risk.

As a result MPs are talking about what they should be teaching in schools. Do parents and citizens agree, and do we know what?

Shadow business secretary Chuka Umunna, Labour MP told Pienaar’s Politics on BBC Radio 5 Live: “I really do think this is not just an issue for the intelligence services, it’s for all of us in our schools, in our communities, in our families to tackle this.”

Justice Minister Simon Hughes told Murnaghan on Sky News it was important to ensure a counter-argument against extremism was being made in schools and also to show pupils “that’s not where excitement and success should lie”. [BBC 22 February 2015]

There are already policies in schools that touch all our children and laws which reach into our family lives that we may know little about.

I have lots of questions what and how we are teaching our children about ‘extremism’ in schools and how the state uses surveillance to monitor our children’s and our own lives.

This may affect all schools and places of education, not those about which we hear stories about in the news, so it includes yours.

We all want the best for our young people and security in society, but are we protecting and promoting the right things?

Are today’s policies in practice, helping or hardening our children’s thinking?

Of course I want to see that all our kids are brought up safe. I also want to bring them up free from prejudice and see they get equal treatment and an equal start in life in a fair and friendly society.

I think we should understand the big picture better.

1. Do you feel comfortable that you know what is being  taught in schools or what is done with information recorded or shared by schools or its proposed expansion to pre-schools about toddlers under the Prevent programme?.

2. Do government communications’ surveillance programmes in reality, match up with real world evidence of need, and how is it measured to be effective?

3. Do these programmes create more problems as side-effects we don’t see or don’t measure?

4. If any of our children have information recorded about them in these programmes, how is it used, who sees it and for what purposes?

5. How much do we know about the laws brought in under the banner of ‘counter-terror’ measures, and how they are used for all citizens in everyday life?

We always think unexpected things will happen to someone else, and everything is rightfully justified in surveillance, until it isn’t.

Labels can be misleading.

One man’s terrorist may be another’s freedom fighter.

One man’s investigative journalist is another’s ‘domestic extremist.’

Who decides who is who?

Has anyone asked in Parliament: Why has religious hate crime escalated by 45% in 2013/14 and what are we doing about it? (up 700 to 2, 273 offences, Crime figures [19])

These aren’t easy questions, but we shouldn’t avoid asking them because it’s difficult.

I think we should ask: do we have laws which discriminate by religion, censor our young people’s education, or store information about us which is used in ways we don’t expect or know about?

Our MPs are after all, only people like us, who represent us, and who make decisions about us, which affect us. And on 7th May, they may be about to change.

As a mother, whoever wins the next General Election matters to me because it will affect the next five years or more, of what policies are made which will affect our children, and all of us as citizens.

It should be clear what these programmes are and there should be no reason why it’s not transparent.

“To counter terrorism, society needs more than labels and laws. We need trust in authority and in each other.”

We need trust in authority and in each other in our society, built on a strong and simple legal framework and founded on facts, not fears.

So I think this should be an election issue. What does each party plan on surveillance to resolve the issues outlined by journalists, lawyers and civil society? What applied programmes does each party see that will be, in practical terms: “for all of us in our schools, in our communities, in our families to tackle this.”

If you agree, then you could ask your MP, and ask your prospective parliamentary candidates. What is already done in real life and what are their future policies?

Let’s understand ‘the war on terror’ at home better, and its real impacts. These laws and programmes should be transparent, easy to understand, and not only legal, but clearly just, and proportionate.

Let’s get back to some of the basics, and respect the rights of our children.

Let’s start to untangle this spaghetti of laws; the programmes, that affect us in practice; and understand their measures of success.

Solutions to protecting our children, are neither simple or short term. But they may not always mean more surveillance.

Whether the Secret Service will start recruiting through Mumsnet or not, we could start with better education of us all.

At very least, we should understand what ‘surveillance’ means.

****

If you want to know more detail, I look at this below.

The laws applied in Real Life

Have you ever looked at case studies of how surveillance law is used?

In  one case, a mother and her children’s [1] movements were watched and tracked when a council wasn’t sure if her school application was for the correct catchment area. [It legally used the Regulation of Investigatory Powers Act 2000, (RIPA) [2]

Do you think it is just or fair that  a lawyer’s conversations with his client [3] were recorded and may have been used preparing the trial – when the basis of our justice system is innocent until proven guilty?

Or is it right that journalists’ phone records could be used to identify people by the police, without telling the journalists or getting independent approval, from a judge for example?

ft

These aren’t theoretical questions but stem from real-life uses of laws used in the ‘counter terrorism’ political arena and in practice.

Further programmes store information about every day people which we may find surprising.

In November 2014 it was reported that six British journalists [4] had found out personal and professionally related information had been collected about them, and was stored on the ‘domestic extremist’ database by the Metropolitan Police in London.

They were not criminal nor under surveillance for any wrongdoing.

One of the journalists wrote in response in a blog post on the NUJ website [5]:

“…the police have monitored public interest investigations in my case since 1999. More importantly if the police are keeping tabs on a lightweight like myself then they are doing the same and more to others?”

Ever participated in a protest and if not reported on one?

‘Others’ in that ‘domestic extremist list’ might include you, or me.

Current laws may be about to change [6] (again) and perhaps for the good, but will yet more rushed legislation in this area be done right?

There are questions over the detail and what will actually change. There are multiple bills affecting security, counter-terrorism and data access in parliament, panels and reviews going on in parallel.

The background which has led to this is the culmination of lots of concern and pressure over a long period of time focuses on one set of legal rules, in the the Regulation of Investigatory Powers Act (RIPA).

The latest draft code of practice [7] for the Regulation of Investigatory Powers Act (RIPA) [8] allows the police and other authorities to continue to access journalists’ and other professionals’ communications without any independent process or oversight.

‘Nothing to hide, nothing to fear’, is a phrase we hear said of surveillance but as these examples show, its use is widespread and often unexpected, not in extremes as we are often told.

David Cameron most recently called for ever wider surveillance legislation, again in The Telegraph, Jan 12 2015  saying:[9]

“That is why in extremis it has been possible to read someone’s letter, to listen to someone’s telephone, to mobile communications.”

Laws and programmes enable and permit these kinds of activity which are not transparent to the broad public. Is that right?

The Deregulation bill has changes, which appear now to have been amended to keep the changes affecting journalists in PACE [10] laws after all, but what effects are there for other professions and how exactly will this change interact with further new laws such as the Counter Terrorism and Security Act [p20]? [11]

It’s understandable that politicians are afraid of doing nothing, if a terrorist attack takes place, they are at risk of looking like they failed.

But it appears that politicians may have got themselves so keen to be seen to be doing ‘something’ in the face of terror attacks, that they are doing too much, in the wrong places, and we have ended up with a legislative spaghetti of simultaneous changes, with no end in sight.

It’s certainly no way to make legal changes understandable to the British public.

Political change may come as a result of the General Election. What implications will it have for the applied ‘war-on-terror’ and average citizen’s experience of surveillance programmes in real life?

What do we know about how we are affected? The harm to some in society is real, and is clearly felt in some, if not all communities. [12]

Where is the evidence to include in the debate, how laws affect us in real life and what difference they make vs their intentions?

Anti-terror programmes in practice; in schools & surgeries

In addition to these changes in law, there are a number of programmes in place at the moment.

The Prevent programme?[16] I already mentioned above.

Its expansion to wider settings would include our children from age 2 and up, who will be under an additional level of scrutiny and surveillance [criticism of the the proposal has come across the UK].

How might what a three year old says or draws be interpreted, or recorded them about them, or their family? Who accesses that data?

What film material is being produced that is: ” distributed directly by these organisations, with only a small portion directly badged with government involvement” and who is shown it and why? [Review of Australia‘s Counter Terror Machinery, February 2015] [17]

What if it’s my child who has something recorded about them under ‘Prevent’? Will I be told? Who will see that information?  What do I do if I disagree with something said or stored about them?

Does surveillance benefit society or make parts of it feel alienated and how are both its intangible cost and benefit measured?

When you combine these kinds of opaque, embedded programmes in education or social care  with political thinking which could appear to be based on prejudice not fact [18], the outcomes could be unexpected and reminiscent of 1930s anti-religious laws.

Baroness Hamwee raised this concern in the Lords on the 28th January, 2015 on the Prevent Programme:

“I am told that freedom of information requests for basic statistics about Prevent are routinely denied on the basis of national security. It seems to me that we should be looking for ways of providing information that do not endanger security.

“For instance, I wondered how many individuals are in a programme because of anti-Semitic violence. Over the last day or two, I have been pondering what it would look like if one substituted “Jewish” for “Muslim” in the briefings and descriptions we have had.” Baroness Hamwee:  [28 Jan 2015 : Column 267 -11]

“It has been put to me that Prevent is regarded as a security prism through which all Muslims are seen and that Muslims are suspect until proved otherwise. The term “siege mentality” has also been used.

“We have discussed the dangers of alienation arising from the very activities that should be part of the solution, not part of the problem, and of alienation feeding violence. […]

“Transparency is a very important tool … to counter those concerns.”

Throughout history good and bad are dependent on your point of view. In 70s London, but assuming today’s technology, would all Catholics have come sweepingly under this extra scrutiny?

“Early education funding regulations have been amended to ensure that providers who fail to promote the fundamental British values of democracy, the rule of law, individual liberty and mutual respect and tolerance for those with different faiths and beliefs do not receive funding.” [consultation guidance Dec 2014]

The programme’s own values seem undermined by its attitudes to religion and individual liberty. On universities the same paragraph on ‘freedom of speech’ suggests restrictive planning measures on protest meetings and IT surveillance for material accessed for  ‘non-research purposes’.

School and university is a time when our young people explore all sorts of ideas, including to be able to understand and to criticise them. Just looking at material online should not necessarily have any implications.  Do we really want to censor what our young people should and should not think about, and who is deciding the criteria?

For families affected by violence, nothing can justify their loss and we may want to do anything to justify its prevention.

But are we seeing widespread harm in society as side effects of surveillance programmes?

We may think we live in a free and modern society. History tells us all too easily governments can change a slide into previously unthinkable directions. It would be complacent to think, ‘it couldn’t happen here.’

Don’t forget, religious hate crime escalated by 45% in 2013/14 Crime figures [19])

Writers self-censor their work.  Whistleblowers may not come forward to speak to journalists if they feel actively watched.

Terrorism is not new.

Young people with fervour to do something for a cause and going off ‘to the fight’ in a foreign country is not new.

In the 1930s the UK Government made it illegal to volunteer to fight in Spain in the civil war, but over 2,000 went anyway.

New laws are not always solutions. especially when ever stricter surveillance laws, may still not mean any better accuracy of terror prevention on the ground. [As Charlie Hebdo and Copenhagen showed. in these cases the people involved were known to police. In the case of Lee Rigby it was even more complex.]

How about improving our citizens’ education and transparency about what’s going on & why, based on fact and not fear?

If the state shouldn’t nanny us, then it must allow citizens and parents the transparency and understanding of the current reality, to be able to inform ourselves and our children in practical ways, and know if we are being snooped on or surveillance recorded.

There is an important role for cyber experts in/and civil society to educate and challenge MPs on policy. There is also a very big gap in practical knowledge for the public, which should be addressed.

Can  we trust that information will be kept confidential that I discuss with my doctor or lawyer or if I come forward as a whistleblower?

Do I know whether my email and telephone conversations, or social media interactions are being watched, actively or by algorithms?

Do we trust that we are treating all our young people equally and without prejudice and how are we measuring impact of programmes we impose on them?

To counter terrorism, society needs more than labels and laws

We need trust in authority and in each other in our society, built on a strong and simple legal framework and founded on facts, not fears.

If the Prevent programme is truly needed at this scale, tell us why and tell us all what our children are being told in these programmes.

We should ask our MPs even though consultation is closed, what is the evidence behind the thinking about getting prevent into toddler settings and far more? What risks and benefits have been assessed for any of our children and families who might be affected?

Do these efforts need expanded to include two-year-olds?

Are all efforts to keep our kids and society safe equally effective and proportionate to potential and actual harm caused?

Alistair MacDonald QC, chairman of the Bar Council, said:

‘As a caring society, we cannot simply leave surveillance issues to senior officers of the police and the security services acting purportedly under mere codes of practice.

What is surely needed more than ever before is a rigorous statutory framework under which surveillance is authorised and conducted.”

Whether we are disabled PIP protesters outside parliament or mothers on the school run, journalists or lawyers, doctors or teachers, or anyone, these changes in law or lack of them, may affect us. Baroness Hamwee clearly sees harm caused in the community.

Development of a future legislative framework should reflect public consensus, as well as the expert views of technologists, jurists, academics and civil liberty groups.

What don’t we know? and what can we do?

According to an Ipsos MORI poll for the Evening Standard on October 2014 [20] only one in five people think the police should be free to trawl through the phone records of journalists to identify their sources.

Sixty-seven per cent said the approval of a judge should be obtained before such powers are used.

No one has asked the public if we think the Prevent programme is appropriate or proportionate as far as I recall?

Who watches that actions taken under it, are reasonable and not reactionary?

We really should be asking; what are our kids being shown, taught, informed about or how they may be  informed upon?

I’d like all of that in the public domain, for all parents and guardians. The curriculum, who is teaching and what materials are used.

It’s common sense to see that young people who feel isolated or defensive are less likely to talk to parents about their concerns.

It is a well known quote in surveillance “Nothing to hide, nothing to fear.” But this argument is flawed, because information can be wrong.

‘Nothing to fear, nowhere to hide’, may become an alternative meme we hear debated again soon, about surveillance if the internet and all communications are routinely tracked, without oversight.

To ensure proper judicial oversight in all these laws and processes – to have an independent judge give an extra layer of approval – would restore public trust in this system and the authority on which it depends.

It could pave the way for a new hope of restoring the checks and balances in many governance procedures, which a just and democratic society deserves.

As Roosevelt said: “let me assert my firm belief that the only thing we have to fear is fear itself—nameless, unreasoning, unjustified terror.”

 

******

[On Channel4OD: Channel 4 – Oscar winning, ‘CitizenFour’  Snowden documentary]

References:

[1] The Guardian, 2008, council spies on school applicants

[2] Wikipedia RIPA legislation

[3] UK admits unlawfully monitoring communications

[4] http://www.theguardian.com/uk-news/2014/nov/20/police-legal-action-snooping-journalists

[5] Journalist’s response

[6] SOS Campaign

[7] RIPA Consultation

[8] The RIPA documents are directly accessible here

[9] The Telegraph

[10] Deregulation Bill

[11] Counter Terrorism and Security Act 2015

[12] Baroness Hamwee comments in the House of Lords [Hansard]

[13] Consultation response by charity Children in Scotland

[14] The Telegraph, Anti-terror plan to spy on toddlers ‘is heavy-handed’

[15] GPs told to specify counter terrorism leads [Prevent]

[16] The Prevent programme, BBC / 2009 Prevent programme for schools

[17] Review of Australia’s CT Machinery

[18] Boris Johnson, March 2014

[19] Hate crime figures 2013-14

[20] Ipsos MORI poll, October 2014

 

******

 image credit: ancient history

MORE: click the link below

Continue reading Nothing to fear, nowhere to hide – a mother’s attempt to untangle UK surveillance law and programmes

On Being Human – moral and material values

The long running rumours of change afoot on human rights political policy were confirmed recently, and have been in the media and on my mind since.

Has human value become not just politically acceptable, but politically valuable?

Paul Bernal in his blog addressed the subject which has been on my mind, ‘Valuing the Human’ and explored the idea, ‘Many people seem to think that there isn’t any value in the human, just in certain kinds of human.’

Indeed, in recent months there appears to be the creation of a virtual commodity, making this concept of human value “not just politically acceptable, but politically valuable.” The concept of the commodity of human value, was starkly highlighted by Lord Freud’s recent comments, on human worth. How much a disabled person should earn was the focus of the remarks, but conflated the price of labour and human value.

European Rights undermined

Given the party policy announcements and the response by others in government or lack of it, it is therefore unsurprising that those familiar with human rights feel they will be undermined in the event that the policy proposals should ever take effect. As the nation gears up into full electioneering mode for May 2015, we have heard much after party speeches, about rights and responsibilities in our dealings with European partners, on what Europe contributes to, or takes away from our sovereignty in terms of UK law. There has been some inevitable back-slapping and generalisation in some quarters that everything ‘Europe’ is bad.

Whether or not our state remains politically within the EU may be up for debate, but our tectonic plates are not for turning. So I find it frustrating when politicians speak of or we hear of in the media, pulling out of Europe’ or similar.

This conflation of language is careless,  but I fear it is also dangerous in a time when the right wing fringe is taking mainstream votes and politicians in by-elections. Both here in the UK and in other European countries this year, far right groups have taken significant votes.

Poor language on what is ‘Europe’ colours our common understanding of what ‘Europe’ means, the nuances of the roles organisational bodies have, for example the differences between the European Court of Human Rights and the European Court of Justice, and their purposes are lost entirely.

The values imposed in the debate are therefore misaligned with the organisations’ duties, and all things ‘European’ and organisations  are tarred with the same ‘interfering’ brush and devalued.

Human Rights were not at their heart created by ‘Europe’ nor are they only some sort of treaty to be opted out from, [whilst many are enshrined in treaties and Acts which were, and are] but their values risk being conflated with the structures which support them.

“A withdrawal from the convention could jeopardise Britain’s membership of the EU, which is separate to the Council of Europe whose members are drawn from across the continent and include Russia and Ukraine. Membership of the Council of Europe is a requirement for EU member states.” [Guardian, October 3rd – in a clearly defined article]

The participation in the infrastructure of ‘Brussels’ however, is convenient to conflate with values; a loss of sovereignty, loss of autonomy, frivoulous legislation. Opting out of a convention should not mean changing our values. However it does seem the party attitude now on show, is seeking to withdraw from the convention. This would mean withdrawing the protections the structure offers. Would it mean withdrawing rights offered to all citizens equally as well?

Ethical values undermined

Although it varies culturally and with few exceptions, I think we do have in England a collective sense of what is fair, and how we wish to treat each others as human beings. Increasingly however, it feels as though through loose or abuse of language in political debate we may be giving ground on our ethics. We are being forced to bring the commodity of human value to the podium, and declare on which side we stand in party politics. In a time of austerity, there is a broad range of ideas how.

Welfare has become branded ‘benefits’. Migrant workers, ‘foreigners’ over here for ‘benefit tourism’. The disabled labeled ‘fit for work’ regardless of medical fact. It appears, increasingly in the UK, some citizens are being measured by their economic material value to contribute or take away from ‘the system’.

I’ve been struck by the contrast coming from 12 years abroad, to find England a place where the emphasis is on living to work, not working to live. If we’re not careful, we see our personal output in work as a measure of our value. Are humans to be measured only in terms of our output, by our productivity, by our ‘doing’ or by our intrinsic value as an individual life? Or simply by our ‘being’? If indeed we go along with the concept, that we are here to serve some sort of productive goal in society on an economic basis, our measurement of value of our ‘doing’, is measured on a material basis.

“We hear political speeches talking about ‘decent, hardworking people’ – which implies that there are some people who are not as valuable.”

I strongly agree with this in Paul’s blog. And as he does, disagree with its value statement.

Minority Rights undermined

There are minorities and segments of society whose voice is being either ignored, or actively quietened. Those on the outer edge of the umbrella ‘society’ offers us, in our collective living, are perhaps least easily afforded its protections. Travelers, those deemed to lack capacity, whether ill, old or young, single parents, or ‘foreign’ workers, to take just some examples.

I was told this week that the UK has achieved a  first. It was said, we are the first ‘first-world’ country under review by the CPRD for human rights abuse of the disabled. Which cannot be confirmed nor denied by the UN but a recent video indicated.

This is appalling in 21st century Britain.

Recently on Radio 4 news I heard of thousands of ESA claimants assigned to work, although their medical records clearly state they are long term unfit.

The group at risk highlighted on October 15th in the Lords, in debate on electoral records’ changes [col 206]  is women in refuges, women who feel at risk. As yet I still see nothing to assure me that measures have been taken to look after this group, here or for care.data.{*}

These are just simplified sample groups others have flagged at risk. I feel these groups’ basic rights are being ignored, because they can be for these minorities. Are they viewed as of less value than the majority of ‘decent, hardworking people’ perhaps, as having less economic worth to the state?

Politicians may say that any change will continue to offer assurances:
“We promote the values of individual human dignity, equal treatment and fairness as the foundations of a democratic society.”

But I simply don’t see it done fairly for all.

I see society being quite deliberately segmented into different population groups, weak and strong. Some groups need more looking after than others, and I am attentive when I hear of groups portrayed as burdens to society, the rest who are economically ‘productive’.

Indeed we seem to have reached a position in which the default position undermines the rights of the vulnerable, far from offering additional responsibilities to those who should protect them.

This stance features often in the media discussion and in political debate, on health and social care. DWP workfare, JSA, ‘bedroom tax’ to name but a few.


How undermining Rights undermines access

So, as the NHS England five year forward plan was announced recently, I wonder how the plan for the NHS and the visions for the coming 5 year parliamentary terms will soon align?

There is a lot of talking about plans, but more important is what happens as a result not of what we say, but of what we do, or don’t do. Not only for future, but what is already, today.

Politically, socially and economically we do not exist in silos. So too, our human rights which overlap in these areas, should be considered together.

Recent years has seen a steady reduction of rights to access for the most vulnerable in society. Access to a lawyer or judicial review has been made more difficult through charging for it.  The Ministry of Justice is currently pushing for, but losing it seems their quest in the Lords, for changes to the judicial review law.

If you are a working-age council or housing association tenant, the council limits your housing benefit claim if it decides you have ‘spare’ bedrooms. Changes have hit the disabled and their families hardest. These segments of the population are being denied or given reduced access to health, social and legal support.

Ethical Values need Championed

Whilst it appears the state increasingly measures everything in economic value, I believe the public must not lose sight of our ethical values, and continue to challenge and champion their importance.

How we manage our ethics today is shaping our children. What do we want their future to be like? It will also be our old age. Will we by then be measured by our success in achievement, by what we ‘do’, by what we financially achieved in life, by our health, or by who we each are? Or more intrinsically, values judged even, based on our DNA?

Will it ever be decided by dint of our genes, what level of education we can access?

Old age brings its own challenges of care and health, and we are an aging population. Changes today are sometimes packaged as shaping our healthcare fit for the 21st century.

I’d suggest that current changes in medical research and the drivers behind parts of the NHS 5YP vision will shape society well beyond that.

What restrictions do we place on value and how are moral and material values to play out together? Are they compatible or in competition?

Because there is another human right we should remember in healthcare, that of striving to benefit from scientific improvement.

This is an area in which the rights of the vulnerable and the responsibilities to uphold them must be clearer than clear.

In research if Rights are undermined, it may impact Responsibilities for research

I would like to understand how the boundary is set of science and technology and who sets them on what value basis in ethics committees and more. How does it control or support the decision making processes which runs in the background of NHS England which has shaped this coming 5 year policy?

It appears there are many decisions on rare disease, on commissioning,  for example, which despite their terms of reference, see limited or no public minutes, which hinders a transparency of their decision making.

The PSSAG has nothing at all. Yet they advise on strategy and hugely significant parts of the NHS budget.

Already we see fundamental changes of approach which appear to have economic rather than ethical reasons behind them. This in stem-cell banking, is a significant shift for the state away from the absolute belief in the non-commercialisation of human tissue, and yet little public debate has been encouraged.

There is a concerted effort from research bodies, and from those responsible for our phenotype data {*}, to undermine the coming-in-2015, stronger, European data protection and regulation, with attempt to amend EU legislation in line with [less stringent] UK policy. Policy which is questioned by data experts on the use of pseudonymisation for example.

How will striving to benefit from scientific improvement overlap with material values of ‘economic function’ is clear when we hear often that UK Life Sciences are the jewel in the crown of the UK economy? Less spoken of, is how this function overlaps with our moral values.

“We’ve got to change the way we innovate, the way that we collaborate, and the way that we open up the NHS.” [David Cameron, 2011]

Patient questions on care.data – an open letter

Dear NHS England Patients & Information Directorate,

We’ve been very patient patients in the care.data pause. Please can we have some answers now?

I would like to call for greater transparency and openness about the promises made to the public, project processes & policies and your care.data communication plans.

In 2013, in the Health Service Journal Mr. Kelsey wrote:

“When patients are ignored, they are most at risk; that was the central conclusion of the report by Robert Francis into Stafford hospital.

Don Berwick, in his safety review, said the NHS should be “engaging, empowering and hearing patients and their carers all the time.

“That has been my mission since I started as National Director for Patients and Information: to support health and care services transform transparency and participation.

HSJ, 10th December 2013

It is time to walk-the-talk for care.data under this banner of transparency, participation and open government.

Response to the Listening exercises

The care.data listening phase, introduced by the pause announced on February 18th, has captured a mass of questions, the majority of which still remain unaddressed.

At one of these sessions, [the 1-hr session on June 17th Open House, linking ca. 100 people at each of the locations in Basingstoke, Leicester, London, and York] participants were promised that our feedback would be shared with us later in the summer, and posted online. After the NHS AGM on Sept 18th I was told it would happen ‘soon’. It is still not in the public domain.

At every meeting all the unanswered questions, on post-it notes, in table-group minutes or scribbled flipcharts, were gathered ‘to be answered at a later date’. When will that be?

To date, there has been no published information which addresses the unanswered event questions.

Transparency of Process, Policies and Approach

The care.data Programme Board has held meetings to plan the rollout process, policies and approach. The minutes and materials from which have not been published. I find this astonishing when one considers that the minutes of the care.data advisory group, NIB (new), CAG, GPES advisory or even NHS England Board itself are in the public domain. I believe the care.data Programme Board meeting materials should be too.

It was acknowledged through the Partridge Review of past use of our hospital records that this HES data is not anonymous. The extent of its sale to commercial third-parties and use by police and the Home Office was revealed. This is our medical data we gave to hospitals and in our wider medical use for our care. Why are we the last to hear it’s being accessed by all sorts of people who are not at all involved in our clinical care?

Even for commissioning purposes it is unclear how these datasharing reasons are justified when the Caldicott Review said extracting identifiable data for risk stratification or commissioning could not be assumed under some sort of ‘consent deal’?

“The Review Panel found that commissioners do not need dispensation from confidentiality, human rights and data protection law…” [The Information Governance review, ch7]

The 251 approval just got extended *again* – until 30th April 2015. If you can’t legally extract data without repeat approvals from on high, then maybe it’s time to question why?

The DoH, NHS England Patients and Information Directorate, HSCIC, and indeed many data recipients, all appear to have normalised an approach that for many is still a shock. The state centralised and passed on our medical records to others without our knowledge or permission. For years. With financial exchange. 

Amazingly, it continues to be released in this way today, still without our consent or fair processing or publicised way to opt out.

“To earn the public’s trust in future we must be able to show that our controls are meticulous, fool-proof and solid as a rock.”  said Sir Nick Partridge in his summary review.

Now you ask us to trust in care.data that the GP data, a degree more personal, will be used properly.

Yet you ask us to do this without significant changes in legislation to safeguard tightly defined purposes who can access it and why, how we control what future changes may be made without our knowledge and without a legally guaranteed opt out.

There is no information about what social care dataset is to be included in future, so how can we know what care.data scope even is yet?

Transparency cannot be a convenient watch word which applies with caveats. Quid pro quo, you want our data under an assumed consent process, then guarantee a genuinely informed public.

You can’t tell patients one approach now, then plan to change what will be said after the pilot is complete, knowingly planning a wider scope to include musculoskeletal or social care data and more.  Or knowing you plan to broaden users of data [like research and health intelligence currently under discussion at IAG ] but only communicate a smaller version in the pilot. That is like cheating on a diet. You can’t say and do one thing in public, then have your cake and eat it later when no one is looking. It still counts.

In these processes, policies and approach, I don’t feel my trust can be won back with lack of openness and transparency. I don’t yet see a system which is, ‘meticulous, fool-proof or solid as a rock’.

‘Pathfinder’ pilots

Most recently you have announced that four areas of CCGs will pilot the ‘pathfinder’ stage in the rollout of phase one. But where and when remains a  mystery. Pathfinder communications methods may vary from place to place and trial what works and what fails. One commendable method will be a written letter.

However even given that individual notice intent, we cannot ignore that many remaining questions will be hard to address in a leaflet or letter. They certainly won’t fit into an SMS text.

Why pilot communications at all which will leave the same open questions unanswered you already know, but have not answered?

For example, let’s get a few of the missing processes clarified up front:

  • How will you communicate with Gillick competent children, whose records may contain information about which their parents are not aware?
  • How will you manage this for elderly or vulnerable patients in care homes and with diminished awareness or responsibility?
  • What of  the vulnerable at risk of domestic abuse and coercion?
  • When things change in scope or use, how will we be given the choice to change our opt out decision?

I ask you not to ignore the processes which remain open. They need addressed BEFORE the pilot, unless you want people to opt out on the basis of their uncertainty and confusion.

What you do now, will set the model expectations for future communications. Patient online. Personalised medicine. If NHS health and social care is to become all about the individual, will you address all individuals equally or is reaching some less important than others?

It seems there is time and effort in talking to other professionals about big data, but not to us, whose data it is. Dear Patients & Information Directorate, you need to be talking to us, before to others about how to use us.

In March, this twelve point plan made some sensible suggestions.

Many of them remain unaddressed. You could start there. But in addition it must be clear before getting into communications tools, what is it that the pathfinders are actually piloting?

You can’t pilot communications without clearly defined contents to talk about.

Questions of substance need answers, the ten below to start with.

What determines that patients understand the programme and are genuinely informed, and how will it be measured?

Is it assumed that pilots will proceed to extraction? Or will the fair processing efforts be evaluated first and the effort vs cost be taken into account whether it is worth proceeding at all?

Given the cost involved, and legal data protection requirements, surely the latter? But the pathfinder action plan conflates the two.

Citizen engagement

Let’s see this as an opportunity to get care.data right, for us, the patients. After all, you and the rest of the NHS England Board were keen to tell us at the NHS AGM on September 18th, how valuable citizen engagement is, and to affirm that the NHS belongs to us all.

How valued is our engagement in reality, if it is ignored? How will involvement continue to be promoted in NHS Citizen and other platforms, if it is seen to be ineffective? How might this negatively affect future programmes and our willingness to get involved in clinical research if we don’t trust this basic programme today?

This is too important to get wrong. It confuses people and causes concern. It put trust and confidence in jeopardy. Not just for now, but for other future projects. care.data risks polluting across data borders, even to beyond health:

“The care.data story is a warning for us all. It is far better if the industry can be early on writing standards and protocols to protect privacy now rather than later on down the track,” he said. [David Willets, on 5G]

So please, don’t keep the feedback and this information to internal departments.

We are told it is vital to the future of our NHS. It’s our personal information.  And both belong to us.

During one Health Select Committee hearing, Mr. Kelsey claimed: “If 90 per cent opt out [of care.data], we won’t have an NHS.”

The BMA ARM voted in June for an opt in model.

ICO has ruled that an opt in model by default at practice level with due procedures for patient notification will satisfy both legal requirements and protect GPs in their role as custodians of confidentiality and data controllers. Patient Concern has called for GPs to follow that local choice opt in model.

I want to understand why he feels what the risk is, to the NHS and examine its evidence base. It’s our NHS and if it is going to fail without care.data and the Board let it come to this, then we must ask why. And we can together do something to fix it. There was a list of pre-conditions he stated at those meetings would be needed before any launch, which the public is yet to see met. Answering this question should be part of that.

It can’t afford to fail, but how do we measure at what cost?

I was one of many, including much more importantly the GPES Advisory Group, who flagged the shortcomings of the patient leaflet in October 2013, which failed to be a worthwhile communications process in January. I flagged it with comms teams, my MP, the DoH.

[Sept 2013 GPES Advisory] “The Group also had major concerns about the process for making most patients aware of the contents of the leaflets before data extraction for care.data commenced”.

No one listened. No action was taken. It went ahead as planned. It cost public money, and more importantly, public trust.

In the words of Lord Darzi,

“With more adroit handling, this is a row that might have been avoided.”

Now there is still a chance to listen and to act. This programme can’t afford to pilot another mistake. I’m sure you know this, but it would appear that with the CCG announcement, the intent is to proceed to pilot soon.  Ready or not.

If the programme is so vital to the NHS future, then let’s stop and get it right. If it’s not going to get the participation levels needed, then is it worth the cost? What are the risks and benefits of pressing ahead or at what point do we call a halt? Would it be wise to focus first on improving the quality and correct procedures around the data you already have – before increasing the volume of data you think you need? Where is the added intelligence, in adding just more information?

Is there any due diligence, a cost benefit analysis for care.data?

Suggestions

Scrap the ‘soon’ timetable. But tell us how long you need.

The complete raw feedback from all these care.data events should be made public, to ensure all the questions and concerns are debated and answers found BEFORE any pilot.

The care.data programme board minutes papers and all the planning and due diligence should be published and open to scrutiny, as any other project spending public funds should be.

A public plan of how the pathfinders fit into the big picture and timeline of future changes and content would remove the lingering uncertainty of the public and GPs: what is going on and when will I be affected?

The NHS 5 year forward view was quite clear; our purse strings have been pulled tight. The NHS belongs to all of us. And so we should say, care.data  can’t proceed at any and at all costs. It needs to be ‘meticulous, fool-proof and solid as a rock’.

We’ve been patient patients. We should now expect the respect and response, that deserves.

Thank you for your consideration.

Yours sincerely.

 

Addendum: Sample of ten significant questions still outstanding

1. Scope: What is care.data? Scope content is shifting. and requests for scope purposes are changing already, from commissioning only to now include research and health intelligence. How will we patients know what we sign up to today, stays the purposes to which data may be used tomorrow?

2. Scope changes fair processing: We cannot sign up to one thing today, and find it has become something else entirely tomorrow without our knowledge. How will we be notified of any changes in what is to be extracted or change in how what has been extracted is to be used in future – a change notification plan?

3. Purposes clarity: Who will use which parts of our medical data for what? a: Clinical care vs secondary uses:

Given the widespread confusion – demonstrated on radio and in press after the pathfinders’ announcement – between care.data  which is for ‘secondary use’ only, i.e. purposes other than the direct care of the patient – and the Summary Care Record (SCR) for direct care in medical settings, how will uses be made very clear to patients and how it will affect our existing consent settings?

3. Purposes definition: Who will use which parts of our medical data for what?  b) Commercial use  It is claimed the Care Act will rule out “solely commercial”purposes, but how when what remains is a broad definition open to interpretation? Will “the promotion of health” still permit uses such as marketing? Will HSCIC give its own interpretation, it is after all, the fact it operates within the law which prescribes what it should promote and permit.

3. Purposes exclusion: Who will use which parts of our medical data for what?  c) Commercial re-use by third parties: When will the new contracts and agreements be in place? Drafts on the HSCIC website still appear to permit commercial re-use and make no mention of changes or revoking licenses for intermediaries.

4a. Opt out: It is said that patients who opt out will have this choice respected by the Health and Social Care Information Centre (i.e. no data will be extracted from their GP record) according to the Secretary of State for Health  [col 147] – but when will the opt out – currently no more than a spoken promise – be put on a statutory basis? There seem to be no plans whatsoever for this.

Further wider consents: how patients will know what they have opted into or out from is currently almost impossible. We have the Summary Care Record, Proactive care in some local areas, different clinical GP systems, the Electronic Prescription Service and soon to be Patient Online, all using different opt in methods of asking and maintaining data and consent, means patients are unsurprisingly confused.

4b. Opt out: At what point do you determine that levels of participation are worth the investment and of value? If parts of the population are not represented, how will it be taken into account and remain valuable to have some data? What will be statistically significant?

5. Legislation around security: The Care Act 2014 is supposed to bring in new legislation for our data protection. But there are no changes to date as far as I can see – what happened to the much discussed in Parliament, one strike and out. Is any change still planned? If so, how has this been finalised and with what wording, will it be open to Parliamentary scrutiny?  The Government claim to have added legal protection is meaningless until the new Care Act Regulations are put in front of Parliament and agreed.

6. What of the Governance changes discussed?

There was some additional governance and oversight promised, but to date no public communication of changes to the data management groups through the HRA CAG or DAAG and no sight of the patient involvement promised.

The Data Guardian role remains without the legal weight that the importance of its position should command. It has been said this will be granted ‘at the earliest opportunity.’ Many seem to have come and gone.

7. Data security: The planned secure data facility (‘safe setting’) at HSCIC to hold linked GP and hospital data is not yet built for expanded volume of data and users expected according to Ciaran Devane at the 6th September event. When will it be ready for the scale of care.data?

Systems and processes on this scale need security designed, that scales up to match in size with the data and its use.

Will you proceed with a pilot which uses a different facility and procedures from the future plan? Or worse still, with extracting data into a setting you know is less secure than it should be?

8. Future content sharing: Where will NHS patients’ individual-level data go in the longer term? The current documentation says ‘in wave 1’ or phase one, which would indicate a future change is left open, and indicated identifiable ‘red’ data is to be shared in future?  “care.data will provide the longer term visions as well as […] the replacement for SUS.

9.  Current communications:

    • How will GPs and patients in ‘pathfinder’ practices be contacted?
    • Will every patient be written to directly with a consent form?
    • What will patients who opted out earlier this year be told if things have changed since then?
    • How will NHS England contact those who have retired or moved abroad recently or temporarily, still with active GP records?
    • How will foreign pupils’ parents be informed abroad and rights respected?
    • How does opt out work for sealed envelopes?
    • All the minorities with language needs or accessibility needs – how will you cater for foreign language, dialect or disability?
    • The homeless, the nomadic,  children-in-care
    • How can we separate these uses clearly from clinical care in the public’s mind to achieve a genuinely informed opinion?
    • How will genuine mistakes in records be deleted – wrong data on wrong record, especially if we only get Patient Online access second and then spot mistakes?
    • How long will data be retained for so that it is relevant and not excessive – Data Protection principle 3?
    • How will the communications cater for both GP records and HES plus other data collection and sharing?
    • If the plan is to have opt out effective for all secondary uses, communications must cater for new babies to give parents an informed choice from Day One. How and when will this begin?

No wonder you wanted first no opt out, then an assumed consent via opt out junk mail leaflet. This is hard stuff to do well. Harder still, how will you measure effectiveness of what you may have missed?

10. Pathfinder fixes: Since NHS England doesn’t know what will be effective communications tools, what principles will be followed to correct any failures in communications for any particular trial run and how will that be measured?

How will patients be asked if they heard about it and how will any survey, or follow up ensure the right segmentation does not miss measuring the hard to reach groups – precisely those who may have been missed?  i.e. If you only inform 10% of the population, then ask that same 10% if they heard of care.data, you would expect a close to 100% yes. That’s not reflective that the whole population was well informed about the programme.

If it is shown to have been ineffective, at what point do you say Fair Processing failed and you cannot legally proceed to extraction?

> This list doesn’t yet touch on the hundreds of questions generated from public events, on post-its and minutes. But it would be a start.

*******

References for remaining questions:

17th June Open House: Q&A

17th June Open House: Unanswered public Questions

Twelve point plan [March 2014] positive suggestions by Jeremy Taylor, National Voices

6th September care.data meeting in London

image quote: Winnie The Pooh, A.A. Milne