Tag Archives: privacy

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (1)

Nicky Morgan, the Education Secretary,  gave a speech [1] this week and shared her dream of the benefits technology for pupils.

It mentioned two initiatives to log children’s individual actions; one is included in a consultation on new statutory guidance, and the other she praised, is a GPS based mobile monitoring app.

As with many new applications of technology, how the concept is to be implemented in practice is important to help understand how intrusive any new use of data is going to be.

Unfortunately for this consultation there is no supporting code of practice what the change will mean, and questions need asked.

The most significant aspects in terms of changes to data collection through required monitoring are in the areas of statutory monitoring, systems, and mandatory teaching of ‘safeguarding’:

Consultation p11/14: “We believe including the requirement to ensure appropriate filtering and monitoring are in place, in statutory guidance, is proportional and reasonable in order to ensure all schools and colleges are meeting this requirement. We don’t think including this requirement will create addition burdens for the vast majority of schools, as they are already doing this, but we are keen to test this assumption.”

Consultation:  paragraph 75 on page 22 introduces this guidance section and ends with a link to “Buying advice for schools.” “Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. Guidance on e-security is available from the National Education Network.

Guidance: para 78 “Whilst it is essential that governing bodies and proprietors ensure that appropriate filters and monitoring systems are in place they should be careful  that “over blocking” does not lead to unreasonable restrictions as to what children can be taught with regards to online teaching  and safeguarding.” —

Consultation: “The Opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and  “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities. This is an important topic and the assumption is the vast majority of governing bodies and proprietors will already be ensuring the children in their school are suitably equipped with regards to safeguarding. But we are keen to hear views as to the change in emphasis.”

Paragraph 88 on p24  is oddly phrased: “Governing bodies and proprietors should ensure that staff members do not agree confidentiality and always act in the best interests of the child.”

What if confidentiality may sometimes be in the best interests of the child? What would that mean in practice?

 

Keeping Children Safe in Education – Questions on the Consultation and Use in practice

The consultation [2] is open until February 16th, and includes a new requirement to have web filtering and monitoring systems.

Remembering that 85% of children’s waking hours are spent outside school, and in a wide range of schools our children aged 2 -19, not every moment is spent unsupervised and on-screen, is it appropriate that this 24/7 monitoring would be applied to all types of school?

This provider software is reportedly being used in nearly 1,400 secondary schools in the UK.  We hear little about its applied use.

The cases of cyber bullying or sexting in schools I hear of locally, or read in the press, are through smartphones. Unless the school snoops on individual devices I wonder therefore if the cost, implementation and impact is proportionate to the benefit?

  1. Necessary and proportionate? How does this type of monitoring compare with other alternatives?
  2. Privacy impact assessment? Has any been done – surely required as a minimum measure?
  3. Cost benefit risk assessment of the new guidance in practice?
  4. Problem vs Solution: What problem is it trying to solve and how will they measure if it is successful, or stop its use if it is not?  Are other methods on offer?
  5. Due diligence: how do parents know that the providers have undergone thorough vetting and understand who they are? After all, these providers have access to millions of our children’s  online interactions.
  6. Evidence: If it has been used for years in school, how has it been assessed against other methods of supervision?
  7. The national cash cost: this must be enormous when added up for every school in the country, how is cost balanced against risk?
  8. Intangible costs – has anyone asked our children’s feeling on this? Where is the boundary between what is constructive and creepy? Is scope change documented if they decide to collect more data?

Are we Creating a Solution that Solves or creates a Problem?

The private providers would have no incentive to say their reports don’t work and schools, legally required to be risk averse, would be unlikely to say stop if there is no outcome at all.

Some providers  include “review of all incidents by child protection and forensic experts; freeing up time for teachers to focus on intervention” and “trends such as top users can be viewed.” How involved are staff who know the child as a first point of information sharing?

Most tools are multipurposed and I understand the reasons given behind them, but how it is implemented concerns me.

If the extent of these issues really justify this mass monitoring in every school – what are we doing to fix the causes, not simply spy on every child’s every online action in school? (I look at how it extends outside in part two.)

Questions on Public engagement: How are children and families involved in the implementation and with what oversight?

  1. Privacy and consent: Has anyone asked pupils and parents what they think and what rights they have to say no to sharing data?
  2. Involvement: Are parents to be involved and informed in software purchasing and in all data sharing decisions at local or regional level? Is there consistency of message if providers vary?
  3. Transparency: Where are the data created through the child’s actions stored, and for how long? Who has access to the data? What actions may result from it? And with what oversight?
  4. Understanding: How will children and parents be told what is “harmful and inappropriate content” as loosely defined by the consultation, and what they may or may not research? Children’s slang changes often, and “safeguarding” terms are subjective.
  5. Recourse: Will it include assessment of unintended consequences from misinterpretation of information gathered?
  6. Consent: And can I opt my child out from data collection by these unknown and ‘faceless’ third parties?

If children and parents are told their web use is monitored, what chilling effect may that have on their trust of the system, of teaching staff, and their ability to look for content to support their own sensitive concerns or development  that they may not be able to safe to look for at home? What limitation will that put on their creativity?

These are all questions that should be asked to thoroughly understand the consultation and requires wide public examination.

Since key logging is already common practice (according to provider websites) and will effectively in practice become statutory, where is the public discussion? If it’s not explicitly statutory, should pupils be subject to spying on their web searches in a postcode lottery?

What exactly might this part of the new guidance mean for pupils?

In part two, I include the other part of her speech, the GPS app and ask whether if we track every child in and outside school, are we moving closer to the digital dream, or nightmare, in the search to close the digital skills gap?

****

References:

[1] Nicky Morgan’s full speech at BETT

[2] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

“Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. [Proposed statutory guidance]

Since “guidance on procuring appropriate ICT” from the National Education Network NEN* is offered, it is clearly intended that this ‘system’ to be ‘in place’, should be computer based. How will it be applied in practice? A number of the software providers for schools already provide services that include key logging, using “keyword detection libraries” that “provides a complete log of all online activity”.

(*It’s hard to read more about as many of NEN’s links are dead.)  

Children’s private chats and personal data lost through VTech toys

If you’ve got young children who have an Innotab console  or other ed tech apps and games from Vtech, then you need to pay attention.

Your and your children’s personal data may have been stolen. The Vtech security breach has exposed private data of more than six million children worldwide, including 700,000 British customers.IMG_3125

The games are designed for children age 2-9. The loss reportedly includes thousands of pictures of children and parents, as well as a year’s worth of chat logs, names and addresses.

Where from? Well, the information that parents and children entered in set up or created using the games like Innotab for example.  The Innotab using an apps allows children to record voice and text messages, take photos and send these to the matching app on the parents’ phone. The data from both users has been lost. The link is the registered email account that connects both child’s toy, and parent’s phone, via the downloaded app.

And why kids’ photos may be included, is that during the set up, a profile photo can be taken by the child, and stored and used in a similar way to social media sites.

VTech’s Learning Lodge app store customer database is affected and VTech Kid Connect servers accessed. As a precautionary step, Vtech says on their website, they have suspended Learning Lodge, the Kid Connect network and a dozen websites temporarily whilst they ‘conduct a thorough security assessment’.

Reactions

One mum spoke to Good Morning Britain about how she felt when she discovered information about her child may have been stolen.

She says she hadn’t received any notification about the loss from VTech and didn’t know about it until she saw it on the six o’clock news. She then pro-actively contacted VTech customer services.

VTech’s response was confused, first telling her they had not lost data related to the KidConnect app – but in a later email say they did.

What’s disappointing in GMB’s coverage they focused in the VTech story on how disappointing this would be for the toymaker VTech in the run up to Christmas.

There was little information for families on what this could mean for using the toys safely in future.  They went on to talk about some other web based tools, but didn’t talk about data protection which really should be much stronger for children’s data.

While parents must take an active role in thinking ahead for our children and how their digital identities can be compromised, we also need to be able to rely on organisations with whom we entrust our own and our children’s personal data, and know that when they ask us for data that they will look after it securely, and use it in ways we expect. On the technical side, data security systems need to be proportionate to the risks they place children in, if data are breached. This is true of commercial companies and public bodies.

On the human side, public transparency and good communication are key to managing expectations, to ensure we know what users sign up to, and to know what happens if things go wrong.

Sadly VTech is continuing to downplay the loss of children’s personal data. In their first statement their focus was to tell people not to worry because credit card details were not included.

When I asked five days after the breach was announced, VTech declined to confirm to me whether avatars and profile pictures had been accessed, arguing that its internal investigation is still ongoing. That’s now two weeks ago.

Their FAQs still say this is unknown. If this is true it would appear surprisingly incompetent on the part of VTech to know that all the other items have been lost in detail, but not profile pictures.

That it is possible for personal details that include date of birth, address and photo to all be lost together is clearly a significant threat for identity theft. It shows one risk of having so much identifiable personal data stored in one place.

The good news, is that it appears not to have any malicious motive. According to the report in Motherboard; “The hacker who broke into VTech’s systems […] that he never intended to release the data to the public.

”Frankly, it makes me sick that I was able to get all this stuff,” the hacker told [Motherboard] in an encrypted chat on Monday.

Could this be the biggest consumer breach of children’s personal data in history?

What now for the 700,000 users of the systems?

Parent accounts need to be directly and fully informed by VTech:

a) what was compromised, by platform, by website, or by customer

b) if and how they will be able to use their equipment again

c) how children’s data would be made safe in future and what Vtech are doing that will different from how they handled data before

Any organisation needs to demonstrate through full transparency and how it acts in the event of such a breach, that it is worthy of trust.

The children’s toys and systems appear to have been shut down.

They’re not cheap with the Innotab coming in at around £90 and its cartridge games upwards of £17 each.  Toy sellers will be in the front line for public facing questions in the shops. Anyone that had already bought these just before Christmas will be wondering what to do now, if access to the systems and the apps have been shut down, they won’t work.

And if and when they do work, will they work securely?

Did Vtech contact you and tell you about the breach?

The sensible thing is to stop using that email address, change the password at very minimum and not only on the adult’s phone and child’s game app, but also anywhere else you use it.

What else do you need to know?

What do parents do when their child’s digital identity has been compromised?

More information is needed from the company, and soon.

####

If you want to get in touch, come over and visit defenddigitalme.com You can also sign up to the Twitter group, support the campaign to get 8 million school pupils’ data made safe, or leave a comment.

####

References:

VTech website FAQs as of December 3, 2015

November 28, 2015: blog TryHunt.com by Microsoft MVP for Developer Security

December 1, 2015: Motherboard report by @josephfcox

December 1, 2015: Motherboard article by Lorenzo Franceschi-Bicchierai

 

 

 

Access to school pupil personal data by third parties is changing

The Department for Education in England and Wales [DfE] has lost control of who can access our children’s identifiable school records by giving individual and sensitive personal data out to a range of third parties, since government changed policy in 2012. It looks now like they’re panicking how to fix it.

Applicants wanting children’s personal identifiable and/or sensitive data now need to first apply for the lowest level criminal record check, DBS, in the access process, to the National Pupil Database.

Schools Week wrote about it and asked for comment on the change [1] (as discussed by Owen in his blog [2] and our tweets).

At first glance, it sound like a great idea, but what real difference will this make to who can receive 8 million school pupils’ data?

Yes, you did read that right.

The National Pupil Database gives away the personal data of eight million children, aged 2-19. Gives it away outside its own protection,  because users get sent raw data, to their own desks.[3]

It would be good to know people receiving your child’s data hadn’t ever been cautioned or convicted about something related to children in their past, right?

Unfortunately, this DBS check won’t tell the the Department for Education (DfE) that – because it’s the the basic £25 DBS check [4], not full version.

So this change seems less about keeping children’s personal data safe than being seen to do something. Anything. Anything but the thing that needs done. Which is to keep the data secure.

Why is this not a brilliant solution? 

Moving towards the principle of keeping the data more secure is right, but in practice, the DBS check is only useful if it would make data safe by stopping people receiving data and the risks associated with data misuse. So how will this DBS check achieve this? It’s not designed for people who handle data. It’s designed for people working with children.

There is plenty of evidence available of data inappropriately used for commercial purposes often in the news, and often through inappropriate storage and sharing of data as well as malicious breaches. I am not aware, and refer to this paper [5], of risks realised through malicious data misuse of data for academic purposes in safe settings. Though mistakes do happen through inappropriate processes, and through human error and misjudgement.

However it is not necessary to have a background check for its own sake. It is necessary to know that any users handle children’s data securely and appropriately, and with transparent oversight. There is no suggestion at all that people at TalkTalk are abusing data, but their customers’ data were not secure and those data held in trust are now being misused.

That risk is the harm that is likely to affect a high number of individuals if bulk personal data are not securely managed. Measures to make it so must be proportionate to that risk. [6]

Coming back to what this will mean for individual applicants and its purpose: Basic Disclosure contains only convictions considered unspent under The Rehabilitation of Offenders Act 1974. [7]

The absence of a criminal record does not mean data are securely stored or appropriately used by the recipient.

The absence of a criminal record does not mean data will not be forwarded to another undisclosed recipient and there be a way for the DfE to ever know it happened.

The absence of a criminal record showing up on the basic DBS check does not even prove that the person has no previous conviction related to misuse of people or of data. And anything you might consider ‘relevant’ to children for example, may have expired.

DBS_box copy

So for these reasons, I disagree that the decision to have a basic DBS check is worthwhile.  Why? Because it’s effectively meaningless and doesn’t solve the problem which is this:

Anyone can apply for 8m children’s personal data, and as long as they meet some purposes and application criteria, they get sent sensitive and identifiable children’s data to their own setting. And they do. [8]

Anyone the 2009 designed legislation has defined as a prescribed person or researcher, has come to mean journalists for example. Like BBC Newsnight, or Fleet Street papers. Is it right journalists can access my children’s data, but as pupils and parents we cannot, and we’re not even informed? Clearly not.

It would be foolish to be reassured by this DBS check. The DfE is kidding themselves if they think this is a workable or useful solution.

This step is simply a tick box and it won’t stop the DfE regularly giving away the records of eight million children’s individual level and sensitive data.

What problem is this trying to solve and how will it achieve it?

Before panicking to implement a change DfE should first answer:

  • who will administer and store potentially sensitive records of criminal convictions, even if unrelated to data?
  • what implications does this have for other government departments handling individual personal data?
  • why are 8m children’s personal and sensitive data given away ‘into the wild’ beyond DfE oversight in the first place?

Until the DfE properly controls the individual personal data flowing out from NPD, from multiple locations, in raw form, and its governance, it makes little material difference whether the named user is shown to have, or not have a previous criminal record. [9] Because the DfE has no idea if they are they only person who uses it.

The last line from DfE in the article is interesting: “it is entirely right that we we continue to make sure that those who have access to it have undergone the necessary background checks.”

Continue from not doing it before? Tantamount to a denial of change, to avoid scrutiny of the past and status quo? They have no idea who has “access” to our children’s data today after they have released it, except on paper and trust, as there’s no audit process.[10]

If this is an indicator of the transparency and type of wording the DfE wants to use to communicate to schools, parents and pupils I am concerned. Instead we need to see full transparency, assessment of privacy impact and a public consultation of coordinated changes.

Further, if I were an applicant, I’d be concerned that DfE is currently handling sensitive pupil data poorly, and wants to collect more of mine.

In summary: because of change in Government policy in 2012 and the way in which it is carried out in practice, the Department for Education in England and Wales [DfE] has lost control of who can access our 8m children’s identifiable school records. Our children deserve proper control of their personal data and proper communication about who can access that and why.

Discovering through FOI [11] the sensitivity level and volume of identifiable data access journalists are being given, shocked me. Discovering that schools and parents have no idea about it, did not.

This is what must change.

 

*********

If you have questions or concerns about the National Pupil Database or your own experience, or your child’s data used in schools, please feel free to get in touch, and let’s see if we can make this better to use our data well, with informed public support and public engagement.

********

References:
[1] National Pupil Database: How to apply: https://www.gov.uk/guidance/national-pupil-database-apply-for-a-data-extract

[2]Blogpost: http://mapgubbins.tumblr.com/post/132538209345/no-more-fast-track-access-to-the-national-pupil

[3] Which third parties have received data since 2012 (Tier 1 and 2 identifiable, individual and/or sensitive): release register https://www.gov.uk/government/publications/ national-pupil-database-requests-received

[4] The Basic statement content http://www.disclosurescotland.co.uk/disclosureinformation/index.htm

[5] Effective Researcher management: 2009 T. Desai (London School of Economics) and F. Ritchie (Office for National Statistics), United Kingdom http://www.unece.org/fileadmin/DAM/stats/documents/ece/ces/ge.46/2009/wp.15.e.pdf

[6] TalkTalk is not the only recent significant data breach of public trust. An online pharmacy that sold details of more than 20,000 customers to marketing companies has been fined £130,000 https://ico.org.uk/action-weve-taken/enforcement/pharmacy2u-ltd/

[7] Guidance on rehabilitation of Offenders Act 1974 https://www.gov.uk/government/uploads/system/uploads/
attachment_data/file/299916/rehabilitation-of-offenders-guidance.pdf

[8] the August 2014 NPD application from BBC Newsnight https://www.whatdotheyknow.com/request/293030/response/723407/attach/10/BBC%20Newsnight.pdf

[9] CPS Guidelines for offences involving children https://www.sentencingcouncil.org.uk/wp-content/uploads/Final_Sexual_Offences_Definitive_Guideline_content_web1.pdf
indecent_images_of_children/

[10] FOI request https://www.whatdotheyknow.com/request/pupil_data_application_approvals#outgoing-482241

[11] #saveFOI – I found out exactly how many requests had been fast tracked and not scrutinised by the data panel via a Freedom of Information Request, as well as which fields journalists were getting access to. The importance of public access to this kind of information is a reason to stand up for FOI  http://www.pressgazette.co.uk/press-gazette-launches-petition-stop-charges-foi-requests-which-would-be-tax-journalism

 

Act now: Stand up and speak out for your rights to finding out the facts #saveFOI

The Freedom of Information Act has enabled me to stand up for my children’s rights. It really matters to me. And we might lose it.

For every member of the public, who has ever or who has never used their rights under the Freedom of Information Act laws, the government consultation on changing them that closes today is worth caring about. If you haven’t yet had your say, go and take action here >> now.  If it is all you have time for before the end of today, you can sign 38 degrees petition or write an email to your MP.

Or by the end of today you can reply to the call for evidence. There is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also got this plain English version.

Please do. Now. It closes today, on November 20th.

If you need convinced why it matters to me and it should to you, read on.

What will happen

If the proposed changes come to pass, information about public accountability will be lost. Political engagement will not be open to all equally. It will promote an unfair society in which individuals are not only prevented from taking part in full public life, but prevented from understanding decisions made about them or that affect them. Campaign groups will be constrained from standing up for human rights by cost.  The press will be restrained in what they can ask.

MySociety has a brilliant summary.  Michael Sheen spoke up calling it “nothing short of a full frontal attack” on the principles of democratic government. And Tom Watson spoke of three serious instances where facts would have stayed hidden, were it not for access made using the law of Freedom of Information:

1. death rates in cardiac patient care
2. cases when the police use Tasers on children
3. the existence of cracks in the nuclear power station at Hinckley

Why does FOI matter to me personally? In Education.

Because it’s enabled me to start a conversation to get the Department for Education to start to improve their handling of our 8 million children’s personal and sensitive data they hold in the National Pupil Database for England and Wales. Through FOI I asked for unpublished facts how many releases of identifiable personal data of school pupils have been fast-tracked at the Department of Education without panel oversight. And to see the panel terms of reference which are still not on their website.

The request: whatdotheykknow.com
The outcome:
National Pupil Database FOI case study summary here.

I’m now coordinating calls for changes on behalf of the 8m children whose records they hold and parents across the country.

******

Why does FOI matter to me personally? In Health.

Because Freedom of Information law has enabled public transparency of decision making and accountability of the care.data programme board decision making that was kept secret for over a year. NHS England refused to publish them. Their internal review declined appeal. The Information Commissioner’s Office upheld it.

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. In fact my case study, while highlighting that NHS England refused to release information, also shows that only a handful of genuine redactions were necessary, using Section 36 to keep them hidden, when the minutes were finally released.

In October 2014 I simply wanted to see the meeting minutes form part of the public record of care.data planning. I wanted to see the cost-benefit business case and scrutinise it against the benefits case that the public were told of at every public engagement event I had been to.  When at every turn the public is told how little money the NHS can afford to spend I wanted scrutiny of what the programme would cost at national and local levels. It was in the public interest to better inform public debate about the merits of the national programme. And I strongly believe that it is in the public interest to be informed and fully understand the intention and programme that demands the use of sensitive personal data.

The request: whatdotheyknow.com
The outcome: care.data FOI case study summary here.

Others could use this information I hoped, to ask the right questions about missing meeting minutes and transparency, and for everyone to question why there was no cost-benefit business plan at all in private; while the public kept being told of the benefits.  And it shows that data collection is further set to expand, without public debate.

Since then the programme has been postoned again and work is in progress on improved public engagement to enable public and professional confidence.

What has Freedom of Information achieved?

One of the most memorable results of Freedom of Information was the MPs expenses scandal. Who knows how much this Freedom of Information Request saved the taxpayers in immeasurable amounts of future spending on duck houses since MPs have been required to publish expenses since 2010? Four MPs were jailed for false accounting. Peers were expelled. Second homes and what appeared to the public as silly spending on sundries were revealed. Mr. Cameron apologized in 2009, saying he was “appalled” by the expenses. The majority of MPs had done nothing illegal but the Freedom of Information request enabled the start of a process of increased transparency to the public which showed where activities, while permitted by law, were simply unethical or unreasonable.

Historical record

Information published under the Freedom of Information Act can help to ensure that important records of decision-making processes are retained as part of the historic background to government.

Increased trust

The right information at the right time helps make better decisions, make spending more transparent and makes policies and practices more trustworthy.

Access to official information can also improve public confidence where public sector bodies are seen as being open. In a 2011 survey carried out on behalf of the Information Commissioner’s Office, 81% of public bodies questioned agreed that the Act had increased the public’s trust in their organisation.

A key argument made by the commission is that those in public office need private space for decision making. The Information Commissioner’s Office countered this in their submission to the consultation saying,

“there is a distinction between a need for a private space, depending on the circumstances and a desire for secrecy across a broad area of public sector activity. It was the latter tendency that FOIA was intended to correct.”

So how much more “private space” do public servants need?

Holding back information

When information that are judged should not be released in the public interest, there are already exemptions that can be applied to prevent disclosure of information under the Freedom of Information Act. [1]

The exemptions include:

  • if the information can easily be accessed by other means – e.g. the internet or published documents
  • if the information is personal information
  • if the information is provided in confidence (but only if legally enforceable)
  • when there is a legal reason not to disclose
  • if the information is about national security, defence, the economy, law enforcement, formulation of Government policy, health and safety, communications with Her Majesty or other royalty, international relations, intended for future publication and commercial interests. (All exemptions in this group must be tested to see if disclosure is in the public interest.)

In addition to these exemptions, organisations can withhold information if it will take more than two-and-a-half days to provide it, or they cannot identify what information is needed (although they have to work with the requester to clarify what is being requested).

They can also withhold information if they decide the request is vexatious.

Does it cost us too much to administer?

Some people who are supportive of these changes say they are concerned about costs in answering requests but have perhaps not considered the savings in exceptional cases (like the Expenses Scandal outcome). And as mySociety has reported [2], money spent responding to Freedom of Information requests also needs to be considered fairly in the context of wider public spending. In 2012 it was reported that Staffordshire County Council had spent £38,000 in a year responding to Freedom of Information requests. The then Director of mySociety, Tom Steinberg, commented:

“From this I can see that oversight by citizens and journalists cost only £38,000 from a yearly total budget of £1.3bn. I think it is fantastic that Staffordshire County Council can provide such information for only 0.002 per cent of its operating budget.”

Why does the government want to make itself less transparent? Even the Information Commissioner’s office has replied to the consultation to say that the Commissioner does not consider that significant changes to the core principles of the legislation are needed. This is a good law, that gives the public rights in our favour and transparency into how we are governed and tax money spent.

How will the value of FOI be measured of what would be lost if the changes are made?

What can you do?

The call for evidence is here and there is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also put together this super-easy Plain English version.

To have your say in the consultation closing on November 20th go online.

Or simply call or write to your MP.  Today. This really matters.


References:

[1] Requests can be refused https://ico.org.uk/for-organisations/guide-to-freedom-of-information/refusing-a-request/

[2] MySociety opposes restrictions https://www.mysociety.org/2015/11/11/voices-from-whatdotheyknow-why-we-oppose-foi-act-restrictions/

[3] National Pupil Database FOI case study summary here

[4] My care.data programme board FOI case study summary here

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex. Too little, too late.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex ed. They should be discussing prevention and personal data protection for all our personal data, not just one company, after the event.

Everyone’s been talking about TalkTalk and for all the wrong reasons. Data loss and a 15-year-old combined with a reportedly reckless response to data protection, compounded by lack of care.

As Rory Cellan-Jones wrote [1] rebuilding its reputation with customers and security analysts is going to be a lengthy job.

In Parliament Chi Onwarah, Shadow Minister for Culture & the Digital Economy, summed up in her question, asking the Minister to acknowledge “that all the innovation has come from the criminals while the Government sit on their hands, leaving it to businesses and consumers to suffer the consequences?”  [Hansard 2]

MPs were concerned for the 4 million* customers’ loss of name, date of birth, email, and other sensitive data, and called for an inquiry. [It may now be fewer*.] [3] The SciTech committee got involved too.

I hope this means Parliament will talk about TalkTalk not as the problem to be solved, but as one case study in a review of contemporary policy and practices in personal data handling.

Government spends money in data protection work in the [4] “National Cyber Security Programme”. [NCSP] What is the measurable outcome – particularly for TalkTalk customers and public confidence – from its £860M budget?  If you look at the breakdown of those sums, with little going towards data protection and security compared with the Home Office and Defence, we should ask if government is spending our money in an appropriately balanced way on the different threats it perceives. Keith Vaz suggested British companies that lose £34 billion every year to cybercrime. Perhaps this question will come into the inquiry.

This all comes after things have gone wrong.  Again [5]. An organisation we trusted has abused that trust by not looking after data with the stringency that customers should be able to expect in the 21st century, and reportedly not making preventative changes, apparent a year ago. Will there be consequences this time?

The government now saying it is talking about data protection and consequences, is like saying they’re talking sex education with teens, but only giving out condoms to the boys.

It could be too little too late. And they want above all to avoid talking about their own practices. Let’s change that.

Will this mean a review to end risky behaviour, bring in change, and be wiser in future?

If MPs explore what the NCSP does, then we the public, should learn more about what government’s expectations of commercial companies is in regards modern practices.

In addition, any MPs’ inquiry should address government’s own role in its own handling of the public’s personal data. Will members of government act in a responsible manner or simply tell others how to do so?

Public discussion around both commercial and state use of our personal data, should mean genuine public engagement. It should involve a discussion of consent where necessary for purposes  beyond those we expect or have explained when we submit our data, and there needs to be a change in risky behaviour in terms of physical storage and release practices, or all the talk, is wasted.

Some say TalkTalk’s  practices mean they have broken their contract along with consumer trust. Government departments should also be asking whether their data handling would constitute a breach of the public’s trust and reasonable expectations.

Mr Vaizey should apply his same logic to government handling data as he does to commercial handling. He said he is open to suggestions for improvement. [6]

Let’s not just talk about TalkTalk.

    • Let’s Talk Consequences: organisations taking risk seriously and meaningful consequences if not [7]
    • Let’s Talk Education: the education of the public on personal data use by others and rights and responsibilities we have [8]
    • Let’s Talk Parliament’s Policies and Practices: about its own complementary lack of data  understanding in government and understand what good practice is in physical storage, good governance and transparent oversight
    • Let’s Talk Public Trust: and the question whether government can be trusted with public data it already has and whether its current handling makes it trustworthy to take more [9]

Vaizey said of the ICO now in his own department: “The Government take the UK’s cyber-security extremely seriously and we will continue to do everything in our power to protect organisations and individuals from attacks.”

“I will certainly meet the Information Commissioner to look at what further changes may be needed in the light of this data breach. [..] It has extensive powers to take action and, indeed, to levy significant fines. “

So what about consequences when data are used in ways the public would consider a loss, and not through an attack or a breach, but government policy? [10]

Let’s Talk Parliament’s Policies and Practices

Commercial companies are not alone in screwing up the use and processing [11] management of our personal data. The civil service under current policy seems perfectly capable of doing by itself. [12]

Government data policy has not kept up with 21st century practices and to me seems to work in the dark, as Chi Onwarah said,

‘illuminated by occasional flashes of incompetence.’

This incompetence can risk harm to people’s lives, to business and to public confidence.

And once given, trust would be undermined by changing the purposes or scope of use for which it was given, for example as care.data plans to do after the pilot. A most risky idea.

Trust in these systems, whether commercial or state, is crucial. Yet reviews which highlight this, and make suggestions to support trust such as ‘data should never be (and currently is never) released with personal identifiers‘ in The Shakespeare Review have been ignored by government.

Where our personal data are not used well in government departments by the department themselves, they seem content to date to rely on public ignorance to get away with current shoddy practices.

Practices such as not knowing who all your customers are, because they pass data on to others. Practices, such as giving individual level identifiable personal data to third parties without informing the public, or asking for consent. Practices, such as never auditing or measuring any benefit of giving away others personal data.

“It is very important that all businesses, particularly those handling significant amounts of sensitive customer data, have robust procedures in place to protect those data and to inform customers when there may have been a data breach.” Ed Vaizey, Oct 26th, HOC

If government departments prove to be unfit to handle the personal data we submit in trust to the state today, would we be right to trust them with even more?

While the government is busy wagging fingers at commercial data use poor practices, the care.data debacle is evidence that not all its MPs or civil service understand how data are used in commercial business or through government departments.

MPs calling for commercial companies to sharpen up their data protection must understand how commercial use of data often piggy-backs the public use of our personal data, or others getting access to it via government for purposes that were unintended.

Let’s Talk Education

If the public is to understand how personal data are to be kept securely with commercial organisations, why should they not equally ask to understand how the state secures their personal data? Educating the public could lead to better engagement with research, better understanding of how we can use digital services and a better educated society as a whole. It seems common sense.

At a recent public event [13],  I asked civil servants talking about big upcoming data plans they announced, linking school data with more further education and employment data, I asked how they planned to involve the people whose data they would use. There was no public engagement to mention. Why not? Inexcusable in this climate.

Public engagement is a matter of trust and developing understanding in a relationship. Organisations must get this right.[14]

If government is discussing risky practices by commercial companies, they also need to look closer to home and fix what is broken in government data handling where it exposes us to risk through loss of control of our personal data.

The National Pupil Database for example, stores and onwardly shares identifiable individual sensitive data of at least 8m children’s records from age 2 -19. That’s twice as big as the TalkTalk loss was first thought to be.

Prevention not protection is what we should champion. Rather than protection after the events,  MPs and public must demand emphasis on prevention measures in our personal data use.

This week sees more debate on how and why the government will legislate to have more powers to capture more data about all the people in the country. But are government policy, process and practices fit to handle our personal data, what they do with it and who they give it to?

Population-wide gathering of data surveillance in any of its many forms is not any less real just because you don’t see it. Children’s health, schools, increases in volume of tax data collection. We don’t discuss enough how these policies can be used every day without the right oversight. MPs are like the conservative parents not comfortable talking to their teens about sleeping with someone. Just because you don’t know, it doesn’t mean they’re not doing it. [15] It just means you don’t want to know because if you find out they’re not doing it safely, you’ll have to do something about it.

And it might be awkward. (Meanwhile in schools real, meaningful PHSE has been left off the curriculum.)

Mr. Vaizey asked in the Commons for suggestions for improvement.

My suggestion is this. How government manages data has many options. But the principle should be simple. Our personal data needs not only protected, but not exposed to unnecessary risk in the first place, by commercial or state bodies. Doing nothing, is not an option.

Let’s Talk about more than TalkTalk

Teens will be teens. If commercial companies can’t manage their systems better to prevent a child successfully hacking it, then it’s not enough to point at criminal behaviour. There is fault to learn from on all sides. In commercial and state uses of personal data.

There is talk of new, and bigger, data sharing plans. [16]

Will the government wait to see  and keep its fingers crossed each month to see if our data are used safely at unsecured settings with some of these unknown partners data might be onwardly shared with, hoping we won’t find out and they won’t need to talk about it, or have a grown up public debate based on public education?

Will it put preventative measures in place appropriate to the sensitivity and volume of the data it is itself responsible for?

Will moving forward with new plans mean safer practices?

If government genuinely wants our administrative data at the heart of digital government fit for the 21st century, it must first understand how all government departments collect and use public data. And it must educate the public in this and commercial data use.

We need a fundamental shift in the way the government respects public opinion and shift towards legal and privacy compliance – both of which are lacking.

Let’s not talk about TalkTalk. Let’s have meaningful grown up debate with genuine engagement. Let’s talk about prevention measures in our data protection. Let’s talk about consent. It’s personal.

******

[1] Questions for TalkTalk: http://www.bbc.co.uk/news/technology-34636308

[2] Hansard: http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm151026/debtext/151026-0001.htm#15102612000004

[3] TalkTalk update: http://www.talktalkgroup.com/press/press-releases/2015/cyber-attack-update-tuesday-october-30-2015.aspx

[4] The Cyber Security Programme: http://www.civilserviceworld.com/articles/feature/depth-look-national-cyber-security-programme

[5] Paul reviews TalkTalk; https://paul.reviews/value-security-avoid-talktalk/

[6] https://ico.org.uk/for-organisations/guide-to-data-protection/conditions-for-processing/

[7] Let’s talk Consequences: the consequences of current failures to meet customers’ reasonable expectations of acceptable risk, are low compared with elsewhere.  As John Nicolson (East Dunbartonshire) SNP pointed out in the debate, “In the United States, AT&T was fined £17 million for failing to protect customer data. In the United Kingdom, the ICO can only place fines of up to £500,000. For a company that received an annual revenue of nearly £1.8 billion, a fine that small will clearly not be terrifying. The regulation of telecoms must be strengthened to protect consumers.”

[8] Let’s talk education: FOI request revealing a samples of some individual level data released to members of the press: http://www.theyworkforyou.com/debates/?id=2015-10-26b.32.0

The CMA brought out a report in June, on the use of consumer data, the topic should be familiar in parliament, but little engagement has come about as a result. It suggested the benefit:

“will only be realised if consumers continue to provide data and this relies on them being able to trust the firms that collect and use it”, and that “consumers should know when and how their data is being collected and used and be able to decide whether and how to participate. They should have access to information from firms about how they are collecting, storing and using data.”

[9] Let’s Talk Public Trust – are the bodies involved Trustworthy? Government lacks an effective data policy and is resistant to change. Yet it wants to collect ever more personal and individual level for unknown purposes from the majority of 60m people, with an unprecedented PR campaign.  When I heard the words ‘we want a mature debate’ it was reminiscent of HSCIC’s ‘intelligent grown up debate’ requested by Kinglsey Manning, in a speech when he admitted lack of public knowledge was akin to a measure of past success, and effectively they would rather have kept the use of population wide health data ‘below the radar’.

Change: We need change, the old way after all, didn’t work, according to Minister Matt Hancock: “The old model of government has failed, so we will build a new one.” I’d like to see what that new one will look like. Does he mean to expand only data sharing policy, or the powers of the civil service?

[10] National Pupil Database detailed data releases to third parties https://www.whatdotheyknow.com/request/pupil_data_national_pupil_databa

[11] http://adrn.ac.uk/news-events/latest-news/adrn-rssevent

[12] https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[13] https://www.liberty-human-rights.org.uk/human-rights/privacy/state-surveillance

[14] http://www.computerweekly.com/news/4500256274/Government-will-tackle-barriers-to-sharing-and-linking-data-says-Cabinet-Office-minister-Hancock

Driving digital health, revolution by design

This follows on from: 1. Digital revolution by design: building for change and people.

***

Talking about the future of digital health in the NHS, Andy Williams went on to ask, what makes the Internet work?

In my head I answered him, freedom.

Freedom from geographical boundaries. Freedom of speech to share ideas and knowledge in real time with people around the world.  The freedom to fair and equal use. Cooperation, creativity, generosity…

Where these freedoms do not exist or are regulated the Internet may not work well for its citizens and its potential is restricted, as well as its risks.

But the answer he gave, was standards.

And of course he was right.  Agreed standards are needed when sharing a global system so that users, their content and how it works behind the screen cooperate and function as intended.

I came away wondering what the digital future embodied in the NHS NIB plans will look like, who has their say in its content and design and who will control  it?

What freedoms and what standards will be agreed upon for the NHS ‘digital future’ to function and to what purpose?

Citizens help shape the digital future as we help define the framework of how our data are to be collected and used, through what public feeling suggests is acceptable and people actually use.

What are some of the expectations the public have and what potential barriers exist to block achieving its benefits?

It’s all too easy when discussing the digital future of the NHS to see it as a destination. Perhaps we could shift the conversation focus to people, and consider what tools digital will offer the public on their life journey, and how those tools will be driven and guided.

Expectations

One key public expectation will be of trust, if something digital is offered under the NHS brand, it must be of the rigorous standard we expect.

Is every app a safe, useful tool or fun experiment and how will users [especially for mental health apps where the outcomes may be less tangibly measured than say, blood glucose] know the difference?

A second expectation must be around universal equality of access.

A third expectation must be that people know once the app is downloaded or enrolment done, what they have signed up to.

Will the NHS England / NIB digital plans underway create or enable these barriers and expectations?

What barriers exist to the NHS digital vision and why?

Is safety regulation a barrier to innovation?

The ability to broadly share innovation at speed is one of the greatest strengths of digital development, but can also risk spreading harm quickly. Risk management needs to be upfront.

We  assume that digital designs will put at their heart the core principles in the spirit of the NHS.  But if apps are not available on prescription and are essentially a commercial product with no proven benefit, does that exploit the NHS brand trust?

Regulation of quality and safety must be paramount, or they risk doing harm as any other treatment could to the person and regulation must further consider reputational risk to the NHS and the app providers.

Regulation shouldn’t be seen as a barrier, but as an enabler to protect and benefit both user and producer, and indirectly the NHS and state.

Once safety regulation is achieved, I hope that spreading benefits will not be undermined by creating artificial boundaries that restrict access to the tools by affordability, in a postcode lottery,  or in language.

But are barriers being built by design in the NHS digital future?

Cost: commercial digital exploitation or digital exclusion?

There appear to be barriers being built by design into the current NHS apps digital framework. The first being cost.

For the poorest even in the UK today in maternity care, exclusion is already measurable in those who can and cannot afford the data allowance it costs on a smart phone for e-red book access, attendees were told by its founder at #kfdigital15.

Is digital participation and its resultant knowledge or benefit to become a privilege reserved for those who can afford it? No longer free at the point of service?

I find it disappointing that for all the talk of digital equality, apps are for sale on the NHS England website and many state they may not be available in your area – a two-tier NHS by design. If it’s an NHS app, surely it should be available on prescription and/or be free at the point of use and for all like any other treatment? Or is yet another example of  NHS postcode lottery care?

There are tonnes of health apps on the market which may not have much proven health benefit, but they may sell well anyway.

I hope that decision makers shaping these frameworks and social contracts in health today are also looking beyond the worried well, who may be the wealthiest and can afford apps leaving the needs of those who can’t afford to pay for them behind.

At home, it is some of the least wealthy who need the most intervention and from whom there may be little profit to be made There is little in 2020 plans I can see that focuses on the most vulnerable, those in prison and IRCs, and those with disabilities.

Regulation in addition to striving for quality and safety by design, can ensure there is no commercial exploitation of purchasers.  However it is a  question of principle that will decide for or against exclusion for users based on affordability.

Geography: crossing language, culture and country barriers

And what about our place in the wider community, the world wide web, as Andy Williams talked about: what makes the Internet work?

I’d like to think that governance and any “kite marking” of digital tools such as apps, will consider this and look beyond our bubble.

What we create and post online will be on the world wide web.  That has great potential benefits and has risks.

I feel that in the navel gazing focus on our Treasury deficit, the ‘European question’ and refusing refugees, the UK government’s own insularity is a barrier to our wider economic and social growth.

At the King’s Fund event and at the NIB meeting the UK NHS leadership did not discuss one of the greatest strengths of online.

Online can cross geographical boundaries.

How are NHS England approved apps going to account for geography and language and cross country regulation?

What geographical and cultural barriers to access are being built by design just through lack of thought into the new digital framework?

Barriers that will restrict access and benefits both in certain communities within the UK, and to the UK.

One of the three questions asked at the end of the NIB session, was how the UK Sikh community can be better digitally catered for.

In other parts of the world both traditional and digital access to knowledge are denied to those who cannot afford it.

school

This photo reportedly from Indonesia, is great [via Banksy on Twitter, and apologies I cannot credit the photographer] two boys on the way to school, pass their peers on their way to work.

I wonder if one of these boys has the capability to find the cure for cancer?
What if he is one of the five, not one of the two?

Will we enable the digital infrastructure we build today to help global citizens access knowledge and benefits, or restrict access?

Will we enable broad digital inclusion by design?

And what of  data sharing restrictions: Barrier or Enabler?

Organisations that talk only of legal, ethical or consent ‘barriers’ to datasharing don’t understand human behaviour well enough.

One of the greatest risks to achieving the potential benefits from data is the damage done to it by organisations that are paternalistic and controlling. They exploit a relationship rather than nurturing it.

The data trust deficit from the Royal Statistical Society has lessons for policymakers. Including finding that: “Health records being sold to private healthcare companies to make money for government prompted the greatest opposition (84%).”

Data are not an abstract to be exploited, but personal information. Unless otherwise informed, people expect that information offered for one purpose, will not be used for another. Commercial misuse is the greatest threat to public trust.

Organisations that believe behavioural barriers to data sharing are an obstacle,  have forgotten that trust is not something to be overcome, but to be won and continuously reviewed and protected.

The known barrier without a solution is the lack of engagement that is fostered where there is a lack of respect for the citizen behind the data. A consensual data charter could help to enable a way forward.

Where is the wisdom we have lost in knowledge?

Once an app is [prescribed[, used, data exchanged with the NHS health provider and/or app designer, how will users know that what they agreed to in an in-store app, does not change over time?

How will ethical guidance be built into the purposes of any digital offerings we see approved and promoted in the NHS digital future?

When the recent social media experiment by Facebook only mentioned the use of data for research after the experiment, it caused outcry.

It crossed the line between what people felt acceptable and intrusive, analysing the change in behaviour that Facebook’s intervention caused.

That this manipulation is not only possible but could go unseen, are both a risk and cause for concern in a digital world.

Large digital platforms, even small apps have the power to drive not only consumer, but potentially social and political decision making.

“Where is the knowledge we have lost in information?” asks the words of T S Elliott in Choruses, from the Rock. “However you disguise it, this thing does not change: The perpetual struggle of Good and Evil.”

Knowledge can be applied to make a change to current behaviour, and offer or restrict choices through algorithmic selection. It can be used for good or for evil.

‘Don’t be evil’ Google’s adoptive mantra is not just some silly slogan.

Knowledge is power. How that power is shared or withheld from citizens matters not only today’s projects, but for the whole future digital is helping create. Online and offline. At home and abroad.

What freedoms and what standards will be agreed upon for it to function and to what purpose? What barriers can we avoid?

When designing for the future I’d like to see discussion consider not only the patient need, and potential benefits, but also the potential risk for exploitation and behavioural change the digital solution may offer. Plus, ethical solutions to be found for equality of access.

Regulation and principles can be designed to enable success and benefits, not viewed as barriers to be overcome

There must be an ethical compass built into the steering of the digital roadmap that the NHS is so set on, towards its digital future.

An ethical compass guiding app consumer regulation,  to enable fairness of access and to know when apps are downloaded or digital programmes begun, that users know to what they are signed up.

Fundamental to this the NIB speakers all recognised at #kfdigital15 is the ethical and trustworthy extraction, storage and use of data.

There is opportunity to consider when designing the NHS digital future [as the NIB develops its roadmaps for NHS England]:

i making principled decisions on barriers
ii. pro-actively designing ethics and change into ongoing projects, and,
iii. ensuring engagement is genuine collaboration and co-production.

The barriers do not need got around, but solutions built by design.

***

Part 1. Digital revolution by design: building for change and people
Part 3. Digital revolution by design: building infrastructures

NIB roadmaps: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/384650/NIB_Report.pdf

Digital revolution by design: building for change and people (1)

Andy Williams said* that he wants not evolution, but a revolution in digital health.

It strikes me that few revolutions have been led top down.

We expect revolution from grass roots dissent, after a growing consensus in the population that the status quo is no longer acceptable.

As the public discourse over the last 18 months about the NHS use of patient data has proven, we lack a consensual agreement between state, organisations and the public how the data in our digital lives should be collected, used and shared.

The 1789 Declaration of the Rights of Man and Citizen as part of the French Revolution set out a charter for citizens, an ethical and fair framework of law in which they trusted their rights would be respected by fellow men.

That is something we need in this digital revolution.

We are told hand by government that it is necessary to share all our individual level data in health from all sorts of sources.

And that bulk data collection is vital in the public interest to find surveillance knowledge that government agencies want.

At the same time other government departments plan to restrict citizens’ freedom of access to knowledge that could be used to hold the same government  and civil servants to account.

On the consumer side, there is public concern about the way we are followed around on the web by companies including global platforms like Google and Facebook, that track our digital footprint to deliver advertising.

There is growing objection to the ways in which companies scoop up data to build profiles of individuals and groups and personalising how they get treated. Recent objection was to marketing misuse by charities.

There is little broad understanding yet of the power of insight that organisations can now have to track and profile due to the power of algorithms and processing capability.

Technology progress that has left legislation behind.

But whenever you talk to people about data there are two common threads.

The first, is that although the public is not happy with the status quo of how paternalistic organisations or consumer companies ‘we can’t live without’ manage our data, there is a feeling of powerlessness that it can’t change.

The second, is frustration with organisations that show little regard for public opinion.

What happens when these feelings both reach tipping point?

If Marie Antoinette were involved in today’s debate about the digital revolution I suspect she may be the one saying: “let them eat cookies.”

And we all know how that ended.

If there is to be a digital revolution in the NHS where will it start?

There were marvelous projects going on at grassroots discussed over the two days: bringing the elderly online and connected and in housing and deprivation. Young patients with rare diseases are designing apps and materials to help consultants improve communications with patients.

The NIB meeting didn’t have real public interaction and or any discussion of those projects ‘in the room’ in the 10 minutes offered. Considering the wealth of hands-on digital health and care experience in the audience it was a missed opportunity for the NIB to hear common issues and listen to suggestions for co-designed solutions.

While white middle class men (for the most part) tell people of their grand plans from the top down, the revolutionaries of all kinds are quietly getting on with action on the ground.

If a digital revolution is core to the NHS future, then we need to ask to understand the intended change and outcome much more simply and precisely.

We should all understand why the NHS England leadership wants to drive change, and be given proper opportunity to question it, if we are to collaborate in its achievement.

It’s about the people, stoopid

Passive participation will not be enough from the general public if the revolution is to be as dramatic as it is painted.

Consensual co-design of plans and co-writing policy are proven ways to increase commitment to change.

Evidence suggests citizen involvement in planning is more likely to deliver success. Change done with, not to.

When constructive solutions have been offered what impact has engagement if no change is made to any  plans?

If that’s engagement, you’re doing it wrong.

Struggling with getting themselves together on the current design for now, it may be hard to invite public feedback on the future.

But it’s only made hard if what the public wants is ignored.  If those issues were resolved in the way the public asked for at listening events it could be quite simple to solve.

The NIB leadership clearly felt nervous to have debate, giving only 10 minutes of three hours for public involvement, yet that is what it needs. Questions and criticism are not something to be scared of, but opportunities to make things better.

The NHS top-down digital plans need public debate and dissected by the clinical professions to see if it fits the current and future model of healthcare, because if not involved in the change, the ride will be awfully bumpy to get there.

For data about us, to be used without us, is certainly an outdated model incompatible with a digital future.

The public needs to fight for citizen rights in a new social charter that demands change along lines we want, change that doesn’t just talk of co-design but that actually means it.

If unhappy about today’s data use, then the general public has to stop being content to be passive cash cows as we are data mined.

If we want data used only for public benefit research and not market segmentation, then we need to speak up. To the Information Commissioner’s Office if the organisation itself will not help.

“As Nicole Wong, who was one of President Obama’s top technology advisors, recently wrote, “[t]here is no future in which less data is collected and used.”

“The challenge lies in taking full advantage of the benefits that the Internet of Things promises while appropriately protecting consumers’ privacy, and ensuring that consumers are treated fairly.” Julie Brill, FTC, May 4 2015, Berlin

In the rush to embrace the ‘Internet of Things’ it can feel as though the reason for creating them has been forgotten. If the Internet serves things, it serves consumerism. AI must tread an enlightened path here. If the things are designed to serve people, then we would hope they offer methods of enhancing our life experience.

In the dream of turning a “tsunami of data” into a “tsunami of actionable business intelligence,” it seems all too often the person providing the data is forgotten.

While the Patient and Information Directorate, NHS England or NIB speakers may say these projects are complex and hard to communicate the benefits, I’d say if you can’t communicate the benefits, its not the fault of the audience.

People shouldn’t have to either a) spend immeasurable hours of their personal time, understanding how these projects work that want their personal data, or b) put up with being ignorant.

We should be able to fully question why it is needed and get a transparent and complete explanation. We should have fully accountable business plans and scrutiny of tangible and intangible benefits in public, before projects launch based on public buy-in which may misplaced. We should expect  plans to be accessible to everyone and make documents straightforward enough to be so.

Even after listening to a number of these meetings and board meetings, I am  not sure many would be able to put succinctly: what is the NHS digital forward view really? How is it to be funded?

On the one hand new plans are to bring salvation, while the other stops funding what works already today.

Although the volume of activity planned is vast, what it boils down to, is what is visionary and achievable, and not just a vision.

Digital revolution by design: building for change and people

We have opportunity to build well now, avoiding barriers-by-design, pro-actively designing ethics and change into projects, and to ensure it is collaborative.

Change projects must map out their planned effects on people before implementing technology. For the NHS that’s staff and public.

The digital revolution must ensure the fair and ethical use of the big data that will flow for direct care and secondary uses if it is to succeed.

It must also look beyond its own bubble of development as they shape their plans in the ever changing infrastructures in which data, digital, AI and ethics will become important to discuss together.

That includes in medicine.

Design for the ethics of the future, and enable change mechanisms in today’s projects that will cope with shifting public acceptance, because that shift has already begun.

Projects whose ethics and infrastructures of governance were designed years ago, have been overtaken in the digital revolution.

Projects with an old style understanding of engagement are not fit-for-the-future. As Simon Denegri wrote, we could have 5 years to get a new social charter and engagement revolutionised.

Tim Berners-Lee when he called for a Magna Carta on the Internet asked for help to achieve the web he wants:

“do me a favour. Fight for it for me.”

The charter as part of the French Revolution set out a clear, understandable, ethical and fair framework of law in which they trusted their rights would be respected by fellow citizens.

We need one for data in this digital age. The NHS could be a good place to start.

****

It’s exciting hearing about the great things happening at grassroots. And incredibly frustrating to then see barriers to them being built top down. More on that shortly, on the barriers of cost, culture and geography.

****

* at the NIB meeting held on the final afternoon of the Digital Conference on Health & Social Care at the King’s Fund, June 16-17.

NEXT>>
2. Driving Digital Health: revolution by design
3. Digital revolution by design: building infrastructure

Refs:
Apps for sale on the NHS website
Whose smart city? Resident involvement
Data Protection and the Internet of Things, Julie Brill FTC
A Magna Carta for the web

Are care.data pilots heading for a breech delivery?

Call the midwife [if you can find one free, the underpaid overworked miracle workers that they are], the care.data ‘pathfinder’ pilots are on their way! [This is under a five minute read, so there should be time to get the hot water on – and make a cup of tea.]

I’d like to be able to say I’m looking forward to a happy new arrival, but I worry care.data is set for a breech birth. Is there still time to have it turned around? I’d like to say yes, but it might need help.

The pause appears to be over as the NHS England board delegated the final approval of directions to their Chair, Sir Malcolm Grant and Chief Executive, Simon Stevens, on Thursday May 28.

Directions from NHS England which will enable the HSCIC to proceed with their pathfinder pilots’ next stage of delivery.

“this is a programme in which we have invested a great deal, of time and thought in its development.” [Sir Malcolm Grant, May 28, 2015, NHS England Board meeting]

And yet. After years of work and planning, and a 16 month pause, as long as it takes for the gestation of a walrus, it appears the directions had flaws. Technical fixes are also needed before the plan could proceed to extract GP care.data and merge it with our hospital data at HSCIC.

And there’s lots of unknowns what this will deliver.**

Groundhog Day?

The public and MPs were surprised in 2014. They may be even more surprised if 2015 sees a repeat of the same again.

We have yet to hear case studies of who received data in the past and would now be no longer eligible. Commercial data intermediaries? Can still get data, the NHS Open Day was told. And they do, as the HSCIC DARS meeting minutes in 2015 confirm.

By the time the pilots launch, the objection must actually work, communications must include an opt out form and must clearly give the programme a name.

I hope that those lessons have been learned, but I fear they have not been. There is still lack of transparency. NHS England’s communications materials and May-Oct 2014 and any 2015 programme board minutes have not been published.

We have been here before.

Back to September 2013: the GPES Advisory Committee, the ICO and Dame Fiona Caldicott, as well as campaigners and individuals could see the issues in the patient leaflet and asked for fixes.The programme went ahead anyway in February 2014 and although foreseen, failed to deliver. [For some, quite literally.]

These voices aren’t critical for fun, they call for fixes to get it right.

I would suggest that all of the issues raised since April 2014, were broadly known in February 2014 before the pause began. From the public listening exercise,  the high level summary captures some issues raised by patients, but doesn’t address their range or depth.

Some of the difficult and unwanted  issues, are still there, still the same and still being ignored, at least in the public domain. [4]

A Healthy New Arrival?

How is the approach better now and what happens next to proceed?

“It seems a shame,” the Walrus said, “To play them such a trick, After we’ve brought them out so far, And made them trot so quick!” [Lewis Carroll]

When asked by a board member: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach? it wasn’t very clear. [full detail end of post]

First they must pass the tests asked of them by Dame Fiona [her criteria and 27 questions from before Christmas.] At least that was what the verbal background given at the board meeting explained.

If the pilots should be a dip in the water of how national rollouts will proceed, then they need to test not just for today, but at least for the known future of changing content scope and expanding users – who will pay for the communication materials’ costs each time?

If policy keeps pressing forward, will it not make these complications worse under pressure? There may be external pressure ahead as potential changes to EU data protection are expected this year as well, for which the pilot must be prepared and design in advance for the expectations of best practice.

Pushing out the pathfinder directions, before knowing the answers to these practical things and patient questions open for over 16 months, is surely backwards. A breech birth, with predictable complications.

If in Sir Malcolm Grant’s words:

“we would only do this  if we believed it was absolutely critical in the interests of patients.” [Malcom Grant, May 28, 2015, NHS England Board meeting]

then I’d like to see the critical interest of patients put first. Address the full range of patient questions from the ‘listening pause’.

In the rush to just fix the best of a bad job, we’ve not even asked are we even doing the right thing? Is the system designed to best support doctor patient needs especially with the integration “blurring the lines” that Simon Stevens seems set on.

If  focus is on the success of the programme and not the patient, consider this: there’s a real risk too many opt out due to these unknowns. And lack of real choice on how their data gets used. It could be done better to reduce that risk.

What’s the percentage of opt out that the programme deems a success to make it worthwhile?

In March 2014, at a London event, a GP told me all her patients who were opting out were the newspaper reading informed, white, middle class. She was worried that the data that would be included, would be misleading and unrepresentative of her practice in CCG decision making.

medConfidential has written a current status for pathfinder areas that make great sense to focus first on fixing care.data’s big post-election question the opt out that hasn’t been put into effect. Of course in February 2014 we had to choose between two opt outs -so how will that work for pathfinders?

In the public interest we need collectively to see this done well. Another mis-delivery will be fatal. “No artificial timelines?”

Right now, my expectations are that the result won’t be as cute as a baby walrus.

******

Notes from the NHS England Board Meeting, May 28, 2015:

TK said:  “These directions [1] relate only to the pathfinder programme and specify for the HSCIC what data we want to be extracted in the event that Dame Fiona, this board and the Secretary of State have given their approval for the extraction to proceed.

“We will be testing in this process a public opt out, a citizen’s right to opt out, which means that, and to be absolutely clear if someone does exercise their right to opt out, no clinical data will be extracted from their general practice,  just to make that point absolutely clearly.

“We have limited access to the data, should it be extracted at the end of the pathfinder phase, in the pathfinder context to just four organisations: NHS England, Public Health England, the HSCIC and CQC.”

“Those four organisations will only be able to access it for analytic purposes in a safe, a secure environment developed by the Information Centre [HSCIC], so there will be no third party hosting of the data that flows from the extraction.

“In the event that Dame Fiona, this board, the Secretary of State, the board of the Information Centre, are persuaded that there is merit in the data analysis that proceeds from the extraction, and that we’ve achieved an appropriate standard of what’s called fair processing, essentially have explained to people their rights, it may well be that we proceed to a programme of national rollout, in that case this board will have to agree a separate set of directions.”

“This is not signing off anything other than a process to test communications, and for a conditional approval on extracting data subject to the conditions I’ve just described.”

CD said: “This is new territory, precedent, this is something we have to get right, not only for the pathfinders but generically as well.”

“One of the consequences of having a pathfinder approach, is as Tim was describing, is that directions will change in the future. So if we are going to have a truly fair process , one of the things we have to get right, is that for the pathfinders, people understand that the set of data that is extracted and who can use it in the pathfinders, will both be a subset of, the data that is extracted and who can use it in the future. If we are going to be true to this fair process, we have to make sure in the pathfinders that we do that.

“For example, at the advisory group last week, is that in the communication going forward we have to make sure that we flag the fact there will be further directions, and they will be changed, that we are overt in saying, subject to what Fiona Caldicott decides, that process itself will be transparent.”

Questions from Board members:
Q: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach?
What are the top three objectives we seek to achieve?

TK: So, Dame Fiona has set a series of standards she expects the pathfinders to demonstrate, in supporting GPs to be able to discharge this rather complex communication responsibility, that they have under the law  in any case.

“On another level how we can demonstrate that people have adequately understood their right to opt out [..]

“and how do we make sure that populations who are relatively hard to reach, although listed with GPs, are also made aware of their opportunity to opt out.

Perhaps it may help if I forward this to the board, It is in the public domain. But I will forward the letter to the board.”

“So that lays out quite a number of specific tangible objectives that we then have to evaluate in light of the pathfinder experience. “

Chair: “this is a programme in which we have invested a great deal, of time and thought in its development, we would only do this  if we believed it was absolutely critical in the interests of patients, it was something that would give us the information the intelligence that we need to more finely attune our commissioning practice, but also to get real time intelligence about how patients lives are lived, how treatments work and how we can better provide for their care.

“I don’t think this is any longer a matter of huge controversy, but how do we sensitively attune ourselves to patient confidentiality.”

“I propose that […] you will approve in principle the directions before you and also delegate to the Chief Executive and to myself to do final approval on behalf of the board, once we have taken into account the comments from medConfidential and any other issues, but the substance will remain unchanged.”

******

[4] request for the release of June 2014 Open House feedback still to be published in the hope that the range and depth of public questions can be addressed.

care.data comms letter

******
“The time has come,” the walrus said, “to talk of many things.”
[From ‘The Walrus* and the Carpenter’ in Through the Looking-Glass by Lewis Carroll]

*A walrus has a gestation period of about 16 months.
The same amount of time which the pause in the care.data programme has taken to give birth to the pathfinder sites.

references:
[1] NHS England Directions to HSCIC: May 28 2015 – http://www.england.nhs.uk/wp-content/uploads/2015/05/item6-board-280515.pdf
[2] Notes from care.data advisory group meeting on 27th February 2015
[3] Patient questions: https://jenpersson.com/pathfinder/
[4] Letter from NHS England in response to request from September, and November 2014 to request that public questions be released and addressed


15 Jan 2024: Image section in header replaced at the request of likely image tracing scammers who don’t own the rights and since it and this blog is non-commercial would fall under fair use anyway. However not worth the hassle. All other artwork on this site is mine.

The Economic Value of Data vs the Public Good? [1] care.data, Concerns and the cost of Consent

They say ‘every little helps’.  care.data needs every little it can get.

In my new lay member role on the ADRN panel, I read submissions for research requests for any ethical concerns that may be reflected in wider public opinion.

The driving force for sharing administrative data research is non-commercial, with benefits to be gained for the public good.

So how do we quantify the public good, and ‘in the public interest’?

Is there alignment between the ideology of government, the drivers of policy [for health, such as the commissioning body NHS England] and the citizens of the country on what constitutes ‘the public good’?

There is public good to be gained for example, from social and health data seen as a knowledge base,  by using it using in ‘bona fide’ research, often through linking with other data to broaden insights.

Insight that might result in improving medicines, health applications, and services. Social benefits that should help improve lives, to benefit society.

Although social benefits may be less tangible, they are no harder for the public to grasp than the economic. And often a no brainer as long as confidentiality and personal control are not disregarded.

When it comes to money making from our data the public is less happy. The economic value of data raises more questions on use.

There is economic benefit to extract from data as a knowledge base to inform decision making, being cost efficient and investing wisely. Saving money.

And there is measurable economic public good in terms of income tax from individuals and corporations who by using the data make a profit, using data as a basis from which to create tools or other knowledge. Making money for the public good through indirect sales.

Then there is economic benefit from data trading as a commodity. Direct sales.

In all of these considerations, how does what the public feels and their range of opinions, get taken into account in the public good cost and benefit accounting?

Do we have a consistent and developed understanding of ‘the public interest’ and how it is shifting to fit public expectation and use?

Public concern

“The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.”  [Wellcome blog, April 2015]

If something is jeopardising that public good it is in the public interest to say so, and for the right reasons.

The loss of public trust in data sharing measured by public feeling in 2014 is a threat to data used in the public interest, so what are we doing to fix it and are care.data lessons being learned?

The three biggest concerns voiced by the public at care.data listening events[1] were repeatedly about commercial companies’ use, and re-use of data, third parties accessing data for unknown purposes and the resultant loss of confidentiality.

 Question from Leicester: “Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.” [NHS Open Day, June 2014]

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial purposes.

Much of the debate and upset caused by the revelations of how our hospital episode statistics were managed in the past centred on the sense of loss of ownership. And with that, the inability to consent to who uses it. This despite acknowledgment that patients own their data.

Significant concern centres on use of the information gleaned from data that patients consider commercial exploitation. For use segmenting the insurance markets. For consumer market research. Using data for individual targeting. And its utter lack of governance.

There is also concern about data being directly sold or exchanged as a commodity.

These concerns were raised meeting after meeting in the 2014 care.data “listening process.”

To read in Private Eye that commercially sensitive projects were discussed in various meetings between NHS England and supermarket giant Tesco throughout 2014 [2] by the Patients and Information Director, responsible for care.data, is therefore all the more surprising.

They may of course be quite unrelated.

But when transparency is the mother of trust, it’s perhaps a surprising liason while ‘listening’ to care.data concerns.

It could appear that greater confidentiality was given to the sensitivity of commercial meetings than citizens’ sensitive data.

Consent package deals may be a costly mistake

People are much more aware since care.data a year ago, that unknown third parties may access data without our consent.

Consent around secondary NHS data sharing and in wider fora is no longer an inconvenient ethical dilemma best left on the shelf, as it has been for the last 25 years in secondary use, dusted off in the care.data crisis. [3]

Consent is front and centre in the latest EU data protection discussions [4] in which consent may become a requirement for all research purposes.

How that may affect social science and health research use, its pros and cons [5] remain to be seen.

However, in principle consent has always been required and good practice in applied medicine, despite the caveat for data used in medical research. As a general rule: “An intervention in the health field may only be carried out after the person concerned has given free and informed consent to it”. But this is consent for your care. Assuming that information is shared when looking after you, for direct care, during medical treatment itself is not causes concerns.

The idea is becoming increasingly assumed in discussions I have heard, [at CCG and other public meetings] that because patients have given implied consent to sharing their information for their care, that the same data may be shared for other purposes. It is not, and it is those secondary purposes that the public has asked at care.data events, to see split up, and differentiated.

Research uses are secondary uses, and those purposes cannot ethically be assumed. However, legal gateways, access to that data which makes it possible to uses for clearly defined secondary purposes by law, may make that data sharing legal.

That legal assumption, for the majority of people polls and dialogue show [though not for everyone 6b], comes  a degree of automatic support for bona fide research in the public interest. But it’s not a blanket for all secondary uses by any means, and it is this blanket assumption which has damaged trust.

So if data use in research assumes consent, and any panel is the proxy for personal decision making, the panel must consider the public voice and public interest in its decision making.

So what does the public want?

In those cases where there is no practicable alternative [to consent], there is still pressure to respect patient privacy and to meet reasonable expectations regarding use. The stated ambition of the CAG, for example, is to only advise disclosure in those circumstances where there is reason to think patients would agree it to be reasonable.

Whether active not implied consent does or does not become a requirement for research purposes without differentiation between kinds, the public already has different expectations and trust around different users.

The biggest challenge for championing the benefits of research in the public good, may be to avoid being lumped in with commercial marketing research for private profit.

The latter’s misuse of data is an underlying cause of the mistrust now around data sharing [6]. It’s been a high price to pay for public health research and others delayed since the Partridge audit.

Consent package deals mean that the public cannot choose how data are used in what kids of research and if not happy with one kind, may refuse permission for the other.

By denying any differentiation between direct, indirect, economic and social vale derived from data uses, the public may choose to deny all researchers access to their all personal data.

That may be costly to the public good, for public health and in broader research.

A public good which takes profit into account for private companies and the state, must not be at the expense of public feeling, reasonable expectations and ethical good practice.

A state which allows profit for private companies to harm the perception of  good practice by research in the public interest has lost its principles and priorities. And lost sight of the public interest.

Understanding if the public, the research community and government have differing views on what role economic value plays in the public good matters.

It matters when we discuss how we should best protect and approach it moving towards a changing EU legal framework.

“If the law relating to health research is to be better harmonised through the passing of a Regulation (rather than the existing Directive 95/46/EC), then we need a much better developed understanding of ‘the public interest’ than is currently offered by law.”  [M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1]

In the words of Dr Mark Taylor, “we need to do this better.”

How? I took a look at some of this in more detail:

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

Update note: A version of these three posts was combined into an opinion piece – care.data: ‘The Value of Data versus the Public Interest?’ published on StatsLife on June 3rd 2015.

****

image via Tesco media

 

[1] care.data listening event questions: https://jenpersson.com/pathfinder/

[2] Private Eye – on Tesco / NHS England commercial meetings https://twitter.com/medConfidential/status/593819474807148546

[3] HSCIC audit and programme for change www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

[4] EU data protection discussion http://www.digitalhealth.net/news/EHI/9934/eu-ministers-back-data-privacy-changes

[5] Joint statement on EU Data Protection proposals http://www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/WTP055584.pdf

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[6b] The ‘Dialogue on Data’ Ipsos MORI research 2014 https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx – commissioned by the Economic and Social Research Council (ESRC) and the Office for National Statistics (ONS) to conduct a public dialogue examining the public’s views on using linked administrative data for research purposes,

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

[10] Personalisation in health data plans http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-002-Adult-Social-Care-Informatics.pdf

[11] Tim Kelsey Keynote speech at Strata November 2013 https://www.youtube.com/watch?v=s8HCbXsC4z8

[12] Forbes: Illumina CEO on the US$20bn DNA market http://www.forbes.com/sites/luketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market/

 

The Economic Value of Data vs the Public Good? [3] The value of public voice.

Demonstrable value of public research to the public good, while abstract, is a concept quite clearly understood.

Demonstrating the economic value of data for private consumer companies like major supermarkets is even easier to understand.

What is less obvious is the harm that the commercial misuse of data can do to the public’s perception of all research for the public good.[6]

The personal cost of consumer data exploitation, whether through the loss of, or through paid-for privacy, must be limited to reduce the perceived personal cost of the public good.

By reducing the personal cost, we increase the value of the perceived public benefit of sharing and overall public good.

The public good may mean many things: benefits from public health research like understanding how disease travels, or good financial planning, derived from knowing what needs communities have and what services to provide.

By reducing the private cost to individuals of the loss of control and privacy of our data, citizens will be more willing to share.

It will create more opportunity for data to be used in the public interest, for both economic and social gain.

As I outlined in the previous linked blog posts, consent [part 1] and privacy [part 2] would be wise investments for its growth.

So how are consumer businesses and the state taking this into account?

Where is the dialogue we need to keep expectations and practices aligned in a changing environment and legal framework?

Personalisation: the economic value of data for companies

Any projects under discussion or in progress without adequate public consultation and real involvement, that ignore public voice,  risk their own success and with it the public good they should create.

The same is true for commercial projects.  For example, back to Tesco.

Whether the clubcard data management and processing [8] is directly or indirectly connected to Tesco, its customer data are important to the supermarket chain and are valuable.

Former Tesco executive, spoke about that value in a 2013 interview:

“These are slow-growing industries,” Leahy said. “The difference was in the use of data, in the way Tesco learned about its customers. And from that, everything flowed.”[9]

By knowing who, how and when citizens shop, it allows them to target the sales offering to make people buy more or differently. The so-called ‘nudge’ moving citizens in the direction the company wants.

He explained how, through the Clubcard loyalty program, the supermarket was able to transition from mass marketing to personalized marketing and that it works in other areas too:

“You can already see in some areas where customers are content to be priced as customers: risk pricing with insurance and so on.

“It makes a lot of sense in health pricing, but there will be certain social policy restriction in terms of fair access and so on.”

NHS patient data and commercial supermarket data may be coming closer in their use than we might think.

Not only closer in their similar desire to move towards personalisation [10] but for similar reasons, in the desire to use all the data to know all about people as health consumers and from that, to plan and purchase, best and cheapest…”in reducing overall cost.”

It is worth thinking about in an economy driven by ideological austerity, how reducing overall cost will be applied, by cutting services or reducing to whom services are offered.

What ‘nudge’ may be applied through NHS policies, to move citizens in the direction the drivers in government or civil service want to see?

What will push those who can afford it, into private care and out of those who the state has to spend money on, if they are prepared to spend their own, for example.

What is the data that citizens provide through schemes like care.data designed to achieve?

“Demonstrating The Actual Economic Value of Data”

Tim Kelsey, speaking at Strata in 2013 [11] talked about: “Demonstrating The Actual Economic Value of Data”. Our NHS data are valuable in both economic and social terms.

[From 12:17] “It will help put the UK on the map in terms of genomic research. The PM has already committed to the UK developing 100K gene sequences very rapidly. But those sequences on their own will have very limited value without the reference data that lies out there in the real world of the NHS, the data we’ll start making available form next June […]. The name of the programme by the way is care dot data.”

The long since delayed care.data programme plans to provide medical records for secondary use, as reference data for the 100K genomics programme. The programme has the intent to “create a lasting legacy for patients, the NHS and the UK economy.”

With consent.

When the CEO of Illumina talks about winning a US $20bn market [12] perhaps it also sounds economically appealing for the UK plc and the austerity-lean NHS. Illumina is the company which won the contract for the Genomics England project sequencing of course.

“The notion here is that it’s really a precursor to understand the health economics of why sequencing helps improve healthcare, both in quality of outcome, and in reducing overall cost. Presuming we meet the objectives of this three-year study–and it’s truly a pilot–then the program will expand substantially and sequence many more people in the U.K.” [Jay Flatley, CEO]

The idea of it being a precursor leaves me asking, to what?
“Will expand substantially” to whom?

As more and more becomes possible in science, there will be an ever greater need for understanding between how and why we should advance medicine, and how to protect human dignity. Because it becomes possible may not always mean it should be done.

Article 21 of the Convention for the Protection of Human Rights and Dignity of the Human Being with regard to the application of biology and medicine, also says:  “The human body and its parts shall not, as such, give rise to financial gain.”

How close is profit making from DNA sequencing getting to that line?

These are questions that raise ethical questions and questions of social and economic value. The social legitimacy of these programmes will depend on trust. Trust based on no surprises.

Commercial market research or real research for the public good?

Meanwhile all consenting patients can in theory now choose to access their own record [GP online].  Mr Kelsey expressed hopes in 2013 that developers would use that to help patients:

“to mash it up with other data sources to get their local retailers to tell them about their purchasing habits [16:05] so they can mash it up with their health data.”

This despite the 67% of the public concerned around health data use by commercial companies.

So what were the commercially sensitive projects discussed by NHS England and Tesco throughout 2014? It would be interesting to know whether loyalty cards and mashing up our data was part of it – or did they discuss market segmentation, personalisation and health pricing? Will we hear the ‘Transparency Tsar‘ tell NHS citizens their engagement is valued, but in reality find the public is not involved?

To do so would risk another care.data style fiasco in other fields.

Who might any plans offer most value to – the customer, the company or the country plc? Will the Goliaths focus on short term profit or fair processing and future benefits?

In the long run, ignoring public voice won’t help the UK plc or the public interest.

A balanced and sustainable research future will not centre on a consumer pay-for-privacy basis, or commercial alliances, but on a robust ethical framework for the public good.

A public good which takes profit into account for private companies and the state, but not at the expense of public feeling and ethical good practice.

A public good which we can understand in terms of social, direct and indirect economic value.

While we strive for the economic and public good in scientific and medical advances we must also champion human dignity and values.

This dialogue needs to be continued.

“The commitment must be an ongoing one to continue to consult with people, to continue to work to optimally protect both privacy and the public interest in the uses of health data. We need to use data but we need to use it in ways that people have reason to accept. Use ‘in the public interest’ must respect individual privacy. The current law of data protection, with its opposed concepts of ‘privacy’ and ‘public interest’, does not do enough to recognise the dependencies or promote the synergies between these concepts.”

[M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1]

The public voice from care.data listening and beyond, could positively help shape the developing consensual model if given genuine adequate opportunity to do so in much needed dialogue.

As they say, every little helps.

****

Part one: The Economic Value of Data vs the Public Good? [1] Concerns and the cost of Consent

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

****

[1] care.data listening event questions: https://jenpersson.com/pathfinder/

[2] Private Eye – on Tesco / NHS England commercial meetings https://twitter.com/medConfidential/status/593819474807148546

[3] HSCIC audit and programme for change www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

[4] EU data protection discussion http://www.digitalhealth.net/news/EHI/9934/eu-ministers-back-data-privacy-changes

[5] Joint statement on EU Data Protection proposals http://www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/WTP055584.pdf

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

[10] Personalisation in health data plans http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-002-Adult-Social-Care-Informatics.pdf

[11] Tim Kelsey Keynote speech at Strata November 2013 https://www.youtube.com/watch?v=s8HCbXsC4z8

[12] Forbes: Illumina CEO on the US$20bn DNA market http://www.forbes.com/sites/luketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market/