Tag Archives: privacy

Access to school pupil personal data by third parties is changing

The Department for Education in England and Wales [DfE] has lost control of who can access our children’s identifiable school records by giving individual and sensitive personal data out to a range of third parties, since government changed policy in 2012. It looks now like they’re panicking how to fix it.

Applicants wanting children’s personal identifiable and/or sensitive data now need to first apply for the lowest level criminal record check, DBS, in the access process, to the National Pupil Database.

Schools Week wrote about it and asked for comment on the change [1] (as discussed by Owen in his blog [2] and our tweets).

At first glance, it sound like a great idea, but what real difference will this make to who can receive 8 million school pupils’ data?

Yes, you did read that right.

The National Pupil Database gives away the personal data of eight million children, aged 2-19. Gives it away outside its own protection,  because users get sent raw data, to their own desks.[3]

It would be good to know people receiving your child’s data hadn’t ever been cautioned or convicted about something related to children in their past, right?

Unfortunately, this DBS check won’t tell the the Department for Education (DfE) that – because it’s the the basic £25 DBS check [4], not full version.

So this change seems less about keeping children’s personal data safe than being seen to do something. Anything. Anything but the thing that needs done. Which is to keep the data secure.

Why is this not a brilliant solution? 

Moving towards the principle of keeping the data more secure is right, but in practice, the DBS check is only useful if it would make data safe by stopping people receiving data and the risks associated with data misuse. So how will this DBS check achieve this? It’s not designed for people who handle data. It’s designed for people working with children.

There is plenty of evidence available of data inappropriately used for commercial purposes often in the news, and often through inappropriate storage and sharing of data as well as malicious breaches. I am not aware, and refer to this paper [5], of risks realised through malicious data misuse of data for academic purposes in safe settings. Though mistakes do happen through inappropriate processes, and through human error and misjudgement.

However it is not necessary to have a background check for its own sake. It is necessary to know that any users handle children’s data securely and appropriately, and with transparent oversight. There is no suggestion at all that people at TalkTalk are abusing data, but their customers’ data were not secure and those data held in trust are now being misused.

That risk is the harm that is likely to affect a high number of individuals if bulk personal data are not securely managed. Measures to make it so must be proportionate to that risk. [6]

Coming back to what this will mean for individual applicants and its purpose: Basic Disclosure contains only convictions considered unspent under The Rehabilitation of Offenders Act 1974. [7]

The absence of a criminal record does not mean data are securely stored or appropriately used by the recipient.

The absence of a criminal record does not mean data will not be forwarded to another undisclosed recipient and there be a way for the DfE to ever know it happened.

The absence of a criminal record showing up on the basic DBS check does not even prove that the person has no previous conviction related to misuse of people or of data. And anything you might consider ‘relevant’ to children for example, may have expired.

DBS_box copy

So for these reasons, I disagree that the decision to have a basic DBS check is worthwhile.  Why? Because it’s effectively meaningless and doesn’t solve the problem which is this:

Anyone can apply for 8m children’s personal data, and as long as they meet some purposes and application criteria, they get sent sensitive and identifiable children’s data to their own setting. And they do. [8]

Anyone the 2009 designed legislation has defined as a prescribed person or researcher, has come to mean journalists for example. Like BBC Newsnight, or Fleet Street papers. Is it right journalists can access my children’s data, but as pupils and parents we cannot, and we’re not even informed? Clearly not.

It would be foolish to be reassured by this DBS check. The DfE is kidding themselves if they think this is a workable or useful solution.

This step is simply a tick box and it won’t stop the DfE regularly giving away the records of eight million children’s individual level and sensitive data.

What problem is this trying to solve and how will it achieve it?

Before panicking to implement a change DfE should first answer:

  • who will administer and store potentially sensitive records of criminal convictions, even if unrelated to data?
  • what implications does this have for other government departments handling individual personal data?
  • why are 8m children’s personal and sensitive data given away ‘into the wild’ beyond DfE oversight in the first place?

Until the DfE properly controls the individual personal data flowing out from NPD, from multiple locations, in raw form, and its governance, it makes little material difference whether the named user is shown to have, or not have a previous criminal record. [9] Because the DfE has no idea if they are they only person who uses it.

The last line from DfE in the article is interesting: “it is entirely right that we we continue to make sure that those who have access to it have undergone the necessary background checks.”

Continue from not doing it before? Tantamount to a denial of change, to avoid scrutiny of the past and status quo? They have no idea who has “access” to our children’s data today after they have released it, except on paper and trust, as there’s no audit process.[10]

If this is an indicator of the transparency and type of wording the DfE wants to use to communicate to schools, parents and pupils I am concerned. Instead we need to see full transparency, assessment of privacy impact and a public consultation of coordinated changes.

Further, if I were an applicant, I’d be concerned that DfE is currently handling sensitive pupil data poorly, and wants to collect more of mine.

In summary: because of change in Government policy in 2012 and the way in which it is carried out in practice, the Department for Education in England and Wales [DfE] has lost control of who can access our 8m children’s identifiable school records. Our children deserve proper control of their personal data and proper communication about who can access that and why.

Discovering through FOI [11] the sensitivity level and volume of identifiable data access journalists are being given, shocked me. Discovering that schools and parents have no idea about it, did not.

This is what must change.

 

*********

If you have questions or concerns about the National Pupil Database or your own experience, or your child’s data used in schools, please feel free to get in touch, and let’s see if we can make this better to use our data well, with informed public support and public engagement.

********

References:
[1] National Pupil Database: How to apply: https://www.gov.uk/guidance/national-pupil-database-apply-for-a-data-extract

[2]Blogpost: http://mapgubbins.tumblr.com/post/132538209345/no-more-fast-track-access-to-the-national-pupil

[3] Which third parties have received data since 2012 (Tier 1 and 2 identifiable, individual and/or sensitive): release register https://www.gov.uk/government/publications/ national-pupil-database-requests-received

[4] The Basic statement content http://www.disclosurescotland.co.uk/disclosureinformation/index.htm

[5] Effective Researcher management: 2009 T. Desai (London School of Economics) and F. Ritchie (Office for National Statistics), United Kingdom http://www.unece.org/fileadmin/DAM/stats/documents/ece/ces/ge.46/2009/wp.15.e.pdf

[6] TalkTalk is not the only recent significant data breach of public trust. An online pharmacy that sold details of more than 20,000 customers to marketing companies has been fined £130,000 https://ico.org.uk/action-weve-taken/enforcement/pharmacy2u-ltd/

[7] Guidance on rehabilitation of Offenders Act 1974 https://www.gov.uk/government/uploads/system/uploads/
attachment_data/file/299916/rehabilitation-of-offenders-guidance.pdf

[8] the August 2014 NPD application from BBC Newsnight https://www.whatdotheyknow.com/request/293030/response/723407/attach/10/BBC%20Newsnight.pdf

[9] CPS Guidelines for offences involving children https://www.sentencingcouncil.org.uk/wp-content/uploads/Final_Sexual_Offences_Definitive_Guideline_content_web1.pdf
indecent_images_of_children/

[10] FOI request https://www.whatdotheyknow.com/request/pupil_data_application_approvals#outgoing-482241

[11] #saveFOI – I found out exactly how many requests had been fast tracked and not scrutinised by the data panel via a Freedom of Information Request, as well as which fields journalists were getting access to. The importance of public access to this kind of information is a reason to stand up for FOI  http://www.pressgazette.co.uk/press-gazette-launches-petition-stop-charges-foi-requests-which-would-be-tax-journalism

 

Act now: Stand up and speak out for your rights to finding out the facts #saveFOI

The Freedom of Information Act has enabled me to stand up for my children’s rights. It really matters to me. And we might lose it.

For every member of the public, who has ever or who has never used their rights under the Freedom of Information Act laws, the government consultation on changing them that closes today is worth caring about. If you haven’t yet had your say, go and take action here >> now.  If it is all you have time for before the end of today, you can sign 38 degrees petition or write an email to your MP.

Or by the end of today you can reply to the call for evidence. There is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also got this plain English version.

Please do. Now. It closes today, on November 20th.

If you need convinced why it matters to me and it should to you, read on.

What will happen

If the proposed changes come to pass, information about public accountability will be lost. Political engagement will not be open to all equally. It will promote an unfair society in which individuals are not only prevented from taking part in full public life, but prevented from understanding decisions made about them or that affect them. Campaign groups will be constrained from standing up for human rights by cost.  The press will be restrained in what they can ask.

MySociety has a brilliant summary.  Michael Sheen spoke up calling it “nothing short of a full frontal attack” on the principles of democratic government. And Tom Watson spoke of three serious instances where facts would have stayed hidden, were it not for access made using the law of Freedom of Information:

1. death rates in cardiac patient care
2. cases when the police use Tasers on children
3. the existence of cracks in the nuclear power station at Hinckley

Why does FOI matter to me personally? In Education.

Because it’s enabled me to start a conversation to get the Department for Education to start to improve their handling of our 8 million children’s personal and sensitive data they hold in the National Pupil Database for England and Wales. Through FOI I asked for unpublished facts how many releases of identifiable personal data of school pupils have been fast-tracked at the Department of Education without panel oversight. And to see the panel terms of reference which are still not on their website.

The request: whatdotheykknow.com
The outcome:
National Pupil Database FOI case study summary here.

I’m now coordinating calls for changes on behalf of the 8m children whose records they hold and parents across the country.

******

Why does FOI matter to me personally? In Health.

Because Freedom of Information law has enabled public transparency of decision making and accountability of the care.data programme board decision making that was kept secret for over a year. NHS England refused to publish them. Their internal review declined appeal. The Information Commissioner’s Office upheld it.

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. In fact my case study, while highlighting that NHS England refused to release information, also shows that only a handful of genuine redactions were necessary, using Section 36 to keep them hidden, when the minutes were finally released.

In October 2014 I simply wanted to see the meeting minutes form part of the public record of care.data planning. I wanted to see the cost-benefit business case and scrutinise it against the benefits case that the public were told of at every public engagement event I had been to.  When at every turn the public is told how little money the NHS can afford to spend I wanted scrutiny of what the programme would cost at national and local levels. It was in the public interest to better inform public debate about the merits of the national programme. And I strongly believe that it is in the public interest to be informed and fully understand the intention and programme that demands the use of sensitive personal data.

The request: whatdotheyknow.com
The outcome: care.data FOI case study summary here.

Others could use this information I hoped, to ask the right questions about missing meeting minutes and transparency, and for everyone to question why there was no cost-benefit business plan at all in private; while the public kept being told of the benefits.  And it shows that data collection is further set to expand, without public debate.

Since then the programme has been postoned again and work is in progress on improved public engagement to enable public and professional confidence.

What has Freedom of Information achieved?

One of the most memorable results of Freedom of Information was the MPs expenses scandal. Who knows how much this Freedom of Information Request saved the taxpayers in immeasurable amounts of future spending on duck houses since MPs have been required to publish expenses since 2010? Four MPs were jailed for false accounting. Peers were expelled. Second homes and what appeared to the public as silly spending on sundries were revealed. Mr. Cameron apologized in 2009, saying he was “appalled” by the expenses. The majority of MPs had done nothing illegal but the Freedom of Information request enabled the start of a process of increased transparency to the public which showed where activities, while permitted by law, were simply unethical or unreasonable.

Historical record

Information published under the Freedom of Information Act can help to ensure that important records of decision-making processes are retained as part of the historic background to government.

Increased trust

The right information at the right time helps make better decisions, make spending more transparent and makes policies and practices more trustworthy.

Access to official information can also improve public confidence where public sector bodies are seen as being open. In a 2011 survey carried out on behalf of the Information Commissioner’s Office, 81% of public bodies questioned agreed that the Act had increased the public’s trust in their organisation.

A key argument made by the commission is that those in public office need private space for decision making. The Information Commissioner’s Office countered this in their submission to the consultation saying,

“there is a distinction between a need for a private space, depending on the circumstances and a desire for secrecy across a broad area of public sector activity. It was the latter tendency that FOIA was intended to correct.”

So how much more “private space” do public servants need?

Holding back information

When information that are judged should not be released in the public interest, there are already exemptions that can be applied to prevent disclosure of information under the Freedom of Information Act. [1]

The exemptions include:

  • if the information can easily be accessed by other means – e.g. the internet or published documents
  • if the information is personal information
  • if the information is provided in confidence (but only if legally enforceable)
  • when there is a legal reason not to disclose
  • if the information is about national security, defence, the economy, law enforcement, formulation of Government policy, health and safety, communications with Her Majesty or other royalty, international relations, intended for future publication and commercial interests. (All exemptions in this group must be tested to see if disclosure is in the public interest.)

In addition to these exemptions, organisations can withhold information if it will take more than two-and-a-half days to provide it, or they cannot identify what information is needed (although they have to work with the requester to clarify what is being requested).

They can also withhold information if they decide the request is vexatious.

Does it cost us too much to administer?

Some people who are supportive of these changes say they are concerned about costs in answering requests but have perhaps not considered the savings in exceptional cases (like the Expenses Scandal outcome). And as mySociety has reported [2], money spent responding to Freedom of Information requests also needs to be considered fairly in the context of wider public spending. In 2012 it was reported that Staffordshire County Council had spent £38,000 in a year responding to Freedom of Information requests. The then Director of mySociety, Tom Steinberg, commented:

“From this I can see that oversight by citizens and journalists cost only £38,000 from a yearly total budget of £1.3bn. I think it is fantastic that Staffordshire County Council can provide such information for only 0.002 per cent of its operating budget.”

Why does the government want to make itself less transparent? Even the Information Commissioner’s office has replied to the consultation to say that the Commissioner does not consider that significant changes to the core principles of the legislation are needed. This is a good law, that gives the public rights in our favour and transparency into how we are governed and tax money spent.

How will the value of FOI be measured of what would be lost if the changes are made?

What can you do?

The call for evidence is here and there is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also put together this super-easy Plain English version.

To have your say in the consultation closing on November 20th go online.

Or simply call or write to your MP.  Today. This really matters.


References:

[1] Requests can be refused https://ico.org.uk/for-organisations/guide-to-freedom-of-information/refusing-a-request/

[2] MySociety opposes restrictions https://www.mysociety.org/2015/11/11/voices-from-whatdotheyknow-why-we-oppose-foi-act-restrictions/

[3] National Pupil Database FOI case study summary here

[4] My care.data programme board FOI case study summary here

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex. Too little, too late.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex ed. They should be discussing prevention and personal data protection for all our personal data, not just one company, after the event.

Everyone’s been talking about TalkTalk and for all the wrong reasons. Data loss and a 15-year-old combined with a reportedly reckless response to data protection, compounded by lack of care.

As Rory Cellan-Jones wrote [1] rebuilding its reputation with customers and security analysts is going to be a lengthy job.

In Parliament Chi Onwarah, Shadow Minister for Culture & the Digital Economy, summed up in her question, asking the Minister to acknowledge “that all the innovation has come from the criminals while the Government sit on their hands, leaving it to businesses and consumers to suffer the consequences?”  [Hansard 2]

MPs were concerned for the 4 million* customers’ loss of name, date of birth, email, and other sensitive data, and called for an inquiry. [It may now be fewer*.] [3] The SciTech committee got involved too.

I hope this means Parliament will talk about TalkTalk not as the problem to be solved, but as one case study in a review of contemporary policy and practices in personal data handling.

Government spends money in data protection work in the [4] “National Cyber Security Programme”. [NCSP] What is the measurable outcome – particularly for TalkTalk customers and public confidence – from its £860M budget?  If you look at the breakdown of those sums, with little going towards data protection and security compared with the Home Office and Defence, we should ask if government is spending our money in an appropriately balanced way on the different threats it perceives. Keith Vaz suggested British companies that lose £34 billion every year to cybercrime. Perhaps this question will come into the inquiry.

This all comes after things have gone wrong.  Again [5]. An organisation we trusted has abused that trust by not looking after data with the stringency that customers should be able to expect in the 21st century, and reportedly not making preventative changes, apparent a year ago. Will there be consequences this time?

The government now saying it is talking about data protection and consequences, is like saying they’re talking sex education with teens, but only giving out condoms to the boys.

It could be too little too late. And they want above all to avoid talking about their own practices. Let’s change that.

Will this mean a review to end risky behaviour, bring in change, and be wiser in future?

If MPs explore what the NCSP does, then we the public, should learn more about what government’s expectations of commercial companies is in regards modern practices.

In addition, any MPs’ inquiry should address government’s own role in its own handling of the public’s personal data. Will members of government act in a responsible manner or simply tell others how to do so?

Public discussion around both commercial and state use of our personal data, should mean genuine public engagement. It should involve a discussion of consent where necessary for purposes  beyond those we expect or have explained when we submit our data, and there needs to be a change in risky behaviour in terms of physical storage and release practices, or all the talk, is wasted.

Some say TalkTalk’s  practices mean they have broken their contract along with consumer trust. Government departments should also be asking whether their data handling would constitute a breach of the public’s trust and reasonable expectations.

Mr Vaizey should apply his same logic to government handling data as he does to commercial handling. He said he is open to suggestions for improvement. [6]

Let’s not just talk about TalkTalk.

    • Let’s Talk Consequences: organisations taking risk seriously and meaningful consequences if not [7]
    • Let’s Talk Education: the education of the public on personal data use by others and rights and responsibilities we have [8]
    • Let’s Talk Parliament’s Policies and Practices: about its own complementary lack of data  understanding in government and understand what good practice is in physical storage, good governance and transparent oversight
    • Let’s Talk Public Trust: and the question whether government can be trusted with public data it already has and whether its current handling makes it trustworthy to take more [9]

Vaizey said of the ICO now in his own department: “The Government take the UK’s cyber-security extremely seriously and we will continue to do everything in our power to protect organisations and individuals from attacks.”

“I will certainly meet the Information Commissioner to look at what further changes may be needed in the light of this data breach. [..] It has extensive powers to take action and, indeed, to levy significant fines. “

So what about consequences when data are used in ways the public would consider a loss, and not through an attack or a breach, but government policy? [10]

Let’s Talk Parliament’s Policies and Practices

Commercial companies are not alone in screwing up the use and processing [11] management of our personal data. The civil service under current policy seems perfectly capable of doing by itself. [12]

Government data policy has not kept up with 21st century practices and to me seems to work in the dark, as Chi Onwarah said,

‘illuminated by occasional flashes of incompetence.’

This incompetence can risk harm to people’s lives, to business and to public confidence.

And once given, trust would be undermined by changing the purposes or scope of use for which it was given, for example as care.data plans to do after the pilot. A most risky idea.

Trust in these systems, whether commercial or state, is crucial. Yet reviews which highlight this, and make suggestions to support trust such as ‘data should never be (and currently is never) released with personal identifiers‘ in The Shakespeare Review have been ignored by government.

Where our personal data are not used well in government departments by the department themselves, they seem content to date to rely on public ignorance to get away with current shoddy practices.

Practices such as not knowing who all your customers are, because they pass data on to others. Practices, such as giving individual level identifiable personal data to third parties without informing the public, or asking for consent. Practices, such as never auditing or measuring any benefit of giving away others personal data.

“It is very important that all businesses, particularly those handling significant amounts of sensitive customer data, have robust procedures in place to protect those data and to inform customers when there may have been a data breach.” Ed Vaizey, Oct 26th, HOC

If government departments prove to be unfit to handle the personal data we submit in trust to the state today, would we be right to trust them with even more?

While the government is busy wagging fingers at commercial data use poor practices, the care.data debacle is evidence that not all its MPs or civil service understand how data are used in commercial business or through government departments.

MPs calling for commercial companies to sharpen up their data protection must understand how commercial use of data often piggy-backs the public use of our personal data, or others getting access to it via government for purposes that were unintended.

Let’s Talk Education

If the public is to understand how personal data are to be kept securely with commercial organisations, why should they not equally ask to understand how the state secures their personal data? Educating the public could lead to better engagement with research, better understanding of how we can use digital services and a better educated society as a whole. It seems common sense.

At a recent public event [13],  I asked civil servants talking about big upcoming data plans they announced, linking school data with more further education and employment data, I asked how they planned to involve the people whose data they would use. There was no public engagement to mention. Why not? Inexcusable in this climate.

Public engagement is a matter of trust and developing understanding in a relationship. Organisations must get this right.[14]

If government is discussing risky practices by commercial companies, they also need to look closer to home and fix what is broken in government data handling where it exposes us to risk through loss of control of our personal data.

The National Pupil Database for example, stores and onwardly shares identifiable individual sensitive data of at least 8m children’s records from age 2 -19. That’s twice as big as the TalkTalk loss was first thought to be.

Prevention not protection is what we should champion. Rather than protection after the events,  MPs and public must demand emphasis on prevention measures in our personal data use.

This week sees more debate on how and why the government will legislate to have more powers to capture more data about all the people in the country. But are government policy, process and practices fit to handle our personal data, what they do with it and who they give it to?

Population-wide gathering of data surveillance in any of its many forms is not any less real just because you don’t see it. Children’s health, schools, increases in volume of tax data collection. We don’t discuss enough how these policies can be used every day without the right oversight. MPs are like the conservative parents not comfortable talking to their teens about sleeping with someone. Just because you don’t know, it doesn’t mean they’re not doing it. [15] It just means you don’t want to know because if you find out they’re not doing it safely, you’ll have to do something about it.

And it might be awkward. (Meanwhile in schools real, meaningful PHSE has been left off the curriculum.)

Mr. Vaizey asked in the Commons for suggestions for improvement.

My suggestion is this. How government manages data has many options. But the principle should be simple. Our personal data needs not only protected, but not exposed to unnecessary risk in the first place, by commercial or state bodies. Doing nothing, is not an option.

Let’s Talk about more than TalkTalk

Teens will be teens. If commercial companies can’t manage their systems better to prevent a child successfully hacking it, then it’s not enough to point at criminal behaviour. There is fault to learn from on all sides. In commercial and state uses of personal data.

There is talk of new, and bigger, data sharing plans. [16]

Will the government wait to see  and keep its fingers crossed each month to see if our data are used safely at unsecured settings with some of these unknown partners data might be onwardly shared with, hoping we won’t find out and they won’t need to talk about it, or have a grown up public debate based on public education?

Will it put preventative measures in place appropriate to the sensitivity and volume of the data it is itself responsible for?

Will moving forward with new plans mean safer practices?

If government genuinely wants our administrative data at the heart of digital government fit for the 21st century, it must first understand how all government departments collect and use public data. And it must educate the public in this and commercial data use.

We need a fundamental shift in the way the government respects public opinion and shift towards legal and privacy compliance – both of which are lacking.

Let’s not talk about TalkTalk. Let’s have meaningful grown up debate with genuine engagement. Let’s talk about prevention measures in our data protection. Let’s talk about consent. It’s personal.

******

[1] Questions for TalkTalk: http://www.bbc.co.uk/news/technology-34636308

[2] Hansard: http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm151026/debtext/151026-0001.htm#15102612000004

[3] TalkTalk update: http://www.talktalkgroup.com/press/press-releases/2015/cyber-attack-update-tuesday-october-30-2015.aspx

[4] The Cyber Security Programme: http://www.civilserviceworld.com/articles/feature/depth-look-national-cyber-security-programme

[5] Paul reviews TalkTalk; https://paul.reviews/value-security-avoid-talktalk/

[6] https://ico.org.uk/for-organisations/guide-to-data-protection/conditions-for-processing/

[7] Let’s talk Consequences: the consequences of current failures to meet customers’ reasonable expectations of acceptable risk, are low compared with elsewhere.  As John Nicolson (East Dunbartonshire) SNP pointed out in the debate, “In the United States, AT&T was fined £17 million for failing to protect customer data. In the United Kingdom, the ICO can only place fines of up to £500,000. For a company that received an annual revenue of nearly £1.8 billion, a fine that small will clearly not be terrifying. The regulation of telecoms must be strengthened to protect consumers.”

[8] Let’s talk education: FOI request revealing a samples of some individual level data released to members of the press: http://www.theyworkforyou.com/debates/?id=2015-10-26b.32.0

The CMA brought out a report in June, on the use of consumer data, the topic should be familiar in parliament, but little engagement has come about as a result. It suggested the benefit:

“will only be realised if consumers continue to provide data and this relies on them being able to trust the firms that collect and use it”, and that “consumers should know when and how their data is being collected and used and be able to decide whether and how to participate. They should have access to information from firms about how they are collecting, storing and using data.”

[9] Let’s Talk Public Trust – are the bodies involved Trustworthy? Government lacks an effective data policy and is resistant to change. Yet it wants to collect ever more personal and individual level for unknown purposes from the majority of 60m people, with an unprecedented PR campaign.  When I heard the words ‘we want a mature debate’ it was reminiscent of HSCIC’s ‘intelligent grown up debate’ requested by Kinglsey Manning, in a speech when he admitted lack of public knowledge was akin to a measure of past success, and effectively they would rather have kept the use of population wide health data ‘below the radar’.

Change: We need change, the old way after all, didn’t work, according to Minister Matt Hancock: “The old model of government has failed, so we will build a new one.” I’d like to see what that new one will look like. Does he mean to expand only data sharing policy, or the powers of the civil service?

[10] National Pupil Database detailed data releases to third parties https://www.whatdotheyknow.com/request/pupil_data_national_pupil_databa

[11] http://adrn.ac.uk/news-events/latest-news/adrn-rssevent

[12] http://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[13] https://www.liberty-human-rights.org.uk/human-rights/privacy/state-surveillance

[14] http://www.computerweekly.com/news/4500256274/Government-will-tackle-barriers-to-sharing-and-linking-data-says-Cabinet-Office-minister-Hancock

Driving digital health, revolution by design

This follows on from: 1. Digital revolution by design: building for change and people.

***

Talking about the future of digital health in the NHS, Andy Williams went on to ask, what makes the Internet work?

In my head I answered him, freedom.

Freedom from geographical boundaries. Freedom of speech to share ideas and knowledge in real time with people around the world.  The freedom to fair and equal use. Cooperation, creativity, generosity…

Where these freedoms do not exist or are regulated the Internet may not work well for its citizens and its potential is restricted, as well as its risks.

But the answer he gave, was standards.

And of course he was right.  Agreed standards are needed when sharing a global system so that users, their content and how it works behind the screen cooperate and function as intended.

I came away wondering what the digital future embodied in the NHS NIB plans will look like, who has their say in its content and design and who will control  it?

What freedoms and what standards will be agreed upon for the NHS ‘digital future’ to function and to what purpose?

Citizens help shape the digital future as we help define the framework of how our data are to be collected and used, through what public feeling suggests is acceptable and people actually use.

What are some of the expectations the public have and what potential barriers exist to block achieving its benefits?

It’s all too easy when discussing the digital future of the NHS to see it as a destination. Perhaps we could shift the conversation focus to people, and consider what tools digital will offer the public on their life journey, and how those tools will be driven and guided.

Expectations

One key public expectation will be of trust, if something digital is offered under the NHS brand, it must be of the rigorous standard we expect.

Is every app a safe, useful tool or fun experiment and how will users [especially for mental health apps where the outcomes may be less tangibly measured than say, blood glucose] know the difference?

A second expectation must be around universal equality of access.

A third expectation must be that people know once the app is downloaded or enrolment done, what they have signed up to.

Will the NHS England / NIB digital plans underway create or enable these barriers and expectations?

What barriers exist to the NHS digital vision and why?

Is safety regulation a barrier to innovation?

The ability to broadly share innovation at speed is one of the greatest strengths of digital development, but can also risk spreading harm quickly. Risk management needs to be upfront.

We  assume that digital designs will put at their heart the core principles in the spirit of the NHS.  But if apps are not available on prescription and are essentially a commercial product with no proven benefit, does that exploit the NHS brand trust?

Regulation of quality and safety must be paramount, or they risk doing harm as any other treatment could to the person and regulation must further consider reputational risk to the NHS and the app providers.

Regulation shouldn’t be seen as a barrier, but as an enabler to protect and benefit both user and producer, and indirectly the NHS and state.

Once safety regulation is achieved, I hope that spreading benefits will not be undermined by creating artificial boundaries that restrict access to the tools by affordability, in a postcode lottery,  or in language.

But are barriers being built by design in the NHS digital future?

Cost: commercial digital exploitation or digital exclusion?

There appear to be barriers being built by design into the current NHS apps digital framework. The first being cost.

For the poorest even in the UK today in maternity care, exclusion is already measurable in those who can and cannot afford the data allowance it costs on a smart phone for e-red book access, attendees were told by its founder at #kfdigital15.

Is digital participation and its resultant knowledge or benefit to become a privilege reserved for those who can afford it? No longer free at the point of service?

I find it disappointing that for all the talk of digital equality, apps are for sale on the NHS England website and many state they may not be available in your area – a two-tier NHS by design. If it’s an NHS app, surely it should be available on prescription and/or be free at the point of use and for all like any other treatment? Or is yet another example of  NHS postcode lottery care?

There are tonnes of health apps on the market which may not have much proven health benefit, but they may sell well anyway.

I hope that decision makers shaping these frameworks and social contracts in health today are also looking beyond the worried well, who may be the wealthiest and can afford apps leaving the needs of those who can’t afford to pay for them behind.

At home, it is some of the least wealthy who need the most intervention and from whom there may be little profit to be made There is little in 2020 plans I can see that focuses on the most vulnerable, those in prison and IRCs, and those with disabilities.

Regulation in addition to striving for quality and safety by design, can ensure there is no commercial exploitation of purchasers.  However it is a  question of principle that will decide for or against exclusion for users based on affordability.

Geography: crossing language, culture and country barriers

And what about our place in the wider community, the world wide web, as Andy Williams talked about: what makes the Internet work?

I’d like to think that governance and any “kite marking” of digital tools such as apps, will consider this and look beyond our bubble.

What we create and post online will be on the world wide web.  That has great potential benefits and has risks.

I feel that in the navel gazing focus on our Treasury deficit, the ‘European question’ and refusing refugees, the UK government’s own insularity is a barrier to our wider economic and social growth.

At the King’s Fund event and at the NIB meeting the UK NHS leadership did not discuss one of the greatest strengths of online.

Online can cross geographical boundaries.

How are NHS England approved apps going to account for geography and language and cross country regulation?

What geographical and cultural barriers to access are being built by design just through lack of thought into the new digital framework?

Barriers that will restrict access and benefits both in certain communities within the UK, and to the UK.

One of the three questions asked at the end of the NIB session, was how the UK Sikh community can be better digitally catered for.

In other parts of the world both traditional and digital access to knowledge are denied to those who cannot afford it.

school

This photo reportedly from Indonesia, is great [via Banksy on Twitter, and apologies I cannot credit the photographer] two boys on the way to school, pass their peers on their way to work.

I wonder if one of these boys has the capability to find the cure for cancer?
What if he is one of the five, not one of the two?

Will we enable the digital infrastructure we build today to help global citizens access knowledge and benefits, or restrict access?

Will we enable broad digital inclusion by design?

And what of  data sharing restrictions: Barrier or Enabler?

Organisations that talk only of legal, ethical or consent ‘barriers’ to datasharing don’t understand human behaviour well enough.

One of the greatest risks to achieving the potential benefits from data is the damage done to it by organisations that are paternalistic and controlling. They exploit a relationship rather than nurturing it.

The data trust deficit from the Royal Statistical Society has lessons for policymakers. Including finding that: “Health records being sold to private healthcare companies to make money for government prompted the greatest opposition (84%).”

Data are not an abstract to be exploited, but personal information. Unless otherwise informed, people expect that information offered for one purpose, will not be used for another. Commercial misuse is the greatest threat to public trust.

Organisations that believe behavioural barriers to data sharing are an obstacle,  have forgotten that trust is not something to be overcome, but to be won and continuously reviewed and protected.

The known barrier without a solution is the lack of engagement that is fostered where there is a lack of respect for the citizen behind the data. A consensual data charter could help to enable a way forward.

Where is the wisdom we have lost in knowledge?

Once an app is [prescribed[, used, data exchanged with the NHS health provider and/or app designer, how will users know that what they agreed to in an in-store app, does not change over time?

How will ethical guidance be built into the purposes of any digital offerings we see approved and promoted in the NHS digital future?

When the recent social media experiment by Facebook only mentioned the use of data for research after the experiment, it caused outcry.

It crossed the line between what people felt acceptable and intrusive, analysing the change in behaviour that Facebook’s intervention caused.

That this manipulation is not only possible but could go unseen, are both a risk and cause for concern in a digital world.

Large digital platforms, even small apps have the power to drive not only consumer, but potentially social and political decision making.

“Where is the knowledge we have lost in information?” asks the words of T S Elliott in Choruses, from the Rock. “However you disguise it, this thing does not change: The perpetual struggle of Good and Evil.”

Knowledge can be applied to make a change to current behaviour, and offer or restrict choices through algorithmic selection. It can be used for good or for evil.

‘Don’t be evil’ Google’s adoptive mantra is not just some silly slogan.

Knowledge is power. How that power is shared or withheld from citizens matters not only today’s projects, but for the whole future digital is helping create. Online and offline. At home and abroad.

What freedoms and what standards will be agreed upon for it to function and to what purpose? What barriers can we avoid?

When designing for the future I’d like to see discussion consider not only the patient need, and potential benefits, but also the potential risk for exploitation and behavioural change the digital solution may offer. Plus, ethical solutions to be found for equality of access.

Regulation and principles can be designed to enable success and benefits, not viewed as barriers to be overcome

There must be an ethical compass built into the steering of the digital roadmap that the NHS is so set on, towards its digital future.

An ethical compass guiding app consumer regulation,  to enable fairness of access and to know when apps are downloaded or digital programmes begun, that users know to what they are signed up.

Fundamental to this the NIB speakers all recognised at #kfdigital15 is the ethical and trustworthy extraction, storage and use of data.

There is opportunity to consider when designing the NHS digital future [as the NIB develops its roadmaps for NHS England]:

i making principled decisions on barriers
ii. pro-actively designing ethics and change into ongoing projects, and,
iii. ensuring engagement is genuine collaboration and co-production.

The barriers do not need got around, but solutions built by design.

***

Part 1. Digital revolution by design: building for change and people
Part 3. Digital revolution by design: building infrastructures

NIB roadmaps: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/384650/NIB_Report.pdf

Digital revolution by design: building for change and people (1)

Andy Williams said* that he wants not evolution, but a revolution in digital health.

It strikes me that few revolutions have been led top down.

We expect revolution from grass roots dissent, after a growing consensus in the population that the status quo is no longer acceptable.

As the public discourse over the last 18 months about the NHS use of patient data has proven, we lack a consensual agreement between state, organisations and the public how the data in our digital lives should be collected, used and shared.

The 1789 Declaration of the Rights of Man and Citizen as part of the French Revolution set out a charter for citizens, an ethical and fair framework of law in which they trusted their rights would be respected by fellow men.

That is something we need in this digital revolution.

We are told hand by government that it is necessary to share all our individual level data in health from all sorts of sources.

And that bulk data collection is vital in the public interest to find surveillance knowledge that government agencies want.

At the same time other government departments plan to restrict citizens’ freedom of access to knowledge that could be used to hold the same government  and civil servants to account.

On the consumer side, there is public concern about the way we are followed around on the web by companies including global platforms like Google and Facebook, that track our digital footprint to deliver advertising.

There is growing objection to the ways in which companies scoop up data to build profiles of individuals and groups and personalising how they get treated. Recent objection was to marketing misuse by charities.

There is little broad understanding yet of the power of insight that organisations can now have to track and profile due to the power of algorithms and processing capability.

Technology progress that has left legislation behind.

But whenever you talk to people about data there are two common threads.

The first, is that although the public is not happy with the status quo of how paternalistic organisations or consumer companies ‘we can’t live without’ manage our data, there is a feeling of powerlessness that it can’t change.

The second, is frustration with organisations that show little regard for public opinion.

What happens when these feelings both reach tipping point?

If Marie Antoinette were involved in today’s debate about the digital revolution I suspect she may be the one saying: “let them eat cookies.”

And we all know how that ended.

If there is to be a digital revolution in the NHS where will it start?

There were marvelous projects going on at grassroots discussed over the two days: bringing the elderly online and connected and in housing and deprivation. Young patients with rare diseases are designing apps and materials to help consultants improve communications with patients.

The NIB meeting didn’t have real public interaction and or any discussion of those projects ‘in the room’ in the 10 minutes offered. Considering the wealth of hands-on digital health and care experience in the audience it was a missed opportunity for the NIB to hear common issues and listen to suggestions for co-designed solutions.

While white middle class men (for the most part) tell people of their grand plans from the top down, the revolutionaries of all kinds are quietly getting on with action on the ground.

If a digital revolution is core to the NHS future, then we need to ask to understand the intended change and outcome much more simply and precisely.

We should all understand why the NHS England leadership wants to drive change, and be given proper opportunity to question it, if we are to collaborate in its achievement.

It’s about the people, stoopid

Passive participation will not be enough from the general public if the revolution is to be as dramatic as it is painted.

Consensual co-design of plans and co-writing policy are proven ways to increase commitment to change.

Evidence suggests citizen involvement in planning is more likely to deliver success. Change done with, not to.

When constructive solutions have been offered what impact has engagement if no change is made to any  plans?

If that’s engagement, you’re doing it wrong.

Struggling with getting themselves together on the current design for now, it may be hard to invite public feedback on the future.

But it’s only made hard if what the public wants is ignored.  If those issues were resolved in the way the public asked for at listening events it could be quite simple to solve.

The NIB leadership clearly felt nervous to have debate, giving only 10 minutes of three hours for public involvement, yet that is what it needs. Questions and criticism are not something to be scared of, but opportunities to make things better.

The NHS top-down digital plans need public debate and dissected by the clinical professions to see if it fits the current and future model of healthcare, because if not involved in the change, the ride will be awfully bumpy to get there.

For data about us, to be used without us, is certainly an outdated model incompatible with a digital future.

The public needs to fight for citizen rights in a new social charter that demands change along lines we want, change that doesn’t just talk of co-design but that actually means it.

If unhappy about today’s data use, then the general public has to stop being content to be passive cash cows as we are data mined.

If we want data used only for public benefit research and not market segmentation, then we need to speak up. To the Information Commissioner’s Office if the organisation itself will not help.

“As Nicole Wong, who was one of President Obama’s top technology advisors, recently wrote, “[t]here is no future in which less data is collected and used.”

“The challenge lies in taking full advantage of the benefits that the Internet of Things promises while appropriately protecting consumers’ privacy, and ensuring that consumers are treated fairly.” Julie Brill, FTC, May 4 2015, Berlin

In the rush to embrace the ‘Internet of Things’ it can feel as though the reason for creating them has been forgotten. If the Internet serves things, it serves consumerism. AI must tread an enlightened path here. If the things are designed to serve people, then we would hope they offer methods of enhancing our life experience.

In the dream of turning a “tsunami of data” into a “tsunami of actionable business intelligence,” it seems all too often the person providing the data is forgotten.

While the Patient and Information Directorate, NHS England or NIB speakers may say these projects are complex and hard to communicate the benefits, I’d say if you can’t communicate the benefits, its not the fault of the audience.

People shouldn’t have to either a) spend immeasurable hours of their personal time, understanding how these projects work that want their personal data, or b) put up with being ignorant.

We should be able to fully question why it is needed and get a transparent and complete explanation. We should have fully accountable business plans and scrutiny of tangible and intangible benefits in public, before projects launch based on public buy-in which may misplaced. We should expect  plans to be accessible to everyone and make documents straightforward enough to be so.

Even after listening to a number of these meetings and board meetings, I am  not sure many would be able to put succinctly: what is the NHS digital forward view really? How is it to be funded?

On the one hand new plans are to bring salvation, while the other stops funding what works already today.

Although the volume of activity planned is vast, what it boils down to, is what is visionary and achievable, and not just a vision.

Digital revolution by design: building for change and people

We have opportunity to build well now, avoiding barriers-by-design, pro-actively designing ethics and change into projects, and to ensure it is collaborative.

Change projects must map out their planned effects on people before implementing technology. For the NHS that’s staff and public.

The digital revolution must ensure the fair and ethical use of the big data that will flow for direct care and secondary uses if it is to succeed.

It must also look beyond its own bubble of development as they shape their plans in the ever changing infrastructures in which data, digital, AI and ethics will become important to discuss together.

That includes in medicine.

Design for the ethics of the future, and enable change mechanisms in today’s projects that will cope with shifting public acceptance, because that shift has already begun.

Projects whose ethics and infrastructures of governance were designed years ago, have been overtaken in the digital revolution.

Projects with an old style understanding of engagement are not fit-for-the-future. As Simon Denegri wrote, we could have 5 years to get a new social charter and engagement revolutionised.

Tim Berners-Lee when he called for a Magna Carta on the Internet asked for help to achieve the web he wants:

“do me a favour. Fight for it for me.”

The charter as part of the French Revolution set out a clear, understandable, ethical and fair framework of law in which they trusted their rights would be respected by fellow citizens.

We need one for data in this digital age. The NHS could be a good place to start.

****

It’s exciting hearing about the great things happening at grassroots. And incredibly frustrating to then see barriers to them being built top down. More on that shortly, on the barriers of cost, culture and geography.

****

* at the NIB meeting held on the final afternoon of the Digital Conference on Health & Social Care at the King’s Fund, June 16-17.

NEXT>>
2. Driving Digital Health: revolution by design
3. Digital revolution by design: building infrastructure

Refs:
Apps for sale on the NHS website
Whose smart city? Resident involvement
Data Protection and the Internet of Things, Julie Brill FTC
A Magna Carta for the web

Are care.data pilots heading for a breech delivery?

Call the midwife [if you can find one free, the underpaid overworked miracle workers that they are], the care.data ‘pathfinder’ pilots are on their way! [This is under a five minute read, so there should be time to get the hot water on – and make a cup of tea.]

I’d like to be able to say I’m looking forward to a happy new arrival, but I worry care.data is set for a breech birth. Is there still time to have it turned around? I’d like to say yes, but it might need help.

The pause appears to be over as the NHS England board delegated the final approval of directions to their Chair, Sir Malcolm Grant and Chief Executive, Simon Stevens, on Thursday May 28.

Directions from NHS England which will enable the HSCIC to proceed with their pathfinder pilots’ next stage of delivery.

“this is a programme in which we have invested a great deal, of time and thought in its development.” [Sir Malcolm Grant, May 28, 2015, NHS England Board meeting]

And yet. After years of work and planning, and a 16 month pause, as long as it takes for the gestation of a walrus, it appears the directions had flaws. Technical fixes are also needed before the plan could proceed to extract GP care.data and merge it with our hospital data at HSCIC.

And there’s lots of unknowns what this will deliver.**

Groundhog Day?

The public and MPs were surprised in 2014. They may be even more surprised if 2015 sees a repeat of the same again.

We have yet to hear case studies of who received data in the past and would now be no longer eligible. Commercial data intermediaries? Can still get data, the NHS Open Day was told. And they do, as the HSCIC DARS meeting minutes in 2015 confirm.

By the time the pilots launch, the objection must actually work, communications must include an opt out form and must clearly give the programme a name.

I hope that those lessons have been learned, but I fear they have not been. There is still lack of transparency. NHS England’s communications materials and May-Oct 2014 and any 2015 programme board minutes have not been published.

We have been here before.

Back to September 2013: the GPES Advisory Committee, the ICO and Dame Fiona Caldicott, as well as campaigners and individuals could see the issues in the patient leaflet and asked for fixes.The programme went ahead anyway in February 2014 and although foreseen, failed to deliver. [For some, quite literally.]

These voices aren’t critical for fun, they call for fixes to get it right.

I would suggest that all of the issues raised since April 2014, were broadly known in February 2014 before the pause began. From the public listening exercise,  the high level summary captures some issues raised by patients, but doesn’t address their range or depth.

Some of the difficult and unwanted  issues, are still there, still the same and still being ignored, at least in the public domain. [4]

A Healthy New Arrival?

How is the approach better now and what happens next to proceed?

“It seems a shame,” the Walrus said, “To play them such a trick, After we’ve brought them out so far, And made them trot so quick!” [Lewis Carroll]

When asked by a board member: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach? it wasn’t very clear. [full detail end of post]

First they must pass the tests asked of them by Dame Fiona [her criteria and 27 questions from before Christmas.] At least that was what the verbal background given at the board meeting explained.

If the pilots should be a dip in the water of how national rollouts will proceed, then they need to test not just for today, but at least for the known future of changing content scope and expanding users – who will pay for the communication materials’ costs each time?

If policy keeps pressing forward, will it not make these complications worse under pressure? There may be external pressure ahead as potential changes to EU data protection are expected this year as well, for which the pilot must be prepared and design in advance for the expectations of best practice.

Pushing out the pathfinder directions, before knowing the answers to these practical things and patient questions open for over 16 months, is surely backwards. A breech birth, with predictable complications.

If in Sir Malcolm Grant’s words:

“we would only do this  if we believed it was absolutely critical in the interests of patients.” [Malcom Grant, May 28, 2015, NHS England Board meeting]

then I’d like to see the critical interest of patients put first. Address the full range of patient questions from the ‘listening pause’.

In the rush to just fix the best of a bad job, we’ve not even asked are we even doing the right thing? Is the system designed to best support doctor patient needs especially with the integration “blurring the lines” that Simon Stevens seems set on.

If  focus is on the success of the programme and not the patient, consider this: there’s a real risk too many opt out due to these unknowns. And lack of real choice on how their data gets used. It could be done better to reduce that risk.

What’s the percentage of opt out that the programme deems a success to make it worthwhile?

In March 2014, at a London event, a GP told me all her patients who were opting out were the newspaper reading informed, white, middle class. She was worried that the data that would be included, would be misleading and unrepresentative of her practice in CCG decision making.

medConfidential has written a current status for pathfinder areas that make great sense to focus first on fixing care.data’s big post-election question the opt out that hasn’t been put into effect. Of course in February 2014 we had to choose between two opt outs -so how will that work for pathfinders?

In the public interest we need collectively to see this done well. Another mis-delivery will be fatal. “No artificial timelines?”

Right now, my expectations are that the result won’t be as cute as a baby walrus.

******

Notes from the NHS England Board Meeting, May 28, 2015:

TK said:  “These directions [1] relate only to the pathfinder programme and specify for the HSCIC what data we want to be extracted in the event that Dame Fiona, this board and the Secretary of State have given their approval for the extraction to proceed.

“We will be testing in this process a public opt out, a citizen’s right to opt out, which means that, and to be absolutely clear if someone does exercise their right to opt out, no clinical data will be extracted from their general practice,  just to make that point absolutely clearly.

“We have limited access to the data, should it be extracted at the end of the pathfinder phase, in the pathfinder context to just four organisations: NHS England, Public Health England, the HSCIC and CQC.”

“Those four organisations will only be able to access it for analytic purposes in a safe, a secure environment developed by the Information Centre [HSCIC], so there will be no third party hosting of the data that flows from the extraction.

“In the event that Dame Fiona, this board, the Secretary of State, the board of the Information Centre, are persuaded that there is merit in the data analysis that proceeds from the extraction, and that we’ve achieved an appropriate standard of what’s called fair processing, essentially have explained to people their rights, it may well be that we proceed to a programme of national rollout, in that case this board will have to agree a separate set of directions.”

“This is not signing off anything other than a process to test communications, and for a conditional approval on extracting data subject to the conditions I’ve just described.”

CD said: “This is new territory, precedent, this is something we have to get right, not only for the pathfinders but generically as well.”

“One of the consequences of having a pathfinder approach, is as Tim was describing, is that directions will change in the future. So if we are going to have a truly fair process , one of the things we have to get right, is that for the pathfinders, people understand that the set of data that is extracted and who can use it in the pathfinders, will both be a subset of, the data that is extracted and who can use it in the future. If we are going to be true to this fair process, we have to make sure in the pathfinders that we do that.

“For example, at the advisory group last week, is that in the communication going forward we have to make sure that we flag the fact there will be further directions, and they will be changed, that we are overt in saying, subject to what Fiona Caldicott decides, that process itself will be transparent.”

Questions from Board members:
Q: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach?
What are the top three objectives we seek to achieve?

TK: So, Dame Fiona has set a series of standards she expects the pathfinders to demonstrate, in supporting GPs to be able to discharge this rather complex communication responsibility, that they have under the law  in any case.

“On another level how we can demonstrate that people have adequately understood their right to opt out [..]

“and how do we make sure that populations who are relatively hard to reach, although listed with GPs, are also made aware of their opportunity to opt out.

Perhaps it may help if I forward this to the board, It is in the public domain. But I will forward the letter to the board.”

“So that lays out quite a number of specific tangible objectives that we then have to evaluate in light of the pathfinder experience. “

Chair: “this is a programme in which we have invested a great deal, of time and thought in its development, we would only do this  if we believed it was absolutely critical in the interests of patients, it was something that would give us the information the intelligence that we need to more finely attune our commissioning practice, but also to get real time intelligence about how patients lives are lived, how treatments work and how we can better provide for their care.

“I don’t think this is any longer a matter of huge controversy, but how do we sensitively attune ourselves to patient confidentiality.”

“I propose that […] you will approve in principle the directions before you and also delegate to the Chief Executive and to myself to do final approval on behalf of the board, once we have taken into account the comments from medConfidential and any other issues, but the substance will remain unchanged.”

******

[4] request for the release of June 2014 Open House feedback still to be published in the hope that the range and depth of public questions can be addressed.

care.data comms letter

******
“The time has come,” the walrus said, “to talk of many things.”
[From ‘The Walrus* and the Carpenter’ in Through the Looking-Glass by Lewis Carroll]

*A walrus has a gestation period of about 16 months.
The same amount of time which the pause in the care.data programme has taken to give birth to the pathfinder sites.

references:
[1] NHS England Directions to HSCIC: May 28 2015 – http://www.england.nhs.uk/wp-content/uploads/2015/05/item6-board-280515.pdf
[2] Notes from care.data advisory group meeting on 27th February 2015
[3] Patient questions: http://jenpersson.com/pathfinder/
[4] Letter from NHS England in response to request from September, and November 2014 to request that public questions be released and addressed

The Economic Value of Data vs the Public Good? [1] care.data, Concerns and the cost of Consent

They say ‘every little helps’.  care.data needs every little it can get.

In my new lay member role on the ADRN panel, I read submissions for research requests for any ethical concerns that may be reflected in wider public opinion.

The driving force for sharing administrative data research is non-commercial, with benefits to be gained for the public good.

So how do we quantify the public good, and ‘in the public interest’?

Is there alignment between the ideology of government, the drivers of policy [for health, such as the commissioning body NHS England] and the citizens of the country on what constitutes ‘the public good’?

There is public good to be gained for example, from social and health data seen as a knowledge base,  by using it using in ‘bona fide’ research, often through linking with other data to broaden insights.

Insight that might result in improving medicines, health applications, and services. Social benefits that should help improve lives, to benefit society.

Although social benefits may be less tangible, they are no harder for the public to grasp than the economic. And often a no brainer as long as confidentiality and personal control are not disregarded.

When it comes to money making from our data the public is less happy. The economic value of data raises more questions on use.

There is economic benefit to extract from data as a knowledge base to inform decision making, being cost efficient and investing wisely. Saving money.

And there is measurable economic public good in terms of income tax from individuals and corporations who by using the data make a profit, using data as a basis from which to create tools or other knowledge. Making money for the public good through indirect sales.

Then there is economic benefit from data trading as a commodity. Direct sales.

In all of these considerations, how does what the public feels and their range of opinions, get taken into account in the public good cost and benefit accounting?

Do we have a consistent and developed understanding of ‘the public interest’ and how it is shifting to fit public expectation and use?

Public concern

“The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.”  [Wellcome blog, April 2015]

If something is jeopardising that public good it is in the public interest to say so, and for the right reasons.

The loss of public trust in data sharing measured by public feeling in 2014 is a threat to data used in the public interest, so what are we doing to fix it and are care.data lessons being learned?

The three biggest concerns voiced by the public at care.data listening events[1] were repeatedly about commercial companies’ use, and re-use of data, third parties accessing data for unknown purposes and the resultant loss of confidentiality.

 Question from Leicester: “Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.” [NHS Open Day, June 2014]

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial purposes.

Much of the debate and upset caused by the revelations of how our hospital episode statistics were managed in the past centred on the sense of loss of ownership. And with that, the inability to consent to who uses it. This despite acknowledgment that patients own their data.

Significant concern centres on use of the information gleaned from data that patients consider commercial exploitation. For use segmenting the insurance markets. For consumer market research. Using data for individual targeting. And its utter lack of governance.

There is also concern about data being directly sold or exchanged as a commodity.

These concerns were raised meeting after meeting in the 2014 care.data “listening process.”

To read in Private Eye that commercially sensitive projects were discussed in various meetings between NHS England and supermarket giant Tesco throughout 2014 [2] by the Patients and Information Director, responsible for care.data, is therefore all the more surprising.

They may of course be quite unrelated.

But when transparency is the mother of trust, it’s perhaps a surprising liason while ‘listening’ to care.data concerns.

It could appear that greater confidentiality was given to the sensitivity of commercial meetings than citizens’ sensitive data.

Consent package deals may be a costly mistake

People are much more aware since care.data a year ago, that unknown third parties may access data without our consent.

Consent around secondary NHS data sharing and in wider fora is no longer an inconvenient ethical dilemma best left on the shelf, as it has been for the last 25 years in secondary use, dusted off in the care.data crisis. [3]

Consent is front and centre in the latest EU data protection discussions [4] in which consent may become a requirement for all research purposes.

How that may affect social science and health research use, its pros and cons [5] remain to be seen.

However, in principle consent has always been required and good practice in applied medicine, despite the caveat for data used in medical research. As a general rule: “An intervention in the health field may only be carried out after the person concerned has given free and informed consent to it”. But this is consent for your care. Assuming that information is shared when looking after you, for direct care, during medical treatment itself is not causes concerns.

The idea is becoming increasingly assumed in discussions I have heard, [at CCG and other public meetings] that because patients have given implied consent to sharing their information for their care, that the same data may be shared for other purposes. It is not, and it is those secondary purposes that the public has asked at care.data events, to see split up, and differentiated.

Research uses are secondary uses, and those purposes cannot ethically be assumed. However, legal gateways, access to that data which makes it possible to uses for clearly defined secondary purposes by law, may make that data sharing legal.

That legal assumption, for the majority of people polls and dialogue show [though not for everyone 6b], comes  a degree of automatic support for bona fide research in the public interest. But it’s not a blanket for all secondary uses by any means, and it is this blanket assumption which has damaged trust.

So if data use in research assumes consent, and any panel is the proxy for personal decision making, the panel must consider the public voice and public interest in its decision making.

So what does the public want?

In those cases where there is no practicable alternative [to consent], there is still pressure to respect patient privacy and to meet reasonable expectations regarding use. The stated ambition of the CAG, for example, is to only advise disclosure in those circumstances where there is reason to think patients would agree it to be reasonable.

Whether active not implied consent does or does not become a requirement for research purposes without differentiation between kinds, the public already has different expectations and trust around different users.

The biggest challenge for championing the benefits of research in the public good, may be to avoid being lumped in with commercial marketing research for private profit.

The latter’s misuse of data is an underlying cause of the mistrust now around data sharing [6]. It’s been a high price to pay for public health research and others delayed since the Partridge audit.

Consent package deals mean that the public cannot choose how data are used in what kids of research and if not happy with one kind, may refuse permission for the other.

By denying any differentiation between direct, indirect, economic and social vale derived from data uses, the public may choose to deny all researchers access to their all personal data.

That may be costly to the public good, for public health and in broader research.

A public good which takes profit into account for private companies and the state, must not be at the expense of public feeling, reasonable expectations and ethical good practice.

A state which allows profit for private companies to harm the perception of  good practice by research in the public interest has lost its principles and priorities. And lost sight of the public interest.

Understanding if the public, the research community and government have differing views on what role economic value plays in the public good matters.

It matters when we discuss how we should best protect and approach it moving towards a changing EU legal framework.

“If the law relating to health research is to be better harmonised through the passing of a Regulation (rather than the existing Directive 95/46/EC), then we need a much better developed understanding of ‘the public interest’ than is currently offered by law.”  [M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1]

In the words of Dr Mark Taylor, “we need to do this better.”

How? I took a look at some of this in more detail:

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

Update note: A version of these three posts was combined into an opinion piece – care.data: ‘The Value of Data versus the Public Interest?’ published on StatsLife on June 3rd 2015.

****

image via Tesco media

 

[1] care.data listening event questions: http://jenpersson.com/pathfinder/

[2] Private Eye – on Tesco / NHS England commercial meetings https://twitter.com/medConfidential/status/593819474807148546

[3] HSCIC audit and programme for change www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

[4] EU data protection discussion http://www.digitalhealth.net/news/EHI/9934/eu-ministers-back-data-privacy-changes

[5] Joint statement on EU Data Protection proposals http://www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/WTP055584.pdf

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[6b] The ‘Dialogue on Data’ Ipsos MORI research 2014 https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx – commissioned by the Economic and Social Research Council (ESRC) and the Office for National Statistics (ONS) to conduct a public dialogue examining the public’s views on using linked administrative data for research purposes,

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale http://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

[10] Personalisation in health data plans http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-002-Adult-Social-Care-Informatics.pdf

[11] Tim Kelsey Keynote speech at Strata November 2013 https://www.youtube.com/watch?v=s8HCbXsC4z8

[12] Forbes: Illumina CEO on the US$20bn DNA market http://www.forbes.com/sites/luketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market/

 

The Economic Value of Data vs the Public Good? [3] The value of public voice.

Demonstrable value of public research to the public good, while abstract, is a concept quite clearly understood.

Demonstrating the economic value of data for private consumer companies like major supermarkets is even easier to understand.

What is less obvious is the harm that the commercial misuse of data can do to the public’s perception of all research for the public good.[6]

The personal cost of consumer data exploitation, whether through the loss of, or through paid-for privacy, must be limited to reduce the perceived personal cost of the public good.

By reducing the personal cost, we increase the value of the perceived public benefit of sharing and overall public good.

The public good may mean many things: benefits from public health research like understanding how disease travels, or good financial planning, derived from knowing what needs communities have and what services to provide.

By reducing the private cost to individuals of the loss of control and privacy of our data, citizens will be more willing to share.

It will create more opportunity for data to be used in the public interest, for both economic and social gain.

As I outlined in the previous linked blog posts, consent [part 1] and privacy [part 2] would be wise investments for its growth.

So how are consumer businesses and the state taking this into account?

Where is the dialogue we need to keep expectations and practices aligned in a changing environment and legal framework?

Personalisation: the economic value of data for companies

Any projects under discussion or in progress without adequate public consultation and real involvement, that ignore public voice,  risk their own success and with it the public good they should create.

The same is true for commercial projects.  For example, back to Tesco.

Whether the clubcard data management and processing [8] is directly or indirectly connected to Tesco, its customer data are important to the supermarket chain and are valuable.

Former Tesco executive, spoke about that value in a 2013 interview:

“These are slow-growing industries,” Leahy said. “The difference was in the use of data, in the way Tesco learned about its customers. And from that, everything flowed.”[9]

By knowing who, how and when citizens shop, it allows them to target the sales offering to make people buy more or differently. The so-called ‘nudge’ moving citizens in the direction the company wants.

He explained how, through the Clubcard loyalty program, the supermarket was able to transition from mass marketing to personalized marketing and that it works in other areas too:

“You can already see in some areas where customers are content to be priced as customers: risk pricing with insurance and so on.

“It makes a lot of sense in health pricing, but there will be certain social policy restriction in terms of fair access and so on.”

NHS patient data and commercial supermarket data may be coming closer in their use than we might think.

Not only closer in their similar desire to move towards personalisation [10] but for similar reasons, in the desire to use all the data to know all about people as health consumers and from that, to plan and purchase, best and cheapest…”in reducing overall cost.”

It is worth thinking about in an economy driven by ideological austerity, how reducing overall cost will be applied, by cutting services or reducing to whom services are offered.

What ‘nudge’ may be applied through NHS policies, to move citizens in the direction the drivers in government or civil service want to see?

What will push those who can afford it, into private care and out of those who the state has to spend money on, if they are prepared to spend their own, for example.

What is the data that citizens provide through schemes like care.data designed to achieve?

“Demonstrating The Actual Economic Value of Data”

Tim Kelsey, speaking at Strata in 2013 [11] talked about: “Demonstrating The Actual Economic Value of Data”. Our NHS data are valuable in both economic and social terms.

[From 12:17] “It will help put the UK on the map in terms of genomic research. The PM has already committed to the UK developing 100K gene sequences very rapidly. But those sequences on their own will have very limited value without the reference data that lies out there in the real world of the NHS, the data we’ll start making available form next June […]. The name of the programme by the way is care dot data.”

The long since delayed care.data programme plans to provide medical records for secondary use, as reference data for the 100K genomics programme. The programme has the intent to “create a lasting legacy for patients, the NHS and the UK economy.”

With consent.

When the CEO of Illumina talks about winning a US $20bn market [12] perhaps it also sounds economically appealing for the UK plc and the austerity-lean NHS. Illumina is the company which won the contract for the Genomics England project sequencing of course.

“The notion here is that it’s really a precursor to understand the health economics of why sequencing helps improve healthcare, both in quality of outcome, and in reducing overall cost. Presuming we meet the objectives of this three-year study–and it’s truly a pilot–then the program will expand substantially and sequence many more people in the U.K.” [Jay Flatley, CEO]

The idea of it being a precursor leaves me asking, to what?
“Will expand substantially” to whom?

As more and more becomes possible in science, there will be an ever greater need for understanding between how and why we should advance medicine, and how to protect human dignity. Because it becomes possible may not always mean it should be done.

Article 21 of the Convention for the Protection of Human Rights and Dignity of the Human Being with regard to the application of biology and medicine, also says:  “The human body and its parts shall not, as such, give rise to financial gain.”

How close is profit making from DNA sequencing getting to that line?

These are questions that raise ethical questions and questions of social and economic value. The social legitimacy of these programmes will depend on trust. Trust based on no surprises.

Commercial market research or real research for the public good?

Meanwhile all consenting patients can in theory now choose to access their own record [GP online].  Mr Kelsey expressed hopes in 2013 that developers would use that to help patients:

“to mash it up with other data sources to get their local retailers to tell them about their purchasing habits [16:05] so they can mash it up with their health data.”

This despite the 67% of the public concerned around health data use by commercial companies.

So what were the commercially sensitive projects discussed by NHS England and Tesco throughout 2014? It would be interesting to know whether loyalty cards and mashing up our data was part of it – or did they discuss market segmentation, personalisation and health pricing? Will we hear the ‘Transparency Tsar‘ tell NHS citizens their engagement is valued, but in reality find the public is not involved?

To do so would risk another care.data style fiasco in other fields.

Who might any plans offer most value to – the customer, the company or the country plc? Will the Goliaths focus on short term profit or fair processing and future benefits?

In the long run, ignoring public voice won’t help the UK plc or the public interest.

A balanced and sustainable research future will not centre on a consumer pay-for-privacy basis, or commercial alliances, but on a robust ethical framework for the public good.

A public good which takes profit into account for private companies and the state, but not at the expense of public feeling and ethical good practice.

A public good which we can understand in terms of social, direct and indirect economic value.

While we strive for the economic and public good in scientific and medical advances we must also champion human dignity and values.

This dialogue needs to be continued.

“The commitment must be an ongoing one to continue to consult with people, to continue to work to optimally protect both privacy and the public interest in the uses of health data. We need to use data but we need to use it in ways that people have reason to accept. Use ‘in the public interest’ must respect individual privacy. The current law of data protection, with its opposed concepts of ‘privacy’ and ‘public interest’, does not do enough to recognise the dependencies or promote the synergies between these concepts.”

[M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1]

The public voice from care.data listening and beyond, could positively help shape the developing consensual model if given genuine adequate opportunity to do so in much needed dialogue.

As they say, every little helps.

****

Part one: The Economic Value of Data vs the Public Good? [1] Concerns and the cost of Consent

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

****

[1] care.data listening event questions: http://jenpersson.com/pathfinder/

[2] Private Eye – on Tesco / NHS England commercial meetings https://twitter.com/medConfidential/status/593819474807148546

[3] HSCIC audit and programme for change www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

[4] EU data protection discussion http://www.digitalhealth.net/news/EHI/9934/eu-ministers-back-data-privacy-changes

[5] Joint statement on EU Data Protection proposals http://www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/WTP055584.pdf

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale http://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

[10] Personalisation in health data plans http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-002-Adult-Social-Care-Informatics.pdf

[11] Tim Kelsey Keynote speech at Strata November 2013 https://www.youtube.com/watch?v=s8HCbXsC4z8

[12] Forbes: Illumina CEO on the US$20bn DNA market http://www.forbes.com/sites/luketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market/

Wearables: patients will ‘essentially manage their data as they wish’. What will this mean for diagnostics, treatment and research and why should we care? [#NHSWDP 3]

 

Consent to data sharing appears to be a new choice firmly available on the NHS England patient menu if patient ownership of our own records, is clearly acknowledged as ‘the operating principle legally’.

Simon Stevens, had just said in his keynote speech:

“..smartphones; […] the single most important health treatment and diagnostic tool at our disposal over the coming decade and beyond ” Simon Stevens, March 18 2015.

Tim Kelsey, Director Patients and Information, NHS England, then talked about consent in the Q&A:

“We now acknowledge the patient’s ownership of the record […] essentially, it’s always been implied, it’s still not absolutely explicit but it is the operating principle now legally for the NHS.

“So, let’s get back to consent and what it means for clinical professionals, because we are going to move to a place where people will make those decisions as they currently do with wearable devices, and other kinds of mobile, and we need to get to a point where people can plug their wearable device into their medical record, and essentially manage their data as they wish.

“It is essentially, their data.”

How this principle has been applied in the past, is being now, and how it may change matters, as it will affect many other areas.

Our personal health data is the business intelligence of the health industry’s future.

Some parts of that industry will say we don’t share enough data. Or don’t use it in the right way.  For wearables designed as medical devices, it will be vital to do so.

But before some launch into polemics on the rights and wrongs of blanket ‘data sharing’ we should be careful what types of data we mean, and for what purposes it is extracted.It matters when discussing consent and sharing.

We should be clear to separate consent to data sharing for direct treatment from consent for secondary purposes other than care (although Mr Kelsey hinted at a conflation of the two in a later comment). The promised opt-out from sharing for secondary uses is pending legal change. At least that’s what we’ve been told.

Given that patient data from hospital and range of NHS health settings today, are used for secondary purposes without consent – despite the political acknowledgement that patients have an opt out – this sounded a bold new statement, and contrasted with his past stance.

Primary care data extraction for secondary uses, in the care.data programme, was not intended to be consensual. Will it become so?

Its plan so far has an assumed opt-in model, despite professional calls from some, such as at the the BMA ARM to move to an opt-in model, and the acknowledged risk of harm that it will do to patient trust.

The NHS England Privacy Assessment said: ‘The extraction of personal confidential data from providers without consent carries the risk that patients may lose trust in the confidential nature of the health service.’

A year into the launch, Jan 2014, a national communications plan should have solved the need for fair processing, but another year on, March 2015, there is postcode lottery, pilot approach.

If in principle, datasharing is to be decided by consensual active choice,  as it “is the operating principle now legally for the NHS” then why not now, for care.data, and for all?

When will the promised choice be enacted to withhold data from secondary uses and sharing with third parties beyond the HSCIC?

“we are going to move to a place where people will make those decisions as they currently do with wearable devices” [Widening digital participation, at the King’s Fund March 2015]

So when will we see this ‘move’ and what will it mean?

Why plan to continue to extract more data under the ‘old’ assumption principle, if ownership of data is now with the individual?

And who is to make the move first – NHS patients or NHS patriarchy – if patients use wearables before the NHS is geared up to them?

Looking back or forward thinking?

Last year’s programme has become outdated not only in principle, but digital best practice if top down dictatorship is out, and the individual is now to “manage their data as they wish.”

What might happen in the next two years, in the scope of the Five Year Forward Plan or indeed by 2020?

This shift in data creation, sharing and acknowledged ownership may mean epic change for expectations and access.

It will mean that people’s choice around data sharing; from patients and healthy controls, need considered early on in research & projects. Engagement, communication and involvement will be all about trust.

For the ‘worried well’, wearables could ‘provide digital “nudges” that will empower us to live healthier and better lives‘ or perhaps not.

What understanding have we yet, of the big picture of what this may mean and where apps fit into the wider digital NHS application and beyond?

Patients right to choose

The rights to information and decision making responsibility is shifting towards the patient in other applied areas of care.

But what data will patients truly choose to apply and what to share, manipulate or delete? Who will use wearables and who will not, and how will that affect the access to and delivery of care?

What data will citizens choose to share in future and how will it affect the decision making by their clinician, the NHS as an organisation, research, public health, the state, and the individual?

Selective deletion could change a clinical history and clinician’s view.

Selective accuracy in terms of false measurements [think diabetes], or in medication, could kill people quickly.

How are apps to be regulated? Will only NHS ‘approved’ apps be licensed for use in the NHS and made available to choose from and what happens to patients’ data who use a non-approved app?

How will any of their data be accessed and applied in primary care?

Knowledge is used to make choices and inform decisions. Individuals make choices about their own lives, clinicians make decisions for and with their patients in their service provision, organisations make choices about their business model which may include where to profit.

Our personal health data is the business intelligence of the health industry’s future.

Who holds the balance of power in that future delivery model for healthcare in England, is going to be an ongoing debate of epic proportions but it will likely change in drips rather than a flood.

It has already begun. Lobbyists and companies who want access to data are apparently asking for significant changes to be made in the access to micro data held at the ONS. EU laws are changing.

The players who hold data, will hold knowledge, will hold power.

If the NHS were a monopoly board game, data intermediaries would be some of the wealthiest sites, but the value they create from publicly funded NHS data, should belong in the community chest.

If consent is to be with the individual for all purposes other than direct care, then all data sharing bodies and users had best set their expectations accordingly. Patients will need to make wise decisions, for themselves and in the public interest.

Projects for research and sharing must design trust and security into plans from the start or risk failure through lack of participants.

It’s enormously exciting.  I suspect some apps will be rather well hyped and deflate quickly if not effective. Others might be truly useful. Others may kill us.

As twitter might say, what a time to be alive.

Digital opportunities for engaging citizens as far as apps and data sharing goes, is not only not about how the NHS will engage citizens, but how citizens will engage with what NHS offering.

Consent it seems will one day be king.
Will there or won’t there be a wearables revolution?
Will we be offered or choose digital ‘wellness tools’ or medically approved apps? Will we trust them for diagnostics and treatment? Or will few become more than a fad for the worried well?
Control for the individual over their own data and choice to make their own decisions of what to store, share or deny may rule in practice, as well as theory.
That practice will need to differentiate between purposes for direct clinical care and secondary uses as it does today, and be supported and protected in legislation, protecting patient trust.
“We are going to move to a place where people will make those decisions as they currently do with wearable devices, and other kinds of mobile, and we need to get to a point where people can plug their wearable device into their medical record, and essentially manage their data as they wish.”
However as ‘choice’ was the buzzword for NHS care in recent years – conflated with increasing the use of private providers – will consent be abused to mean a shift of responsibility from the state to the individual, with caveats for how it could affect care?
With that shift in responsibility for decision making, as with personalized budgets, will we also see a shift in responsibility for payment choices from state to citizen?
Will our lifestyle choices in one area exclude choice in another?
Could app data of unhealthy purchases from the supermarket or refusal to share our health data, one day be seen as refusal of care and a reason to decline it? Mr Kelsey hinted at this last question in the meeting.
Add a population stratified by risk groups into the mix, and we have lots of legitimate questions to ask on the future vision of the NHS.
He went on to say:
“we have got some very significant challenges to explore in our minds, and we need to do, quite urgently from a legal and ethical perspective, around the advent of machine learning, and …artificial intelligence capable of handling data at a scale which we don’t currently do […] .
“I happen to be the person responsible in the NHS for the 100K genomes programme[…]. We are on the edge of a new kind of medicine, where we can also look at the interaction of all your molecules, as they bounce around your DNA. […]
“The point is, the principle is, it’s the patient’s data and they must make decisions about who uses it and what they mash it up with.”
How well that is managed will determine who citizens will choose to engage and share data with, inside and outside our future NHS.
Simon Stevens earlier at the event, had acknowledged a fundamental power shift he sees as necessary:
“This has got to be central about what the redesign of care looks like, with a fundamental power shift actually, in the way in which services are produced and co-produced.”

That could affect everyone in the NHS, with or without a wearables revolution.

These are challenges the public is not yet discussing and we’re already late to the party.

We’re all invited. What will you be wearing?

********
[Previous: part one here #NHSWDP 1  – From the event “Digital Participation and Health Literacy: Opportunities for engaging citizens” held at the King’s Fund, London, March 18, 2015]

[Previous: part two #NHSWDP 2: Smartphones: the single most important health treatment & diagnostic tool at our disposal]

********

Apple ResearchKit: http://techcrunch.com/2015/03/09/apple-introduces-researchkit-turning-iphones-into-medical-diagnostic-devices/#lZOCiR:UwOp
Digital nudges – the Tyranny of the Should by Maneesha Juneja http://maneeshjuneja.com/blog/2015/3/2/the-tyranny-of-the-should

You may use these HTML tags and attributes: <blockquote cite="">

Nothing to fear, nowhere to hide – a mother’s attempt to untangle UK surveillance law and programmes

“The Secret Service should start recruiting through Mumsnet to attract more women to senior posts, MPs have said.”
[SkyNews, March 5, 2015]

Whilst we may have always dreamed of being ‘M’, perhaps we can start by empowering all Mums to understand how real-life surveillance works today, in all our lives, homes and schools.

In the words of Franklin D. Roosevelt at his 1933 inaugural address:

“This is preeminently the time to speak the truth, the whole truth, frankly and boldly…

“Let me assert my firm belief that the only thing we have to fear is fear itself—nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.”

It is hard to know in the debate in the ‘war on terror’, what is truthful and what is ‘justified’ fear as opposed to ‘nameless and unreasoning.’

To be reasoned, we need to have information to understand what is going on and it can feel that picture is complex and unclear.

What concrete facts do you and I have about terrorism today, and the wider effects it has on our home life?

If you have children in school, or are a journalist, a whistleblower, lawyer or have thought about the effects of the news recently, it may affect our children or any of us in ways which we may not expect.

It might surprise you that it was surveillance law that was used to track a mother and her children’s [1] movements when a council wasn’t sure if her school application was for the correct catchment area. [It legally used the Regulation of Investigatory Powers Act 2000, (RIPA) [2]

Recent headlines are filled with the story of three more girls who are reported to have travelled to Syria.

As a Mum I’d be desperate for my teens, and I cannot imagine what their family must feel. There are conflicting opinions, and politics,  but let’s leave that aside. These girls are each somebody’s daughter, and at risk.

As a result MPs are talking about what they should be teaching in schools. Do parents and citizens agree, and do we know what?

Shadow business secretary Chuka Umunna, Labour MP told Pienaar’s Politics on BBC Radio 5 Live: “I really do think this is not just an issue for the intelligence services, it’s for all of us in our schools, in our communities, in our families to tackle this.”

Justice Minister Simon Hughes told Murnaghan on Sky News it was important to ensure a counter-argument against extremism was being made in schools and also to show pupils “that’s not where excitement and success should lie”. [BBC 22 February 2015]

There are already policies in schools that touch all our children and laws which reach into our family lives that we may know little about.

I have lots of questions what and how we are teaching our children about ‘extremism’ in schools and how the state uses surveillance to monitor our children’s and our own lives.

This may affect all schools and places of education, not those about which we hear stories about in the news, so it includes yours.

We all want the best for our young people and security in society, but are we protecting and promoting the right things?

Are today’s policies in practice, helping or hardening our children’s thinking?

Of course I want to see that all our kids are brought up safe. I also want to bring them up free from prejudice and see they get equal treatment and an equal start in life in a fair and friendly society.

I think we should understand the big picture better.

1. Do you feel comfortable that you know what is being  taught in schools or what is done with information recorded or shared by schools or its proposed expansion to pre-schools about toddlers under the Prevent programme?.

2. Do government communications’ surveillance programmes in reality, match up with real world evidence of need, and how is it measured to be effective?

3. Do these programmes create more problems as side-effects we don’t see or don’t measure?

4. If any of our children have information recorded about them in these programmes, how is it used, who sees it and for what purposes?

5. How much do we know about the laws brought in under the banner of ‘counter-terror’ measures, and how they are used for all citizens in everyday life?

We always think unexpected things will happen to someone else, and everything is rightfully justified in surveillance, until it isn’t.

Labels can be misleading.

One man’s terrorist may be another’s freedom fighter.

One man’s investigative journalist is another’s ‘domestic extremist.’

Who decides who is who?

Has anyone asked in Parliament: Why has religious hate crime escalated by 45% in 2013/14 and what are we doing about it? (up 700 to 2, 273 offences, Crime figures [19])

These aren’t easy questions, but we shouldn’t avoid asking them because it’s difficult.

I think we should ask: do we have laws which discriminate by religion, censor our young people’s education, or store information about us which is used in ways we don’t expect or know about?

Our MPs are after all, only people like us, who represent us, and who make decisions about us, which affect us. And on 7th May, they may be about to change.

As a mother, whoever wins the next General Election matters to me because it will affect the next five years or more, of what policies are made which will affect our children, and all of us as citizens.

It should be clear what these programmes are and there should be no reason why it’s not transparent.

“To counter terrorism, society needs more than labels and laws. We need trust in authority and in each other.”

We need trust in authority and in each other in our society, built on a strong and simple legal framework and founded on facts, not fears.

So I think this should be an election issue. What does each party plan on surveillance to resolve the issues outlined by journalists, lawyers and civil society? What applied programmes does each party see that will be, in practical terms: “for all of us in our schools, in our communities, in our families to tackle this.”

If you agree, then you could ask your MP, and ask your prospective parliamentary candidates. What is already done in real life and what are their future policies?

Let’s understand ‘the war on terror’ at home better, and its real impacts. These laws and programmes should be transparent, easy to understand, and not only legal, but clearly just, and proportionate.

Let’s get back to some of the basics, and respect the rights of our children.

Let’s start to untangle this spaghetti of laws; the programmes, that affect us in practice; and understand their measures of success.

Solutions to protecting our children, are neither simple or short term. But they may not always mean more surveillance.

Whether the Secret Service will start recruiting through Mumsnet or not, we could start with better education of us all.

At very least, we should understand what ‘surveillance’ means.

****

If you want to know more detail, I look at this below.

The laws applied in Real Life

Have you ever looked at case studies of how surveillance law is used?

In  one case, a mother and her children’s [1] movements were watched and tracked when a council wasn’t sure if her school application was for the correct catchment area. [It legally used the Regulation of Investigatory Powers Act 2000, (RIPA) [2]

Do you think it is just or fair that  a lawyer’s conversations with his client [3] were recorded and may have been used preparing the trial – when the basis of our justice system is innocent until proven guilty?

Or is it right that journalists’ phone records could be used to identify people by the police, without telling the journalists or getting independent approval, from a judge for example?

ft

These aren’t theoretical questions but stem from real-life uses of laws used in the ‘counter terrorism’ political arena and in practice.

Further programmes store information about every day people which we may find surprising.

In November 2014 it was reported that six British journalists [4] had found out personal and professionally related information had been collected about them, and was stored on the ‘domestic extremist’ database by the Metropolitan Police in London.

They were not criminal nor under surveillance for any wrongdoing.

One of the journalists wrote in response in a blog post on the NUJ website [5]:

“…the police have monitored public interest investigations in my case since 1999. More importantly if the police are keeping tabs on a lightweight like myself then they are doing the same and more to others?”

Ever participated in a protest and if not reported on one?

‘Others’ in that ‘domestic extremist list’ might include you, or me.

Current laws may be about to change [6] (again) and perhaps for the good, but will yet more rushed legislation in this area be done right?

There are questions over the detail and what will actually change. There are multiple bills affecting security, counter-terrorism and data access in parliament, panels and reviews going on in parallel.

The background which has led to this is the culmination of lots of concern and pressure over a long period of time focuses on one set of legal rules, in the the Regulation of Investigatory Powers Act (RIPA).

The latest draft code of practice [7] for the Regulation of Investigatory Powers Act (RIPA) [8] allows the police and other authorities to continue to access journalists’ and other professionals’ communications without any independent process or oversight.

‘Nothing to hide, nothing to fear’, is a phrase we hear said of surveillance but as these examples show, its use is widespread and often unexpected, not in extremes as we are often told.

David Cameron most recently called for ever wider surveillance legislation, again in The Telegraph, Jan 12 2015  saying:[9]

“That is why in extremis it has been possible to read someone’s letter, to listen to someone’s telephone, to mobile communications.”

Laws and programmes enable and permit these kinds of activity which are not transparent to the broad public. Is that right?

The Deregulation bill has changes, which appear now to have been amended to keep the changes affecting journalists in PACE [10] laws after all, but what effects are there for other professions and how exactly will this change interact with further new laws such as the Counter Terrorism and Security Act [p20]? [11]

It’s understandable that politicians are afraid of doing nothing, if a terrorist attack takes place, they are at risk of looking like they failed.

But it appears that politicians may have got themselves so keen to be seen to be doing ‘something’ in the face of terror attacks, that they are doing too much, in the wrong places, and we have ended up with a legislative spaghetti of simultaneous changes, with no end in sight.

It’s certainly no way to make legal changes understandable to the British public.

Political change may come as a result of the General Election. What implications will it have for the applied ‘war-on-terror’ and average citizen’s experience of surveillance programmes in real life?

What do we know about how we are affected? The harm to some in society is real, and is clearly felt in some, if not all communities. [12]

Where is the evidence to include in the debate, how laws affect us in real life and what difference they make vs their intentions?

Anti-terror programmes in practice; in schools & surgeries

In addition to these changes in law, there are a number of programmes in place at the moment.

The Prevent programme?[16] I already mentioned above.

Its expansion to wider settings would include our children from age 2 and up, who will be under an additional level of scrutiny and surveillance [criticism of the the proposal has come across the UK].

How might what a three year old says or draws be interpreted, or recorded them about them, or their family? Who accesses that data?

What film material is being produced that is: ” distributed directly by these organisations, with only a small portion directly badged with government involvement” and who is shown it and why? [Review of Australia‘s Counter Terror Machinery, February 2015] [17]

What if it’s my child who has something recorded about them under ‘Prevent’? Will I be told? Who will see that information?  What do I do if I disagree with something said or stored about them?

Does surveillance benefit society or make parts of it feel alienated and how are both its intangible cost and benefit measured?

When you combine these kinds of opaque, embedded programmes in education or social care  with political thinking which could appear to be based on prejudice not fact [18], the outcomes could be unexpected and reminiscent of 1930s anti-religious laws.

Baroness Hamwee raised this concern in the Lords on the 28th January, 2015 on the Prevent Programme:

“I am told that freedom of information requests for basic statistics about Prevent are routinely denied on the basis of national security. It seems to me that we should be looking for ways of providing information that do not endanger security.

“For instance, I wondered how many individuals are in a programme because of anti-Semitic violence. Over the last day or two, I have been pondering what it would look like if one substituted “Jewish” for “Muslim” in the briefings and descriptions we have had.” Baroness Hamwee:  [28 Jan 2015 : Column 267 -11]

“It has been put to me that Prevent is regarded as a security prism through which all Muslims are seen and that Muslims are suspect until proved otherwise. The term “siege mentality” has also been used.

“We have discussed the dangers of alienation arising from the very activities that should be part of the solution, not part of the problem, and of alienation feeding violence. […]

“Transparency is a very important tool … to counter those concerns.”

Throughout history good and bad are dependent on your point of view. In 70s London, but assuming today’s technology, would all Catholics have come sweepingly under this extra scrutiny?

“Early education funding regulations have been amended to ensure that providers who fail to promote the fundamental British values of democracy, the rule of law, individual liberty and mutual respect and tolerance for those with different faiths and beliefs do not receive funding.” [consultation guidance Dec 2014]

The programme’s own values seem undermined by its attitudes to religion and individual liberty. On universities the same paragraph on ‘freedom of speech’ suggests restrictive planning measures on protest meetings and IT surveillance for material accessed for  ‘non-research purposes’.

School and university is a time when our young people explore all sorts of ideas, including to be able to understand and to criticise them. Just looking at material online should not necessarily have any implications.  Do we really want to censor what our young people should and should not think about, and who is deciding the criteria?

For families affected by violence, nothing can justify their loss and we may want to do anything to justify its prevention.

But are we seeing widespread harm in society as side effects of surveillance programmes?

We may think we live in a free and modern society. History tells us all too easily governments can change a slide into previously unthinkable directions. It would be complacent to think, ‘it couldn’t happen here.’

Don’t forget, religious hate crime escalated by 45% in 2013/14 Crime figures [19])

Writers self-censor their work.  Whistleblowers may not come forward to speak to journalists if they feel actively watched.

Terrorism is not new.

Young people with fervour to do something for a cause and going off ‘to the fight’ in a foreign country is not new.

In the 1930s the UK Government made it illegal to volunteer to fight in Spain in the civil war, but over 2,000 went anyway.

New laws are not always solutions. especially when ever stricter surveillance laws, may still not mean any better accuracy of terror prevention on the ground. [As Charlie Hebdo and Copenhagen showed. in these cases the people involved were known to police. In the case of Lee Rigby it was even more complex.]

How about improving our citizens’ education and transparency about what’s going on & why, based on fact and not fear?

If the state shouldn’t nanny us, then it must allow citizens and parents the transparency and understanding of the current reality, to be able to inform ourselves and our children in practical ways, and know if we are being snooped on or surveillance recorded.

There is an important role for cyber experts in/and civil society to educate and challenge MPs on policy. There is also a very big gap in practical knowledge for the public, which should be addressed.

Can  we trust that information will be kept confidential that I discuss with my doctor or lawyer or if I come forward as a whistleblower?

Do I know whether my email and telephone conversations, or social media interactions are being watched, actively or by algorithms?

Do we trust that we are treating all our young people equally and without prejudice and how are we measuring impact of programmes we impose on them?

To counter terrorism, society needs more than labels and laws

We need trust in authority and in each other in our society, built on a strong and simple legal framework and founded on facts, not fears.

If the Prevent programme is truly needed at this scale, tell us why and tell us all what our children are being told in these programmes.

We should ask our MPs even though consultation is closed, what is the evidence behind the thinking about getting prevent into toddler settings and far more? What risks and benefits have been assessed for any of our children and families who might be affected?

Do these efforts need expanded to include two-year-olds?

Are all efforts to keep our kids and society safe equally effective and proportionate to potential and actual harm caused?

Alistair MacDonald QC, chairman of the Bar Council, said:

‘As a caring society, we cannot simply leave surveillance issues to senior officers of the police and the security services acting purportedly under mere codes of practice.

What is surely needed more than ever before is a rigorous statutory framework under which surveillance is authorised and conducted.”

Whether we are disabled PIP protesters outside parliament or mothers on the school run, journalists or lawyers, doctors or teachers, or anyone, these changes in law or lack of them, may affect us. Baroness Hamwee clearly sees harm caused in the community.

Development of a future legislative framework should reflect public consensus, as well as the expert views of technologists, jurists, academics and civil liberty groups.

What don’t we know? and what can we do?

According to an Ipsos MORI poll for the Evening Standard on October 2014 [20] only one in five people think the police should be free to trawl through the phone records of journalists to identify their sources.

Sixty-seven per cent said the approval of a judge should be obtained before such powers are used.

No one has asked the public if we think the Prevent programme is appropriate or proportionate as far as I recall?

Who watches that actions taken under it, are reasonable and not reactionary?

We really should be asking; what are our kids being shown, taught, informed about or how they may be  informed upon?

I’d like all of that in the public domain, for all parents and guardians. The curriculum, who is teaching and what materials are used.

It’s common sense to see that young people who feel isolated or defensive are less likely to talk to parents about their concerns.

It is a well known quote in surveillance “Nothing to hide, nothing to fear.” But this argument is flawed, because information can be wrong.

‘Nothing to fear, nowhere to hide’, may become an alternative meme we hear debated again soon, about surveillance if the internet and all communications are routinely tracked, without oversight.

To ensure proper judicial oversight in all these laws and processes – to have an independent judge give an extra layer of approval – would restore public trust in this system and the authority on which it depends.

It could pave the way for a new hope of restoring the checks and balances in many governance procedures, which a just and democratic society deserves.

As Roosevelt said: “let me assert my firm belief that the only thing we have to fear is fear itself—nameless, unreasoning, unjustified terror.”

 

******

[On Channel4OD: Channel 4 – Oscar winning, ‘CitizenFour’  Snowden documentary]

References:

[1] The Guardian, 2008, council spies on school applicants

[2] Wikipedia RIPA legislation

[3] UK admits unlawfully monitoring communications

[4] http://www.theguardian.com/uk-news/2014/nov/20/police-legal-action-snooping-journalists

[5] Journalist’s response

[6] SOS Campaign

[7] RIPA Consultation

[8] The RIPA documents are directly accessible here

[9] The Telegraph

[10] Deregulation Bill

[11] Counter Terrorism and Security Act 2015

[12] Baroness Hamwee comments in the House of Lords [Hansard]

[13] Consultation response by charity Children in Scotland

[14] The Telegraph, Anti-terror plan to spy on toddlers ‘is heavy-handed’

[15] GPs told to specify counter terrorism leads [Prevent]

[16] The Prevent programme, BBC / 2009 Prevent programme for schools

[17] Review of Australia’s CT Machinery

[18] Boris Johnson, March 2014

[19] Hate crime figures 2013-14

[20] Ipsos MORI poll, October 2014

 

******

 image credit: ancient history

MORE: click the link below

Continue reading “Nothing to fear, nowhere to hide – a mother’s attempt to untangle UK surveillance law and programmes” »