care.data : the economic value of data versus the public interest?

 This is a repost of my opinion piece published in StatsLife in June 2015.

The majority of the public supports the concept of using data for public benefit.[1] But the measurable damage done in 2014 to the public’s trust in data sharing [2] and reasons for it, are an ongoing threat to its achievement.

Rebuilding trust and the public legitimacy of government data gathering could be a task for Sisyphus, given the media atmosphere clouded by the smoke and mirrors of state surveillance. As Mark Taylor, chair of the NHS’s Confidentiality Advisory Group wrote when he considered the tribulations of care.data [3] ‘…we need a much better developed understanding of ‘the public interest’ than is currently offered by law.’

So what can we do to improve this as pilot sites move forward and for other research? Can we consistently quantify the value of the public good and account for intangible concerns and risks alongside demonstrable benefits? Do we have a common understanding of how the public feels what is in its own best interests?

And how are shifting public and professional expectations to be reflected in the continued approach to accessing citizens’ data, with the social legitimacy upon which research depends?

Listening and lessons learned

Presented as an interval to engage the public and professionals, the 18 month long pause in care.data involved a number of ‘listening’ events. I attended several of these to hear what people were saying about the use of personal health data. The three biggest areas of concern raised frequently [4] were:

  • Commercial companies’ use and re-use of data
  • Lack of transparency and control over who was accessing data for what secondary purposes, and
  • Potential resulting harms: from data inaccuracy, loss of trust and confidentiality, and fear of discrimination.

It’s not the use of data per se that the majority of the public raises objection to. Indeed many people would object if health data were not used for research in the public interest. Objections were more about the approach to this in the past and in the future.

There is a common understanding of what bona fide research is, how it serves the public interest, and polls confirm a widespread acceptance of ‘reasonable’ research use of data. The HSCIC audit under Sir Nick Partridge [5] acknowledged that some past users or raw data sharing had not always met public expectations of what was ‘reasonable’. The new secure facility should provide a safe setting for managing this better, but open questions remain on governance and transparency.

As one question from a listening event succinctly put it [6]:

‘Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.’

Using the information gleaned from data was often seen as exploitation when used in segmenting the insurance markets, consumer market research or individual targeting. There is also concern, even outright hostility, to raw health data being directly sold, re-used or exchanged as a commodity – regardless whether this is packaged as ‘for profit’ or ‘covering administrative costs’.

Add to that, the inability to consent to, control or find out who uses individual level data and for what purpose, or to delete mistakes, and there is a widespread sense of disempowerment and loss of trust.

Quantifying the public perception of care.data’s value

While the pause was to explain the benefits of the care.data extraction, it actually seemed clear at meetings that people already understood the potential benefits. There is clear public benefit to be gained for example, from using data as a knowledge base, often by linking with other data to broaden scientific and social insights, generating public good.

What people were asking, was what new knowledge would be gained that isn’t gathered from non-identifiable data already? Perhaps more tangible, yet less discussed at care.data events, is the economic benefits for commissioning use by using data as business intelligence to inform decisions in financial planning and cost cutting.

There might be measurable economic public good from data, from outside interests who will make a profit by using data to create analytic tools. Some may even sell information back into the NHS as business insights.

Care.data is also to be an ‘accelerator’ for other projects [7]. But it is hard to find publicly available evidence to a) support the economic arguments for using primary care data in any future projects, and b) be able to compare them with the broader current and future needs of the NHS.

A useful analysis could find that potential personal benefits and the public good overlap, if the care.data business case were to be made available by NHS England in the public domain. In a time when the NHS budget is rarely out of the media it seems a no-brainer that this should be made open.

Feedback consistently shows that making money from data raises more concern over its uses. Who all future users might be remains open as the Care Act 2014 clause is broadly defined. Jamie Reed MP said in the debate [8]: ‘the new clause provides for entirely elastic definitions that, in practice, will have a limitless application.’

Unexpected uses and users of public data has created many of its historical problems. But has the potential future cost of ‘limitless’ applications been considered in the long term public interest? And what of the confidentiality costs [9]? The NHS’s own Privacy Impact Assessment on care.data says [10]:

‘The extraction of personal confidential data from providers without consent carries the risk that patients may lose trust in the confidential nature of the health service.

Who has quantified the cost of that loss of confidence and have public and professional opinions been accounted for in any cost/benefit calculations? All these tangible and intangible factors should be measured in calculating its value in the public interest and ask, ‘what does the public want?’ It is after all, our data and our NHS.

Understanding shifting public expectations

‘The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.’ – David Carr, policy adviser at the Wellcome Trust [11]

To rebuild trust in data sharing, individuals need the imbalance of power corrected, so they can control ‘their data’. The public was mostly unaware health records were being used for secondary purposes by third parties, before care.data. In February 2014, the secretary of state stepped in to confirm an opt-out will be offered, as promised by the prime minister in his 2010 ‘every patient a willing research patient’ speech.

So leaving aside the arguments for and against opt-in versus opt-out (and that for now it is not technically possible to apply the 700,000 opt-outs already made) the trouble is, it’s all or nothing. By not offering any differentiation between purposes, the public may feel forced to opt-out of secondary data sharing, denying all access to all their data even if they want to permit some uses and not others.

Defining and differentiating secondary uses and types of ‘research purposes’ could be key to rebuilding trust. The HSCIC can disseminate information ‘for the purposes of the provision of health care or adult social care, or the promotion of health’. This does not exclude commercial use. Cutting away commercial purposes which appear exploitative from purposes in the public interest could benefit the government, commerce and science if, as a result, more people would be willing to share their data.

This choice is what the public has asked for at care.data events, other research events [12] and in polls, but to date has yet to see any move towards. I feel strongly that the government cannot continue to ignore public opinion and assume its subjects are creators of data, willing to be exploited, without expecting further backlash. Should a citizen’s privacy become a commodity to put a price tag on if it is a basic human right?

One way to protect that right is to require an active opt-in to sharing. With ongoing renegotiation of public rights and data privacy at EU level, consent is no longer just a question best left ignored in the pandora’s box of ethics, as it has been for the last 25 years in hospital data secondary use. [13]

The public has a growing awareness, differing expectations, and different degrees of trust around data use by different users. Policy makers ignoring these expectations, risk continuing to build on a shaky foundation and jeopardise the future data sharing infrastructure. Profiting at the expense of public feeling and ethical good practice is an unsustainable status quo.

Investing in the public interest for future growth

The care.data pause has revealed differences between the thinking of government, the drivers of policy, the research community, ethics panels and the citizens of the country. This is not only about what value we place on our own data, but how we value it as a public good.

Projects that ignore the public voice, that ‘listen’ but do not act, risk their own success and by implication that of others. And with it they risk the public good they should create. A state which allows profit for private companies to harm the perception of good research practice sacrifices the long term public interest for short term gain. I go back to the words of Mark Taylor [3]:

‘The commitment must be an ongoing one to continue to consult with people, to continue to work to optimally protect both privacy and the public interest in the uses of health data. We need to use data but we need to use it in ways that people have reason to accept. Use ‘in the public interest’ must respect individual privacy. The current law of data protection, with its opposed concepts of ‘privacy’ and ‘public interest’, does not do enough to recognise the dependencies or promote the synergies between these concepts.’ 

The economic value of data, personal rights and the public interest are not opposed to one another, but have synergies and a co-dependency. The public voice from care.data listening could positively help shape a developing consensual model of data sharing if the broader lessons learned are built upon in an ongoing public dialogue. As Mark Taylor also said, ‘we need to do this better.’

*******

[1] according to various polls and opinions gathered from my own discussions and attendance at care.data events in 2014 [ refs: 2, 4. 6. 12]

[2] The data trust deficit, work by the Royal Statistical Society in 2014

[3] M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1 http://script-ed.org/?p=1377

[4] Communications and Change – blogpost https://jenpersson.com/care-data-communications-change/

[5] HSCIC audit under Sir Nick Partridge https://www.gov.uk/government/publications/review-of-data-releases-made-by-the-nhs-information-centre

[6] Listening events, NHS Open Day blogpost https://jenpersson.com/care-data-communications-core-concepts-part-two/

[7] Accelerator for projects mentioned include the 100K Genomics programme https://www.youtube.com/watch?v=s8HCbXsC4z8

[8] Hansard http://www.publications.parliament.uk/pa/cm201314/cmhansrd/cm140311/debtext/140311-0002.htm

[9] Confidentiality Costs; StatsLife http://www.statslife.org.uk/opinion/1723-confidentiality-costs

[10] care.data privacy impact assessment Jan 2014 [newer version has not been publicly released] http://www.england.nhs.uk/wp-content/uploads/2014/01/pia-care-data.pdf

[11] Wellcome Trust http://blog.wellcome.ac.uk/2015/04/08/sharing-research-data-to-improve-public-health/

[12]  Dialogue on Data – Exploring the public’s views on using linked administrative data for research purposes: https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx

[13] HSCIC Lessons Learned http://www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

The views expressed in this article originally published in the Opinion section of StatsLife are solely mine, the original author. These views and opinions do not necessarily represent those of The Royal Statistical Society.

The National Pupil Database end of year report: an F in Fair Processing

National Pupil Database? What National Pupil Database? Why am I on it?

At the start of the school year last September 2014, I got the usual A4 pieces of paper. Each of my children’s personal details, our home address and contact details, tick boxes for method of transport each used to get to school, types of school meal eaten all listed, and a privacy statement at the bottom:

“Data Protection Act 1988: The school is registered under the Data Protection Act for holding personal data. The school has a duty to protect this information and to keep it up to date. The school is required to share some of the data with the Local Authority and with the DfE.”

There was no mention of the DfE sharing it onwards with anyone else. But they do, through the National Pupil Database [NPD] and  it is enormous [1].  It’s a database which holds personal information of every child who has ever been in state education since 2002, some data since 1996. [That includes me as both a student AND a parent.]

“Never heard of it?”

Well neither have I from my school, which is what I pointed out to the DfE in September 2014.

School heads, governors, and every parent I have spoken with in my area and beyond, are totally unaware of the National Pupil database. All are surprised. Some are horrified at the extent of data sharing at such an identifiable and sensitive level, without school and parental knowledge.[2]

Here’s a list what it holds. Fully identifiable data at unique, individual level. Tiered from 1-4, where 1 is the most sensitive. A full list of what data is available in each of the tiers and standard extracts can be found in the ‘NPD data tables’.

K5

I’d like to think it has not been deliberately hidden from schools and parents. I hope it has simply been careless about its communications.

Imagine that the data once gathered only for administration since 1996, was then decided about at central level and they forgot to tell the people whom they should have been asking. The data controllers and subjects the data were from – the schools, parents/guardians and pupils – were forgotten. That could happen when you see data as a commodity and not as people’ s personal histories.

The UK appears to have gathered admin data for years until the coalition decided it was an asset it could further exploit. The DfE may have told others in 2002 and in 2012 when it shaped policy on how the NPD would be used, but it forgot to tell the children whose information it is and used them without asking. In my book, that’s an abuse of power and misuse of data.

It seems to me that current data policies in practice across all areas of government have simply drifted at national level towards ever greater access by commercial users.

And although that stinks, it has perhaps arisen from lack of public transparency and appropriate oversight, rather than some nefarious intent.

Knowingly failing to inform schools, pupils and guardians how the most basic of our personal data are used is outdated and out of touch with public feeling. Not to mention, that it fails fair processing under Data Protection law.

Subject Access Request – User experience gets an ‘F’ for failing

The submission of the school census, including a set of named pupil records, is a statutory requirement on schools.

This means that children and parents data, regardless of how well or poorly informed they may be, are extracted for administrative purposes, and are used in addition to those we would expect, for various secondary reasons.

Unless the Department for Education makes schools aware of the National Pupil Database use and users, the Department fails to provide an adequate process to enable schools to meet their local data protection requirements. If schools don’t know, they can’t process data properly.

So I wrote to the Department for Education (DfE) in September 2014, including the privacy notice used in schools like ours, showing it fails to inform parents how our children’s personal data and data about us (as related parent/guardians) are stored and onwardly used by the National Pupil Database (NPD). And I asked three questions:

1. I would like to know what information is the minimum you require for an individual child from primary schools in England?

2. Is there an opt out to prevent this sharing and if so, under what process can parents register this?

3. Is there a mechanism for parents to restrict the uses of the data (i.e. opt out our family data) with third parties who get data from the National Pupil Database?

I got back some general information, but no answer to my three questions.

What data do you hold and share with third parties about my children?

In April 2015 I decided to find out exactly what data they held, so I made a subject access request [SAR], expecting to see the data they held about my children. They directed me to ask my children’s school instead and to ask for their educational record. The difficulty with that is, it’s a different dataset.

My school is not the data controller of the National Pupil Database. I am not asking for a copy of my children’s educational records held by the school, but what information that the NPD holds about me and my children. One set of data may feed the other but they are separately managed. The NPD is the data controller for that data it holds and as such I believe has data controller responsibility for it, not the school they attend.

Why do I care? Well for starters, I want to know if the data are accurate.  And I want to know who else has access to it and for what purposes – school can’t tell me that. They certainly couldn’t two months ago, as they had no idea the NPD existed.

I went on to ask the DfE for a copy of the publicly accessible subject access request (SAR) policy and procedures, aware that I was asking on behalf of my children. I couldn’t find any guidance, so asked for the SAR policy. They helpfully provided some advice, but I was then told:

“The department does not have a publicly accessible standard SAR policy and procedures document.”  and “there is not an expectation that NPD data be made available for release in response to a SAR.”

It seems policies are inconsistent. For this other DfE project, there is information about the database, how participants can opt out and  respecting your choice. On the DfE website a Personal Information Charter sets out “what you can expect when we ask for and hold your personal information.”

It says: “Under the terms of the Data Protection Act 1998, you’re entitled to ask us:

  • if we’re processing your personal data
  • to give you a description of the data we hold about you, the reasons why we’re holding it and any recipient we may disclose it to (eg Ofsted)
  • for a copy of your personal data and any details of its source

You’re also entitled to ask us to change the information we hold about you, if it is wrong.

To ask to see your personal data (make a ‘subject access request’), or to ask for clarification about our processing of your personal data, contact us via the question option on our contact form and select ‘other’.”

So I did. But it seems while it applies to that project,  Subject Access Request is not to apply to the data they hold in the NPD. And they finally rejected my request last week, stating it is exempt:

SAR_reject

I appealed the decision on the basis that the section 33 Data Protection Act criteria given, are not met:

“the data subject was made fully aware of the use(s) of their personal data (in the form of a privacy notice)”

But it remains rejected.

It seems incomprehensible that third parties can access my children’s data and I can’t even check to see if it is correct.

While acknowledging section 7 of the Data Protection Act 1998 (DPA) “an individual has the right to ask an organisation to provide them with information they hold which identifies them and, in certain circumstances, a parent can make such a request on behalf of a child” they refused citing the Research, History and Statistics exemption (i.e. section 33(4) of the DPA).

Fair processing, another F for failure and F for attitude

The Department of Education response to me said that it “makes it clear what information is held, why it is held, the uses made of it by DfE and its partners and publishes a statement on its website setting this out. Schools also inform parents and pupils of how the data is used through privacy notices.”

I have told the DfE the process does not work. The DfE / NPD web instructions do not reach parents. Even if they did, information is thoroughly inadequate and either deliberately hides or does so by omission, the commercial third party use of data.

The Department for Education made a web update on 03/07/2015 with privacy information to be made available to parents by schools: http://t.co/PwjN1cwe6r

Despite this update this year, it is inadequate on two counts. In content and communication.

To claim as they did in response to me that: “The Department makes it clear to children and their parents what information is held about pupils and how it is processed, through a statement on its website,” lacks any logic.

Updating their national web page doesn’t create a thorough communications process or engage anyone who does not know about it to start with.

Secondly, the new privacy policy is inadequate in it content and utterly confusing. What does this statement mean, is there now some sort of opt out on offer? I doubt it, but it is unclear:

“A parent/guardian can ask that no information apart from their child’s name, address and date of birth be passed to [insert name of local authority or the provider of Youth Support Services in your area] by informing [insert name of school administrator]. This right is transferred to the child once he/she reaches the age 16. For more information about services for young people, please go to our local authority website [insert link].” [updated privacy statement, July 3, 2015]

Information that I don’t know exists, about a database I don’t know exists, that my school does not know exists, they believe meets fair processing through a statement on its own website?

Appropriate at this time of year,  I have to ask, “you cannot be serious?”

Fair processing means transparently sharing the purpose or purposes for which you intend to process the information, not hiding some of the users through careful wording.

It thereby fails to legally meet the first data protection principle. as parents are not informed at all, never mind fully of further secondary uses.

As a parent, when I register my child for school, I of course expect that some personal details must be captured to administer their education.

There must be data shared to adequately administer, best serve, understand, and sometimes protect our children.  And bona fide research is in the public interest.

However I have been surprised in the last year to find that firstly, I can’t ask what is stored on my own children and that secondly, a wide range of sensitive data are shared through the Department of Education with third parties.

Some of these potential third parties don’t meet research criteria in my understanding of what a ‘researcher’ should be. Journalists? the MOD?

To improve, there would be little additional time or work burden required to provide proper fair processing as a starting point, but to do so, the department can’t only update a policy on its website and think it’s adequate. And the newly updated suggested text for pupils is only going to add confusion.

The privacy policy text needs carefully reworded in human not civil service speak.

It must not omit [as it does now] the full range of potential users.

After all the Data Protection principles state that: “If you wish to use or disclose personal data for a purpose that was not contemplated at the time of collection (and therefore not specified in a privacy notice), you have to consider whether this will be fair.”

Now that it must be obvious to DfE that it is not the best way to carry on, why would they choose NOT to do better? Our children deserve better.

What would better look like? See part 3. The National Pupil Database end of year report: a D in transparency, C minus in security.

*****

[PS: I believe the Freedom of Information Officer tried their best and was professional and polite in our email exchanges, B+. Can’t award an A as I didn’t get any information from my requests. Thank you to them for their effort.]

*****

Updated on Sunday 19th July to include the criteria of my SAR rejection.

1. Our children’s school data: an end of year report card
2. The National Pupil Database end of year report: an F in fair processing
3. The National Pupil Database end of year report: a D in transparency, C minus in security

References:

[1] The National Pupil Database user guide: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/261189/NPD_User_Guide.pdf

[2] The Department for Education has specific legal powers to collect pupil, child and workforce data held by schools, local authorities and awarding bodies under section 114 of the Education Act 2005section 537A of the Education Act 1996, and section 83 of the Children Act 1989. The submission of the school census returns, including a set of named pupil records, is a statutory requirement on schools under Section 537A of the Education Act 1996.

[3] Data tables to see the individual level data items stored and shared (by tabs on the bottom of the file) https://www.gov.uk/government/publications/national-pupil-database-user-guide-andsupporting-information

[4] The table to show who has bought or received data and for what purpose https://www.gov.uk/government/publications/national-pupil-database-requests-received

[5] Data Trust Deficit – from the RSS: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[6] Talk by Phil Booth and Terri Dowty: http://www.infiniteideasmachine.com/2013/04/terris-and-my-talk-on-the-national-pupil-database-at-the-open-data-institute/

[7] Presentation given by Paul Sinclair of the Department for Education at the Workshop on Evaluating the Impact of Youth Programmes, 3rd June 2013

The National Pupil Database end of year report: D for transparency, C minus in security.

Transparency and oversight of how things are administered are simple ways that the public can both understand and trust that things run as we expect.

For the National Pupil Database, parents might be surprised, as I was about some of the current practices.

The scope of use and who could access the National Pupil Database was changed in 2012 and although I had three children at school at that time and heard nothing about it, nor did I read it in the papers. (Hah – time to read the papers?)  So I absolutely agree with Owen Boswara’s post when he wrote:

“There appears to have been no concerted effort to bring the consultation or the NPD initiative to the attention of parents or pupils (i.e. the data subjects themselves). This is a quote from one of the parents who did respond:

“I am shocked and appalled that I wasn’t notified about this consultation through my child’s school – I read about it on Twitter of all things. A letter should have gone to every single parent explaining the proposals and how to respond to this consultation.”

(Now imagine that sentiment amplified via Mumsnet …)”
[July 2013, blog by O. Boswara]

As Owen wrote,  imagine that sentiment amplified via Mumsnet indeed.

Here’s where third parties can apply and here’s a list of who has been given data from the National Pupil Database . (It’s only been updated twice in 18 months. The most recent of which has been since I’ve asked about it, in .) The tier groups 1-4 are explained here on p.18, where 1 is the most sensitive identifiable classification.

The consultation suggested in 2012 that the changes could be an “effective engine of economic growth, social wellbeing, political accountability and public service improvement.”.  

Has this been measured at all if the justification given has begun to be achieved? Often research can take a long time and implementing any changes as a result, more time. But perhaps there has been some measure of public benefit already begun to be accrued?

The release panel would one hope, have begun to track this. [update: DfE confirmed August 20th they do not track benefits, nor have ever done any audit of recipients]

And in parallel what oversight governs checks and balances to make sure that the drive for the ‘engine of economic growth’ remembers to treat these data as knowledge about our children?

Is there that level of oversight from application to benefits measurement?

Is there adequate assessment of privacy impact and ethics in applications?

Why the National Pupil Database troubles me, is not the data it contains per se, but the lack of child/guardian involvement, lack of accountable oversight how it is managed and full transparency around who it is used by and its processes.

Some practical steps forward

Taken now, steps could resolve some of these issues and avoid the risk of them becoming future issues of concern.

The first being thorough fair processing, as I covered in my previous post.

The submission of the school census returns, including a set of named pupil records, has been a statutory requirement on schools since the Education Act 1996. That’s almost twenty years ago in the pre-mainstream internet age.

The Department must now shape up its current governance practices in its capacity as the data processor and controller of the National Pupil Database, to be fit for the 21st century.

Ignoring current weaknesses, actively accepts an ever-increasing reputational risk for the Department, schools, other data sharing bodies or those who link to the data and its bona fide research users. If people lose trust in data uses, they won’t share at all and the quality of data will suffer, bad for functional admin of the state and individual, but also for the public good.

That concerns me also wearing my hat as a lay member on the ADRN panel because it’s important that the public trusts our data is looked after wisely so that research can continue to use it for advances in health and social science and all sorts of areas of knowledge to improve our understanding of society and make it better.

Who decides who gets my kids data, even if I can’t?

A Data Management Advisory Panel (DMAP) considers applications for only some of the applications, tier 1 data requests. Those are the most, but not the only applications for access to sensitive data.

“When you make a request for NPD data it will be considered for approval by the Education Data Division (EDD) with the exception of tier 1 data requests, which will be assessed by the department’s Data Management Advisory Panel. The EDD will inform you of the outcome of the decision.”

Where is governance transparency?

What is the make up of both the Data Management Advisory Panel and and the Education Data Division (EDD)? Who sits on them and how are they selected? Do they document their conflicts of interest for each application? For how long are they appointed and under what selection criteria?

Where is decision outcome transparency?

The outcome of the decision should be documented and published. However, the list has been updated only twice since its inception in 2012. Once was December 2013, and the most recently was, ahem, May 18 2015. After considerable prodding. There should be a regular timetable, with responsible owner and a depth of insight into its decision making.

Where is transparency over decision making to approve or reject requests?

Do privacy impact assessments and ethics reviews play any role in their application and if so, how are they assessed and by whom?

How are those sensitive and confidential data stored and governed?

The weakest link in any system is often said to be human error. Users of the NPD data vary from other government departments to “Mom and Pop” small home businesses, selling schools’ business intelligence and benchmarking.

So how secure are our children’s data really, and once the data have left the Department database, how are they treated? Does lots of form filling and emailed data with a personal password ensure good practice, or simply provide barriers to slow down the legitimate applications process?

What happens to data that are no longer required for the given project? Are they properly deleted and what audits have ever been carried out to ensure that?

The National Pupil Database end of year report: a C- in security

The volume of data that can be processed now at speed is incomparable with 1996, and even 2012 when the current processes were set up. The opportunities and risks in cyber security have also moved on.

Surely the Department for Education should take responsibility seriously to treat our children’s personal data and sensitive records equally as well as the HSCIC now intends to manage health data?

Processing administrative or linked data in an environment with layered physical security (e.g. a secure perimeter, CCTV, security guarding or a locked room without remote connection such as internet access) is good practice. And reduces the risk of silly, human error. Or  simple theft.

Is giving out chunks of raw data by email, with reams of paperwork as its approval ‘safeguards’ really fit for the 21st century and beyond?

tiers

Twenty years on from the conception of the National Pupil Database, it is time to treat the personal data of our future adult citizens with the respect it deserves and we expect of best-in-class data management.

It should be as safe and secure as we treat other sensitive government data, and lessons could be learned from the FARR, ADRN and HSCIC safe settings.

Back to school – more securely, with public understanding and transparency

Understanding how that all works, how technology and people, data sharing and privacy, data security and trust all tie together is fundamental to understanding the internet. When administrations take our data, they take on responsibilities for some of our participation in dot.everyone that the state is so keen for us all to take part in. Many of our kids will live in the world which is the internet of things.  Not getting that, is to not understand the Internet.

And to reiterate some of why that matters, I go back to my previous post in which I quoted Martha Lane Fox recently and the late Aaron Swartz when he said: “It’s not OK not understand the internet, anymore”.

While the Department of Education has turned down my subject access request to find out what the National Pupil Database stores on my own children, it matters too much to brush the issues aside, as only important for me. About 700,000 children are born each year and will added to this database every academic year. None ever get deleted.

Parents can, and must ask that it is delivered to the highest standards of fair processing, transparency, oversight and security. I’m certainly going to.

It’s going to be Back to School in September, and those annual privacy notices, all too soon.

*****

1. The National Pupil Database end of year report card

2. The National Pupil Database end of year report: an F in fair processing

3. The National Pupil Database end of year report: a D in transparency

References:

[1] The National Pupil Database user guide: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/261189/NPD_User_Guide.pdf

[2] Data tables to see the individual level data items stored and shared (by tabs on the bottom of the file) https://www.gov.uk/government/publications/national-pupil-database-user-guide-andsupporting-information

[3] The table to show who has bought or received data and for what purpose https://www.gov.uk/government/publications/national-pupil-database-requests-received

[4] Data Trust Deficit – from the RSS: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[5] Talk by Phil Booth and Terri Dowty: http://www.infiniteideasmachine.com/2013/04/terris-and-my-talk-on-the-national-pupil-database-at-the-open-data-institute/

[6] Presentation given by Paul Sinclair of the Department for Education at the Workshop on Evaluating the Impact of Youth Programmes, 3rd June 2013

What is in the database?

The Schools Census dataset contains approximately eight million records incrementally every year (starting in 1996) and includes variables on the pupil’s home postcode, gender, age, ethnicity, special educational needs (SEN), free school meals eligibility, and schooling history. It covers pupils in state-funded primary, secondary, nursery, special schools and pupil referral units. Schools that are entirely privately funded are not included.

Pupils can be tracked across schools. Pupils can now be followed throughout their school careers. And it provides a very rich set of data on school characteristics. There is further use by linking the data from other related datasets such as those on higher education, neighbourhoods and teachers in schools.

Data stored include the full range of personal and sensitive data from name, date of birth and address, through SEN and disability needs. (Detail of content is here.)  To see what is in it download the excel sheet : NPD Requests.

 

The Department for Education has specific legal powers to collect pupil, child and workforce data held by schools, local authorities and awarding bodies under section 114 of the Education Act 2005section 537A of the Education Act 1996, and section 83 of the Children Act 1989. The submission of the school census returns, including a set of named pupil records, is a statutory requirement on schools under Section 537A of the Education Act 1996.

Our children’s school data: an end-of-year report card

To quote the late Aaron Swartz: “It’s not OK to not understand the internet, anymore.”

Parents and guardians are trying their best.We leave work early and hurry to attend meetings on internet safety. We get told how vital it is that children not give away their name, age or address to strangers on social media. We read the magazines that come home in book bags about sharing their identity with players in interactive games.  We may sign school policies to opt out of permission for sharing photos from school performances on the school website.

And yet most guardians appear unaware that our children’s confidential, sensitive and basic personal data are being handed out to third parties by the Department of Education, without our knowledge or clear and accessible public accountability.

Data are extracted by the Department for Education [DfE] from schools, stored in a National Pupil Database [NPD], and onwardly shared.

Fine you may say. That makes sense, it’s the Department for Education.

But did you expect that the Ministry of Defence or Schools comparison websites may request or get given access to our children’s individual records, the data [detailed in the ‘NPD data tables’] that we provide to schools for routine administration?

School heads, governors, and every parent I have spoken with in my area, are totally unaware that data extracted by the Department of Education are used in this way.

All are surprised.

Some are shocked at the extent of data sharing at such an identifiable and sensitive level, without school and parental knowledge.

The DfE manages the NPD and holds responsibility to ensure we know all about it. But they’re not ensuring that pupils and parents are told before the data extraction, who else gets access to it and for what purposes. That fails to process data fairly which is a requirement to make its use lawful.

There’s no way to opt out, to check its accuracy or recourse for anything incorrect.

As our lives involve the ever more advanced connectivity of devices, systems, and services, that’s simply not good enough. It’s not a system fit for the 21st century or our children’s digital future.

While the majority of requestors seem to access data for bona fide research in the public interest, some use it for bench marking, others are commercial users.

Is that what pupils and parents expect their data are used for?

And what happens in future when, not if, the Department chooses to change who uses it and why.

How will we know about that? Because it has done so already.

When school census data first began, it extracted no names. That changed. Every pupil’s name is now recorded along with a growing range of information.

Where it began with schools, it is now extended to nursery schools; childminders, private nurseries and playgroups.

Where it was once used only for state administrative purposes, since 2012 it has been given to third parties.

What’s next?

Data should be used in the public interest and must be shared to adequately administer, best serve, understand, and sometimes protect our children.

I want to see our children’s use of technology, and their data created in schools used well in research that will enable inclusive, measurable benefits in education and well being.

However this can only be done with proper application of law, future-proofed security, and respectful recognition of public opinion.

The next academic year must bring these systems into the 21st century to safeguard both our children and the benefits that using data wisely can bring.

Out of sight, out of date, out of touch?

The data sharing is made possible through a so-called ‘legal gateway’, law that gives permission to the Secretary of State for Education to require data from schools.

In this case, it is founded on legislation almost twenty years old.

Law founded in the 1996 Education Act and other later regulations changed in 2009 give information-sharing powers to the Secretary of State and to public bodies through law pre-dating wide use of the Internet, social media, and the machine learning and computer processing power of today.

Current law and policies have not kept pace with modern technology. 2015 is a world away even from 2009 when Pluto was still a planet.

Our children’s data is valuable, and gives insights into society that researchers should of course use to learn from and make policy recommendations. That has widespread public support in the public interest. But it has to be done in an appropriate and secure way, and as soon as it’s for commercial use. there are more concerns and questions to ask.

As an example why NPD doesn’t do this as I feel it should, the data are still given away to users in their own offices rather than properly and securely accessed in a safe-setting, as bona fide accredited researchers at the Office of National Statistics do.

In addition to leaving our children’s personal data vulnerable to cybersecurity threats, it actively invites greater exposure to human error.

Remember those HMRC child benefit discs lost in the post with personal and bank data of 25 million individuals?

Harder to do if you only access sensitive data in a safe setting where you can walk out with your research but not raw files.

When biometrics data are already widely used in schools and are quite literally, our children’s passport to the world, poor data management approaches from government in health and education are simply not good enough anymore.

It’s not OK anymore.

Our children’s personal data is too valuable to lose control of as their digital footprint will become not an add-on, but integral to everything they do in future.

Guardians do their best to bring up children as digitally responsible citizens and that must be supported, not undermined by state practices.

Children will see the divide between online and ‘real’-life activities blend ever more seamlessly.

We cannot predict how their digital identity will become used in their adult lives.

If people don’t know users have data about them, how can we be sure they are using it properly for only the right reasons or try and repair damage when they have not?

People decide to withhold identities or data online if they don’t trust how they will be used, and who will use it well.

Research, reports and decision making are flawed if data quality is poor. That is not in the public interest.

The government must at least take responsibility for current policies to ensure our children’s rights are met in practice.

People who say data privacy does not matter, seem to lack any vision of its value.

Did you think that a social media site would ever try to control its users emotions and influence their decision-making based on the data they entered or read? It just did.

Did you foresee five years ago that a fingerprint could unlock your phone? It just did.

Did you believe 5 months ago the same fingerprint accessible phone would become an accepted payment card in England? It just did.

There is often a correlation between verification of identity and payment.

Fingerprinting for payment and library management has become common in UK schools and many parents do not know that parental consent is a legal requirement.

In reality, it’s not always enacted by schools.

Guardians can find non-participation is discouraged and worry their child will be stigmatised as the exception.

Yet no one would seriously consider asking guardians to give canteens their bank card PIN.

The broad points of use where data are created and shared about our children mean parents can often not know who knows what about them.

What will that mean for them as adults much of whose lives will be digital?

What free choice remains for people who want to be cautious with their digital identities? 

Many systems increasingly require registration, some including biometric data, sometimes from vulnerable people, and the service on offer is otherwise denied.

Is that OK anymore? Or is denial-of-service a form of coercion?

The current model of state data sharing often totally ignores that the children and young people whose personal data are held in these systems are not asked, informed or consulted about changes.

While Ministers talk about wanting our children to become digital leaders of tomorrow, policies of today promote future adults ill-educated in their own internet safety and personal data sharing practices.

But it’s not OK not to understand the internet anymore.

Where is the voice of our young people talking about who shares their information, how it is used online, and why?

When shall we stop to ask collectively, how personal is too personal?

Is analysing the exact onscreen eye movement of a child appropriate or invasive?

These deeply personal uses of our young people’s information raise ethical questions about others’ influence over their decision making.

Where do we draw the line?

Where will we say, it’s not OK anymore?

Do we trust that all uses are for bona fide reasons and not ask to find out why?

Using our children’s data across a range of practices in education seem a free for all to commercially exploit, with too little oversight and no visibility of decision making processes for the public,whose personal data they profit from.

Who has oversight for the ethical use of listening software tools in classrooms, especially if used to support government initiatives like Channel in ‘Prevent’?

What corrective action is taken if our children’s data are exposed through software brought into school over which parents have no control?

The policies and tools used to manage our children’s data in and outside schools seem often out of step with current best-in-class data protection and security practices.

Pupils and parents find it hard to track who has their personal data and why.

While the Department for Education says what it expects of others, it appears less committed to meeting its own responsibilities: “We have been clear that schools are expected to ensure that sensitive pupil information is held securely. The Data Protection Act of 1998 is clear what standards schools are expected to adhere to and we provide guidance on this.” 

A post on a webpage is hardly guidance fit to future proof the data and digital identities of a whole generation.

I believe we should encourage greater use of this administrative data for bona fide research. Promoting broader use of aggregated and open data could also be beneficial. In order to do both, key things should happen that will make researchers less risk averse in its use, and put data at reduced risk of accidental or deliberate misuse by other third parties. Parents and pupils could become more confident that their data is used for all the right reasons.

The frameworks of fair processing, physical data security, of transparent governance and publicly accountable oversight need redesigned and strengthened.

Not only for data collection, but its central management, especially on a scale as large as the National Pupil Database.

“It’s not OK not to understand the internet anymore.”

In fact, it never was.

The next academic year must bring these systems into the 21st century to safeguard both our children and the benefits that using data wisely can bring.

The Department for Education “must try harder” and must start now.

********

If you have questions or concerns about the National Pupil Database or your own experience, or your child’s data used in schools, please feel free to get in touch, and let’s see if we can make this better. [Email me as listed above right.]

1. An overview: an end of year report on our Children’s School Records
2. The National Pupil Database end of year report: an F in fair processing
3. The National Pupil Database end of year report: a D in transparency, C- in security

********

References:

[1] The National Pupil Database user guide: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/261189/NPD_User_Guide.pdf

[2] Data tables to see the individual level data items stored and shared (by tabs on the bottom of the file) https://www.gov.uk/government/publications/national-pupil-database-user-guide-andsupporting-information

[3] The table to show who has applied for and received data and for what purpose https://www.gov.uk/government/publications/national-pupil-database-requests-received

[4] Data Trust Deficit – from the RSS: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[5] Talk by Phil Booth and Terri Dowty: http://www.infiniteideasmachine.com/2013/04/terris-and-my-talk-on-the-national-pupil-database-at-the-open-data-institute/

[6] On 1 September 2013 sections 26 and 27 of the Protection of Freedoms Act 2012 came into force, requiring schools to seek parental consent before collecting biometric data, such as fingerprints.

The nhs.uk digital platform: a personalised gateway to a new NHS?

In recent weeks rebranding the poverty definitions and the living wage in the UK deservedly received more attention than the rebrand of the website NHS Choices into ‘nhs.uk.

The site that will be available only in England and Wales despite its domain name, will be the doorway to enter a personalised digital NHS offering.

As the plans proceed without public debate, I took some time to consider the proposal announced through the National Information Board (NIB) because it may be a gateway to a whole new world in our future NHS. And if not, will it be a  big splash of cash but create nothing more than a storm-in-a-teacup?

In my previous post I’d addressed some barriers to digital access. Will this be another? What will it offer that isn’t on offer already today and how will the nhs.uk platform avoid the problems of its predecessor HealthSpace?

Everyone it seems is agreed, the coming cuts are going to be ruthless. So, like Alice, I’m curious. What is down the rabbit hole ahead?

What’s the move from NHS Choices to nhs.uk about?

The new web platform nhs.uk would invite users to log on, using a system that requires identity, and if compulsory, would be another example of a barrier to access simply from a convenience point of view, even leaving digital security risks aside.

What will nhs.uk offer to incentivise users and offer benefit as a trade off against these risks, to go down the new path into the unknown and like it?

“At the heart of the domain , will be the development of nhs.uk into a new integrated health and care digital platform that will be a source of access to information, directorate, national services and locally accredited applications.”

In that there is nothing new compared with information, top down governance and signposting done by NHS Choices today.  

What else?

“Nhs.uk will also become the citizen ’s gateway to the creation of their own personal health record, drawing on information from the electronic health records in primary and secondary care.”

nhs.uk will be an access point to patient personal confidential records

Today’s patient online we are told offers 97% of patients access to their own GP created records access. So what will nhs.uk offer more than is supposed to be on offer already today? Adding wearables data into the health record is already possible for some EMIS users, so again, that won’t be new. It does state it will draw on both primary and secondary records which means getting some sort of interoperability to show both hospital systems data and GP records. How will the platform do this?

Until care.data many people didn’t know their hospital record was stored anywhere outside the hospital. In all the care.data debates the public was told that HES/SUS was not like a normal record in the sense we think of it. So what system will secondary care records come from? [Some places may have far to go. My local hospital pushes patients round with beige paper folders.] The answer appears to be an unpublished known or an unknown.

What else?

nhs.uk will be an access point to tailored ‘signposting’ of services

In addition to access to your personal medical records in the new “pull not push” process the nhs.uk platform will also offer information and services, in effect ‘advertising’ local services, to draw users to want to use it, not force its use. And through the power of web tracking tools combined with log in, it can all be ‘tailored’ or ‘targeted’ to you, the user.

“Creating an account will let you save information, receive emails on your chosen topics and health goals and comment on our content.”

Do you want to receive emails on your chosen topics or comment on content today? How does it offer more than can already be done by signing up now to NHS Choices?

NHS Choices today already offers information on local services, on care provision and symptoms’ checker.

What else?

Future nhs.uk users will be able to “Find, Book, Apply, Pay, Order, Register, Report and Access,” according to the NIB platform headers.

platform

“Convenient digital transactions will be offered like ordering and paying for prescriptions, registering with GPs, claiming funds for treatment abroad, registering as an organ and blood donor and reporting the side effects of drugs . This new transactional focus will complement nhs.uk’s existing role as the authoritative source of condition and treatment information, NHS services and health and care quality information.

“This will enable citizens to communicate with clinicians and practices via email, secure video links and fill out pre-consultation questionnaires. They will also be able to include data from their personal applications and wearable devices in their personal record. Personal health records will be able to be linked with care accounts to help people manage their personal budget.”

Let’s consider those future offerings more carefully.

Separating out the the transactions that for most people will be one off, extremely rare or never events (my blue) leaves other activities which you can already do or will do via the patient online programme (in purple).

The question is that although video and email are not yet widespread where they do work today and would in future, would they not be done via a GP practice system, not a centralised service? Or is the plan not that you could have an online consultation with ‘your’ named GP through nhs.uk but perhaps just ‘any’ GP from a centrally provided GP pool? Something like this? 

That leaves two other things, which are both payment tools (my bold).

i. digital transactions will be offered like ordering and paying for prescriptions
ii. …linked with care accounts to help people manage their personal budget.”

Is the core of the new offering about managing money at individual and central level?

Beverly Bryant, ‎Director of Strategic Systems and Technology at NHS England, said at the #kfdigi2015 June 16th event, that implementing these conveniences had costs saving benefits as well: “The driver is customer service, but when you do it it actually costs less.”

How are GP consultations to cost less, significantly less, to be really cost effective compared with the central platform to enable it to happen, when the GP time is the most valuable part and remains unchanged spent on the patient consultation and paperwork and referral for example?

That most valuable part to the patient, may be seen as what is most costly to ‘the system’.

If the emphasis is on the service saving money, it’s not clear what is in it for people to want to use it and it risks becoming another Healthspace, a high cost top down IT rollout without a clear customer driven need.

The stated aim is that it will personalise the user content and experience.

That gives the impression that the person using the system will get access to information and benefits unique and relevant to them.

If this is to be something patients want to use (pull) and are not to be forced to use (push) I wonder what’s really at its core, what’s in it for them, that is truly new and not part of the existing NHS Choices and Patient online offering?

What kind of personalised tailoring do today’s NHS Choices Ts&Cs sign users up to?

“Any information provided, or any information the NHS.uk site may infer from it, are used to provide content and information to your account pages or, if you choose to, by email.  Users may also be invited to take part in surveys if signed up for emails.

“You will have an option to submit personal information, including postcode, age, date of birth, phone number, email address, mobile phone number. In addition you may submit information about your diet and lifestyle, including drinking or exercise habits.”

“Additionally, you may submit health information, including your height and weight, or declare your interest in one or more health goals, conditions or treatments. “

“With your permission, academic institutions may occasionally use our data in relevant studies. In these instances, we shall inform you in advance and you will have the choice to opt out of the study. The information that is used will be made anonymous and will be confidential.”

Today’s NHS Choices terms and conditions say that “we shall inform you in advance and you will have the choice to opt out of the study.”

If that happens already and the NHS is honest about its intent to give patients that opt out right whether to take part in studies using data gathered from registered users of NHS Choices, why is it failing to do so for the 700,000 objections to secondary use of personal data via HSCIC?

If the future system is all about personal choice NIB should perhaps start by enforcing action over the choice the public may have already made in the past.

Past lessons learned – platforms and HealthSpace

In the past, the previous NHS personal platform, HealthSpace, came in for some fairly straightforward criticism including that it offered too little functionality.

The Devil’s in the Detail remarks are as relevant today on what users want as they were in 2010. It looked at the then available Summary Care Record (prescriptions allergies and reactions) and the web platform HealthSpace which tried to create a way for users to access it.

Past questions from Healthspace remain unanswered for today’s care.data or indeed the future nhs.uk data: What happens if there is a mistake in the record and the patient wants it deleted? How will access be given to third party carers/users on behalf of individuals without capacity to consent to their records access?

Reasons given by non-users of HealthSpace included lack of interest in managing their health in this way, a perception that health information was the realm of health professionals and lack of interest or confidence in using IT.

“In summary, these findings show that ‘self management’ is a much more complex, dynamic, and socially embedded activity than original policy documents and technical specifications appear to have assumed.”

What lessons have been learned? People today are still questioning the value of a centrally imposed system. Are they being listened to?

Digital Health reported that Maurice Smith, GP and governing body member for Liverpool CCG, speaking in a session on self-care platforms at the King’s Fund event he said that driving people towards one national hub for online services was not an option he would prefer and that he had no objection to a national portal, “but if you try drive everybody to a national portal and expect everybody to be happy with that I think you will be disappointed.”

How will the past problems that hit Healthspace be avoided for the future?

How will the powers-at-be avoid repeating the same problems for its ongoing roll out of care.data and future projects? I have asked this same question to NHS England/NIB leaders three times in the last year and it remains unanswered.

How will you tell patients in advance of any future changes who will access their data records behind the scenes, for what purpose, to future proof any programmes that plan to use the data?

One of the Healthspace 2010 concerns was: “Efforts of local teams to find creative new uses for the SCR sat in uneasy tension with implicit or explicit allegations of ‘scope creep’.”

Any programme using records can’t ethically sign users up to one thing and change it later without informing them before the change. Who will pay for that and how will it be done? care.data pilots, I’d want that answered before starting pilot communications.

As an example of changes to ‘what’ or content scope screep, future plans will see ‘social care flags added’ to the SCR record, states p.17 of the NIB 2020 timeline. What’s the ‘discovery for the use of genomic data complete’ about on p.11?  Scope creep of ‘who’ will access records, is very current. Recent changes allow pharmacists to access the SCR yet the change went by with little public discussion. Will they in future see social care flags or mental health data under their SCR access? Do I trust the chemist as I trust a GP?

Changes without adequate public consultation and communication cause surprises. Bad idea. Sir Nick Partridge said ensuring ‘no surprises’ is key to citizens’ trust after the audit of HES/SUS data uses. He is right.

The core at the heart of this nhs.uk plan is that it needs to be used by people, and enough people to make the investment vs cost worthwhile. That is what Healthspace failed to achieve.

The change you want to see doesn’t address the needs of the user as a change issue. (slide 4) This is all imposed change. Not user need-driven change.

Dear NIB, done this way seems to ignore learning from Healthspace. The evidence shown is self-referring to Dr. Foster and NHS Choices. The only other two listed are from Wisconsin and the Netherlands, hardly comparable models of UK lifestyle or healthcare systems.

What is really behind the new front door of the nhs.uk platform?

The future nhs.uk looks very much as though it seeks to provide a central front door to data access, in effect an expanded Summary Care Record (GP and secondary care records) – all medical records for direct care – together with a way for users to add their own wider user data.

Will nhs.uk also allow individuals to share their data with digital service providers of other kinds through the nhs.uk platform and apps? Will their data be mined to offer a personalised front door of tailored information and service nudges? Will patients be profiled to know their health needs, use and costs?

If yes, then who will be doing the mining and who will be using that data for what purposes?

If not, then what value will this service offer if it is not personal?

What will drive the need to log on to another new platform, compared with using the existing services of patient online today to access our health records, access GPs via video tools, and without any log-in requirement, browse similar content of information and nudges towards local services offered via NHS Choices today?

If this is core to the future of our “patient experience” of the NHS the public should be given the full and transparent facts  to understand where’s the public benefit and the business case for nhs.uk, and what lies behind the change expected via online GP consultations.

This NIB programme is building the foundation of the NHS offering for the next ten years. What kind of NHS are the NIB and NHS England planning for our children and our retirement through their current digital designs?

If the significant difference behind the new offering for nhs.uk platform is going to be the key change from what HealthSpace offered and separate from what patient online already offers it appears to be around managing cost and payments, not delivering any better user service.

Managing more of our payments with pharmacies and personalised budgets would reflect the talk of a push towards patient-responsible-self-management  direction of travel for the NHS as a whole.

More use of personal budgets is after all what Simon Stevens called a “radical new option” and we would expect to see “wider scale rollout of successful projects is envisaged from 2016-17″.

When the system will have finely drawn profiles of its users, will it have any effect for individuals in our universal risk-shared system? Will a wider roll out of personalised budgets mean more choice or could it start to mirror a private insurance system in which a detailed user profile would determine your level of risk and personal budget once reached, mean no more service?

What I’d like to see and why

To date, transparency has a poor track record on sharing central IT/change programme business plans.  While saying one thing, another happens in practice. Can that be changed? Why all the effort on NHS Citizen and ‘listening’, if the public is not to be engaged in ‘grown up debate‘ to understand the single biggest driver of planned service changes today: cost.

It’s at best patronising in the extreme, to prevent the public from seeing plans which spend public money.

We risk a wasteful, wearing repeat of the past top down failure of an imposed NPfIT-style HealthSpace, spending public money on a project which purports to be designed to save it.

To understand the practical future we can look back to avoid what didn’t work and compare with current plans. I’d suggest they should spell out very clearly what were the failures of Healthspace, and why is nhs.uk different.

If the site will offer an additional new pathway to access services than we already have, it will cost more, not less. If it has genuine expected cost reduction compared with today, where precisely will it come from?

I’d suggest you publish the detailed business plan for the nhs.uk platform and have the debate up front. Not only the headline numbers towards the end of these slides, but where and how it fits together in the big picture of Stevens’ “radical new option”.  This is public money and you *need* the public on side for it to work.

Publish the business cases for the NIB plans before the public engagement meet ups, because otherwise what facts will opinion be based on?

What discussion can be of value without them, when we are continually told by leadership those very  details are at the crux of needed change – the affordability of the future of the UK health and care system?

Now, as with past projects, The Devil’s in the Detail.

***

NIB detail on nhs.uk and other concepts: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/437067/nib-delivering.pdf

The Devil’s in the Detail: Final report of the independent evaluation of the Summary Care Record and HealthSpace programmes 2010

Digital revolution by design: infrastructures and the world we want

1. Digital revolution by design: building for change and people
2. Digital revolution by design: barriers by design
3. Digital revolution by design: infrastructures and the fruits of knowledge

This is Part 4.  Infrastructures and the world we want

At high level physical network infrastructures enable data transfer from one place to another and average users perceive little of it.

In the wider world Internet infrastructure today, this week might be looked back on as, to use a horrible cliché, a game changer. A two-tier Internet traffic system could be coming to Europe which would destroy a founding principle of equality – all traffic is created equal.

In other news, Facebook announced it will open an office in the toe of Africa, a foothold on a potential market of a billion people.

Facebook’s Internet.org initiative sees a further ‘magnificent seven’ companies working together. Two of whom Ericsson and Nokia will between them have “an effective lock down on the U.S market,” unless another viable network competitor emerges.  And massive reach worldwide.

In Africa today there is a hodge podge of operators and I’ll be interested to see how much effect the boys ganging up under the protection of everybody’s big brother ‘Facebook” will have on local markets.

And they’re not alone in wanting in on African action.

Whatever infrastructures China is building on and under the ground of the African continent, or donating ludicrous showcase gifts, how they are doing it has not gone unnoticed. The Chinese ethics of working and their environmental standards can provoke local disquiet.

Will Facebook’s decision makers shape up to offer Africa an ethical package that could include not only a social network, but physical one managing content delivery in the inner workings of tubes and pipes?

In Europe the data connections within and connecting the continent are shifting, as TTIP, CETA and TISA shape how our data and knowledge will be shared or reserved or copyrighted by multinational corporations.

I hope we will ensure transparency designed it these supra-national agreements on private ownership of public firms.

We don’t want to find commercial companies withhold information such as their cyber security planning, and infrastructure investments in the name of commercial protectionism, but at a public cost.

The public has opportunities now as these agreements are being drawn up, we may not get soon again.

Not only for the physical constructions, the often networked infrastructures, but intangible infrastructures of principles and power, co-dependencies around a physical system; the legal and ethical infrastructures of ownership, governance and accountability.

The Open Data institute has just launched a call for the promotion of understanding around our own data infrastructures:

“A strong data infrastructure will increase interoperability and collaboration, efficiency and productivity in public and private sectors, nationally and internationally.”

Sounds like something we want to get right in, and outside, the UK.

Governance of data is physically geographical through country unique legislation, as well as supra national such as European-wide data protection.

In some ways outdated legal concepts in a borderless digital age but one way at least over which there is manageable oversight and citizens should be able to call companies and State to account.

Yet that accountability is questionable when laws seem to be bypassed under the banner of surveillance.

As a result people have somewhat lost trust in national bodies to do the right thing. We want to share data for public good but not for commercial exploitation. And we’re not sure who to trust with it.

Data governance of contractual terms is part of the infrastructure needed to prevent exploitation and enable not restrict sharing. And it needs to catch up with apps whose terms and conditions can change after a user has enrolled.

That comes back down to the individual and some more  ideas on those personal infrastructures are in the previous post.

Can we build lasting foundations fit for a digital future?

Before launching into haphazard steps of a digital future towards 2020, the NIB/NHS decision makers need to consider the wider infrastructures in which it is set and understand under what ethical compass they are steering by.

Can there be oversight to make national and supra-national infrastructures legally regulated, bindingly interoperable and provider and developer Ts and Cs easily understood?

Is it possible to regulate only that which is offered or sold through UK based companies or web providers and what access should be enabled or barriers designed in?

Whose interests should data and knowledge created from data serve?

Any state paid initiative building a part of the digital future for our citizens must decide, is it to be for public good or for private profit?

NHS England’s digital health vision includes: “clinical decision support to be auto populated with existing healthcare information, to take real time feeds of biometric data, and to consider genomics data in the future.”  [NIB plans, Nov 2014]

In that 66 page document while it talks of data and trust and cyber security, ethics is not mentioned once.  The ambition is to create ‘health-as-a-platform’ and its focus is on tech, not on principles.

‘2020’ is the goal and it’s not a far away future at all if counted as 1175 working days from now.

By 2020 we may have moved on or away in a new digital direction entirely or to other new standards of network or technology. On what can we build?

Facebook’s founder sees a futuristic role for biometric data used in communication. Will he drive it? Should we want him to?

Detail will change, but ethical principles could better define the framework for development promoting the best of innovation long term and protect citizens from commercial exploitation. We need them now.

When Tim Berners-Lee called for a Magna Carta on the world wide web he asked for help to achieve the web he wants.

I think it’s about more than the web he wants. This fight is not only for net neutrality. It’s not only challenging the internet of things to have standards, ethics and quality that shape a fair future for all.

While we shape the web we want, we shape the world we want.

That’s pretty exciting, and we’d better try to get it right.

******

1. Digital revolution by design: building for change and people
2. Digital revolution by design: barriers by design
3. Digital revolution by design: infrastructures and the fruits of knowledge
4. Digital revolution by design: infrastructures and the world we want

 

Digital revolution by design: infrastructures and the fruits of knowledge

Since the beginning of time and the story of the Garden of Eden, man has found a way to share knowledge and its power.

Modern digital tools have become the everyday way to access knowledge for many across the world, giving quick access to information and sharing power more fairly.

In this third part of my thoughts on digital revolution by design, triggered by the #kfdigi15 event on June 16-17, I’ve been considering some of the constructs we have built; those we accept and those that could be changed, given the chance, to build a better digital future.

Not only the physical constructions, the often networked infrastructures, but intangible infrastructures of principles and power, co-dependencies around a physical system; the legal and ethical infrastructures of ownership, governance and accountability.

Our personal data flow in systems behind the screens, at the end of our fingertips. Controlled in frameworks designed by providers and manufacturers, government and commercial agencies.

Increasingly in digital discussions we hear that the data subject, the citizen, will control their own data.

But if it is on the terms and conditions set by others, how much control is real and how much is the talk of a consenting citizen only a fig leaf behind which any real control is still held by the developer or organisation providing the service?

When data are used, turned into knowledge as business intelligence that adds value to aid informed decision making. By human or machine.

How much knowledge is too much knowledge for the Internet of Things to build about its users? As Chris Matyszczyk wrote:

“We have all agreed to this. We click on ‘I agree’ with no thought of consequences, only of our convenience.”

Is not knowing what we have agreed to our fault, or the responsibility of the provider who’d rather we didn’t know?

Citizens’ rights are undermined in unethical interactions if we are exploited by easy one-click access and exchange our wealth of data at unseen cost. Can it be regulated to promote, not stifle innovation?

How can we get those rights back and how will ‘doing the right thing’ help shape and control the digital future we all want?

The infrastructures we live inside

As Andrew Chitty says in this HSJ article: “People live more mobile and transient lives and, as a result, expect more flexible, integrated, informed health services.

To manage that, do we need to know how systems work, how sharing works, and trust the functionality of what we are not being told and don’t see behind the screens?

At the personal level, whether we sign up for the social network, use a platform for free email, or connect our home and ourselves in the Internet of things, we each exchange our personal data with varying degrees of willingness. There there is often no alternative if we want to use the tool.

As more social and consensual ‘let the user decide’ models are being introduced, we hear it’s all about the user in control, but reality is that users still have to know what they sign up for.

In new models of platform identity sign on, and tools that track and mine our personal data to the nth degree that we share with the system, both the paternalistic models of the past and the new models of personal control and social sharing are merging.

Take a Fitbit as an example. It requires a named account and data sharing with the central app provider. You can choose whether or not to enable ‘social sharing’ with nominated friends whom you want to share your boasts or failures with. You can opt out of only that part.

I fear we are seeing the creation of a Leviathan sized monster that will be impossible to control and just as scary as today’s paternalistic data mis-management. Some data held by the provider and invisibly shared with third parties beyond our control, some we share with friends, and some stored only on our device.

While data are shared with third parties without our active knowledge, the same issue threatens to derail consumer products, as well as commercial ventures at national scale, and with them the public interest. Loss of trust in what is done behind the settings.

Society has somehow seen privacy lost as the default setting. It has become something to have to demand and defend.

“If there is one persistent concern about personal technology that nearly everybody expresses, it is privacy. In eleven of the twelve countries surveyed, with India the only exception, respondents say that technology’s effect on privacy was mostly negative.” [Microsoft survey 2015, of  12,002 internet users]

There’s one part of that I disagree with. It’s not the effect of technology itself, but the designer or developers’ decision making that affects privacy. People have a choice how to design and regulate how privacy is affected, not technology.

Citizens have vastly differing knowledge bases of how data are used and how to best interact with technology. But if they are told they own it, then all the decision making framework should be theirs too.

By giving consumers the impression of control, the shock is going to be all the greater if a breach should ever reveal where fitness wearable users slept and with whom, at what address, and were active for how long. Could a divorce case demand it?

Fitbit users have already found their data used by police and in the courtroom – probably not what they expected when they signed up to a better health tool.  Others may see benefits that could harm others by default who are excluded from accessing the tool.

Some at org level still seem to find this hard to understand but it is simple:
No trust = no data = no knowledge for commercial, public or personal use and it will restrict the very innovation you want to drive.

Google gmail users have to make 10+ clicks to restrict all ads and information sharing based on their privacy and ad account settings. The default is ad tailoring and data mining. Many don’t even know it is possible to change the settings and it’s not intuitive how to.

Firms need to consider their own reputational risk if users feel these policies are not explicit and are exploitation. Those caught ‘cheating’ users can get a very public slap on the wrist.

Let the data subjects rule, but on whose terms and conditions?

The question every citizen signing up to digital agreements should ask, is what are the small print  and how will I know if they change? Fair processing should offer data protection, but isn’t effective.

If you don’t have access to information, you make decisions based on a lack of information or misinformation. Decisions which may not be in your own best interest or that of others. Others can exploit that.

And realistically and fairly, organisations can’t expect citizens to read pages and pages of Ts&Cs. In addition, we don’t know what we don’t know. Information that is missing can be as vital to understand as that provided. ‘Third parties’ sharing – who exactly does that mean?

The concept of an informed citizenry is crucial to informed decision making but it must be within a framework of reasonable expectation.

How do we grow the fruits of knowledge in a digital future?

Real cash investment is needed now for a well-designed digital future, robust for cybersecurity, supporting enforceable governance and oversight. Collaboration on standards and thorough change plans. I’m sure there is much more, but this is a start.

Figurative investment is needed in educating citizens about the technology that should serve us, not imprison us in constructs we do not understand but cannot live without.

We must avoid the chaos and harm and wasted opportunity of designing massive state-run programmes in which people do not want to participate or cannot participate due to barriers of access to tools. Avoid a Babel of digital blasphemy in which the only wise solution might be to knock it down and start again.

Our legislators and regulators must take up their roles to get data use, and digital contract terms and conditions right for citizens, with simplicity and oversight. In doing so they will enable better protection against risks for commercial and non-profit orgs, while putting data subjects first.

To achieve greatness in a digital future we need: ‘people speaking the same language, then nothing they plan to do will be impossible for them’.

Ethics. It’s more than just a county east of London.

Let’s challenge decision makers to plant the best of what is human at the heart of the technology revolution: doing the right thing.

And from data, we will see the fruits of knowledge flourish.

******

1. Digital revolution by design: building for change and people
2. Digital revolution by design: barriers by design
3
. Digital revolution by design: infrastructures and the fruits of knowledge
4. Digital revolution by design: infrastructures and the world we want

Driving digital health, revolution by design

This follows on from: 1. Digital revolution by design: building for change and people.

***

Talking about the future of digital health in the NHS, Andy Williams went on to ask, what makes the Internet work?

In my head I answered him, freedom.

Freedom from geographical boundaries. Freedom of speech to share ideas and knowledge in real time with people around the world.  The freedom to fair and equal use. Cooperation, creativity, generosity…

Where these freedoms do not exist or are regulated the Internet may not work well for its citizens and its potential is restricted, as well as its risks.

But the answer he gave, was standards.

And of course he was right.  Agreed standards are needed when sharing a global system so that users, their content and how it works behind the screen cooperate and function as intended.

I came away wondering what the digital future embodied in the NHS NIB plans will look like, who has their say in its content and design and who will control  it?

What freedoms and what standards will be agreed upon for the NHS ‘digital future’ to function and to what purpose?

Citizens help shape the digital future as we help define the framework of how our data are to be collected and used, through what public feeling suggests is acceptable and people actually use.

What are some of the expectations the public have and what potential barriers exist to block achieving its benefits?

It’s all too easy when discussing the digital future of the NHS to see it as a destination. Perhaps we could shift the conversation focus to people, and consider what tools digital will offer the public on their life journey, and how those tools will be driven and guided.

Expectations

One key public expectation will be of trust, if something digital is offered under the NHS brand, it must be of the rigorous standard we expect.

Is every app a safe, useful tool or fun experiment and how will users [especially for mental health apps where the outcomes may be less tangibly measured than say, blood glucose] know the difference?

A second expectation must be around universal equality of access.

A third expectation must be that people know once the app is downloaded or enrolment done, what they have signed up to.

Will the NHS England / NIB digital plans underway create or enable these barriers and expectations?

What barriers exist to the NHS digital vision and why?

Is safety regulation a barrier to innovation?

The ability to broadly share innovation at speed is one of the greatest strengths of digital development, but can also risk spreading harm quickly. Risk management needs to be upfront.

We  assume that digital designs will put at their heart the core principles in the spirit of the NHS.  But if apps are not available on prescription and are essentially a commercial product with no proven benefit, does that exploit the NHS brand trust?

Regulation of quality and safety must be paramount, or they risk doing harm as any other treatment could to the person and regulation must further consider reputational risk to the NHS and the app providers.

Regulation shouldn’t be seen as a barrier, but as an enabler to protect and benefit both user and producer, and indirectly the NHS and state.

Once safety regulation is achieved, I hope that spreading benefits will not be undermined by creating artificial boundaries that restrict access to the tools by affordability, in a postcode lottery,  or in language.

But are barriers being built by design in the NHS digital future?

Cost: commercial digital exploitation or digital exclusion?

There appear to be barriers being built by design into the current NHS apps digital framework. The first being cost.

For the poorest even in the UK today in maternity care, exclusion is already measurable in those who can and cannot afford the data allowance it costs on a smart phone for e-red book access, attendees were told by its founder at #kfdigital15.

Is digital participation and its resultant knowledge or benefit to become a privilege reserved for those who can afford it? No longer free at the point of service?

I find it disappointing that for all the talk of digital equality, apps are for sale on the NHS England website and many state they may not be available in your area – a two-tier NHS by design. If it’s an NHS app, surely it should be available on prescription and/or be free at the point of use and for all like any other treatment? Or is yet another example of  NHS postcode lottery care?

There are tonnes of health apps on the market which may not have much proven health benefit, but they may sell well anyway.

I hope that decision makers shaping these frameworks and social contracts in health today are also looking beyond the worried well, who may be the wealthiest and can afford apps leaving the needs of those who can’t afford to pay for them behind.

At home, it is some of the least wealthy who need the most intervention and from whom there may be little profit to be made There is little in 2020 plans I can see that focuses on the most vulnerable, those in prison and IRCs, and those with disabilities.

Regulation in addition to striving for quality and safety by design, can ensure there is no commercial exploitation of purchasers.  However it is a  question of principle that will decide for or against exclusion for users based on affordability.

Geography: crossing language, culture and country barriers

And what about our place in the wider community, the world wide web, as Andy Williams talked about: what makes the Internet work?

I’d like to think that governance and any “kite marking” of digital tools such as apps, will consider this and look beyond our bubble.

What we create and post online will be on the world wide web.  That has great potential benefits and has risks.

I feel that in the navel gazing focus on our Treasury deficit, the ‘European question’ and refusing refugees, the UK government’s own insularity is a barrier to our wider economic and social growth.

At the King’s Fund event and at the NIB meeting the UK NHS leadership did not discuss one of the greatest strengths of online.

Online can cross geographical boundaries.

How are NHS England approved apps going to account for geography and language and cross country regulation?

What geographical and cultural barriers to access are being built by design just through lack of thought into the new digital framework?

Barriers that will restrict access and benefits both in certain communities within the UK, and to the UK.

One of the three questions asked at the end of the NIB session, was how the UK Sikh community can be better digitally catered for.

In other parts of the world both traditional and digital access to knowledge are denied to those who cannot afford it.

school

This photo reportedly from Indonesia, is great [via Banksy on Twitter, and apologies I cannot credit the photographer] two boys on the way to school, pass their peers on their way to work.

I wonder if one of these boys has the capability to find the cure for cancer?
What if he is one of the five, not one of the two?

Will we enable the digital infrastructure we build today to help global citizens access knowledge and benefits, or restrict access?

Will we enable broad digital inclusion by design?

And what of  data sharing restrictions: Barrier or Enabler?

Organisations that talk only of legal, ethical or consent ‘barriers’ to datasharing don’t understand human behaviour well enough.

One of the greatest risks to achieving the potential benefits from data is the damage done to it by organisations that are paternalistic and controlling. They exploit a relationship rather than nurturing it.

The data trust deficit from the Royal Statistical Society has lessons for policymakers. Including finding that: “Health records being sold to private healthcare companies to make money for government prompted the greatest opposition (84%).”

Data are not an abstract to be exploited, but personal information. Unless otherwise informed, people expect that information offered for one purpose, will not be used for another. Commercial misuse is the greatest threat to public trust.

Organisations that believe behavioural barriers to data sharing are an obstacle,  have forgotten that trust is not something to be overcome, but to be won and continuously reviewed and protected.

The known barrier without a solution is the lack of engagement that is fostered where there is a lack of respect for the citizen behind the data. A consensual data charter could help to enable a way forward.

Where is the wisdom we have lost in knowledge?

Once an app is [prescribed[, used, data exchanged with the NHS health provider and/or app designer, how will users know that what they agreed to in an in-store app, does not change over time?

How will ethical guidance be built into the purposes of any digital offerings we see approved and promoted in the NHS digital future?

When the recent social media experiment by Facebook only mentioned the use of data for research after the experiment, it caused outcry.

It crossed the line between what people felt acceptable and intrusive, analysing the change in behaviour that Facebook’s intervention caused.

That this manipulation is not only possible but could go unseen, are both a risk and cause for concern in a digital world.

Large digital platforms, even small apps have the power to drive not only consumer, but potentially social and political decision making.

“Where is the knowledge we have lost in information?” asks the words of T S Elliott in Choruses, from the Rock. “However you disguise it, this thing does not change: The perpetual struggle of Good and Evil.”

Knowledge can be applied to make a change to current behaviour, and offer or restrict choices through algorithmic selection. It can be used for good or for evil.

‘Don’t be evil’ Google’s adoptive mantra is not just some silly slogan.

Knowledge is power. How that power is shared or withheld from citizens matters not only today’s projects, but for the whole future digital is helping create. Online and offline. At home and abroad.

What freedoms and what standards will be agreed upon for it to function and to what purpose? What barriers can we avoid?

When designing for the future I’d like to see discussion consider not only the patient need, and potential benefits, but also the potential risk for exploitation and behavioural change the digital solution may offer. Plus, ethical solutions to be found for equality of access.

Regulation and principles can be designed to enable success and benefits, not viewed as barriers to be overcome

There must be an ethical compass built into the steering of the digital roadmap that the NHS is so set on, towards its digital future.

An ethical compass guiding app consumer regulation,  to enable fairness of access and to know when apps are downloaded or digital programmes begun, that users know to what they are signed up.

Fundamental to this the NIB speakers all recognised at #kfdigital15 is the ethical and trustworthy extraction, storage and use of data.

There is opportunity to consider when designing the NHS digital future [as the NIB develops its roadmaps for NHS England]:

i making principled decisions on barriers
ii. pro-actively designing ethics and change into ongoing projects, and,
iii. ensuring engagement is genuine collaboration and co-production.

The barriers do not need got around, but solutions built by design.

***

Part 1. Digital revolution by design: building for change and people
Part 3. Digital revolution by design: building infrastructures

NIB roadmaps: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/384650/NIB_Report.pdf

Digital revolution by design: building for change and people (1)

Andy Williams said* that he wants not evolution, but a revolution in digital health.

It strikes me that few revolutions have been led top down.

We expect revolution from grass roots dissent, after a growing consensus in the population that the status quo is no longer acceptable.

As the public discourse over the last 18 months about the NHS use of patient data has proven, we lack a consensual agreement between state, organisations and the public how the data in our digital lives should be collected, used and shared.

The 1789 Declaration of the Rights of Man and Citizen as part of the French Revolution set out a charter for citizens, an ethical and fair framework of law in which they trusted their rights would be respected by fellow men.

That is something we need in this digital revolution.

We are told hand by government that it is necessary to share all our individual level data in health from all sorts of sources.

And that bulk data collection is vital in the public interest to find surveillance knowledge that government agencies want.

At the same time other government departments plan to restrict citizens’ freedom of access to knowledge that could be used to hold the same government  and civil servants to account.

On the consumer side, there is public concern about the way we are followed around on the web by companies including global platforms like Google and Facebook, that track our digital footprint to deliver advertising.

There is growing objection to the ways in which companies scoop up data to build profiles of individuals and groups and personalising how they get treated. Recent objection was to marketing misuse by charities.

There is little broad understanding yet of the power of insight that organisations can now have to track and profile due to the power of algorithms and processing capability.

Technology progress that has left legislation behind.

But whenever you talk to people about data there are two common threads.

The first, is that although the public is not happy with the status quo of how paternalistic organisations or consumer companies ‘we can’t live without’ manage our data, there is a feeling of powerlessness that it can’t change.

The second, is frustration with organisations that show little regard for public opinion.

What happens when these feelings both reach tipping point?

If Marie Antoinette were involved in today’s debate about the digital revolution I suspect she may be the one saying: “let them eat cookies.”

And we all know how that ended.

If there is to be a digital revolution in the NHS where will it start?

There were marvelous projects going on at grassroots discussed over the two days: bringing the elderly online and connected and in housing and deprivation. Young patients with rare diseases are designing apps and materials to help consultants improve communications with patients.

The NIB meeting didn’t have real public interaction and or any discussion of those projects ‘in the room’ in the 10 minutes offered. Considering the wealth of hands-on digital health and care experience in the audience it was a missed opportunity for the NIB to hear common issues and listen to suggestions for co-designed solutions.

While white middle class men (for the most part) tell people of their grand plans from the top down, the revolutionaries of all kinds are quietly getting on with action on the ground.

If a digital revolution is core to the NHS future, then we need to ask to understand the intended change and outcome much more simply and precisely.

We should all understand why the NHS England leadership wants to drive change, and be given proper opportunity to question it, if we are to collaborate in its achievement.

It’s about the people, stoopid

Passive participation will not be enough from the general public if the revolution is to be as dramatic as it is painted.

Consensual co-design of plans and co-writing policy are proven ways to increase commitment to change.

Evidence suggests citizen involvement in planning is more likely to deliver success. Change done with, not to.

When constructive solutions have been offered what impact has engagement if no change is made to any  plans?

If that’s engagement, you’re doing it wrong.

Struggling with getting themselves together on the current design for now, it may be hard to invite public feedback on the future.

But it’s only made hard if what the public wants is ignored.  If those issues were resolved in the way the public asked for at listening events it could be quite simple to solve.

The NIB leadership clearly felt nervous to have debate, giving only 10 minutes of three hours for public involvement, yet that is what it needs. Questions and criticism are not something to be scared of, but opportunities to make things better.

The NHS top-down digital plans need public debate and dissected by the clinical professions to see if it fits the current and future model of healthcare, because if not involved in the change, the ride will be awfully bumpy to get there.

For data about us, to be used without us, is certainly an outdated model incompatible with a digital future.

The public needs to fight for citizen rights in a new social charter that demands change along lines we want, change that doesn’t just talk of co-design but that actually means it.

If unhappy about today’s data use, then the general public has to stop being content to be passive cash cows as we are data mined.

If we want data used only for public benefit research and not market segmentation, then we need to speak up. To the Information Commissioner’s Office if the organisation itself will not help.

“As Nicole Wong, who was one of President Obama’s top technology advisors, recently wrote, “[t]here is no future in which less data is collected and used.”

“The challenge lies in taking full advantage of the benefits that the Internet of Things promises while appropriately protecting consumers’ privacy, and ensuring that consumers are treated fairly.” Julie Brill, FTC, May 4 2015, Berlin

In the rush to embrace the ‘Internet of Things’ it can feel as though the reason for creating them has been forgotten. If the Internet serves things, it serves consumerism. AI must tread an enlightened path here. If the things are designed to serve people, then we would hope they offer methods of enhancing our life experience.

In the dream of turning a “tsunami of data” into a “tsunami of actionable business intelligence,” it seems all too often the person providing the data is forgotten.

While the Patient and Information Directorate, NHS England or NIB speakers may say these projects are complex and hard to communicate the benefits, I’d say if you can’t communicate the benefits, its not the fault of the audience.

People shouldn’t have to either a) spend immeasurable hours of their personal time, understanding how these projects work that want their personal data, or b) put up with being ignorant.

We should be able to fully question why it is needed and get a transparent and complete explanation. We should have fully accountable business plans and scrutiny of tangible and intangible benefits in public, before projects launch based on public buy-in which may misplaced. We should expect  plans to be accessible to everyone and make documents straightforward enough to be so.

Even after listening to a number of these meetings and board meetings, I am  not sure many would be able to put succinctly: what is the NHS digital forward view really? How is it to be funded?

On the one hand new plans are to bring salvation, while the other stops funding what works already today.

Although the volume of activity planned is vast, what it boils down to, is what is visionary and achievable, and not just a vision.

Digital revolution by design: building for change and people

We have opportunity to build well now, avoiding barriers-by-design, pro-actively designing ethics and change into projects, and to ensure it is collaborative.

Change projects must map out their planned effects on people before implementing technology. For the NHS that’s staff and public.

The digital revolution must ensure the fair and ethical use of the big data that will flow for direct care and secondary uses if it is to succeed.

It must also look beyond its own bubble of development as they shape their plans in the ever changing infrastructures in which data, digital, AI and ethics will become important to discuss together.

That includes in medicine.

Design for the ethics of the future, and enable change mechanisms in today’s projects that will cope with shifting public acceptance, because that shift has already begun.

Projects whose ethics and infrastructures of governance were designed years ago, have been overtaken in the digital revolution.

Projects with an old style understanding of engagement are not fit-for-the-future. As Simon Denegri wrote, we could have 5 years to get a new social charter and engagement revolutionised.

Tim Berners-Lee when he called for a Magna Carta on the Internet asked for help to achieve the web he wants:

“do me a favour. Fight for it for me.”

The charter as part of the French Revolution set out a clear, understandable, ethical and fair framework of law in which they trusted their rights would be respected by fellow citizens.

We need one for data in this digital age. The NHS could be a good place to start.

****

It’s exciting hearing about the great things happening at grassroots. And incredibly frustrating to then see barriers to them being built top down. More on that shortly, on the barriers of cost, culture and geography.

****

* at the NIB meeting held on the final afternoon of the Digital Conference on Health & Social Care at the King’s Fund, June 16-17.

NEXT>>
2. Driving Digital Health: revolution by design
3. Digital revolution by design: building infrastructure

Refs:
Apps for sale on the NHS website
Whose smart city? Resident involvement
Data Protection and the Internet of Things, Julie Brill FTC
A Magna Carta for the web

Reputational risk. Is NHS England playing a game of public confidence?

“By when will NHS England commit to respect the 700,000 objections  [1] to secondary data sharing already logged* but not enacted?” [gathered from objections to secondary uses in the care.data rollout, Feb 2014*]

Until then, can organisations continue to use health data held by HSCIC for secondary purposes, ethically and legally, or are they placing themselves at reputational risk?

If HSCIC continues to share, what harm may it do to public confidence in data sharing in the NHS?

I should have asked this explicitly of the National Information Board (NIB) June 17th board meeting [2], that rode in for the last 3 hours of the two day Digital Health and Care Congress at the King’s Fund.

But I chose to mention it only in passing, since I assumed it is already being worked on and a public communication will follow very soon. I had lots of other constructive things I wanted to hear in the time planned for ‘public discussion’.

Since then it’s been niggling at me that I should have asked more directly, as it dawned on me watching the meeting recording and more importantly when reading the NIB papers [3], it’s not otherwise mentioned. And there was no group discussion anyway.

Mark Davies. Director at UK Department of Health talked in fairly jargon-free language about transparency. [01:00] I could have asked him when we will see more of it in practice?

Importantly, he said on building and sustaining public trust, “if we do not secure public trust in the way that we collect store and use their personal confidential data, then pretty much everything we do today will not be a success.”

So why does the talk of securing trust seem at odds with the reality?

Evidence of Public Voice on Opt Out

Is the lack of action based on uncertainty over what to do?

Mark Davies also said “we have only a sense” and we don’t have “a really solid evidence base” of what the public want. He said, “people feel slightly uncomfortable about data being used for commercial gain.” Which he felt was “awkward” as commercial companies included pharma working for public good.

If he has not done so already, though I am sure he will have, he could read the NHS England own care.data listening feedback. People were strongly against commercial exploitation of data. Many were livid about its use. [see other care.data events] Not ‘slightly uncomfortable.’  And they were able to make a clear distinction between uses by commercial companies they felt in the public interest, such as bona fide pharma research and the differences with consumer market research, even if by the same company.  Risk stratification and commissioning does not need, and should not have according to the Caldicott Review [8], fully identifiable individual level data sharing.

Uses are actually not so hard to differentiate. In fact, it’s exactly what people want. To have the choice to have their data used only for direct care  or to choose to permit sharing between different users, permitting say, bona fide research.  Or at minimum, possible to exclude commercially exploitative uses and reuse. To enable this would enable more data sharing with confidence.

I’d also suggest there is a significant evidence base gathered in the data trust deficit work from the Royal Statistical Society, a poll on privacy for the Joseph Rowntree Foundation, and work done for the ADRN/ESRC. I’m sure he and the NIB are aware of these projects, and Mark Davies said himself more is currently being done with the Nuffield Trust.

Work with almost 3,000 young for the Royal Academy of Engineering people confirmed what those interested in privacy know, but is the opposite of what is often said about young people and privacy – they care and want control:

youngpeople_privacy

NHS England has itself further said it has held ‘over 180’ listening events in 2014 and feedback was consistent with public letters to papers, radio phone-ins and news reports in spring 2014.

Don’t give raw data out, exclude access to commercial companies not working in the public interest, exclude non-bona fide research use and re-use licenses, define the future purposes, improve legal protection including the opt out and provide transparency to trust.

How much more evidence does anyone need to have of public understanding and feeling, or is it simply that NHS England and the DH don’t like the answers given? Listening does not equal heard.

Here’s some of NHS England’s own slides – [4] points included a common demand from the public to give the opt out legal status:

legal

 

Opt out needs legal status

Paul Bate talked about missing pieces of understanding on secondary uses, for [56:00] [3] “Commissioners, researchers, all the different regulators.” He gave an update, which assumed secondary use of data as the norm.

But he missed out any mention of the perceived cost of loss of confidentiality, and loss of confidence since the failure to respect the 9nu4 objections made in the 2014 aborted care.data rollout. That’s not even mentioning that so many did not even recall getting a leaflet, so those 700,00K came from the most informed.

When the public sees their opt out is not respected they lose trust in the whole system of data sharing. Whether for direct care, for use by an NHS organisation, or by any one of the many organisations vying to manage their digital health interaction and interventions. If someone has been told data will not be shared with third parties and it is, why would they trust any other governance will be honoured?

By looking back on the leadership pre- care.data flawed thinking ‘no one who uses a public service should be allowed to opt out of sharing their records, nor can people rely on their record being anonymised’ and its resulting disastrous attempt to rollout without communication and then a second at fair processing, lessons learned should inform future projects. That includes care.data mark 2. This < is simply daft.

You can object and your data will not be extracted and you can make no contribution to society, Mr. Kelsey answered a critic on twitter in 2014 and revealed that his thinking really hasn’t changed very much, even if he has been forced to make concessions. I should have said at #kfdigital15, ignoring what the public wants is not your call to make.

What legal changes will be made that back up the verbal guarantees given since February? If none are forthcoming, then were the statements made to Parliament untrue? 

“people should be able to opt out from having their anonymised data used for the purposes of scientific research.” [Hunt, 2014]

We are yet to see this legal change and to date, the only publicly stated choice is only for identifiable data, not all data for secondary purposes including anonymous, as offered by the Minister in February 2014, and David Cameron in 2010.

If Mark Davies is being honest about how important he feels trust is to data sharing, implementing the objection should be a) prioritised and b) given legal footing.optout_ppt

 

Risks and benefits : need for a new social contract on Data

Simon Denegri recently wrote [5] he believes there are “probably five years to sort out a new social contract on data in the UK.”

I’d suggest less, if high profile data based projects or breaches irreparably damage public trust first, whether in the NHS or consumer world. The public will choose to share increasingly less.

But the public cannot afford to lose the social benefits that those projects may bring to the people who need them.

Big projects, such as care.data, cannot afford for everyone’s sake to continue to repeatedly set off and crash.

Smaller projects, those planned and in progress by each organisation and attendee at the King’s Fund event, cannot afford for those national mistakes to damage the trust the public may otherwise hold in the projects at local level.

I heard care.data mentioned five different times over the two-day event  in different projects as having harmed the project through trust or delays. We even heard examples of companies in Scotland going bust due to rollouts with slowed data access and austerity.

Individuals cannot afford for their reputation to be harmed through association, or by using data in ways the public finds unreasonable and get splashed across the front page of the Telegraph.

Clarity is needed for everyone using data well whether for direct care with implied consent, or secondary uses without it, and it is in the public interest to safeguard access to that data.

A new social contract on data would be good all round.

Reputational Risk

The June 6th story of the 700,000 unrespected opt outs has been and gone. But the issue has not.

Can organisations continue to use that data ethically and legally knowing it is explicitly without consent?

“When will those objections be implemented?” should be a question that organisations across the country are asking – if reputational risk is a factor in any datasharing decision making – in addition to the fundamental ethical principle: can we continue to use the data from an individual from whom we know consent was not freely given and was actively withheld?

What of projects that use HES or hospital secondary care sites’ submitted data and rely on the HSCIC POM mechanisms? How do those audits or other projects take HES secondary objections into account?

Sir Nick Partridge said in the April 2014 HSCIC HES/SUS audit there should be ‘no surprises’ in future.

That future is now. What has NHS England done since to improve?

“Consumer confidence appears to be fragile and there are concerns that future changes in how data may be collected and used (such as more passive collection via the Internet of Things) could test how far consumers are willing to continue to provide data.” [CMA Consumer report] [6]

The problem exists across both state and consumer data sharing. It is not a matter of if, but when, these surprises are revealed to the public with unpredictable degrees of surprise and revulsion, resulting in more objection to sharing for any purposes at all.

The solutions exist: meaningful transparency, excluding commercial purposes which appear exploitative, consensual choices, and no surprises. Shape communications processes by building-in future change to today’s programmes to future proof trust.

Future-proofing does not mean making a purpose and use of data so vague as to be all encompassing – exactly what the public has said at care.data listening events they do not want and will not find sufficient to trust nor I would argue, would it meet legally adequate fair processing – it must build and budget for mechanisms into every plan today, to inform patients of the future changes to use or users of data already gathered, and offer them a new choice to object or consent. And they should have a way to know who used what.

The GP who asked the first of the only three questions that were possible in 10 minutes Q&A from the room, had taken away the same as I had: the year 2020 is far too late as a public engagement goal. There must be much stronger emphasis on it now. And it is actually very simple. Do what the public has already asked for.

The overriding lesson must be, the person behind the data must come first. If they object to data being used, that must be respected.

It starts with fixing the opt outs. That must happen. And now.

Public confidence is not a game [7]. Reputational risk is not something organisations should be forced to gamble with to continue their use of data and potential benefits of data sharing.

If NHS England, the NIB or Department of Health know how and when it will be fixed they should say so. If they don’t, they better have a darn good reason why and tell us that too.

‘No surprises’, said Nick Partridge.

The question decision makers must address for data management is, do they continue to be part of the problem or offer part of the solution?

******

References:

[1]The Telegraph, June 6th 2015 http://www.telegraph.co.uk/news/health/news/11655777/Nearly-1million-patients-could-be-having-confidential-data-shared-against-their-wishes.html

[2]  June 17th NIB meeting http://www.dh-national-information-board.public-i.tv/core/portal/webcast_interactive/180408

[3] NIB papers / workstream documentation https://www.gov.uk/government/publications/plans-to-improve-digital-services-for-the-health-and-care-sector

[4] care.data listening feedback http://www.england.nhs.uk/wp-content/uploads/2015/01/care-data-presentation.pdf

[5] Simon Denegri’s blog http://simondenegri.com/2015/06/18/is-public-involvement-in-uk-health-research-a-danger-to-itself/

[6] CMA findings on commercial use of consumer data https://www.gov.uk/government/news/cma-publishes-findings-on-the-commercial-use-of-consumer-data

[7] Data trust deficit New research finds data trust deficit with lessons for policymakers: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[8] Caldicott review: information governance in the health and care system

Thinking to some purpose