Category Archives: care.data

Destination smart-cities: design, desire and democracy (Part one)

When I drop my children at school in the morning I usually tell them three things: “Be kind. Have fun. Make good choices.”

I’ve been thinking recently about what a positive and sustainable future for them might look like. What will England be in 10 years?

The #Sprint16 snippets I read talk about how: ”Digital is changing how we deliver every part of government,” and “harnessing the best of digital and technology, and the best use of data to improve public services right across the board.”

From that three things jumped out at me:

  • The first is that the “best use of data” in government’s opinion may conflict with that of the citizen.
  • The second, is how to define “public services” right across the board in a world in which boundaries between private and public in the provision of services have become increasingly blurred.
  • And the third is the power of tech to offer both opportunity and risk if used in “every part of government” and effects on access to, involvement in, and the long-term future of, democracy.

What’s the story so far?

In my experience so far of trying to be a digital citizen “across the board” I’ve seen a few systems come and go. I still have my little floppy paper Government Gateway card, navy blue with yellow and white stripes. I suspect it is obsolete. I was a registered Healthspace user, and used it twice. It too, obsolete. I tested my GP online service. It was a mixed experience.

These user experiences are shaping how I interact with new platforms and my expectations of organisations, and I will be interested to see what the next iteration, nhs alpha, offers.

How platforms and organisations interact with me, and my data, is however increasingly assumed without consent. This involves new data collection, not only using data from administrative or commercial settings to which I have agreed, but new scooping of personal data all around us in “smart city” applications.

Just having these digital applications will be of no benefit and all the disadvantages of surveillance for its own sake will be realised.

So how do we know that all these data collected are used – and by whom? How do we ensure that all the tracking actually gets turned into knowledge about pedestrian and traffic workflow to make streets and roads safer and smoother in their operation, to make street lighting more efficient, or the environment better to breathe in and enjoy? And that we don’t just gift private providers tonnes of valuable data which they simply pass on to others for profit?

Because without making things better, in this Internet-of-Things will be a one-way ticket to power in the hands of providers and loss of control, and quality of life. We’ll work around it, but buying a separate SIM card for trips into London, avoiding certain parks or bridges, managing our FitBits to the nth degree under a pseudonym. But being left no choice but to opt out of places or the latest technology to enjoy, is also tedious. If we want to buy a smart TV to access films on demand, but don’t want it to pass surveillance or tracking information back to the company how can we find out with ease which products offer that choice?

Companies have taken private information that is none of their business, and quite literally, made it their business.

The consumer technology hijack of “smart” to always mean marketing surveillance creates a divide between those who will comply for convenience and pay the price in their privacy, and those who prize privacy highly enough to take steps that are less convenient, but less compromised.

But even wanting the latter, it can be so hard to find out how to do, that people feel powerless and give-in to the easy option on offer.

Today’s system of governance and oversight that manages how our personal data are processed by providers of public and private services we have today, in both public and private space, is insufficient to meet the values most people reasonably expect, to be able to live their life without interference.

We’re busy playing catch up with managing processing and use, when many people would like to be able to control collection.

The Best use of Data: Today

My experience of how the government wants to ‘best use data’ is that until 2013 I assumed the State was responsible with it.

I feel bitterly let down.

care.data taught me that the State thinks my personal data and privacy are something to exploit, and “the best use of my data” for them, may be quite at odds with what individuals expect. My trust in the use of my health data by government has been low ever since. Saying one thing and doing another, isn’t making it more trustworthy.

I found out in 2014 how my children’s personal data are commercially exploited and given to third parties including press outside safe settings, by the Department for Education. Now my trust is at rock bottom. I tried to take a look at what the National Pupil Database stores on my own children and was refused a subject access request, meanwhile the commercial sector and Fleet Street press are given out not only identifiable data, but ‘highly sensitive’ data. This just seems plain wrong in terms of security, transparency and respect for the person.

The attitude that there is an entitlement of the State to individuals’ personal data has to go.

The State has pinched 20 m children’s privacy without asking. Tut Tut indeed. [see Very British Problems for a translation].

And while I support the use of public administrative data in deidentified form in safe settings, it’s not to be expected that anything goes. But the feeling of entitlement to access our personal data for purposes other than that for which we consented, is growing, as it stretches to commercial sector data. However suggesting that public feeling measured based on work with 0.0001% of the population, is “wide public support for the use and re-use of private sector data for social research” seems tenuous.

Even so, comments even in that tiny population suggested, “many participants were taken by surprise at the extent and size of data collection by the private sector” and some “felt that such data capture was frequently unwarranted.” “The principal concerns about the private sector stem from the sheer volume of data collected with and without consent from individuals and the profits being made from linking data and selling data sets.”

The Best use of Data: The Future

Young people, despite seniors often saying “they don’t care about privacy” are leaving social media in search of greater privacy.

These things cannot be ignored if the call for digital transformation between the State and the citizen is genuine because try and do it to us and it will fail. Change must be done with us. And ethically.

And not “ethics” as in ‘how to’, but ethics of “should we.” Qualified transparent evaluation as done in other research areas, not an add on, but integral to every project, to look at issues such as:

  • whether participation is voluntary, opt-out or covert
  • how participants can get and give informed consent
  • accessibility to information about the collection and its use
  • small numbers, particularly of vulnerable people included
  • identifiable data collection or disclosure
  • arrangements for dealing with disclosures of harm and recourse
  • and how the population that will bear the risks of participating in the research is likely to benefit from the knowledge derived from the research or not.

Ethics is not about getting away with using personal data in ways that won’t get caught or hauled over the coals by civil society.

It’s balancing risk and benefit in the public interest, and not always favouring the majority, but doing what is right and fair.

We hear a lot at the moment on how the government may see lives, shaped by digital skills, but too little of heir vison for what living will look and feel like, in smart cities of the future.

My starting question is, how does government hope society will live there and is it up to them to design it? If not, who is because these smart-city systems are not designing themselves. You’ve heard of Stepford wives. I wonder what do we do if we do not want to live like Milton Keynes man?

I hope that the world my children will inherit will be more just, more inclusive, and with a more sustainable climate to support food, livelihoods and kinder than it is today. Will ‘smart’ help or hinder?

What is rarely discussed in technology discussions is how the service should look regardless of technology. The technology assumed as inevitable, becomes the centre of service delivery.

I’d like to first understand what is the central and local government vision for “public services”  provision for people of the future? What does it mean for everyday services like schools and health, and how does it balance security and our freedoms?

Because without thinking about how and who provides those services for people, there is a hole in the discussion of “the best use of data” and their improvement “right across the board”.

The UK government has big plans for big data sharing, sharing across all public bodies, some tailored for individual interventions.

While there are interesting opportunities for public benefit from at-scale systems, the public benefit is at risk not only from lack of trust in how systems gather data and use them, but that interoperability in service, and the freedom for citizens to transfer provider, gets lost in market competition.

Openness and transparency can be absent in public-private partnerships until things go wrong. Given the scale of smart-cities, we must have more than hope that data management and security will not be one of those things.

How will we know if new plans are designed well, or not?

When I look at my children’s future and how our current government digital decision making may affect it, I wonder if their future will be more or less kind. More or less fun.

Will they be left with the autonomy to make good choices of their own?

The hassle we feel when we feel watched all the time, by every thing that we own, in every place we go, having to check every check box has a reasonable privacy setting, has a cumulative cost in our time and anxieties.

Smart technology has invaded not only our public space and our private space, but has nudged into our head space.

I for one have had enough already. For my kids I want better. Technology should mean progress for people, not tyranny.

Living in smart cities, connected in the Internet-of-Things, run on their collective Big Data and paid for by commercial corporate providers, threatens not only their private lives and well-being, their individual and independent lives, but ultimately independent and democratic government as we know it.

*****

This is the start of a four part set of thoughts: Beginnings with smart technology and data triggered by the Sprint16 session (part one). I think about this more in depth in “Smart systems and Public Services” (Part two) here, and the design and development of smart technology making “The Best Use of Data” looking at today in a UK company case study (Part three) before thoughts on “The Best Use of Data” used in predictions and the Future (Part four).

Commission on Freedom of Information: submission

Since it appears that the Independent Commission on Freedom of Information [FOI] has not published all of the received submissions, I thought I’d post what I’d provided via email.

I’d answered two of the questions with two case studies. The first on application of section 35 and 36 exemptions and the safe space. The second on the proposal for potential charges.

On the Commission website, the excel spreadsheet of evidence submitted online, tab 2 notes that NHS England asked belatedly for its submission be unpublished.

I wonder why.

Follow up to both these FOI requests are now long overdue in 2016. The first from NHS England for the care.data decision making  behind the 2015 decision not to publish a record of whether part of the board meetings were to be secret. Transparency needs to be seen in action, to engender public trust. After all, they’re deciding things like how care.data and genomics will be at the “heart of the transformation of the NHS.”

The second is overdue at the Department for Education on the legal basis for identifiable sensitive data releases from the National Pupil Database that meets Schedule 3 of the Data Protection Act 1998 to permit this datasharing with commercial third parties.

Both in line with the apparently recommended use of FOI
according to Mr. Grayling who most recently said:

“It is a legitimate and important tool for those who want to understand why and how Government is taking decisions and it is not the intention of this Government to change that”.  [Press Gazette]

We’ll look forward to see whether that final sentence is indeed true.

*******

Independent Commission on Freedom of Information Submission
Question 1: a) What protection should there be for information relating to the internal deliberations of public bodies? b) For how long after a decision does such information remain sensitive? c) Should different protections apply to different kinds of information that are currently protected by sections 35 and 36?

A “safe space” in which to develop and discuss policy proposals is necessary. I can demonstrate where it was [eventually] used well, in a case study of a request I made to NHS England. [1]

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. I asked in October 2014 for NHS England to publish the care.data planning and decision making for the national NHS patient data extraction programme. This programme has been controversial [2]. It will come at great public expense and to date has been harmful to public and professional trust with no public benefit. [3]

NHS England refused my request based on Section 22 [intended for future publication]. [4] However ten months later the meeting minutes had never been published. In July 2015, after appeal, the Information Commissioner issued an Information Notice and NHS England published sixty-three minutes and papers in August 2015.

In these released documents section 36 exemption was then applied to only a tiny handful of redacted comments. This was sufficient to protect the decisions that NHS England had felt to be most sensitive and yet still enable the release of a year’s worth of minutes.

Transparency does not mean that difficult decisions cannot be debated since only outcomes and decisions are recorded, not every part of every discussion verbatim.

The current provision for safe space using these exemptions is effective and in this case would have been no different made immediately after the meeting or one and a half years later.  If anything, publication sooner may have resulted in better informed policy and decision making through wider involvement from professionals and civil society.  The secrecy in the decision making did not build trust.

When policies such as these are found to have no financial business cost-benefit case for example, I believe it is strongly in the public interest to have transparency of these facts, to scrutinise the policy governance in the public interest to enable early intervention when seen to be necessary.
In the words of the Information Commissioner:

“FOIA can rightly challenge and pose awkward questions to public authorities. That is part of democracy. However, checks and balances are needed to ensure that the challenges are proportionate when viewed against all the other vital things a public authority has to do.

“The Commissioner believes that the current checks and balances in the legislation are sufficient to achieve this outcome.” [5]

Given that most public bodies, including NHS England’s Board, routinely publish its minutes this would seem a standard good practice to be expected and I believe routine publication of meeting minutes would have raised trustworthiness of the programme and its oversight and leadership.

The same section 36 exemption could have been applied from the start to the small redactions that were felt necessary balanced against the public interest of open and transparent decision making.

I do not believe more restrictive applications should be made than are currently under sections 35 and 36.

_____________________________________________________________________

Question 6: Is the burden imposed on public authorities under the Act justified by the public interest in the public’s right to know? Or are controls needed to reduce the burden of FoI on public authorities?

As an individual I made 40 requests of schools and 2 from the Department for Education which may now result in benefit for 8 million children and their families, as well as future citizens.

The transparency achieved through these Freedom of Information requests will I hope soon transform the culture at the the Department for Education from one of secrecy to one of openness.

There is the suggestion that a Freedom of Information request would incur a charge to the applicant.

I believe that the benefits of the FOI Act in the public interest outweigh the cost of FOI to public authorities.  In this second example [6], I would ask the Commission to consider if I had not been able to make these Freedom of Information requests due to cost, and therefore I was not able to present evidence to the Minister, Department, and the Information Commissioner, would the panel members support the secrecy around the ongoing risk that current practices pose to children and our future citizens?

Individual, identifiable and sensitive pupil data are released to third parties from the National Pupil Database without telling pupils, parents and schools or their consent. This Department for Education (DfE) FOI request aimed to obtain understanding of any due diligence and the release process: privacy impact and DfE decision making, with a focus on its accountability.

This was to enable transparency and scrutiny in the public interest, to increase the understanding of how our nation’s children’s personal data are used by government, commercial third parties, and even identifiable and sensitive data given to members of the press.

Chancellor Mr. Osborne spoke on November 17 about the importance of online data protection:

“Each of these attacks damages companies, their customers, and the public’s trust in our collective ability to keep their data and privacy safe.”[…] “Imagine the cumulative impact of repeated catastrophic breaches, eroding that basic faith… needed for our online economy & social life to function.”

Free access to FOI enabled me as a member of the public to ask and take action with government and get information from schools to improve practices in the broad public interest.

If there was a cost to this process I could not afford to ask schools to respond.  Schools are managed individually, and as such I requested the answer to the question; whether they were aware of the National Pupil Database and how the Department shared their pupils’ data onwardly with third parties.

I asked a range of schools in the South and East. In order to give a fair picture of more than one county I made requests from a range of types of school – from academy trusts to voluntary controlled schools – 20 primary and 20 secondary.  Due to the range of schools in England and Wales [7] this was a small sample.

Building even a small representative picture of pupil data privacy arrangements in the school system therefore required a separate request to each school.

I would not have been able to do this, had there been a charge imposed for each request.  This research subsequently led me to write to the Information Commissioner’s Office, with my findings.

Were this only to be a process that access costs would mean organisations or press could enter into due to affordability, then the public would only be able to find out what matters or was felt important to those organisations, but not what matters to individuals.

However what matters to one individual might end up making a big difference to many people.

Individuals may be interested in what are seen as minority topics, perhaps related to discrimination according to gender, sexuality, age, disability, class, race or ethnicity.  If individuals cannot afford to  challenge government policies that matter to them as an individual, we may lose the benefit that they can bring when they go on to champion the rights of more people in the country as a whole.

Eight million children’s records, from children aged 2-19 are stored in the National Pupil Database. I hope that due to the FOI request increased transparency and better practices will help restore their data protections for individuals and also re-establish organisational trust in the Department.

Information can be used to enable or constrain citizenship. In order to achieve universal access to human rights to support participation, transparency and accountability, I appeal that the Commission recognise the need for individuals to tackle vested interests, unjust laws and policies.

Any additional barriers such as cost, only serve to reduce equality and make society less just. There is however an immense intangible value in an engaged public which is hard to measure. People are more likely to be supportive of public servant decision making if they are not excluded from it.

Women for example are underrepresented in Parliament and therefore in public decision making. Further, the average gap within the EU pay is 16 per cent, but pay levels throughout the whole of Europe differ hugely, and in the South East of the UK men earn 25 per cent more than their female counterparts. [8]  Women and mothers like me may therefore find it more difficult to participate in public life and to make improvements on behalf of other families and children across the country.

To charge for access to information about our public decision making process could therefore be excluding and discriminatory.

I believe these two case studies show that the Act’s intended objectives, on parliamentary introduction — to ‘transform the culture of Government from one of secrecy to one of openness’; ‘raise confidence in the processes of government, and enhance the quality of decision making by Government’; and to ‘secure a balance between the right to information…and the need for any organisation, including Government, to be able to formulate its collective policies in private’ — work in practice.

If anything, they need strengthened to ensure accessibility.

Any actions to curtail free and equal access to these kinds of information would not be in the public interest and a significant threat to the equality of opportunity offered to the public in making requests. Charging would particularly restrict access to FOI for poorer individuals and communities who are often those already excluded from full public participation in public life.
___________________________________________________________________________

[1] https://www.whatdotheyknow.com/request/caredata_programme_board_minutes
[2] http://www.theguardian.com/society/2014/dec/12/nhs-patient-care-data-sharing-scheme-delayed-2015-concerns
[3] http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers
[4] https://jenpersson.com/wp-content/uploads/2015/11/caredataprogramme_FOI.pdf
[5] https://ico.org.uk/media/about-the-ico/consultation-responses/2015/1560175/ico-response-independent-commission-on-freedom-of-information.pdf
[6] https://jenpersson.com/wp-content/uploads/2015/11/NPD_FOI_submissionv3.pdf
[7] http://www.newschoolsnetwork.org/sites/default/files/Comparison%20of%20school%20types.pdf
[8] http://www.equalpayportal.co.uk/statistics/

Thoughts since #UKHC15. UK health datasharing.

The world you will release your technology into, is the world you are familiar with, which is already of the past. Based on old data.

How can you design tools and systems fit for the future? And for all?

For my 100th post and the first of 2016, here is a summary of some of my thoughts prompted by . Several grains of thought related to UK heath data that have been growing for some time.

1000 words on “Hard things: identity, data sharing and consent.” The fun run version.

Do we confuse hard with complex? Hard does not have to mean difficult. Some things seem to be harder than necessary, because of politics. I’ve found this hard to write. Where to start?

The search to capture solutions has been elusive.

The starting line: Identity

Then my first thoughts on identity got taken care of by Vinay Gupta in this post, better than I could. (If you want a long read about identity, you might want to get a hot drink like I did and read and re-read. It says it’ll take an hour. It took me several, in absorption and thinking time. And worth it.)

That leaves data sharing and consent. Both of which I have written many of my other 99 posts about in the last year. So what’s new?

Why are we doing this: why aren’t we there yet?

It still feels very much that many parts of the health service and broader government thinking on ‘digital’ is we need to do something. Why is missing, and therefore achieving and measuring success is hard.

Often we start with a good idea and set about finding a solution how to achieve it. But if the ‘why’ behind the idea is shaky to start with, the solution may falter, as soon as it gets difficult. No one seems to know what #paperless actually means in practice.

So why try and change things? Fixing problems, rather than coming up with good ideas is another way to think of it as they suggested at  #ukhc15, it was a meet-up for people who want to make things better, usually for others, and sometimes that involves improving the systems they worked with directly, or supported others in.

I no longer work in systems’ introductions, or enhancement processes, although I have a lay role in research and admin data, but regular readers know, most of the last two years has been all about the data.  care.data.

More often than not, in #ukhc2015 discussions that focused on “the data” I would try and bring people back to thinking about what the change is trying to solve, what it wants to “make better” and why.

There’s a broad tendency to simply think more data = better. Not true, and I’ll show later a case study why. We must question why.

Why doesn’t everyone volunteer or not want to join in?

Very many people who have spoken with me over the last two years have shared their concrete concerns over the plans to share GP data and they do not get heard. They did not see a need to share their identifiable personal confidential data, or see why truly anonymous data would not be sufficient for health planning, for example.

Homeless men, and women at risk, people from the travelling community, those with disabilities, questions on patients with stigmatising conditions, minorities, children, sexual orientation – not to mention from lawyers or agencies representing them. Or the 11 million of our adult population not online. Few of whom we spoke about. Few of whom we heard from at #ukhc15. Yet put together, these individuals make up not only a significant number of people, but make up a disproportionately high proportion of the highest demands on our health and social care services.

The inverse care law appears magnified in its potential when applied to digital, and should magnify the importance of thinking about access. How will care.data make things better for them, and how will the risks be mitigated? And are those costs being properly assessed if there is no assessment of the current care.data business case and seemingly, since 2012 at least, no serious effort to look at alternatives?

The finish line? We can’t see what it looks like yet.

The #ukhc2015 event was well run, and I liked the spontaneity of people braver than me who were keen to lead sessions and did it well.  As someone who is white, living in a ‘nice’ area, I am privileged. It was a privilege to spend a day with #UKHC15 and packed with people who clearly think about hard things all the time. People who want to make things better.  People who were welcoming to nervous first-timers at an ‘un’conference over a shared lunch.

I hope the voices of those who can’t attend these events, and outside London, are equally accounted for in all government 2016 datasharing plans.

This may be the last chance after years of similar consultations have failed to deliver workable, consensual public data sharing policies.

We have vast streams of population-wide data stored in the UK, about which, the population is largely ignorant. But while the data may be from 25 years ago, whatever is designed today is going to need to think long term, not how do we solve what we know, but how do we design solutions that will work for what we don’t.

Transparency here will be paramount to trust if future decisions are made for us, or those we make for ourselves are ‘influenced’ by machine learning, by algorithms, machine learning and ‘mindspace’ work.

As Thurgood Marshall said,

“Our whole constitutional heritage rebels at the thought of giving government the power to control men’s minds.”

Control over who we are and who the system thinks we are becomes a whole new level of discussion, if we are being told how to make a decision, especially where the decision is toward a direction of public policy based on political choice. If pensions are not being properly funded, to not allocate taxes differently and fund them, is a choice the current government has made, while the DWP seeks to influence our decison, to make us save more in private pensions.

And how about in data discussions make an effort to start talking a little more clearly in the same terms – and stop packaging ‘sharing’ as if it is something voluntary in population-wide compulsory policy.

It’s done to us, not with us, in far too many areas of government we do not see. Perhaps this consultation might change that, but it’s the ‘nth’ number of consulations and I want to be convinvced this one is intentional of real change. It’s only open for a few weeks, and this meet up for discussion appeared to be something only organised in London.

I hope we’ll hear committment to real change in support of people and the uses of our personal data by the state in the new #UkDigiStrategy, not simply more blue skythinking and drinking the ‘datasharing’ kool-aid.  We’ve been talking in the UK for far too long about getting this right.

Let’s see the government serious about making it happen. Not for government, but in the public interest, in a respectful and ethical partnership with people, and not find changes forced upon us.

No other foundation will be fit for a future in which care.data, the phenotype data, is to be the basis for an NHS so totally personalised.

If you want a longer read, read on below for my ten things in detail.

Comment welcome.

########

Hard things: The marathon version, below.
Continue reading Thoughts since #UKHC15. UK health datasharing.

Act now: Stand up and speak out for your rights to finding out the facts #saveFOI

The Freedom of Information Act has enabled me to stand up for my children’s rights. It really matters to me. And we might lose it.

For every member of the public, who has ever or who has never used their rights under the Freedom of Information Act laws, the government consultation on changing them that closes today is worth caring about. If you haven’t yet had your say, go and take action here >> now.  If it is all you have time for before the end of today, you can sign 38 degrees petition or write an email to your MP.

Or by the end of today you can reply to the call for evidence. There is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also got this plain English version.

Please do. Now. It closes today, on November 20th.

If you need convinced why it matters to me and it should to you, read on.

What will happen

If the proposed changes come to pass, information about public accountability will be lost. Political engagement will not be open to all equally. It will promote an unfair society in which individuals are not only prevented from taking part in full public life, but prevented from understanding decisions made about them or that affect them. Campaign groups will be constrained from standing up for human rights by cost.  The press will be restrained in what they can ask.

MySociety has a brilliant summary.  Michael Sheen spoke up calling it “nothing short of a full frontal attack” on the principles of democratic government. And Tom Watson spoke of three serious instances where facts would have stayed hidden, were it not for access made using the law of Freedom of Information:

1. death rates in cardiac patient care
2. cases when the police use Tasers on children
3. the existence of cracks in the nuclear power station at Hinckley

Why does FOI matter to me personally? In Education.

Because it’s enabled me to start a conversation to get the Department for Education to start to improve their handling of our 8 million children’s personal and sensitive data they hold in the National Pupil Database for England and Wales. Through FOI I asked for unpublished facts how many releases of identifiable personal data of school pupils have been fast-tracked at the Department of Education without panel oversight. And to see the panel terms of reference which are still not on their website.

The request: whatdotheykknow.com
The outcome:
National Pupil Database FOI case study summary here.

I’m now coordinating calls for changes on behalf of the 8m children whose records they hold and parents across the country.

******

Why does FOI matter to me personally? In Health.

Because Freedom of Information law has enabled public transparency of decision making and accountability of the care.data programme board decision making that was kept secret for over a year. NHS England refused to publish them. Their internal review declined appeal. The Information Commissioner’s Office upheld it.

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. In fact my case study, while highlighting that NHS England refused to release information, also shows that only a handful of genuine redactions were necessary, using Section 36 to keep them hidden, when the minutes were finally released.

In October 2014 I simply wanted to see the meeting minutes form part of the public record of care.data planning. I wanted to see the cost-benefit business case and scrutinise it against the benefits case that the public were told of at every public engagement event I had been to.  When at every turn the public is told how little money the NHS can afford to spend I wanted scrutiny of what the programme would cost at national and local levels. It was in the public interest to better inform public debate about the merits of the national programme. And I strongly believe that it is in the public interest to be informed and fully understand the intention and programme that demands the use of sensitive personal data.

The request: whatdotheyknow.com
The outcome: care.data FOI case study summary here.

Others could use this information I hoped, to ask the right questions about missing meeting minutes and transparency, and for everyone to question why there was no cost-benefit business plan at all in private; while the public kept being told of the benefits.  And it shows that data collection is further set to expand, without public debate.

Since then the programme has been postoned again and work is in progress on improved public engagement to enable public and professional confidence.

What has Freedom of Information achieved?

One of the most memorable results of Freedom of Information was the MPs expenses scandal. Who knows how much this Freedom of Information Request saved the taxpayers in immeasurable amounts of future spending on duck houses since MPs have been required to publish expenses since 2010? Four MPs were jailed for false accounting. Peers were expelled. Second homes and what appeared to the public as silly spending on sundries were revealed. Mr. Cameron apologized in 2009, saying he was “appalled” by the expenses. The majority of MPs had done nothing illegal but the Freedom of Information request enabled the start of a process of increased transparency to the public which showed where activities, while permitted by law, were simply unethical or unreasonable.

Historical record

Information published under the Freedom of Information Act can help to ensure that important records of decision-making processes are retained as part of the historic background to government.

Increased trust

The right information at the right time helps make better decisions, make spending more transparent and makes policies and practices more trustworthy.

Access to official information can also improve public confidence where public sector bodies are seen as being open. In a 2011 survey carried out on behalf of the Information Commissioner’s Office, 81% of public bodies questioned agreed that the Act had increased the public’s trust in their organisation.

A key argument made by the commission is that those in public office need private space for decision making. The Information Commissioner’s Office countered this in their submission to the consultation saying,

“there is a distinction between a need for a private space, depending on the circumstances and a desire for secrecy across a broad area of public sector activity. It was the latter tendency that FOIA was intended to correct.”

So how much more “private space” do public servants need?

Holding back information

When information that are judged should not be released in the public interest, there are already exemptions that can be applied to prevent disclosure of information under the Freedom of Information Act. [1]

The exemptions include:

  • if the information can easily be accessed by other means – e.g. the internet or published documents
  • if the information is personal information
  • if the information is provided in confidence (but only if legally enforceable)
  • when there is a legal reason not to disclose
  • if the information is about national security, defence, the economy, law enforcement, formulation of Government policy, health and safety, communications with Her Majesty or other royalty, international relations, intended for future publication and commercial interests. (All exemptions in this group must be tested to see if disclosure is in the public interest.)

In addition to these exemptions, organisations can withhold information if it will take more than two-and-a-half days to provide it, or they cannot identify what information is needed (although they have to work with the requester to clarify what is being requested).

They can also withhold information if they decide the request is vexatious.

Does it cost us too much to administer?

Some people who are supportive of these changes say they are concerned about costs in answering requests but have perhaps not considered the savings in exceptional cases (like the Expenses Scandal outcome). And as mySociety has reported [2], money spent responding to Freedom of Information requests also needs to be considered fairly in the context of wider public spending. In 2012 it was reported that Staffordshire County Council had spent £38,000 in a year responding to Freedom of Information requests. The then Director of mySociety, Tom Steinberg, commented:

“From this I can see that oversight by citizens and journalists cost only £38,000 from a yearly total budget of £1.3bn. I think it is fantastic that Staffordshire County Council can provide such information for only 0.002 per cent of its operating budget.”

Why does the government want to make itself less transparent? Even the Information Commissioner’s office has replied to the consultation to say that the Commissioner does not consider that significant changes to the core principles of the legislation are needed. This is a good law, that gives the public rights in our favour and transparency into how we are governed and tax money spent.

How will the value of FOI be measured of what would be lost if the changes are made?

What can you do?

The call for evidence is here and there is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also put together this super-easy Plain English version.

To have your say in the consultation closing on November 20th go online.

Or simply call or write to your MP.  Today. This really matters.


References:

[1] Requests can be refused https://ico.org.uk/for-organisations/guide-to-freedom-of-information/refusing-a-request/

[2] MySociety opposes restrictions https://www.mysociety.org/2015/11/11/voices-from-whatdotheyknow-why-we-oppose-foi-act-restrictions/

[3] National Pupil Database FOI case study summary here

[4] My care.data programme board FOI case study summary here

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex. Too little, too late.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex ed. They should be discussing prevention and personal data protection for all our personal data, not just one company, after the event.

Everyone’s been talking about TalkTalk and for all the wrong reasons. Data loss and a 15-year-old combined with a reportedly reckless response to data protection, compounded by lack of care.

As Rory Cellan-Jones wrote [1] rebuilding its reputation with customers and security analysts is going to be a lengthy job.

In Parliament Chi Onwarah, Shadow Minister for Culture & the Digital Economy, summed up in her question, asking the Minister to acknowledge “that all the innovation has come from the criminals while the Government sit on their hands, leaving it to businesses and consumers to suffer the consequences?”  [Hansard 2]

MPs were concerned for the 4 million* customers’ loss of name, date of birth, email, and other sensitive data, and called for an inquiry. [It may now be fewer*.] [3] The SciTech committee got involved too.

I hope this means Parliament will talk about TalkTalk not as the problem to be solved, but as one case study in a review of contemporary policy and practices in personal data handling.

Government spends money in data protection work in the [4] “National Cyber Security Programme”. [NCSP] What is the measurable outcome – particularly for TalkTalk customers and public confidence – from its £860M budget?  If you look at the breakdown of those sums, with little going towards data protection and security compared with the Home Office and Defence, we should ask if government is spending our money in an appropriately balanced way on the different threats it perceives. Keith Vaz suggested British companies that lose £34 billion every year to cybercrime. Perhaps this question will come into the inquiry.

This all comes after things have gone wrong.  Again [5]. An organisation we trusted has abused that trust by not looking after data with the stringency that customers should be able to expect in the 21st century, and reportedly not making preventative changes, apparent a year ago. Will there be consequences this time?

The government now saying it is talking about data protection and consequences, is like saying they’re talking sex education with teens, but only giving out condoms to the boys.

It could be too little too late. And they want above all to avoid talking about their own practices. Let’s change that.

Will this mean a review to end risky behaviour, bring in change, and be wiser in future?

If MPs explore what the NCSP does, then we the public, should learn more about what government’s expectations of commercial companies is in regards modern practices.

In addition, any MPs’ inquiry should address government’s own role in its own handling of the public’s personal data. Will members of government act in a responsible manner or simply tell others how to do so?

Public discussion around both commercial and state use of our personal data, should mean genuine public engagement. It should involve a discussion of consent where necessary for purposes  beyond those we expect or have explained when we submit our data, and there needs to be a change in risky behaviour in terms of physical storage and release practices, or all the talk, is wasted.

Some say TalkTalk’s  practices mean they have broken their contract along with consumer trust. Government departments should also be asking whether their data handling would constitute a breach of the public’s trust and reasonable expectations.

Mr Vaizey should apply his same logic to government handling data as he does to commercial handling. He said he is open to suggestions for improvement. [6]

Let’s not just talk about TalkTalk.

    • Let’s Talk Consequences: organisations taking risk seriously and meaningful consequences if not [7]
    • Let’s Talk Education: the education of the public on personal data use by others and rights and responsibilities we have [8]
    • Let’s Talk Parliament’s Policies and Practices: about its own complementary lack of data  understanding in government and understand what good practice is in physical storage, good governance and transparent oversight
    • Let’s Talk Public Trust: and the question whether government can be trusted with public data it already has and whether its current handling makes it trustworthy to take more [9]

Vaizey said of the ICO now in his own department: “The Government take the UK’s cyber-security extremely seriously and we will continue to do everything in our power to protect organisations and individuals from attacks.”

“I will certainly meet the Information Commissioner to look at what further changes may be needed in the light of this data breach. [..] It has extensive powers to take action and, indeed, to levy significant fines. “

So what about consequences when data are used in ways the public would consider a loss, and not through an attack or a breach, but government policy? [10]

Let’s Talk Parliament’s Policies and Practices

Commercial companies are not alone in screwing up the use and processing [11] management of our personal data. The civil service under current policy seems perfectly capable of doing by itself. [12]

Government data policy has not kept up with 21st century practices and to me seems to work in the dark, as Chi Onwarah said,

‘illuminated by occasional flashes of incompetence.’

This incompetence can risk harm to people’s lives, to business and to public confidence.

And once given, trust would be undermined by changing the purposes or scope of use for which it was given, for example as care.data plans to do after the pilot. A most risky idea.

Trust in these systems, whether commercial or state, is crucial. Yet reviews which highlight this, and make suggestions to support trust such as ‘data should never be (and currently is never) released with personal identifiers‘ in The Shakespeare Review have been ignored by government.

Where our personal data are not used well in government departments by the department themselves, they seem content to date to rely on public ignorance to get away with current shoddy practices.

Practices such as not knowing who all your customers are, because they pass data on to others. Practices, such as giving individual level identifiable personal data to third parties without informing the public, or asking for consent. Practices, such as never auditing or measuring any benefit of giving away others personal data.

“It is very important that all businesses, particularly those handling significant amounts of sensitive customer data, have robust procedures in place to protect those data and to inform customers when there may have been a data breach.” Ed Vaizey, Oct 26th, HOC

If government departments prove to be unfit to handle the personal data we submit in trust to the state today, would we be right to trust them with even more?

While the government is busy wagging fingers at commercial data use poor practices, the care.data debacle is evidence that not all its MPs or civil service understand how data are used in commercial business or through government departments.

MPs calling for commercial companies to sharpen up their data protection must understand how commercial use of data often piggy-backs the public use of our personal data, or others getting access to it via government for purposes that were unintended.

Let’s Talk Education

If the public is to understand how personal data are to be kept securely with commercial organisations, why should they not equally ask to understand how the state secures their personal data? Educating the public could lead to better engagement with research, better understanding of how we can use digital services and a better educated society as a whole. It seems common sense.

At a recent public event [13],  I asked civil servants talking about big upcoming data plans they announced, linking school data with more further education and employment data, I asked how they planned to involve the people whose data they would use. There was no public engagement to mention. Why not? Inexcusable in this climate.

Public engagement is a matter of trust and developing understanding in a relationship. Organisations must get this right.[14]

If government is discussing risky practices by commercial companies, they also need to look closer to home and fix what is broken in government data handling where it exposes us to risk through loss of control of our personal data.

The National Pupil Database for example, stores and onwardly shares identifiable individual sensitive data of at least 8m children’s records from age 2 -19. That’s twice as big as the TalkTalk loss was first thought to be.

Prevention not protection is what we should champion. Rather than protection after the events,  MPs and public must demand emphasis on prevention measures in our personal data use.

This week sees more debate on how and why the government will legislate to have more powers to capture more data about all the people in the country. But are government policy, process and practices fit to handle our personal data, what they do with it and who they give it to?

Population-wide gathering of data surveillance in any of its many forms is not any less real just because you don’t see it. Children’s health, schools, increases in volume of tax data collection. We don’t discuss enough how these policies can be used every day without the right oversight. MPs are like the conservative parents not comfortable talking to their teens about sleeping with someone. Just because you don’t know, it doesn’t mean they’re not doing it. [15] It just means you don’t want to know because if you find out they’re not doing it safely, you’ll have to do something about it.

And it might be awkward. (Meanwhile in schools real, meaningful PHSE has been left off the curriculum.)

Mr. Vaizey asked in the Commons for suggestions for improvement.

My suggestion is this. How government manages data has many options. But the principle should be simple. Our personal data needs not only protected, but not exposed to unnecessary risk in the first place, by commercial or state bodies. Doing nothing, is not an option.

Let’s Talk about more than TalkTalk

Teens will be teens. If commercial companies can’t manage their systems better to prevent a child successfully hacking it, then it’s not enough to point at criminal behaviour. There is fault to learn from on all sides. In commercial and state uses of personal data.

There is talk of new, and bigger, data sharing plans. [16]

Will the government wait to see  and keep its fingers crossed each month to see if our data are used safely at unsecured settings with some of these unknown partners data might be onwardly shared with, hoping we won’t find out and they won’t need to talk about it, or have a grown up public debate based on public education?

Will it put preventative measures in place appropriate to the sensitivity and volume of the data it is itself responsible for?

Will moving forward with new plans mean safer practices?

If government genuinely wants our administrative data at the heart of digital government fit for the 21st century, it must first understand how all government departments collect and use public data. And it must educate the public in this and commercial data use.

We need a fundamental shift in the way the government respects public opinion and shift towards legal and privacy compliance – both of which are lacking.

Let’s not talk about TalkTalk. Let’s have meaningful grown up debate with genuine engagement. Let’s talk about prevention measures in our data protection. Let’s talk about consent. It’s personal.

******

[1] Questions for TalkTalk: http://www.bbc.co.uk/news/technology-34636308

[2] Hansard: http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm151026/debtext/151026-0001.htm#15102612000004

[3] TalkTalk update: http://www.talktalkgroup.com/press/press-releases/2015/cyber-attack-update-tuesday-october-30-2015.aspx

[4] The Cyber Security Programme: http://www.civilserviceworld.com/articles/feature/depth-look-national-cyber-security-programme

[5] Paul reviews TalkTalk; https://paul.reviews/value-security-avoid-talktalk/

[6] https://ico.org.uk/for-organisations/guide-to-data-protection/conditions-for-processing/

[7] Let’s talk Consequences: the consequences of current failures to meet customers’ reasonable expectations of acceptable risk, are low compared with elsewhere.  As John Nicolson (East Dunbartonshire) SNP pointed out in the debate, “In the United States, AT&T was fined £17 million for failing to protect customer data. In the United Kingdom, the ICO can only place fines of up to £500,000. For a company that received an annual revenue of nearly £1.8 billion, a fine that small will clearly not be terrifying. The regulation of telecoms must be strengthened to protect consumers.”

[8] Let’s talk education: FOI request revealing a samples of some individual level data released to members of the press: http://www.theyworkforyou.com/debates/?id=2015-10-26b.32.0

The CMA brought out a report in June, on the use of consumer data, the topic should be familiar in parliament, but little engagement has come about as a result. It suggested the benefit:

“will only be realised if consumers continue to provide data and this relies on them being able to trust the firms that collect and use it”, and that “consumers should know when and how their data is being collected and used and be able to decide whether and how to participate. They should have access to information from firms about how they are collecting, storing and using data.”

[9] Let’s Talk Public Trust – are the bodies involved Trustworthy? Government lacks an effective data policy and is resistant to change. Yet it wants to collect ever more personal and individual level for unknown purposes from the majority of 60m people, with an unprecedented PR campaign.  When I heard the words ‘we want a mature debate’ it was reminiscent of HSCIC’s ‘intelligent grown up debate’ requested by Kinglsey Manning, in a speech when he admitted lack of public knowledge was akin to a measure of past success, and effectively they would rather have kept the use of population wide health data ‘below the radar’.

Change: We need change, the old way after all, didn’t work, according to Minister Matt Hancock: “The old model of government has failed, so we will build a new one.” I’d like to see what that new one will look like. Does he mean to expand only data sharing policy, or the powers of the civil service?

[10] National Pupil Database detailed data releases to third parties https://www.whatdotheyknow.com/request/pupil_data_national_pupil_databa

[11] http://adrn.ac.uk/news-events/latest-news/adrn-rssevent

[12] https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[13] https://www.liberty-human-rights.org.uk/human-rights/privacy/state-surveillance

[14] http://www.computerweekly.com/news/4500256274/Government-will-tackle-barriers-to-sharing-and-linking-data-says-Cabinet-Office-minister-Hancock

care.data: delayed or not delayed? The train wreck that is always on time

If you cancel a train does it still show up in the delayed trains statistics?

care.data plans are not delayed (just don’t ask Healthwatch)

Somerset CCG’s announcement [1] of the delay in their care.data plans came as no surprise, except perhaps to NHS England who effectively denied it, reportedly saying work continues. [2] Both public statements may be true but it would have been good professional practice to publicly recognise that a top down delay affects others who are working hard on the ground to contribute to the effective rollout of the project. Causing confusion and delay is hard to work with. Change and technology projects run on timelines. Deadlines mean that different teams can each do their part and the whole gets done. Or not.

Healthwatch [3] has cancelled their planned public meetings.  Given that one of the reasons stated in the care.data CCG selection process was support from local patient groups including Healthwatch, this appears poor public relations. It almost wouldn’t matter, but in addition to the practicalities, the organisation and leadership are trying to prove it is trustworthy. [4]


HW_cancels


Somerset’s statement is straightforward and says it is applies to all pathfinders: 

“Following a speech by Jeremy Hunt, the Secretary of State for Health this week (3-9-15), in which he outlined his vision for the future use of technology across NHS, NHS England has asked the four care.data pathfinder pilots areas in England (Leeds, Blackburn and Derwent, West Hampshire and Somerset) to temporarily pause their activities.” [Sept 4, Somerset statement]


somerset


From when I first read of the GPES IAG concerns [5] I have seen the care.data programme hurtle from one crisis to another. But this is now a train wreck. A very quiet train wreck. No one has cried out much.[6] And yet I think the project,  professionals, and the public should be shouting from the top of the carriages that this programme needs help if it is ever to reach its destination.

care.data plans are not late against its business plan (there is none)

Where’s the business case? Why can’t it define deadlines that it can achieve?  In February 2015, I suggested the mentality that allows these unaccountable monster programmes to grow unchecked must die out.

I can’t even buy an Oyster card if I don’t know if there is money in my pocket. How can a programme which has already spent multi millions of pounds keep driving on without a budget? There is no transparency of what financial and non-financial benefits are to be expected to justify the cost. There is no accountable public measure of success checking it stays on track.

While it may be more comfortable for the organisation to deny problems, I do not believe it serves the public interest to hide information. This is supported by the very reason for being of the MPA process and its ‘challenge to Whitehall secrecy‘ [7] who rated the care.data rollout red [8] in last years audit. This requires scrutiny to find solutions.

care.data plans do not need to use lessons learned (do they?)

I hope at least there are lessons learned here in the pathfinder on what not to do before the communications rollout to 60m people.  In the words of Richard Feynman, “For successful technology, reality must take precedence over public relations.”

NHS England is using the public interest test to withhold information: “the particular public interest in preserving confidential communications between NHS England and its sponsoring department [the DH].”  I do not believe this serves the public interest if it is used to hide issues and critical external opinion. The argument made is that there is “stronger public interest in maintaining the exemption where it allows the effective development of policy and operational matters on an ongoing basis.”  The Public Accounts Committee in 2013 called for early transparency and intervention which prevents the ongoing waste of “billions of pounds of taxpayers’ money” in their report into the NPfIT. [9] It showed that a lack of transparency and oversight contributed to public harm, not benefit, in that project, under the watch of the Department of Health. The report said:

“Parliament needs to be kept informed not only of what additional costs are being incurred, but also of exactly what has been delivered so far for the billions of pounds spent on the National Programme. The benefits flowing from the National Programme to date are extremely disappointing. The Department estimates £3.7 billion of benefits to March 2012, just half of the costs incurred. This saga [NPfIT] is one of the worst and most expensive contracting fiascos in the history of the public sector.”

And the Public Accounts Committee made a recommendation in 2013:

“If the Department is to deliver a paperless NHS, it needs to draw on the lessons from the National Programme and develop a clear plan, including estimates of costs and benefits and a realistic timetable.” [PAC 2013][9]

Can we see any lessons drawn on today in care.data? Or any in Jeremy Hunt’s speech or his refusal to comment on costs for the paperless NHS plans reported by HSJ journal at NHSExpo15?

While history repeats itself and “estimates of costs and benefits and a realistic timetable” continue to be absent in the care.data programme, the only reason given by Somerset for delay is to fix the specific issue of opt out:

“The National Data Guardian for health and care, Dame Fiona Caldicott, will… provide advice on the wording for a new model of consents and opt-outs to be used by the care.data programme that is so vital for the future of the NHS. The work will be completed by January [2016]…”

Perhaps delay will buy NHS England some time to get itself on track and not only respect public choice on consent, but also deliver a data usage report to shore up trust, and tell us what benefits the programme will deliver that cannot already be delivered today (through existing means, like the CPRD for research [10]).

Perhaps.

care.data plans will only deliver benefits (if you don’t measure costs)

I’ve been told “the realisation of the benefits, which serve the public interest, is dependent on the care.data programme going ahead.” We should be able to see this programme’s costs AND benefits. It is we collectively after all who are paying for it, and for whom we are told the benefits are to be delivered. DH should release the business plan and all cost/benefit/savings  plans. This is a reasonable thing to ask. What is there to hide?

The risk has been repeatedly documented in 2014-15 board meetings that “the project continues without an approved business case”.

The public and medical profession are directly affected by the lack of money given by the Department of Health as the reason for the reductions in service in health and social care. What are we missing out on to deliver what benefit that we do not already get elsewhere today?

On the pilot work continuing, the statement from NHS England reads: “The public interest is best served by a proper debate about the nature of a person’s right to opt out of data sharing and we will now have clarity on the wording for the next steps in the programme,” 

I’d like to see that ‘proper debate’ at public events. The NIB leadership avoids answering hard questions even if asked in advance, as requested. Questions such as mine go unanswered::

“How does NHS England plan to future proof trust and deliver a process of communications for the planned future changes in scope, users or uses?”

We’re expected to jump on for the benefits, but not ask about the cost.

care.data plans have no future costs (just as long as they’re unknown)

care.data isn’t only an IT infrastructure enhancement and the world’s first population wide database of 60m primary care records. It’s a massive change platform through which the NHS England Commissioning Board will use individual level business intelligence to reshape the health service. A massive change programme  that commodifies patient confidentiality as a kick-starter for economic growth.  This is often packaged together with improvements for patients, requirements for patient safety, often meaning explanations talk about use of records in direct care conflated with secondary uses.

“Without interoperable digital data, high quality effective local services cannot be delivered; nor can we achieve a transformation in patient access to new online services and ‘apps’; nor will the NHS maximise its opportunity to be a world centre in medical science and research.” [NHS England, September 1 2015] 

So who will this transformation benefit? Who and what are all its drivers? Change is expensive. It costs time and effort and needs investment.

Blackburn and Darwen’s Healthwatch appear to have received £10K for care.data engagement as stated in their annual report. Somerset’s less clear. We can only assume that Hampshire, expecting a go live ‘later in 2015’ has also had costs. Were any of their patient facing materials already printed for distribution, their ‘allocated-under-austerity’ budgets spent?

care.data is not a single destination but a long journey with a roadmap of plans for incremental new datasets and expansion of new users.

The programme should already know and be able to communicate the process behind informing the public of future changes to ensure future use will meet public expectations in advance of any change taking place. And we should know who is going to pay for that project lifetime process, and ongoing change management. I keep asking what that process will be and how it will be managed:

June 17 2015, NIB meeting at the King’s Fund Digital Conference on Health & Social Care:

june17

September 2 2015, NIB Meeting at NHS Expo 15:

NIBQ_Sept

It goes unanswered time and time again despite all the plans and roadmaps and plans for change.

These projects are too costly to fail. They are too costly to justify only having transparency applied after the event, when forced to do so.

care.data plans are never late (just as long as there is no artificial deadline)

So back to my original question. If you cancel a train does it still show up in the delayed trains statistics? I suppose if the care.data programme claims there is no artificial deadline, it can never be late. If you stop setting measurable deadlines to deliver against, the programme can never be delayed. If there is no budget set, it can never be over it. The programme will only deliver benefits, if you never measure costs.

The programme can claim it is in the public interest for as long as we are prepared to pay with an open public purse and wait for it to be on track.  Wait until data are ready to be extracted, which the notice said:

…” is thought to remain a long way off.” 

All I can say to that, is I sure hope so. Right now, it’s not fit for purpose. There must be decisions on content and process arrived at first. But we also deserve to know what we are expecting of the long journey ahead.

On time, under budget, and in the public interest?

As long as NHS England is the body both applying and measuring the criteria, it fulfils them all.

*******

[1] Somerset CCG announces delay to care.data plans https://www.somersetlmc.co.uk/caredatapaused

[2] NHS England reply to Somerset announcement reported in Government Computing http://healthcare.governmentcomputing.com/news/ccg-caredata-pilot-work-continues-4668290

[3] Healthwatch bulletin: care.data meetings cancelled http://us7.campaign-archive1.com/?u=16b067dc44422096602892350&id=5dbdfc924c

[4] Building public trust: after the NIB public engagement in Bristol https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[5] GPES IAG http://www.hscic.gov.uk/media/12911/GPES-IAG-Minutes-12-September-2013/pdf/GPES_IAG_Minutes_12.09.13.pdf

[6] The Register – Right, opt out everybody! hated care.data programme paused again http://www.theregister.co.uk/2015/09/08/hated_caredata_paused_again_opt_out/

[7] Pulse Today care.data MPA rating http://www.pulsetoday.co.uk/your-practice/practice-topics/it/caredata-looks-unachievable-says-whitehall-watchdog/20010381.article#.VfMXYlbtiyM

[8] Major Projects Authority https://engage.cabinetoffice.gov.uk/major-projects-authority/

[9] The PAC 2013 ttp://www.parliament.uk/business/committees/committees-a-z/commons-select/public-accounts-committee/news/npfit-report/

[10] Clinical Practice Research Datalink (CPRD)

***

image source: http://glaconservatives.co.uk/news/london-commuters-owed-56million-in-unclaimed-refunds-by-rail-operators/

 

Building Public Trust [5]: Future solutions for health data sharing in care.data

This wraps up my series of thoughts on ‘Building Public Trust’ since the NIB Bristol meeting on July 24th.

It has looked at how to stop chasing public trust and instead the need to become organisations that can be trustworthy [part 1]. What behaviours make an organisation trustworthy [part 2]. Why fixing the Type 2 opt out is a vital first step [part 3], and why being blinded by ‘the benefits’ is not the answer [part 4], but giving balanced and fair explanations of programme purposes, commissioning and research, is beneficial to communicate.

So I want to wrap up by suggesting how communications can be improved in content and delivery. Some ideas will challenge the current approach.

Here in part five: Future solutions, I suggest why aiming to “Build Public Trust” through a new communications approach may work better for the public than the past. I’ll propose communications on care.data:

  • Review content:  what would ethical, accurate content look like
  • Strengthen relationships for delivery: don’t attempt to rebuild trust where there is now none, but strengthen the channels that are already viewed by the public to be trustworthy
  • Rethink why you communicate and the plan for when: All communications need delivered through a conversation with real listening and action based upon it. Equal priority must be given to both a communications plan for today and for the future. It must set out a mechanism for future change communications now,  before the pathfinders begin
  • Since writing this, the Leeds area CCGs have released their ‘data sharing’ comms leaflet. I have reviewed this in detail and give my opinions as a case study.

NIB workstream 4, underpins the NHS digital future,  and aims to build and sustain public trust, delivering plans for consent based information sharing and assurance of safeguards. It focuses on 4 areas: governance and oversight, project risks, consent and genomics:

“The work will begin in 2015 and is expected to include deliberative groups to discuss complex issues and engagement events, as well as use of existing organisations and ways to listen. There will also be a need to listen to professional audiences.”  [NIB work stream 4] [ref 1]

Today’s starting point in trust, trust that enables two-way communication, could hardly be worse, with professionals and public audiences. Communications are packaged in mistrust:

“Relations between the doctors’ union and Health Secretary Jeremy Hunt hit a new low following his announcement in July that he was prepared to impose seven-day working on hospital doctors in England.” [BBC news, Aug 15, 2015]

There appears to be divided opinion between politicians and civil servants.

Right now, the Department of Health seems to be sabotaging its own plans for success at every turn.

What reason can there be for denying debate in the public domain of the very plans it says are the life blood of the savings central to the NHS future?

Has the Department learned nothing from the loss of public and professional trust in 2014?

And as regards the public in engagement work, Hetan Shah, executive director of the Royal Statistical Society said in 2014, “Our research shows a “data trust deficit”. In this data rich world, companies and government have to earn citizens’ trust in how they manage and use data – and those that get it wrong will pay the price.’ [RSS Data Trust Deficit, lessons for policymakers, 2014] [2]

Where do the NIB work stream discussions want to reach by 2020?

“The emergence of genomics requires a conversation about what kind of consent is appropriate by 2020. The work stream will investigate a strand of work to be led by an ethicist.” [NIB work stream 4]

Why is genomics here in workstream 4, when datasharing for genomics is with active consent from volunteers? Why will a strand of work be led by an ethicist for this, and not other work strands? Is there a gap in how their consent is managed today or in how consent is to be handled for genomics for the future? It seems to me there is a gap in what is planned and what the public is being told here. It is high time for an overdue public debate on what future today’s population-wide data sharing programme is building. Good communication must ensure there are no surprises.

The words I underlined from the work stream 4 paper, highlight the importance of communication; to listen and to have a conversation. Despite all the engagement work of 2014 I feel that is still to happen. As one participant summed up later, “They seem hell bent on going ahead. I know they listened, but what did they hear?” [3]

care.data pathfinder practices are apparently ready to roll out communications materials: “Extraction is likely to take place between September and November depending on how fair processing testing communications was conducted” [Blackburn and Darwen HW]

So what will patient facing materials look like in content? How will they be rolled out?

Are pathfinder communications more robust than 2014 materials?

I hope the creatives will also think carefully, what is the intent of communications to be delivered.  Is it to fully and ethically inform patients about their choice whether to accept or opt out from changes in their data access, management, use and oversight? Or is the programme guidance to minimise the opt out numbers?

The participants are not signing up to a one time, single use marketing campaign, but to a lifetime of data use by third parties. Third parties who remain in role and purposes, loosely defined.

It is important when balancing this decision not to forget that data  that is available and not used wisely could fail to mitigate risk; for example in identifying pharmaceutical harms.

At the same time to collect all data for all purposes under that ‘patient safety and quality’ umbrella theme is simplistic, and lends itself in some ways, to lazy communications.

Patients must also feel free and able to make an informed decision without coercion, that includes not making opting out feel guilty.

The wording used in the past was weighted towards the organisation’s preference.  The very concept of “data sharing” is weighted positively towards the organisation. Even though in reality the default is for data to be taken by the organisation, not donated by the citizen. In other areas of life, this is recognised as an unwilling position for the citizen to be in.

At the moment I feel that the scope of purposes both today and future are not clearly defined enough in communications or plans for me personally to be able to trust them. Withholding information about how digital plans will fit into the broader NHS landscape and what data sharing will mean beyond 2020 appears rightly or wrongly,  suspicious. Department of Health, what are you thinking?

What the organisation says it will do, it must do and be seen to do, to be demonstrably trustworthy.

This workstream carries two important strands of governance and oversight which now need to be seen to happen. Implementing the statutory footing of the National Data Guardian, which has been talked about since October 2014 and ‘at the earliest opportunity’ seems to have been rather long in coming, and ‘a whole system’ that respects patient choice. What will this look like and how will it take into account the granular level of choices asked for at care.data listening events through 2014?

“By April 2016 NIB will publish, in partnership with civil society and patient leaders, a roadmap for moving to a whole-system, consent-based approach, which respects citizens’ preferences and objections about how their personal and confidential data is used, with the goal of implementing that approach by December 2020.”

‘By December 2020’ is still some time away, yet the pathfinders for care.data rolls on now regardless. The proof that will demonstrate what was said about data use actually is what happens to data, that what is communicated is trustworthy, is part of a system that can communicate this by recording and sharing consent decisions, “and can provide information on the use to which an individual’s data has been put. Over the longer term, digital solutions will be developed that automate as far as possible these processes.”

Until then what will underpin trust to show that what is communicated is done, in the short term?

Future proofing Communications must start now

Since 2013 the NHS England care.data approach appeared to want a quick data grab without long term future-proofed plans. Like the hook-up app approach to dating.

To enable the NIB 2020 plans and beyond, to safeguard research in the public interest, all communications must shape a trusted long term relationship.

To ensure public trust, communications content and delivery can only come after changes. Which is again why focusing only on communicate the benefits without discussing balance of risk does not work.  That’s what 2014 patient facing communications tried.

In 2014 there were challenges on communications that were asked but not answered, on reaching those who are digitally excluded, on reaching those for whom reading text was a challenge, and deciding who the target audience will be, considering people with delegated authority young and old, as well as those who go in and out of GP care throughout their lives, such as some military. Has that changed?

In February 2014 Health Select Committee member Sarah Wollaston, now Chair, said: “There are very serious underlying problems here that need to be addressed.”

If you change nothing, you can expect nothing to change in public and professional feeling about the programme. Communications cannot in 2015 simply revamp the layout and pacakging. There must be a change in content and in the support given in its delivery. Change means that you need to stop doing some things and start doing others.

In summary for future communications to support trust, I suggest:

1. STOP: delivering content that is biased towards what the organsation wants to achieve often with a focus on fair processing requirement, under a coercive veil of patient safety and research

START: communicating with an entirely ethical based approach reconsidering all patient data held at HSCIC and whether omission of  ‘commercial use’, balanced risks as identified in the privacy impact assessment and stating ‘your name is not included’ is right.  

2. STOP: Consider all the releases of health data held by HSCIC again and decide for each type if they are going to deliver public confidence that your organisations are trustworthy. 

START: communicate publicly which commercial companies, re-users and back office would no longer be legally eligible to receive data and why. Demonstrate organisations who received data in the past that will not in future.  

3. STOP: the Department of Health and NHS England must stop undermining trust in its own leadership, through public communications that voice opposition to medical professional bodies. Doctors are trusted much more than politicians.

START: strengthen the public-GP relationship that is already well trusted. Strengthen the GP position that will in turn support the organisational-trust-chain that you need to sustain public support. 

4. STOP: stop delaying the legislative changes needed on Data Guardian and penalties for data misuse 

START: implement them and clearly explain them in Parliament and press

5. STOP: don’t rush through short term short-cuts  to get ‘some’ data but ignore the listening from the public that asked for choice.

START: design a thorough granular consent model fit for the 21stC and beyond and explain to the public what it will offer, the buy in for bona fide research will be much greater (be prepared to define ‘research’!

6. STOP: saying that future practices have been changed and that security and uses are now more trustworthy than in the past. Don’t rush to extract data until you can prove you are trustworthy.

START: Demonstrate in future who receives data to individuals through a data use report. Who future users are in practice can only be shown through a demonstrable tool to see your word can be relied upon in practice. This will I am convinced, lower the opt out rate.

 Point 6 is apparently work-in-progress. [p58]
NIB2015

7. STOP: rolling out the current communications approach without any public position on what changes will mean they are notified before a new purpose and user in future of our data

START: design a thorough change communications model fit for the 21stC and beyond and tell the public in THIS round of communications what changes of user or purposes will trigger a notification to enable them to opt out in future BEFORE a future change i.e. in a fictional future – if the government decided that the population wide database should be further commercialised ‘for the purposes of health’, linked to the NHSBT blood donor registry and sold to genomic research companies, how would I as a donor be told, BEFORE the event?

There are still unknowns in content and future scope that mean communications are difficult. If you don’t know what you’re saying how to say it is hard. But what is certain is that there are future changes in the programme planned, and how to communicate these these with the public and professionals must be designed for now, so that what we are signed up for today, stays what we signed up for.

Delivering messages about data sharing and the broader NHS, the DH/NHS England should consider carefully their relationships and behaviours, all communication becomes relevant to trust.

Solutions cannot only be thought of in terms tools, not of what can be imposed on people, but of what can be achieved with people.

That’s people from the public and professionals and the programme working with the same understanding of the plans together, in a trusted long term relationship.

For more detail including my case study comments on the Leeds area CCGs comms leaflet, continue reading below.

Thanks for sharing in discussions of ideas in my five part post on Building public trust – a New Approach. Comments welcome.

Continue reading Building Public Trust [5]: Future solutions for health data sharing in care.data

Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

care.data communicating the benefits as its response to the failed communications in spring 2014, has failed to deliver public trust, here’s why:

To focus on the benefits is a shortcut for avoiding the real issues

Talking about benefits is about telling people what the organisation wants to tell them. This fails to address what the public and professionals want to know. The result is not communication, but a PR exercise.

Talking about benefits in response to the failed communications in spring 2014 and failing to address criticism since, ignores concerns that public and professionals raised at macro and micro level.  It appears disingenuous about real engagement despite saying ‘we’re listening’ and seems uncaring.

Talking about only the benefits does not provide any solution to demonstrably outweigh the potential risk of individual and public health harm through loss of trust in the confidential GP relationship, or data inaccuracy, or loss, and by ignoring these, seems unrealistic.

Talking about short term benefits and not long term solutions [to the broken opt out, long term security, long term scope change of uses and users and how those will be communicated] does not demonstrate competency or reliability.

Talking about only the benefits of commissioning, and research for the merged dataset CES, doesn’t mention all the secondary uses to which all HSCIC patient level health data are put, [those reflected in Type 2 opt out] including commercial re-use and National Back Office: “2073 releases made from the National Back Office between April 2013 and December 2013. This includes 313 releases to police forces, 1531 to the Home Office and 229 to the National Crime Agency.” [HSCIC, July2,  2014].

This use of hospital records and other secondary data by the back office, without openly telling the public, does not feel  ethical and transparent.

Another example, is the past patient communications that expressly said, ‘we do not collect name’, the intent of which would appear to be to assure patients of anonymity, without saying name is already stored at HSCIC on the Personal Demographics Service, or that name is not needed to be identifiable.

We hear a lot about transparency. But is transparent the same fully accurate, complete and honest? Honest about the intended outcomes of the programme. Honest about all the uses to which health data are put. Honest about potential future scope changes and those already planned.

Being completely truthful in communications is fundamental to future-proofing trust in the programme.

NHS England’s care.data programme through the focus on ‘the benefits’ lacks balance and appears disingenuous, disinterested,  unrealistic and lacking in reliability, competency and honesty. Through these actions it does not demonstrate the organisation is trustworthy.  This could be changed.

care.data fundamentally got it wrong with the intention to not communicate the programme at all.  It got it wrong in the tool and tone of communications in the patient leaflet.  There is a chance to get it right now, if the organisation  would only stop the focus on communicating the benefits.

I’m going to step through with a couple of examples why to-date, some communications on care.data and use of NHS data are not conducive to trust.

Communication designed to ‘future-proof’ an ongoing relationship and trust must be by design, not afterthought.

Communications need to start addressing the changes that are happening and how they make people feel and address the changes that create concern – in the public and professionals – not address the  goals that the organisation has.

Sound familiar? Communications to date have been flawed in the same way that the concept of ‘building trust’ has been flawed. It has aimed to achieve the wrong thing and with the wrong audience.

Communications in care.data needs to stop focussing on what the organisation wants from the public and professionals – the benefits it sees of getting data – and address instead firstly at a macro level, why the change is necessary and why the organisation should be trusted to bring it about.

When explaining benefits there are clearly positives to be had from using primary and secondary data in the public interest. But what benefits will be delivered in care.data that are not already on offer today?

Why if commissioning is done today with less identifiable data, can there be no alternative to the care.data level of identifiable data extraction? Why if the CPRD offers research in both primary and secondary care today, will care.data offer better research possibilities? And secondly at a micro level, must address questions individuals asked up and down the country in 2014.

What’s missing and possible to be done?

  1. aim to meet genuine ongoing communication needs not just legal data protection fair processing tick-boxes
  2. change organisational attitude that encourages people to ask what they each want to know at macro and micro level – why the programme at all, and what’s in it for me? What’s new and a benefit that differs from the status quo? This is only possible if you will answer what is asked.
  3. deliver robust explanations of the reason why the macro and micro benefits demonstrably outweigh the risk of individual potential harms
  4. demonstrate reliability, honesty, competency and you are trustworthy
  5. agree how scope changes will trigger communication to ‘future-proof’ an ongoing relationship and trust by design.

As the NIB work stream on Public Trust says, “This is not merely a technical exercise to counter negative media attention; substantial change and long-term work is needed to deliver the benefits of data use.”

If they’re serious about that long term work, then why continue to roll out pathfinder communications based on a model that doesn’t work, with an opt out that doesn’t work? Communications isn’t an afterthought to public trust. It’s key.

If you’re interested in details and my proposals for success in communications I’ve outlined in depth below:

  • Why Communicate Changes at all?
  • What is change in care.data about?
  • Is NHS England being honest about why this is hard?
  • Communicate the Benefits is not working
  • A mock case study in why ‘communicate the benefits’ will fail
  • Long term trust needs a long term communications solution
  • How a new model for NHS care.data Communication could deliver

Continue reading Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

Let’s assume the question of public trust is as important to those behind data sharing plans in the NHS [1] as they say it is. That the success of the care.data programme today and as a result, the very future of the NHS depends upon it.

“Without the care.data programme, the health service will not have a future, said Tim Kelsey, national director for patients and information, NHS England.” [12]

And let’s assume we accept that public trust is not about the public, but about the organisation being trustworthy.[2]

The next step is to ask, how trustworthy is the programme and organisation behind care.data? And where and how do they start to build?

The table discussion on  [3] “Building Public Trust in Data Sharing”  considered  “what is the current situation?” and “why?”

What’s the current situation? On trust public opinion is measurable. The Royal Statistical Society Data Trust Deficit shows that the starting points are low with the state and government, but higher for GPs. It is therefore important that the medical profession themselves trust the programme in principle and practice. They are after all the care.data point of contact for patients.

The current status on the rollout, according to news reports, is that pathfinder  practices are preparing to rollout [4]  communications in the next few weeks. Engagement is reportedly being undertaken ‘over the summer months’. 

Understanding both public trust and the current starting point matters as the rollout is moving forwards and as leading charity and research organisation experts said: “Above all, patients, public and healthcare professionals must understand and trust the system. Building that trust is fundamental. We believe information from patient records has huge potential to save and improve lives but privacy concerns must be taken seriously. The stakes are too high to risk any further mistakes.” [The Guardian Letters, July 27, 2015]

Here’s three steps I feel could be addressed in the short term, to start to demonstrate why the public and professionals should trust  both organisation and process.

What is missing?

1. Opt out: The type 2 opt out does not work. [5]  

2 a. Professional voices called for answers and change: As mentioned in my previous summary various bodies called for change. Including the BMA whose policy [6] remains that care.data should be on a patient opt-in basis.

2bPublic voices called for answers and change: care.data’s own listening event feedback [7] concluded there was much more than ‘communicate the benefits’ that needed done. There is much missing. Such as questions on confusing SCR and care.data, legislation and concern over controlling its future change, GP concerns of their ethical stance, the Data Guardian’s statutory footing, correction of mistakes, future funding and more.
How are open questions being addressed? If at all?

3. A single clear point of ownership on data sharing and public trust communications> Is this now NIB, NHS England Patients and Information Directorate, the DH  who owns care.data now? It’s hard to ask questions if you don’t know where to go and the boards seem to have stopped any public communications. Why? The public needs clarity of organisational oversight.

What’s the Solution? 

1. Opt out: The type 2 opt out does not work. See the post graphic, the public wanted more clarity over opt out in 2014, so this needs explained clearly >>Solution: follows below from a detailed conversation with Mr. Kelsey.

2. Answers to professional opinions: The Caldicott panel,  raised 27 questions in areas of concern in their report. [8] There has not yet been any response to address them made available in the public domain by NHS England. Ditto APPG report, BMA LMC vote, and others >> Solution: publish the responses to these concerns and demonstrate what actions are being done to address them.

2b. Fill in the lack of transparency: There is no visibility of any care.data programme board meeting minutes or materials from 2015. In eight months, nothing has been published. Their 2014 proposal for transparency, appears to have come to nothing. Why?  The minutes from June-October 2014 are also missing entirely and the October-December 2014 materials published were heavily redacted. There is a care.data advisory board, which seems to have had little public visibility recently either. >> Solution: the care.data programme business case must be detailed and open to debate in the public domain by professionals and public. Scrutiny of its associated current costs and time requirements, and ongoing future financial implications at all levels should be welcomed by national, regional (CCG) and local level providers (GPs). Proactively publishing creates demonstrable reasons why both the organisation, and the plans are both trustworthy. Refusing this without clear justifications, seems counter productive, which is why I have challenged this in the public interest. [10]

3. Address public and professional confusion of ownership: Since data sharing and public trust are two key components of the care.data programme, it seems to come under the NIB umbrella, but there is a care.data programme board [9] of its own with a care.data Senior Responsible Owner and Programme Director. >> Solution: an overview of where all the different nationally driven NHS initiatives fit together and their owners would be helpful.

[Anyone got an interactive Gantt chart for all national level driven NHS initiatives?]

This would also help public and professionals see how and why different initiatives have co-dependencies. This could also be a tool to reduce the ‘them and us’ mentality. Also useful for modelling what if scenarios and reality checks on 5YFV roadmaps for example, if care.data pushes back six months, what else is delayed?

If the public can understand how things fit together it is more likely to invite questions, and an engaged public is more likely to be a supportive public. Criticism can be quashed if it’s incorrect. If it is justified criticism, then act on it.

Yes, these are hard decisions. Yes, to delay again would be awkward. If it were the right decision, would it be worse to ignore it and carry on regardless? Yes.

The most important of the three steps in detail: a conversation with Mr. Kelsey on Type 2 opt out. What’s the Solution?

We’re told “it’s complicated.” I’d say “it’s simple.” Here’s why.

At the table of about fifteen participants at the Bristol NIB event, Mr. Kelsey spoke very candidly and in detail about consent and the opt out.

On the differences between consent in direct care and other uses he first explained the assumption in direct care. Doctors and nurses are allowed to assume that you are happy to have your data shared, without asking you specifically. But he said, “beyond that boundary, for any other purpose, that is not a medical purpose in law, they have to ask you first.”

He went on to explain that what’s changed the whole dynamic of the conversation, is the fact that the current Secretary of State, decided that when your data is being shared for purposes other than your direct care, you not only have the right to be asked, but actually if you said you didn’t want it to be shared, that decision has to be respected, by your clinician.

He said: “So one of the reasons we’re in this rather complex situation now, is because if it’s for analysis, not only should you be asked, but also when you say no, it means no.”

Therefore, I asked him where the public stands with that now. Because at the moment there are ca. 700,000 people who we know said no in spring 2014.

Simply: They opted out of data used for secondary purposes, and HSCIC continues to share their data.

“Is anything more fundamentally damaging to trust, than feeling lied to?”

Mr. Kelsey told the table there is a future solution, but asked us not to tweet when. I’m not sure why, it was mid conversation and I didn’t want to interrupt:

“we haven’t yet been able to respect that preference, because technically the Information Centre doesn’t have the digital capability to actually respect it.”

He went on to say that they have hundreds of different databases and at the moment, it takes 24 hrs for a single person’s opt out to be respected across all those hundreds of databases. He explained a person manually has to enter a field on each database, to say a person’s opted out. He asked the hoped-for timing not be tweeted but explained that all those current historic objections which have been registered will be respected at a future date.

One of the other attendees expressed surprise that GP practices hadn’t been informed of that, having gathered consent choices in 2014 and suggested the dissent code could be extracted now.

The table discussion then took a different turn with other attendee questions, so I’m going to ask here what I would have asked next in response to his statement, “if it’s for analysis, not only should you be asked, but also when you say no, it means no.”

Where is the logic to proceed with pathfinder communications?

What was said has not been done and you therefore appear untrustworthy.

If there will be a future solution it will need communicated (again)?

“Trust is not about the public. Public trust is about the organisation being trustworthy.”

There needs to be demonstrable action that what the org said it would do, the org did. Respecting patient choice is not an optional extra. It is central in all current communications. It must therefore be genuine.

Knowing that what was promised was not respected, might mean millions of people choose to opt out who would not otherwise do so if the process worked when you communicate it.

Before then any public communications in Blackburn and Darwen, and Somerset, Hampshire and Leeds surely doesn’t make sense.

Either the pathfinders will test the same communications that are to be rolled out as a test for a national rollout, or they will not. Either those communications will explain the secondary uses opt out, or they will not. Either they will explain the opt out as is [type 2 not working] or as they hope it might be in future. [will be working] Not all of these can be true.

People who opt out on the basis of a broken process simply due to a technical flaw, are unlikely to ever opt back in again. If it works to starts with, they might choose to stay in.

Or will the communications roll out in pathfinders with a forward looking promise, repeating what was promised but has not yet been done? We will respect your promise (and this time we really mean it)? Would public trust survive that level of uncertainty? In my opinion, I don’t think so.

There needs to be demonstrable action in future as well, that what the org said it would do, the org did. So the use audit report and how any future changes will be communicated both seem basic principles to clarify for the current rollout as well.

So what’s missing and what’s the solution on opt out?

We’re told “it’s complicated.” I say “it’s simple.” The promised opt out must work before moving forward with anything else. If I’m wrong, then let’s get the communications materials out for broad review to see how they accommodate this and the future re-communication of  second process.

There must be a budgeted and planned future change communication process.

So how trustworthy is the programme and organisation behind care.data?

Public opinion on trust levels is measurable. The Royal Statistical Society Data Trust Deficit shows that the starting points are clear. The current position must address the opt out issue before anything else. Don’t say one thing, and do another.

To score more highly on the ‘truthworthy scale’ there must be demonstrable action, not simply more communications.

Behaviours need change and modelled in practice, to focus on people, not  tools and tech solutions, which make patients feel as if they are less important to the organisations than their desire to ‘enable data sharing’.

Actions need to demonstrate they are ethical and robust for a 21stC solution.

Policies, practical steps and behaviours all play vital roles in demonstrating that the organisations and people behind care.data are trustworthy.

These three suggestions are short term, by that I mean six months. Beyond that further steps need to be taken to be demonstrably trustworthy in the longer term and on an ongoing basis.

Right now, do I trust that the physical security of HSCIC is robust? Yes.

Do I trust that the policies in the programme would not to pass my data in the future to third party commercial pharma companies? No.
Do I believe that for enabling commissioning my fully identifiable confidential health records should be stored indefinitely with a third party? No.
Do I trust that the programme would not potentially pass my data to non-health organisations, such as police or Home Office? No.
Do I trust that the programme to tell me if they potentially change the purposes from those which they outline now ? No.

I am open to being convinced.

*****

What is missing from any communications to date and looks unlikely to be included in the current round and why that matters I address in my next post Building Public Trust [4]: Communicate the Benefits won’t work for care.data and then why a future change management model of consent needs approached now, and not after the pilot, I wrap up in [5]: Future solutions.

Continue reading Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

Building Public Trust [2]: a detailed approach to understanding Public Trust in data sharing

Enabling public trust in data sharing is not about ‘communicating benefits’. For those interested in nitty gritty, some practical suggestions for progress in Building Public Trust in data sharing follows on from my summary after the NIB Bristol event 24/7/15.

Trust is an important if invisible currency used in the two-way transactions between an organisation and people.

So far, there have been many interactions and listening events but much of what professionals and the public called for, remains undone and public trust in the programme remains unchanged since 2014.

If you accept that it is not public trust that needs built, but the tangible trusthworthiness of an organisation, then you should also ask  what needs done by the organisation to make that demonstrable change?

What’s today’s position on Public Trust of data storage and use

Trust in the data sharing process is layered and dependent on a number of factors. Mostly [based on polls and public event feedback from 2014] “who will access my data and what will they use it for?”

I’m going to look more closely below at planned purposes: research and commissioning.

It’s also important to remember that trust is not homogeneous. Trust  is nuanced even within the single relationship between one individual and another. Trust, like consent, is stratified – you may trust the Post Office to deliver a letter or postcard, but sign up to recorded delivery for something valuable.

So for example when it comes to secondary uses data sharing, I might trust HSCIC with storing and using my health records for anonymous statistics, for analysis of immunisation and illness patterns for example. But as long as they continue to share with the Home Office, police or other loosely defined third parties [5], do I want them to have fully  identifiable data at all?

Those bodies have their own public trust issues at an all time low.

Mixing the legitimate users of health data with these Back Office punitive  uses will invite some people’s opt out who would otherwise not. Some of the very groups who need the most health and social care understanding, research and care, will be the very groups who opt out if there is a possibility of police and Home Office access by the back door. Telling traveller communities what benefits care.data will bring them is wasted effort when they see NHS health data is a police accessible register. I know. I’ve talked to some about it.

That position on data storage and use should be reconsidered if NHS England is serious that this is about health and for the benefit of individuals and communities’ well being.

What has HSCIC changed to demonstrate that  it is trustworthy?

A new physical secure setting is being built that will enable researchers to view research data but not take raw data away.

That is something they can control, and have changed, and it demonstrates they take the public seriously and we reciprocate.

That is great – demonstrable change by the organisation, inviting change in the public.

That’s practical, so what can be done on policy by NHS England/DH?

What else should be done to demonstrate policy is trustworthy?

Act on what the public and professionals asked for in 2014. [8]

Right now it feels as though in public communications that the only kind of relationship that is wanted on the part of the leadership is a one night stand.

It’s all about what the programme wants. Minimise the objections, get the data, and sneak out. Even when its leaders talk about some sort of ongoing consent model, the focus is still about ‘how to enable sharing data.’

This focus is the wrong one. If you want to encourage people to share they need to know why, what’s in it for them, and why do you want it? What collecting the data is about is still important to explain and specifically, each time the scope changes if you are doing it fairly.

Remember. Data-sharing is not vital to future-proof the NHS. Using knowledge wisely is. 

What is the policy for the future of primary care research?

The CPRD already enables primary care GP data to be linked with secondary data for research. In fact it already links more items from GP held data than current are.data plans to extract. So what benefit will care.data offer to research that is not already available today?

Simply having ever more data, stored in more places will not make us wiser. Before it’s collected repeatedly, it is right to question why.

What do we have collected already? How is it used? Where are the gaps in what we want to achieve through the knowledge we could gain. It’s NOT simply about filling in what gaps exist in what data we could gather. Understand the purposes and what will be gained to see if it’s worth the efforts. Prioritise. Collect it all, is not a solution.

I had thought that the types of data to be collected in care.data were clear, and how it differs from direct care was clear. But the Bristol NIB meeting demonstrated a wide range of understanding in NHS and CCG staff, Local Authority staff, IT staff, IG professionals, data providers and other third parties.  Data for secondary purposes are not to be conflated with direct care.

But that’s not what care.data sharing is about. So where to start with public trust, asked the NIB Bristol #health2020 meeting?

Do you ignore the starting point or tailor your approach to it?

“The NHS is at a crossroads and needs to change and improve as it moves forward. That was the message from NHS England’s Chief Executive Simon Stevens as a Five Year Forward View for the NHS was launched.”  [1] [NHS England, Oct 2014]

As the public is told over and over again that change is vital to the health of a sustainable NHS, a parallel public debate rages, whether the policy-making organisations behind the NHS – the commissioning body NHS England, the Department of Health and Cabinet Office – are serious about the survival of universal health and care provision, and about supporting its clinicians.

It is against this backdrop, and under the premise that obtaining patient data for centralised secondary uses is do or die for the NHS, that the NIB #health2020 has set out [2] work stream 4: “Build and sustain public trust: Deliver roadmap to consent based information sharing and assurance of safeguards”

“Without the care.data programme, the health service will not have a future, said Tim Kelsey, national director for patients and information, NHS England.” [3]

 

Polls say [A] nearly all institutions suffer from a ‘trust in data deficit’. Trust in them to use data appropriately, is lower than trust in the organisation generally.

Public trust in what the Prime Minister says on health is low.

Trust in the Secretary of State for Health is possibly at an all time low, with: “a bitter divide, a growing rift between the Secretary of State for Health and the medical profession.” [New Statesman, July 2015]

This matters. care.data needs the support of professionals and public.

ADRN research showed multiple contributing factors: “Participants were also worried about personal data being leaked, lost, shared or sold by government departments to third parties, particularly commercial companies. Low trust in government more generally seemed to be driving these views.” [Dialogue on data]

It was interesting to see all the same issues as reflected by the public in care.data listening events, asked from the opposite perspective from data users.

But it was frustrating to sit ay the Bristol NIB #health2020 event and discuss questions around the same issues on data sharing already discussed at care.data events through the last 18 months.

Nothing substantial has changed other then HSCIC’s physical security for data storage.

It is frustrating knowing that these change and communications issues will keep coming back again and again if not addressed.

Personally, I’m starting to lose trust there is any real intention for change, if senior leadership is unwilling to address this properly and change themselves.

To see a change in Public Trust do what the public asked to see change: On Choice

At every care.data meeting I attended in 2014, people asked for choice.

They asked for boundaries between the purposes of data uses, real choice.

Willingness for their information to be used by academic researchers in the public interest does not equate to being willing for it to be used by a pharmaceutical company for their own market research and profit.

The public understand these separations well. To say they do not, underestimates people and does not reflect public feeling. Anyone attending 2014 care.data events, has heard many people discuss this. They want a granular consent model.

This would offer a red line between how data are used for what purposes.

Of the data-sharing organisations today some are trusted and others are not. Offering a granular consent approach would offer a choice of a red line between who gets access to data.

This choice of selective use, would encourage fewer people to opt out from all purposes, allowing more data to be available for research for example.

To see a change in Public Trust do what the public asked to see: Explain your purposes more robustly

Primarily this data is to be used and kept indefinitely for commissioning purposes. Research wasn’t included as purposes for care.data gathering  in the planned specifications for well over a year. [After research outcry]

Yet specific to commissioning, the Caldicott recommendations [3] were very clear; commissioning purposes were insufficient and illegal grounds for sharing fully identifiable data which was opposed by NHS England’s Commissioning Board:

“The NHS Commissioning Board suggested that the use of personal confidential data for commissioning purposes would be legitimate because it would form part of a ‘consent deal’ between the NHS and service users. The Review Panel does not support such a proposition. There is no evidence that the public is more likely to trust commissioners to handle personal confidential data than other groups of professionals who have learned how to work within the existing law.”

NHS England seems unwilling to change this position, despite the professionals bodies and the public’s opposition to sharing fully identifiable data for commissioning purposes [care.data listening events 2014]. Is it any wonder that they keep hitting the same barrier? More people don’t want that to happen than you do. Something’s gotta give.

Ref the GPES Customer Requirements specification from March 2013 v2.1 which states on page 11: “…for commissioning purposes, it is important to understand activity undertaken (or not undertaken) in all care settings. The “delta load” approach (by which only new events are uploaded) requires such data to be retained, to enable subsequent linkage.”

The public has asked for red lines to differentiate between the purposes of data uses. NHS England and the Department of Health policy seems unwilling to do so.  Why?

To see a change in Public Trust do what the public asked to see: Red lines on policy of commercial use – and its impact on opt out

The public has asked for red lines outlawing commercial exploitation of their data. Though it was said it was changed, in practice it is hard to see. Department of Health policy seems unwilling to be clear, because the Care Act 2012 purposes remained loose.  Why?

As second best, the public has asked for choice not to have their data used at all for secondary purposes and were offered an opt out.

NHS England leaflet and the Department of Health, Secretary of State publicly promised this but has been unable to implement it and to date has made no public announcement on when it will be respected.  Why?

Trust does not exist in a vacuum.  What you say and what you actually do, matter. Policy and practice are co-dependent. Public trust depends on your organisations being trustworthy.

Creating public trust is not the government, the DH or NIB’s task ahead. They must instead focus on improving their own competency, honesty and reliability and through those, they will demonstrate that they can be trusted.

That the secondary purposes opt out has not been respected does not demonstrate those qualities.

“Trust is not about the public. Public trust is about the organisation being trustworthy.”

How will they do that?

Let the DH/NHS England and organisations in policy and practice address what they themselves will stop and start doing to bring about change in their own actions and behaviours.

Communications change request: Start by addressing the current position NOT what the change will bring. You must move people along the curve , not dump them with a fait accomplice and wonder why the reaction is so dire.

changecurve

Vital for this is the current opt out; what was promised and what was done.

The secondary uses opt out must be implemented with urgency.

To see a change in Public Trust you need to take action. the Programme needs to do what the public asked to see change: on granular consent, on commercial use and defined purposes.

And to gather suggested actions, start asking the right questions.

Not ‘how do we rebuild public trust?’ but “how can we demonstrate that we are trustworthy to the public?”

1. How can a [data-sharing] org demonstrate it is trustworthy?
2. Identify: why people feel confident their trust is well placed?
3. Why do clinical professionals feel confident in any org?
4. What would harm the organisational-trust-chain in future?
5. How will the org-trust-chain be positively maintained in future?
6. What opportunities will be missed if that does not happen?
(identify value)

Yes the concepts are close,  but how it is worded defines what is done.

These apparent small differences make all the difference in how people provide you ideas, how you harness them into real change and improvement.

Only then can you start understanding why “communicating the benefits” has not worked and how it should affect future communications  materials.

From this you will find it much easier to target actual tasks, and short and long term do-able solutions than an open discussion will deliver. Doing should  include thinking/attitudes as well as actions.

This will lead to communications messages that are concrete not wooly. More about that in the next posts.

####

To follow, for those interested in nitty gritty, some practical suggestions for progress in Building Public Trust in data sharing:

Part one: A seven step top line summary – What I’d like to see change addressing public trust in health data sharing for secondary purposes.

This is Part two: a New Approach is needed to understanding Public Trust For those interested in a detailed approach on Trust. What Practical and Policy steps influence trust. On Reserach and Commissioning. Trust is not homogeneous. Trust  is nuanced even within the single relationship between one individual and another. It doesn’t exist in a vacuum.

Part three: Know where you’re starting from. What behaviours influence trust and how can we begin to see them demonstrated. Mr.Kelsey discusses  consent and opt out. Fixing what has already been communicated is vital before new communications get rolled out. Vital to tailor the content of public communications, for public trust and credibility the programme must be clear what is missing and what needs filled in. #Health2020 Bristol NIB meeting.

Part four: “Communicate the Benefits” won’t work – How Communications influence trust. For those interested in more in-depth reasons, I outline in part two why the communications approach is not working, why the focus on ‘benefits’ is wrong, and fixes.

Part five: Future solutions – why a new approach may work better for future trust – not to attempt to rebuild trust where there is now none, but strengthen what is already trusted and fix today’s flawed behaviours; honesty and reliability, that  are vital to future proofing public trust.

 

####

References:

[1] NHS England October 2014 http://www.england.nhs.uk/2014/10/23/nhs-leaders-vision/

[2] Workstream 4: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/442829/Work_Stream_4.pdf

[3] Caldicott Review 2: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/192572/2900774_InfoGovernance_accv2.pdf

[4] Missing Programme Board documents: 2015 and June-October 2014

[5] HSCIC Data release register

[6] Telegraph article on Type 2 opt out http://www.telegraph.co.uk/news/health/news/11655777/Nearly-1million-patients-could-be-having-confidential-data-shared-against-their-wishes.html

[7] Why Wanting a Better Care.Data is not Luddite: http://davidg-flatout.blogspot.co.uk/2014/04/why-wanting-better-caredata-is-not.html

[8] Talking to the public about using their data is crucial- David Walker, StatsLife http://www.statslife.org.uk/opinion/1316-talking-to-the-public-about-using-their-data-is-crucial

[9] Dame Fiona Caldicott appointed in new role as National Data Guardian

[10] Without care.data health service has no future says director http://www.computerweekly.com/news/2240216402/Without-Caredata-we-wont-have-a-health-service-for-much-longer-says-NHS

[11] Coin Street, care.data advisory meeting, September 6th 2014: https://storify.com/ruth_beattie/care-data-advisory-group-open-meeting-6th-septembe

[12] Public questions unanswered: https://jenpersson.com/pathfinder/