Category Archives: #caredata

DeepMind or DeepMined? NHS public data, engagement and regulation repackaged

A duty of confidentiality and the regulation of medical records are as old as the hills. Public engagement on attitudes in this in context of the NHS has been done and published by established social science and health organisations in the last three years. So why is Google DeepMind (GDM) talking about it as if it’s something new? What might assumed consent NHS-wide mean in this new context of engagement? Given the side effects for public health and medical ethics of a step-change towards assumed consent in a commercial product environment, is this ‘don’t be evil’ shift to ‘do no harm’ good enough?  Has Regulation failed patients?
My view from the GDM patient and public event, September 20.

Involving public and patients

Around a hundred participants joined the Google DeepMind public and patient event,  in September after which Paul Wicks gave his view in the BMJ afterwards, and rightly started with the fact the event was held in the aftermath of some difficult questions.

Surprisingly, none were addressed in the event presentations. No one mentioned data processing failings, the hospital Trust’s duty of confidentiality, or criticisms in the press earlier this year. No one talked about the 5 years of past data from across the whole hospital or monthly extracts that were being shared and had first been extracted for GDM use without consent.

I was truly taken aback by the sense of entitlement that came across. The decision by the Trust to give away confidential patient records without consent earlier in 2015/16 was either forgotten or ignored and until the opportunity for questions,  the future model was presented unquestioningly. The model for an NHS-wide hand held gateway to your records that the announcement this week embeds.

What matters on reflection is that the overall reaction to this ‘engagement’ is bigger than the one event, bigger than the concepts of tools they could hypothetically consider designing, or lack of consent for the data already used.

It’s a massive question of principle, a litmus test for future commercial users of big, even national population-wide public datasets.

Who gets a say in how our public data are used? Will the autonomy of the individual be ignored as standard, assumed unless you opt out, and asked for forgiveness with a post-haste opt out tacked on?

Should patients just expect any hospital can now hand over all our medical histories in a free-for-all to commercial companies and their product development without asking us first?

Public and patient questions

Where data may have been used in the algorithms of the DeepMind black box, there was a black hole in addressing patient consent.

Public engagement with those who are keen to be involved, is not a replacement for individual permission from those who don’t want to be, and who expected a duty of patient-clinician confidentiality.

Tellingly, the final part of the event tried to be a capture our opinions on how to involve the public. Right off the bat the first question was one of privacy. Most asked questions about issues raised to date, rather than looking to design the future. Ignoring those and retrofitting a one-size fits all model under the banner of ‘engagement’ won’t work until they address concerns of those people they have already used and the breach of trust that now jeopardises people’s future willingness to be involved, not only in this project, but potentially other research.

This event should have been a learning event for Google which is good at learning and uses people to do it both by man and machine.

But from their post-media reaction after  this week’s announcement it seems not all feedback or lessons learned are welcome.

Google DeepMind executives were keen to use patient case studies and had patients themselves do the most talking, saying how important data is to treat kidney and eyecare, which I respect greatly. But there was very little apparent link why their experience was related to Google DeepMind at all or products created to date.

Google DeepMind has the data from every patient in the hospital in recent years, not only patients affected by this condition and not data from the people who will be supported directly by this app.

Yet GoogleDeepMind say this is “direct care” not research. Hard to be for direct care when you are no longer under the hospital’s care. Implied consent for use of sensitive health data, needs to be used in alignment with the purposes for which it was given. It must be fair and lawful.

If data users don’t get that, or won’t accept it, they should get out of healthcare and our public data right now. Or heed advice of critical friends and get it right to be trustworthy in future. .

What’s the plan ahead?

Beneath the packaging, this came across as a pitch on why Google DeepMind should get access to paid-for-by-the-taxpayer NHS patient data. They have no clinical background or duty of care. They say they want people to be part of a rigorous process, including a public/patient panel, but it’s a process they clearly want to shape and control, and for a future commercial model. Can a public panel be truly independent, and ethical, if profit plays a role?

Of course it’s rightly exciting for healthcare to see innovation and drives towards better clinical care, but not only the intent but how it gets done matters. This matters because it’s not a one-off.

The anticipation in the room of ‘if only we could access the whole NHS data cohort’ was tangible in the room, and what a gift it would be to commercial companies and product makers. Wrapped in heart wrenching stories. Stories of real-patients, with real-lives who genuinely want improvement for all. Who doesn’t want that? But hanging on the coat tails of Mr Suleyman were a range of commmercial companies and third party orgs asking for the same.

In order to deliver those benefits and avoid its risks there is well-established framework of regulation and oversight of UK  practitioners and use of medical records and in medical devices and tools: the General Medical Council, the Health and Social Care Information Centre (Now called ‘NHS Digital’), Confidentiality Advisory Group (CAG)and more, all have roles to play.

Google DeepMind and the Trusts have stepped outwith that framework and been playing catch up not only with public involvement, but also with MHRA regulatory approval.

One of the major questions is around the invisibility of data science decisions that have direct interventions in people’s life and death.

The ethics of data sciences in which decisions are automated, requires us to “guard against dangerous assumptions that algorithms are near-perfect, or more perfect than human judgement.”  (The Opportunities and Ethics of Big Data. [1])

If Google DeepMind now plans to share their API widely who will proof their tech? Who else gets to develop something similar?

Don’t be evil 2.0

Google DeepMind appropriated ‘do no harm’ as the health event motto, echoing the once favored Google motto ‘don’t be evil’.

However, they really needed to address that the fragility of some patients’ trust in their clinicians has been harmed already, before DeepMind has even run an algorithm on the data, simply because patient data was given away with patients’ permission.

A former Royal Free patient spoke to me at the event and said they were shocked to have to have first read in the papers that their confidential medical records had been given to Google without their knowledge. Another said his mother had been part of the cohort and has concerns. Why weren’t they properly informed? The public engagement work they should to my mind be doing, is with the London hospital individual patients whose data they have already been using without their consent, explaining why they got their confidential medical records without telling them, and addressing their questions and real concerns. Not at a flash public event.

I often think in the name, they just left off the ‘e’. They are Google. We are the deep mined. That may sound flippant but it’s not the intent. It’s entirely serious. Past patient data was handed over to mine, in order to think about building a potential future tool.

There was a lot of if, future, ambition, and sweeping generalisations and ‘high-level sketches’ of what might be one day. You need moonshots to boost discovery, but losing patient trust even of a few people, cannot be a casualty we should casually accept. For the company there is no side effect. For patients, it could last a lifetime.

If you go back to the roots of health care, you could take the since misappropriated Hippocratic Oath and quote not only, as Suleyman did, “do no harm” , but the next part. “I will not play God.”

Patriarchal top down Care.data was a disastrous model of engagement that confused communication with ‘tell the public loudly and often what we want to happen, what we think best, and then disregard public opinion.’ A model that doesn’t work.

The recent public engagement event on the National Data Guardian work consent models certainly appear from the talks to be learning those lessons. To get it wrong in commercial use, will be disastrous.

The far greater risk from this misadventure is not company  reputation, which seems to be top among Google DeepMind’s greatest concern. The risk that Google DeepMind seems prepared to take is one that is not at its cost, but that of public trust in the hospitals and NHS brand, public health, and its research.

Commercial misappropriation of patient data without consent could set back restoration of public trust and work towards a better model that has been work-in-progress since care.data car crash of 2013.

You might be able to abdicate responsibility if you think you’re not the driver. But where does the buck stop for contributory failure?

All this, says Google DeepMind, is nothing new, but Google isn’t other companies and this is a massive pilot move by a corporate giant into first appropriating and then brokering access to NHS-wide data to make an as-yet opaque private profit.  And being paid by the hospital trust to do so. Creating a data-sharing access infrastructure for the Royal Free is product development and one that had no permission to use 5 years worth of patient records to do so.

The care.data catastrophe may have damaged public trust and data access for public interest research for some time, but it did so doing commercial interests a massive favour. An assumption of ‘opt out’ rather than ‘opt in’ has become the NHS model. If the boundaries are changing of what is assumed under that, do the public still have no say in whether that is satisfactory? Because it’s not.

This example should highlight why an opt out model of NHS patient data is entirely unsatisfactory and cannot continue for these uses.

Should boundaries be in place?

So should boundaries in place in the NHS before this spreads. Hell yes. If as Mustafa said, it’s not just about developing technology but the process, regulatory and governance landscapes, then we should be told why their existing use of patient data intended for the Streams app development steam-rollered through those existing legal and ethical landscapes we have today. Those frameworks exist to preserve patients from quacks and skullduggery.

This then becomes about the duty of the controller and rights of the patient. It comes back to what we release, not only how it is used.

Can a panel of highly respected individuals intervene to embed good ethics if plans conflict with the purpose of making money from patients? Where are the boundaries between private and public good? Where they quash consent, where are its limitations and who decides? What boundaries do hospital trusts think they have on the duty of confidentiality?

It is for the hospitals as the data controllers from information received through their clinicians that responsibility lies.

What is next for Trusts? Giving an entire hospital patient database to supermarket pharmacies, because they too might make a useful tool? Mash up your health data with your loyalty card? All under assumed consent because product development is “direct care” because it’s clearly not research? Ethically it must be opt in.

App development is not using data for direct care. It is in product development. Post-truth packaging won’t fly. Dressing up the donkey by simply calling it by another name, won’t transform it into a unicorn, no matter how much you want to believe in it.

“In some sense I recognise that we’re an exceptional company, in other senses I think it’s important to put that in the wider context and focus on the patient benefit that we’re obviously trying to deliver.” [TechCrunch, November 22]

We’ve heard the cry, to focus on the benefit before. Right before care.data  failed to communicate to 50m people what it was doing with their health records. Why does Google think they’re different? They don’t. They’re just another company normalising this they say.

The hospitals meanwhile, have been very quiet.

What do patients want?

This was what Google DeepMind wanted to hear in the final 30 minutes of the event, but didn’t get to hear as all the questions were about what have you done so far and why?

There is already plenty of evidence what the public wants on the use of their medical records, from public engagement work that has already been done around NHS health data use from workshops and surveys since 2013. Public opinion is pretty clear. Many say companies should not get NHS records for commercial exploitation without consent at all (in the ESRC public dialogues on data in 2013, the Royal Statistical Society’s data trust deficit with lessons for policy makers work with Ipsos MORI in 2014, and the Wellcome Trust one-way mirror work in 2016 as well of course as the NHS England care.data public engagement workshops in 2014).

mirror

All those surveys and workshops show the public have consistent levels of concern about having a lack of control over who has access to their NHS data for what purposes and unlimited scope or future, and commercial purposes of their data is a red-line for many people.

A red-line which this Royal Free Google DeepMind project appeared to want to wipe out as if it had never been drawn at all.

I am sceptical that Google DeepMind has not done their research into existing public opinion on health data uses and research.

Those studies in public engagement already done by leading health and social science bodies state clearly that commercial use is a red line for some.

So why did they cross it without consent? Tell me why I should trust the hospitals to get this right with this company but trust you not to get it wrong with others. Because Google’s the good guys?

If this event and thinking ‘let’s get patients to front our drive towards getting more data’ sought to legitimise what they and these London hospitals are already getting wrong, I’m not sure that just ‘because we’re Google’ being big, bold and famous for creative disruption, is enough. This is a different game afoot. It will be a game-changer for patient rights to privacy if this scale of commercial product exploitation of identifiable NHS data becomes the norm at a local level to decide at will. No matter how terrific the patient benefit should be, hospitals can’t override patient rights.

If this steamrollers over consent and regulations, what next?

Regulation revolutionised, reframed or overruled

The invited speaker from Patients4Data spoke in favour of commercial exploitation as a benefit for the NHS but as Paul Wicks noted, was ‘perplexed as to why “a doctor is worried about crossing the I’s and dotting the T’s for 12 months (of regulatory approval)”.’

Appropriating public engagement is one thing. Appropriating what is seen as acceptable governance and oversight is another. If a new accepted model of regulation comes from this, we can say goodbye to the old one.  Goodbye to guaranteed patient confidentiality. Goodbye to assuming your health data are not open to commercial use.  Hello to assuming opt out of that use is good enough instead.

Trusted public regulatory and oversight frameworks exist for a reason. But they lag behind the industry and what some are doing. And if big players can find no retribution in skipping around them and then being approved in hindsight there’s not much incentive to follow the rules from the start. As TechCrunch suggested after the event, this is all “pretty standard playbook for tech firms seeking to workaround business barriers created by regulation.”

Should patients just expect any hospital can now hand over all our medical histories in a free-for-all to commercial companies without asking us first? It is for the Information Commissioner to decide whether the purposes of product design were what patients expected their data to be used for, when treated 5 years ago.

The state needs to catch up fast. The next private appropriation of the regulation of  AI collaboration oversight, has just begun. Until then, I believe civil society will not be ‘pedalling’ anything, but I hope will challenge companies cheek by jowl in any race to exploit personal confidential data and universal rights to privacy [2] by redesigning regulation on company terms.

Let’s be clear. It’s not direct care. It’s not research. It’s product development. For a product on which the commercial model is ‘I don’t know‘. How many companies enter a 5 year plan like that?

Benefit is great. But if you ignore the harm you are doing in real terms to real lives and only don’t see it because they’ve not talked to you, ask yourself why that is, not why you don’t believe it matters.

There should be no competition in what is right for patient care and data science and product development. The goals should be the same. Safe uses of personal data in ways the public expect, with no surprises. That means consent comes first in commercial markets.


[1] Olivia Varley-Winter, Hetan Shah, ‘The opportunities and ethics of big data: practical priorities for a national Council of Data Ethics.’ Theme issue ‘The ethical impact of data science’ compiled and edited by Mariarosaria Taddeo and Luciano Floridi. [The Royal Society, Volume 374, issue 2083]

[2] Universal rights to privacy: Upcoming Data Protection legislation (GDPR) already in place and enforceable from May 25, 2018 requires additional attention to fair processing, consent, the right to revoke it, to access one’s own and seek redress for inaccurate data. “The term “child” is not defined by the GDPR. Controllers should therefore be prepared to address these requirements in notices directed at teenagers and young adults.”

The Rights of the Child: Data policy and practice about children’s confidential data will impinge on principles set out in the United Nations Convention on the Rights of the Child, Article 12, the right to express views and be heard in decisions about them and Article 16 a right to privacy and respect for a child’s family and home life if these data will be used without consent. Similar rights that are included in the common law of confidentiality.

Article 8 of the Human Rights Act 1998 incorporating the European Convention on Human Rights Article 8.1 and 8.2 that there shall be no interference by a  public authority on the respect of private and family life that is neither necessary or proportionate.

Judgment of the Court of Justice of the European Union in the Bara case (C‑201/14) (October 2015) reiterated the need for public bodies to legally and fairly process personal data before transferring it between themselves. Trusts need to respect this also with contractors.

The EU Charter of Fundamental Rights, Article 52 also protects the rights of individuals about data and privacy and Article 52 protects the essence of these freedoms.

care.data listening events and consultation: The same notes again?

If lots of things get said in a programme of events, and nothing is left around to read about it, did they happen?

The care.data programme 2014-15 listening exercise and action plan has become impossible to find online. That’s OK, you might think, the programme has been scrapped. Not quite.

You can give your views online until September 7th on the new consultation, “New data security standards and opt-out models for health and social care”  and/or attend the new listening events, September 26th in London, October 3rd in Southampton and October 10th in Leeds.

The Ministerial statement on July 6, announced that NHS England had taken the decision to close the care.data programme after the review of data security and consent by Dame Fiona Caldicott, the National Data Guardian for Health and Care.

But the same questions are being asked again around consent and use of your medical data, from primary and secondary care. What a very long questionnaire asks is in effect,  do you want to keep your medical history private? You can answer only Q 15 if you want.

Ambiguity again surrounds what constitutes “de-identified” patient information.

What is clear is that public voice seems to have been deleted or lost from the care.data programme along with the feedback and brand.

People spoke up in 2014, and acted. The opt out that 1 in 45 people chose between January and March 2014 was put into effect by the HSCIC in April 2016. Now it seems, that might be revoked.

We’ve been here before.  There is no way that primary care data can be extracted without consent without it causing further disruption and damage to public trust and public interest research.  The future plans for linkage between all primary care data and secondary data and genomics for secondary uses, is untenable without consent.

Upcoming events cost time and money and will almost certainly go over the same ground that hours and hours were spent on in 2014. However if they do achieve a meaningful response rate, then I hope the results will not be lost and will be combined with those already captured under the ‘care.data listening events’ responses.  Will they have any impact on what consent model there may be in future?

So what we gonna do? I don’t know, whatcha wanna do? Let’s do something.

Let’s have accredited access and security fixed. While there may now be a higher transparency and process around release, there are still problems about who gets data and what they do with it.

Let’s have clear future scope and control. There is still no plan to give the public rights to control or delete data if we change our minds who can have it or for what purposes. And that is very uncertain. After all, they might decide to privatise or outsource the whole thing as was planned for the CSUs. 

Let’s have answers to everything already asked but unknown. The questions in the previous Caldicott review have still to be answered.

We have the possibility to  see health data used wisely, safely, and with public trust. But we seem stuck with the same notes again. And the public seem to be the last to be invited to participate and views once gathered, seem to be disregarded. I hope to be proved wrong.

Might, perhaps, the consultation deliver the nuanced consent model discussed at public listening exercises that many asked for?

Will the care.data listening events feedback summary be found, and will its 2014 conclusions and the enacted opt out be ignored? Will the new listening event view make more difference than in 2014?

Is public engagement, engagement, if nobody hears what was said?

Datasharing, lawmaking and ethics: power, practice and public policy

“Lawmaking is the Wire, not Schoolhouse Rock. It’s about blood and war and power, not evidence and argument and policy.”

"We can't trust the regulators," they say. "We need to be able to investigate the data for ourselves." Technology seems to provide the perfect solution. Just put it all online - people can go through the data while trusting no one.  There's just one problem. If you can't trust the regulators, what makes you think you can trust the data?" 

Extracts from The Boy Who Could Change the World: The Writings of Aaron Swartz. Chapter: ‘When is Technology Useful? ‘ June 2009.

The question keeps getting asked, is the concept of ethics obsolete in Big Data?

I’ve come to some conclusions why ‘Big Data’ use keeps pushing the boundaries of what many people find acceptable, and yet the people doing the research, the regulators and lawmakers often express surprise at negative reactions. Some even express disdain for public opinion, dismissing it as ignorant, not ‘understanding the benefits’, yet to be convinced. I’ve decided why I think what is considered ‘ethical’ in data science does not meet public expectation.

It’s not about people.

Researchers using large datasets, often have a foundation in data science, applied computing, maths, and don’t see data as people. It’s only data. Creating patterns, correlations, and analysis of individual level data are not seen as research involving human subjects.

This is embodied in the nth number of research ethics reviews I have read in the last year in which the question is asked, does the research involve people? The answer given is invariably ‘no’.

And these data analysts using, let’s say health data, are not working in a subject that is founded on any ethical principle, contrasting with the medical world the data come from.

The public feels differently about the information that is about them, and may be known, only to them or select professionals. The values that we as the public attach to our data  and expectations of its handling may reflect the expectation we have of handling of us as people who are connected to it. We see our data as all about us.

The values that are therefore put on data, and on how it can and should be used, can be at odds with one another, the public perception is not reciprocated by the researchers. This may be especially true if researchers are using data which has been de-identified, although it may not be anonymous.

New legislation on the horizon, the Better Use of Data in Government,  intends to fill the [loop]hole between what was legal to share in the past and what some want to exploit today, and emphasises a gap in the uses of data by public interest, academic researchers, and uses by government actors. The first incorporate by-and-large privacy and anonymisation techniques by design, versus the second designed for applied use of identifiable data.

Government departments and public bodies want to identify and track people who are somehow misaligned with the values of the system; either through fraud, debt, Troubled Families, or owing Student Loans. All highly sensitive subjects. But their ethical data science framework will not treat them as individuals, but only as data subjects. Or as groups who share certain characteristics.

The system again intrinsically fails to see these uses of data as being about individuals, but sees them as categories of people – “fraud” “debt” “Troubled families.” It is designed to profile people.

Services that weren’t built for people, but for government processes, result in datasets used in research, that aren’t well designed for research. So we now see attempts to shoehorn historical practices into data use  by modern data science practitioners, with policy that is shortsighted.

We can’t afford for these things to be so off axis, if civil service thinking is exploring “potential game-changers such as virtual reality for citizens in the autism spectrum, biometrics to reduce fraud, and data science and machine-learning to automate decisions.”

In an organisation such as DWP this must be really well designed since “the scale at which we operate is unprecedented: with 800 locations and 85,000  colleagues, we’re larger than most retail operations.”

The power to affect individual lives through poor technology is vast and some impacts seem to be being badly ignored. The ‘‘real time earnings’ database improved accuracy of benefit payments was widely agreed to have been harmful to some individuals through the Universal Credit scheme, with delayed payments meaning families at foodbanks, and contributing to worse.

“We believe execution is the major job of every business leader,” perhaps not the best wording in on DWP data uses.

What accountability will be built-by design?

I’ve been thinking recently about drawing a social ecological model of personal data empowerment or control. Thinking about visualisation of wants, gaps and consent models, to show rather than tell policy makers where these gaps exist in public perception and expectations, policy and practice. If anyone knows of one on data, please shout. I think it might be helpful.

But the data *is* all about people

Regardless whether they are in front of you or numbers on a screen, big or small datasets using data about real lives are data about people. And that triggers a need to treat the data with an ethical approach as you would people involved face-to-face.

Researchers need to stop treating data about people as meaningless data because that’s not how people think about their own data being used. Not only that, but if the whole point of your big data research is to have impact, your data outcomes, will change lives.

Tosh, I know some say. But, I have argued, the reason being is that the applications of the data science/ research/ policy findings / impact of immigration in education review / [insert purposes of the data user’s choosing] are designed to have impact on people. Often the people about whom the research is done without their knowledge or consent. And while most people say that is OK, where it’s public interest research, the possibilities are outstripping what the public has expressed as acceptable, and few seem to care.

Evidence from public engagement and ethics all say, hidden pigeon-holing, profiling, is unacceptable. Data Protection law has special requirements for it, on autonomous decisions. ‘Profiling’ is now clearly defined under article 4 of the GDPR as ” any form of automated processing of personal data consisting of using those data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.”

Using big datasets for research that ‘isn’t interested in individuals’ may still intend to create results profiling groups for applied policing, or discriminate, to make knowledge available by location. The data may have been deidentified, but in application becomes no longer anonymous.

Big Data research that results in profiling groups with the intent for applied health policy impacts for good, may by the very point of research, with the intent of improving a particular ethnic minority access to services, for example.

Then look at the voting process changes in North Carolina and see how that same data, the same research knowledge might be applied to exclude, to restrict rights, and to disempower.

Is it possible to have ethical oversight that can protect good data use and protect people’s rights if they conflict with the policy purposes?

The “clear legal basis”is not enough for public trust

Data use can be legal and can still be unethical, harmful and shortsighted in many ways, for both the impacts on research – in terms of withholding data and falsifying data and avoiding the system to avoid giving in data – and the lives it will touch.

What education has to learn from health is whether it will permit the uses by ‘others’ outside education to jeopardise the collection of school data intended in the best interests of children, not the system. In England it must start to analyse what is needed vs wanted. What is necessary and proportionate and justifies maintaining named data indefinitely, exposed to changing scope.

In health, the most recent Caldicott review suggests scope change by design – that is a red line for many: “For that reason the Review recommends that, in due course, the opt-out should not apply to all flows of information into the HSCIC. This requires careful consideration with the primary care community.”

The community spoke out already, and strongly in Spring and Summer 2014 that there must be an absolute right to confidentiality to protect patients’ trust in the system. Scope that ‘sounds’ like it might sneakily change in future, will be a death knell to public interest research, because repeated trust erosion will be fatal.

Laws change to allow scope change without informing people whose data are being used for different purposes

Regulators must be seen to be trusted, if the data they regulate is to be trustworthy. Laws and regulators that plan scope for the future watering down of public protection, water down public trust from today. Unethical policy and practice, will not be saved by pseudo-data-science ethics.

Will those decisions in private political rooms be worth the public cost to research, to policy, and to the lives it will ultimately affect?

What happens when the ethical black holes in policy, lawmaking and practice collide?

At the last UK HealthCamp towards the end of the day, when we discussed the hard things, the topic inevitably moved swiftly to consent, to building big databases, public perception, and why anyone would think there is potential for abuse, when clearly the intended use is good.

The answer came back from one of the participants, “OK now it’s the time to say. Because, Nazis.” Meaning, let’s learn from history.

Given the state of UK politics, Go Home van policies, restaurant raids, the possibility of Trump getting access to UK sensitive data of all sorts from across the Atlantic, given recent policy effects on the rights of the disabled and others, I wonder if we would hear the gentle laughter in the room in answer to the same question today.

With what is reported as Whitehall’s digital leadership sharp change today, the future of digital in government services and policy and lawmaking does indeed seem to be more “about blood and war and power,” than “evidence and argument and policy“.

The concept of ethics in datasharing using public data in the UK is far from becoming obsolete. It has yet to begin.

We have ethical black holes in big data research, in big data policy, and big data practices in England. The conflicts between public interest research and government uses of population wide datasets, how the public perceive the use of our data and how they are used, gaps and tensions in policy and practice are there.

We are simply waiting for the Big Bang. Whether it will be creative, or destructive we are yet to feel.

*****

image credit: LIGO – graphical visualisation of black holes on the discovery of gravitational waves

References:

Report: Caldicott review – National Data Guardian for Health and Care Review of Data Security, Consent and Opt-Outs 2016

Report: The OneWay Mirror: Public attitudes to commercial access to health data

Royal Statistical Society Survey carried out by Ipsos MORI: The Data Trust Deficit

Ethics, standards and digital rights – time for a citizens’ charter

Central to future data sharing [1] plans is the principle of public interest, intended to be underpinned by transparency in all parts of the process, to be supported by an informed public.  Three principles that are also key in the plan for open policy.

The draft ethics proposals [2] start with user need (i.e. what government wants, researchers want, the users of the data) and public benefit.

With these principles in mind I wonder how compatible the plans are in practice, plans that will remove the citizen from some of the decision making about information sharing from the citizen; that is, you and me.

When talking about data sharing it is all too easy to forget we are talking about people, and in this case, 62 million individual people’s personal information, especially when users of data focus on how data are released or published. The public thinks in terms of personal data as info related to them. And the ICO says, privacy and an individual’s rights are engaged at the point of collection.

The trusted handling, use and re-use of population-wide personal data sharing and ID assurance are vital to innovation and digital strategy. So in order to make these data uses secure and trusted, fit for the 21st century, when will the bad bits of current government datasharing policy and practice [3] be replaced by good parts of ethical plans?

Current practice and Future Proofing Plans

How is policy being future proofed at a time of changes to regulation in the new EUDP which are being made in parallel? Changes that clarify consent and the individual, requiring clear affirmative action by the data subject. [4]  How do public bodies and departments plan to meet the current moral and legal obligation to ensure persons whose personal data are subject to transfer and processing between two public administrative bodies must be informed in advance?

How is public perception [5] being taken into account?

And how are digital identities to be protected when they are literally our passport to the world, and their integrity is vital to maintain, especially for our children in the world of big data [6] we cannot imagine today? How do we verify identity but not have to reveal the data behind it, if those data are to be used in ever more government transactions – done badly that could mean the citizen loses sight of who knows what information and who it has been re-shared with.

From the 6th January there are lots of open questions, no formal policy document or draft legislation to review. It appears to be far off being ready for public consultation, needing concrete input on practical aspects of what the change would mean in practice.

Changing the approach to the collection of citizens’ personal data and removing the need for consent to wide re-use and onward sharing, will open up a massive change to the data infrastructure of the country in terms of who is involved in administrative roles in the process and when. As a country to date we have not included data as part of our infrastructure. Some suggest we should. To change the construction of roads would require impact planning, mapping and thought out budget before beginning the project to assess its impact. An assessment this data infrastructure change appears to be missing entirely.

I’ve considered the plans in terms of case studies of policy and practice, transparency and trust, the issues of data quality and completeness and digital inclusion.

But I’m starting by sharing only my thoughts on ethics.

Ethics, standards and digital rights – time for a public charter

How do you want your own, or your children’s personal data handled?

This is not theoretical. Every one of us in the UK has our own confidential data used in a number of ways about which we are not aware today. Are you OK with that? With academic researchers? With GCHQ? [7] What about charities? Or Fleet Street press? All of these bodies have personal data from population wide datasets and that means all of us or all of our children, whether or not we are the subjects of research, subject to investigation, or just an ordinary citizen minding their own business.

On balance, where do you draw the line between your own individual rights and public good? What is fair use without consent and where would you be surprised and want to be informed?
I would like to hear more about how others feel about and weigh the risks and benefits trade off in this area.

Some organisations on debt have concern about digital exclusion. Others about compiling single view data in coercive relationships. Some organisations are campaigning for a digital bill of rights. I had some thoughts on this specifically for health data in the past.

A charter of digital standards and ethics could be enabling, not a barrier and should be a tool that must come to consultation before new legislation.

Discussing datasharing that will open up every public data set “across every public body” without first having defined a clear policy is a challenge. Without defining its ethical good practice first as a reference framework, it’s dancing in the dark. This draft plan is running in parallel but not part of the datasharing discussion.
Ethical practice and principles must be the foundation of data sharing plans, not an after thought.

Why? Because this stuff is hard. The kinds of research that use sensitive de-identified data are sometimes controversial and will become more challenging as the capabilities of what is possible increase with machine learning, genomics, and increased personalisation and targeting of marketing, and interventions.

The ADRN had spent months on its ethical framework and privacy impact assessment, before I joined the panel.

What does Ethics look like in sharing bulk datasets?

What do you think about the commercialisation of genomic data by the state – often from children whose parents are desperate for a diagnosis – to ‘kick start’ the UK genomics industry?  What do you think about data used in research on domestic violence and child protection? And in predictive policing?

Or research on religious affiliations and home schooling? Or abortion and births in teens matching school records to health data?

Will the results of the research encourage policy change or interventions with any group of people? Could these types of research have unintended consequences or be used in ways researchers did not foresee supporting not social benefit but a particular political or scientific objective? If so, how is that governed?

What research is done today, what is good practice, what is cautious and what would Joe Public expect? On domestic violence for example, public feedback said no.

And while there’s also a risk of not making the best use of data, there are also risks of releasing even anonymised data [8] in today’s world in which jigsawing together the pieces of poorly anonymised data means it is identifying. Profiling or pigeonholing individuals or areas was a concern raised in public engagement work.

The Bean Report used to draw out some of the reasoning behind needs for increased access to data: “Remove obstacles to the greater use of public sector administrative data for statistical purposes, including through changes to the associated legal framework, while ensuring appropriate ethical safeguards are in place and privacy is protected.”

The Report doesn’t outline how the appropriate ethical safeguards are in place and privacy is protected. Or what ethical looks like.

In the Public interest is not clear cut.

The boundary between public and private interest shift in time as well as culture. While in the UK the law today says we all have the right to be treated as equals, regardless of our gender, identity or sexuality it has not always been so.

By putting the rights of the individual on a lower par than the public interest in this change, we risk jeopardising having any data at all to use. But data will be central to the digital future strategy we are told the government wants to “show the rest of the world how it’s done.”

If they’re serious, if all our future citizens must have a digital identity to use with government with any integrity, then the use of not only our current adult, but our children’s data really matters – and current practices must change.  Here’s a case study why:

Pupil data: The Poster Child of Datasharing Bad Practice

Right now, the National Pupil database containing our 8 million or more children’s personal data in England is unfortunately the poster child of what a change in legislation and policy around data sharing, can mean in practice.  Bad practice.

The “identity of a pupil will not be discovered using anonymised data in isolation”, says the User Guide, but when they give away named data, and identifiable data in all but 11 requests since 2012, it’s not anonymised. Anything but the ‘anonymised data’ of publicly announced plans presented in 2011, yet precisely what the change in law to broaden the range of users in the Prescribed Persons Act 2009 permitted , and the expansion of purposes in the amended Education (Individual Pupil Information)(Prescribed Persons) Regulations introduced in June 2013.  It was opened up to:

“(d)persons who, for the purpose of promoting the education or well-being of children in England are—

(i)conducting research or analysis,

(ii)producing statistics, or

(iii)providing information, advice or guidance,

and who require individual pupil information for that purpose(5);”.

The law was changed so that, individual pupil level data, and pupil names are extracted, stored and have also been released at national level. Raw data sent to commercial third parties, charities and press in identifiable individual level and often sensitive data items.

This is a world away from safe setting, statistical analysis of de-identified data by accredited researchers, in the public interest.

Now our children’s confidential data sit on servers on Fleet Street – is this the model for all our personal administrative data in future?

If not, how do we ensure it is not? How will the new all-datasets’ datasharing legislation permit wider sharing with more people than currently have access and not end up with all our identifiable data sent ‘into the wild’ without audit as our pupil data are today?

Consultation, transparency, oversight and public involvement in ongoing data decision making are key, and  well written legislation.

The public interest alone, is not a strong enough description to keep data safe. This same government brought in this National Pupil Database policy thinking it too was ‘in the public interest’ after all.

We need a charter of ethics and digital rights that focuses on the person, not exclusively the public interest use of data.

They are not mutually exclusive, but enhance one another.

Getting ethics in the right place

These ethical principles start in the wrong place. To me, this is not an ethical framework, it’s a ‘how-to-do-data-sharing’ guideline and try to avoid repeating care.data. Ethics is not first about the public interest, or economic good, or government interest. Instead, referencing an ethics council view, you start with the person.

“The terms of any data initiative must take into account both private and public interests. Enabling those with relevant interests to have a say in how their data are used and telling them how they are, in fact, used is a way in which data initiatives can demonstrate respect for persons.”

Professor Michael Parker, Member of the Nuffield Council on Bioethics Working Party and Professor of Bioethics and Director of the Ethox Centre, University of Oxford:

“Compliance with the law is not enough to guarantee that a particular use of data is morally acceptable – clearly not everything that can be done should be done. Whilst there can be no one-size-fits-all solution, people should have say in how their data are used, by whom and for what purposes, so that the terms of any project respect the preferences and expectations of all involved.”

The  partnership between members of the public and public administration must be consensual to continue to enjoy support. [10]. If personal data are used for research or other uses, in the public interest, without explicit consent, it should be understood as a privilege by those using the data, not a right.

As such, we need to see data as about the person, as they see it themselves, and data at the point of collection as information about individual people, not just think of statistics. Personal data are sensitive, and some research uses highly sensitive,  and data used badly can do harm. Designing new patterns of datasharing must think of the private, as well as public interest,  co-operating for the public good.

And we need a strong ethical framework to shape that in.

******

[1] http://datasharing.org.uk/2016/01/13/data-sharing-workshop-i-6-january-2016-meeting-note/

[2] Draft data science ethical framework: https://data.blog.gov.uk/wp-content/uploads/sites/164/2015/12/Data-science-ethics-short-for-blog-1.pdf

[3] defenddigitalme campaign to get pupil data in England made safe http://defenddigitalme.com/

[4] On the European Data Protection regulations: https://www.privacyandsecuritymatters.com/2015/12/the-general-data-protection-regulation-in-bullet-points/

[5] Public engagament work – ADRN/ESRC/ Ipsos MORI 2014 https://adrn.ac.uk/media/1245/sri-dialogue-on-data-2014.pdf

[6] Written evidence submitted to the parliamentary committee on big data: http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/big-data-dilemma/written/25380.pdf

[7] http://www.bbc.co.uk/news/uk-politics-35300671 Theresa May affirmed bulk datasets use at the IP Bill committee hearing and did not deny use of bulk personal datasets, including medical records

[8] http://www.economist.com/news/science-and-technology/21660966-can-big-databases-be-kept-both-anonymous-and-useful-well-see-you-anon

[9] Nuffield Council on Bioethics http://nuffieldbioethics.org/report/collection-linking-use-data-biomedical-research-health-care/ethical-governance-of-data-initiatives/

[10] Royal Statistical Society –  the data trust deficit https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

Background: Why datasharing matters to me:

When I joined the data sharing discussions that have been running for almost 2 years only very recently, it was wearing two hats, both in a personal capacity.

The first was with interest in how any public policy and legislation may be changing and will affect deidentified datasharing for academic research, as I am one of two lay people, offering public voice on the ADRN approvals panel.

Its aim is to makes sure the process of granting access to the use of sensitive, linked administrative data from population-wide datasets is fair, equitable and transparent, for de-identified use by trusted researchers, for non-commercial use, under strict controls and in safe settings. Once a research project is complete, the data are securely destroyed. It’s not doing work that “a government department or agency would carry out as part of its normal operations.”

Wearing my second hat, I am interested to see how new policy and practice plan to affect current practice. I coordinate the campaign efforts with the Department for Education to stop giving away the identifiable, confidential and sensitive personal data of our 8m children in England to commercial third parties and press from the National Pupil Database.

Thoughts since #UKHC15. UK health datasharing.

The world you will release your technology into, is the world you are familiar with, which is already of the past. Based on old data.

How can you design tools and systems fit for the future? And for all?

For my 100th post and the first of 2016, here is a summary of some of my thoughts prompted by . Several grains of thought related to UK heath data that have been growing for some time.

1000 words on “Hard things: identity, data sharing and consent.” The fun run version.

Do we confuse hard with complex? Hard does not have to mean difficult. Some things seem to be harder than necessary, because of politics. I’ve found this hard to write. Where to start?

The search to capture solutions has been elusive.

The starting line: Identity

Then my first thoughts on identity got taken care of by Vinay Gupta in this post, better than I could. (If you want a long read about identity, you might want to get a hot drink like I did and read and re-read. It says it’ll take an hour. It took me several, in absorption and thinking time. And worth it.)

That leaves data sharing and consent. Both of which I have written many of my other 99 posts about in the last year. So what’s new?

Why are we doing this: why aren’t we there yet?

It still feels very much that many parts of the health service and broader government thinking on ‘digital’ is we need to do something. Why is missing, and therefore achieving and measuring success is hard.

Often we start with a good idea and set about finding a solution how to achieve it. But if the ‘why’ behind the idea is shaky to start with, the solution may falter, as soon as it gets difficult. No one seems to know what #paperless actually means in practice.

So why try and change things? Fixing problems, rather than coming up with good ideas is another way to think of it as they suggested at  #ukhc15, it was a meet-up for people who want to make things better, usually for others, and sometimes that involves improving the systems they worked with directly, or supported others in.

I no longer work in systems’ introductions, or enhancement processes, although I have a lay role in research and admin data, but regular readers know, most of the last two years has been all about the data.  care.data.

More often than not, in #ukhc2015 discussions that focused on “the data” I would try and bring people back to thinking about what the change is trying to solve, what it wants to “make better” and why.

There’s a broad tendency to simply think more data = better. Not true, and I’ll show later a case study why. We must question why.

Why doesn’t everyone volunteer or not want to join in?

Very many people who have spoken with me over the last two years have shared their concrete concerns over the plans to share GP data and they do not get heard. They did not see a need to share their identifiable personal confidential data, or see why truly anonymous data would not be sufficient for health planning, for example.

Homeless men, and women at risk, people from the travelling community, those with disabilities, questions on patients with stigmatising conditions, minorities, children, sexual orientation – not to mention from lawyers or agencies representing them. Or the 11 million of our adult population not online. Few of whom we spoke about. Few of whom we heard from at #ukhc15. Yet put together, these individuals make up not only a significant number of people, but make up a disproportionately high proportion of the highest demands on our health and social care services.

The inverse care law appears magnified in its potential when applied to digital, and should magnify the importance of thinking about access. How will care.data make things better for them, and how will the risks be mitigated? And are those costs being properly assessed if there is no assessment of the current care.data business case and seemingly, since 2012 at least, no serious effort to look at alternatives?

The finish line? We can’t see what it looks like yet.

The #ukhc2015 event was well run, and I liked the spontaneity of people braver than me who were keen to lead sessions and did it well.  As someone who is white, living in a ‘nice’ area, I am privileged. It was a privilege to spend a day with #UKHC15 and packed with people who clearly think about hard things all the time. People who want to make things better.  People who were welcoming to nervous first-timers at an ‘un’conference over a shared lunch.

I hope the voices of those who can’t attend these events, and outside London, are equally accounted for in all government 2016 datasharing plans.

This may be the last chance after years of similar consultations have failed to deliver workable, consensual public data sharing policies.

We have vast streams of population-wide data stored in the UK, about which, the population is largely ignorant. But while the data may be from 25 years ago, whatever is designed today is going to need to think long term, not how do we solve what we know, but how do we design solutions that will work for what we don’t.

Transparency here will be paramount to trust if future decisions are made for us, or those we make for ourselves are ‘influenced’ by machine learning, by algorithms, machine learning and ‘mindspace’ work.

As Thurgood Marshall said,

“Our whole constitutional heritage rebels at the thought of giving government the power to control men’s minds.”

Control over who we are and who the system thinks we are becomes a whole new level of discussion, if we are being told how to make a decision, especially where the decision is toward a direction of public policy based on political choice. If pensions are not being properly funded, to not allocate taxes differently and fund them, is a choice the current government has made, while the DWP seeks to influence our decison, to make us save more in private pensions.

And how about in data discussions make an effort to start talking a little more clearly in the same terms – and stop packaging ‘sharing’ as if it is something voluntary in population-wide compulsory policy.

It’s done to us, not with us, in far too many areas of government we do not see. Perhaps this consultation might change that, but it’s the ‘nth’ number of consulations and I want to be convinvced this one is intentional of real change. It’s only open for a few weeks, and this meet up for discussion appeared to be something only organised in London.

I hope we’ll hear committment to real change in support of people and the uses of our personal data by the state in the new #UkDigiStrategy, not simply more blue skythinking and drinking the ‘datasharing’ kool-aid.  We’ve been talking in the UK for far too long about getting this right.

Let’s see the government serious about making it happen. Not for government, but in the public interest, in a respectful and ethical partnership with people, and not find changes forced upon us.

No other foundation will be fit for a future in which care.data, the phenotype data, is to be the basis for an NHS so totally personalised.

If you want a longer read, read on below for my ten things in detail.

Comment welcome.

########

Hard things: The marathon version, below.
Continue reading Thoughts since #UKHC15. UK health datasharing.

Act now: Stand up and speak out for your rights to finding out the facts #saveFOI

The Freedom of Information Act has enabled me to stand up for my children’s rights. It really matters to me. And we might lose it.

For every member of the public, who has ever or who has never used their rights under the Freedom of Information Act laws, the government consultation on changing them that closes today is worth caring about. If you haven’t yet had your say, go and take action here >> now.  If it is all you have time for before the end of today, you can sign 38 degrees petition or write an email to your MP.

Or by the end of today you can reply to the call for evidence. There is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also got this plain English version.

Please do. Now. It closes today, on November 20th.

If you need convinced why it matters to me and it should to you, read on.

What will happen

If the proposed changes come to pass, information about public accountability will be lost. Political engagement will not be open to all equally. It will promote an unfair society in which individuals are not only prevented from taking part in full public life, but prevented from understanding decisions made about them or that affect them. Campaign groups will be constrained from standing up for human rights by cost.  The press will be restrained in what they can ask.

MySociety has a brilliant summary.  Michael Sheen spoke up calling it “nothing short of a full frontal attack” on the principles of democratic government. And Tom Watson spoke of three serious instances where facts would have stayed hidden, were it not for access made using the law of Freedom of Information:

1. death rates in cardiac patient care
2. cases when the police use Tasers on children
3. the existence of cracks in the nuclear power station at Hinckley

Why does FOI matter to me personally? In Education.

Because it’s enabled me to start a conversation to get the Department for Education to start to improve their handling of our 8 million children’s personal and sensitive data they hold in the National Pupil Database for England and Wales. Through FOI I asked for unpublished facts how many releases of identifiable personal data of school pupils have been fast-tracked at the Department of Education without panel oversight. And to see the panel terms of reference which are still not on their website.

The request: whatdotheykknow.com
The outcome:
National Pupil Database FOI case study summary here.

I’m now coordinating calls for changes on behalf of the 8m children whose records they hold and parents across the country.

******

Why does FOI matter to me personally? In Health.

Because Freedom of Information law has enabled public transparency of decision making and accountability of the care.data programme board decision making that was kept secret for over a year. NHS England refused to publish them. Their internal review declined appeal. The Information Commissioner’s Office upheld it.

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. In fact my case study, while highlighting that NHS England refused to release information, also shows that only a handful of genuine redactions were necessary, using Section 36 to keep them hidden, when the minutes were finally released.

In October 2014 I simply wanted to see the meeting minutes form part of the public record of care.data planning. I wanted to see the cost-benefit business case and scrutinise it against the benefits case that the public were told of at every public engagement event I had been to.  When at every turn the public is told how little money the NHS can afford to spend I wanted scrutiny of what the programme would cost at national and local levels. It was in the public interest to better inform public debate about the merits of the national programme. And I strongly believe that it is in the public interest to be informed and fully understand the intention and programme that demands the use of sensitive personal data.

The request: whatdotheyknow.com
The outcome: care.data FOI case study summary here.

Others could use this information I hoped, to ask the right questions about missing meeting minutes and transparency, and for everyone to question why there was no cost-benefit business plan at all in private; while the public kept being told of the benefits.  And it shows that data collection is further set to expand, without public debate.

Since then the programme has been postoned again and work is in progress on improved public engagement to enable public and professional confidence.

What has Freedom of Information achieved?

One of the most memorable results of Freedom of Information was the MPs expenses scandal. Who knows how much this Freedom of Information Request saved the taxpayers in immeasurable amounts of future spending on duck houses since MPs have been required to publish expenses since 2010? Four MPs were jailed for false accounting. Peers were expelled. Second homes and what appeared to the public as silly spending on sundries were revealed. Mr. Cameron apologized in 2009, saying he was “appalled” by the expenses. The majority of MPs had done nothing illegal but the Freedom of Information request enabled the start of a process of increased transparency to the public which showed where activities, while permitted by law, were simply unethical or unreasonable.

Historical record

Information published under the Freedom of Information Act can help to ensure that important records of decision-making processes are retained as part of the historic background to government.

Increased trust

The right information at the right time helps make better decisions, make spending more transparent and makes policies and practices more trustworthy.

Access to official information can also improve public confidence where public sector bodies are seen as being open. In a 2011 survey carried out on behalf of the Information Commissioner’s Office, 81% of public bodies questioned agreed that the Act had increased the public’s trust in their organisation.

A key argument made by the commission is that those in public office need private space for decision making. The Information Commissioner’s Office countered this in their submission to the consultation saying,

“there is a distinction between a need for a private space, depending on the circumstances and a desire for secrecy across a broad area of public sector activity. It was the latter tendency that FOIA was intended to correct.”

So how much more “private space” do public servants need?

Holding back information

When information that are judged should not be released in the public interest, there are already exemptions that can be applied to prevent disclosure of information under the Freedom of Information Act. [1]

The exemptions include:

  • if the information can easily be accessed by other means – e.g. the internet or published documents
  • if the information is personal information
  • if the information is provided in confidence (but only if legally enforceable)
  • when there is a legal reason not to disclose
  • if the information is about national security, defence, the economy, law enforcement, formulation of Government policy, health and safety, communications with Her Majesty or other royalty, international relations, intended for future publication and commercial interests. (All exemptions in this group must be tested to see if disclosure is in the public interest.)

In addition to these exemptions, organisations can withhold information if it will take more than two-and-a-half days to provide it, or they cannot identify what information is needed (although they have to work with the requester to clarify what is being requested).

They can also withhold information if they decide the request is vexatious.

Does it cost us too much to administer?

Some people who are supportive of these changes say they are concerned about costs in answering requests but have perhaps not considered the savings in exceptional cases (like the Expenses Scandal outcome). And as mySociety has reported [2], money spent responding to Freedom of Information requests also needs to be considered fairly in the context of wider public spending. In 2012 it was reported that Staffordshire County Council had spent £38,000 in a year responding to Freedom of Information requests. The then Director of mySociety, Tom Steinberg, commented:

“From this I can see that oversight by citizens and journalists cost only £38,000 from a yearly total budget of £1.3bn. I think it is fantastic that Staffordshire County Council can provide such information for only 0.002 per cent of its operating budget.”

Why does the government want to make itself less transparent? Even the Information Commissioner’s office has replied to the consultation to say that the Commissioner does not consider that significant changes to the core principles of the legislation are needed. This is a good law, that gives the public rights in our favour and transparency into how we are governed and tax money spent.

How will the value of FOI be measured of what would be lost if the changes are made?

What can you do?

The call for evidence is here and there is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also put together this super-easy Plain English version.

To have your say in the consultation closing on November 20th go online.

Or simply call or write to your MP.  Today. This really matters.


References:

[1] Requests can be refused https://ico.org.uk/for-organisations/guide-to-freedom-of-information/refusing-a-request/

[2] MySociety opposes restrictions https://www.mysociety.org/2015/11/11/voices-from-whatdotheyknow-why-we-oppose-foi-act-restrictions/

[3] National Pupil Database FOI case study summary here

[4] My care.data programme board FOI case study summary here

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex. Too little, too late.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex ed. They should be discussing prevention and personal data protection for all our personal data, not just one company, after the event.

Everyone’s been talking about TalkTalk and for all the wrong reasons. Data loss and a 15-year-old combined with a reportedly reckless response to data protection, compounded by lack of care.

As Rory Cellan-Jones wrote [1] rebuilding its reputation with customers and security analysts is going to be a lengthy job.

In Parliament Chi Onwarah, Shadow Minister for Culture & the Digital Economy, summed up in her question, asking the Minister to acknowledge “that all the innovation has come from the criminals while the Government sit on their hands, leaving it to businesses and consumers to suffer the consequences?”  [Hansard 2]

MPs were concerned for the 4 million* customers’ loss of name, date of birth, email, and other sensitive data, and called for an inquiry. [It may now be fewer*.] [3] The SciTech committee got involved too.

I hope this means Parliament will talk about TalkTalk not as the problem to be solved, but as one case study in a review of contemporary policy and practices in personal data handling.

Government spends money in data protection work in the [4] “National Cyber Security Programme”. [NCSP] What is the measurable outcome – particularly for TalkTalk customers and public confidence – from its £860M budget?  If you look at the breakdown of those sums, with little going towards data protection and security compared with the Home Office and Defence, we should ask if government is spending our money in an appropriately balanced way on the different threats it perceives. Keith Vaz suggested British companies that lose £34 billion every year to cybercrime. Perhaps this question will come into the inquiry.

This all comes after things have gone wrong.  Again [5]. An organisation we trusted has abused that trust by not looking after data with the stringency that customers should be able to expect in the 21st century, and reportedly not making preventative changes, apparent a year ago. Will there be consequences this time?

The government now saying it is talking about data protection and consequences, is like saying they’re talking sex education with teens, but only giving out condoms to the boys.

It could be too little too late. And they want above all to avoid talking about their own practices. Let’s change that.

Will this mean a review to end risky behaviour, bring in change, and be wiser in future?

If MPs explore what the NCSP does, then we the public, should learn more about what government’s expectations of commercial companies is in regards modern practices.

In addition, any MPs’ inquiry should address government’s own role in its own handling of the public’s personal data. Will members of government act in a responsible manner or simply tell others how to do so?

Public discussion around both commercial and state use of our personal data, should mean genuine public engagement. It should involve a discussion of consent where necessary for purposes  beyond those we expect or have explained when we submit our data, and there needs to be a change in risky behaviour in terms of physical storage and release practices, or all the talk, is wasted.

Some say TalkTalk’s  practices mean they have broken their contract along with consumer trust. Government departments should also be asking whether their data handling would constitute a breach of the public’s trust and reasonable expectations.

Mr Vaizey should apply his same logic to government handling data as he does to commercial handling. He said he is open to suggestions for improvement. [6]

Let’s not just talk about TalkTalk.

    • Let’s Talk Consequences: organisations taking risk seriously and meaningful consequences if not [7]
    • Let’s Talk Education: the education of the public on personal data use by others and rights and responsibilities we have [8]
    • Let’s Talk Parliament’s Policies and Practices: about its own complementary lack of data  understanding in government and understand what good practice is in physical storage, good governance and transparent oversight
    • Let’s Talk Public Trust: and the question whether government can be trusted with public data it already has and whether its current handling makes it trustworthy to take more [9]

Vaizey said of the ICO now in his own department: “The Government take the UK’s cyber-security extremely seriously and we will continue to do everything in our power to protect organisations and individuals from attacks.”

“I will certainly meet the Information Commissioner to look at what further changes may be needed in the light of this data breach. [..] It has extensive powers to take action and, indeed, to levy significant fines. “

So what about consequences when data are used in ways the public would consider a loss, and not through an attack or a breach, but government policy? [10]

Let’s Talk Parliament’s Policies and Practices

Commercial companies are not alone in screwing up the use and processing [11] management of our personal data. The civil service under current policy seems perfectly capable of doing by itself. [12]

Government data policy has not kept up with 21st century practices and to me seems to work in the dark, as Chi Onwarah said,

‘illuminated by occasional flashes of incompetence.’

This incompetence can risk harm to people’s lives, to business and to public confidence.

And once given, trust would be undermined by changing the purposes or scope of use for which it was given, for example as care.data plans to do after the pilot. A most risky idea.

Trust in these systems, whether commercial or state, is crucial. Yet reviews which highlight this, and make suggestions to support trust such as ‘data should never be (and currently is never) released with personal identifiers‘ in The Shakespeare Review have been ignored by government.

Where our personal data are not used well in government departments by the department themselves, they seem content to date to rely on public ignorance to get away with current shoddy practices.

Practices such as not knowing who all your customers are, because they pass data on to others. Practices, such as giving individual level identifiable personal data to third parties without informing the public, or asking for consent. Practices, such as never auditing or measuring any benefit of giving away others personal data.

“It is very important that all businesses, particularly those handling significant amounts of sensitive customer data, have robust procedures in place to protect those data and to inform customers when there may have been a data breach.” Ed Vaizey, Oct 26th, HOC

If government departments prove to be unfit to handle the personal data we submit in trust to the state today, would we be right to trust them with even more?

While the government is busy wagging fingers at commercial data use poor practices, the care.data debacle is evidence that not all its MPs or civil service understand how data are used in commercial business or through government departments.

MPs calling for commercial companies to sharpen up their data protection must understand how commercial use of data often piggy-backs the public use of our personal data, or others getting access to it via government for purposes that were unintended.

Let’s Talk Education

If the public is to understand how personal data are to be kept securely with commercial organisations, why should they not equally ask to understand how the state secures their personal data? Educating the public could lead to better engagement with research, better understanding of how we can use digital services and a better educated society as a whole. It seems common sense.

At a recent public event [13],  I asked civil servants talking about big upcoming data plans they announced, linking school data with more further education and employment data, I asked how they planned to involve the people whose data they would use. There was no public engagement to mention. Why not? Inexcusable in this climate.

Public engagement is a matter of trust and developing understanding in a relationship. Organisations must get this right.[14]

If government is discussing risky practices by commercial companies, they also need to look closer to home and fix what is broken in government data handling where it exposes us to risk through loss of control of our personal data.

The National Pupil Database for example, stores and onwardly shares identifiable individual sensitive data of at least 8m children’s records from age 2 -19. That’s twice as big as the TalkTalk loss was first thought to be.

Prevention not protection is what we should champion. Rather than protection after the events,  MPs and public must demand emphasis on prevention measures in our personal data use.

This week sees more debate on how and why the government will legislate to have more powers to capture more data about all the people in the country. But are government policy, process and practices fit to handle our personal data, what they do with it and who they give it to?

Population-wide gathering of data surveillance in any of its many forms is not any less real just because you don’t see it. Children’s health, schools, increases in volume of tax data collection. We don’t discuss enough how these policies can be used every day without the right oversight. MPs are like the conservative parents not comfortable talking to their teens about sleeping with someone. Just because you don’t know, it doesn’t mean they’re not doing it. [15] It just means you don’t want to know because if you find out they’re not doing it safely, you’ll have to do something about it.

And it might be awkward. (Meanwhile in schools real, meaningful PHSE has been left off the curriculum.)

Mr. Vaizey asked in the Commons for suggestions for improvement.

My suggestion is this. How government manages data has many options. But the principle should be simple. Our personal data needs not only protected, but not exposed to unnecessary risk in the first place, by commercial or state bodies. Doing nothing, is not an option.

Let’s Talk about more than TalkTalk

Teens will be teens. If commercial companies can’t manage their systems better to prevent a child successfully hacking it, then it’s not enough to point at criminal behaviour. There is fault to learn from on all sides. In commercial and state uses of personal data.

There is talk of new, and bigger, data sharing plans. [16]

Will the government wait to see  and keep its fingers crossed each month to see if our data are used safely at unsecured settings with some of these unknown partners data might be onwardly shared with, hoping we won’t find out and they won’t need to talk about it, or have a grown up public debate based on public education?

Will it put preventative measures in place appropriate to the sensitivity and volume of the data it is itself responsible for?

Will moving forward with new plans mean safer practices?

If government genuinely wants our administrative data at the heart of digital government fit for the 21st century, it must first understand how all government departments collect and use public data. And it must educate the public in this and commercial data use.

We need a fundamental shift in the way the government respects public opinion and shift towards legal and privacy compliance – both of which are lacking.

Let’s not talk about TalkTalk. Let’s have meaningful grown up debate with genuine engagement. Let’s talk about prevention measures in our data protection. Let’s talk about consent. It’s personal.

******

[1] Questions for TalkTalk: http://www.bbc.co.uk/news/technology-34636308

[2] Hansard: http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm151026/debtext/151026-0001.htm#15102612000004

[3] TalkTalk update: http://www.talktalkgroup.com/press/press-releases/2015/cyber-attack-update-tuesday-october-30-2015.aspx

[4] The Cyber Security Programme: http://www.civilserviceworld.com/articles/feature/depth-look-national-cyber-security-programme

[5] Paul reviews TalkTalk; https://paul.reviews/value-security-avoid-talktalk/

[6] https://ico.org.uk/for-organisations/guide-to-data-protection/conditions-for-processing/

[7] Let’s talk Consequences: the consequences of current failures to meet customers’ reasonable expectations of acceptable risk, are low compared with elsewhere.  As John Nicolson (East Dunbartonshire) SNP pointed out in the debate, “In the United States, AT&T was fined £17 million for failing to protect customer data. In the United Kingdom, the ICO can only place fines of up to £500,000. For a company that received an annual revenue of nearly £1.8 billion, a fine that small will clearly not be terrifying. The regulation of telecoms must be strengthened to protect consumers.”

[8] Let’s talk education: FOI request revealing a samples of some individual level data released to members of the press: http://www.theyworkforyou.com/debates/?id=2015-10-26b.32.0

The CMA brought out a report in June, on the use of consumer data, the topic should be familiar in parliament, but little engagement has come about as a result. It suggested the benefit:

“will only be realised if consumers continue to provide data and this relies on them being able to trust the firms that collect and use it”, and that “consumers should know when and how their data is being collected and used and be able to decide whether and how to participate. They should have access to information from firms about how they are collecting, storing and using data.”

[9] Let’s Talk Public Trust – are the bodies involved Trustworthy? Government lacks an effective data policy and is resistant to change. Yet it wants to collect ever more personal and individual level for unknown purposes from the majority of 60m people, with an unprecedented PR campaign.  When I heard the words ‘we want a mature debate’ it was reminiscent of HSCIC’s ‘intelligent grown up debate’ requested by Kinglsey Manning, in a speech when he admitted lack of public knowledge was akin to a measure of past success, and effectively they would rather have kept the use of population wide health data ‘below the radar’.

Change: We need change, the old way after all, didn’t work, according to Minister Matt Hancock: “The old model of government has failed, so we will build a new one.” I’d like to see what that new one will look like. Does he mean to expand only data sharing policy, or the powers of the civil service?

[10] National Pupil Database detailed data releases to third parties https://www.whatdotheyknow.com/request/pupil_data_national_pupil_databa

[11] http://adrn.ac.uk/news-events/latest-news/adrn-rssevent

[12] https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[13] https://www.liberty-human-rights.org.uk/human-rights/privacy/state-surveillance

[14] http://www.computerweekly.com/news/4500256274/Government-will-tackle-barriers-to-sharing-and-linking-data-says-Cabinet-Office-minister-Hancock

care.data: delayed or not delayed? The train wreck that is always on time

If you cancel a train does it still show up in the delayed trains statistics?

care.data plans are not delayed (just don’t ask Healthwatch)

Somerset CCG’s announcement [1] of the delay in their care.data plans came as no surprise, except perhaps to NHS England who effectively denied it, reportedly saying work continues. [2] Both public statements may be true but it would have been good professional practice to publicly recognise that a top down delay affects others who are working hard on the ground to contribute to the effective rollout of the project. Causing confusion and delay is hard to work with. Change and technology projects run on timelines. Deadlines mean that different teams can each do their part and the whole gets done. Or not.

Healthwatch [3] has cancelled their planned public meetings.  Given that one of the reasons stated in the care.data CCG selection process was support from local patient groups including Healthwatch, this appears poor public relations. It almost wouldn’t matter, but in addition to the practicalities, the organisation and leadership are trying to prove it is trustworthy. [4]


HW_cancels


Somerset’s statement is straightforward and says it is applies to all pathfinders: 

“Following a speech by Jeremy Hunt, the Secretary of State for Health this week (3-9-15), in which he outlined his vision for the future use of technology across NHS, NHS England has asked the four care.data pathfinder pilots areas in England (Leeds, Blackburn and Derwent, West Hampshire and Somerset) to temporarily pause their activities.” [Sept 4, Somerset statement]


somerset


From when I first read of the GPES IAG concerns [5] I have seen the care.data programme hurtle from one crisis to another. But this is now a train wreck. A very quiet train wreck. No one has cried out much.[6] And yet I think the project,  professionals, and the public should be shouting from the top of the carriages that this programme needs help if it is ever to reach its destination.

care.data plans are not late against its business plan (there is none)

Where’s the business case? Why can’t it define deadlines that it can achieve?  In February 2015, I suggested the mentality that allows these unaccountable monster programmes to grow unchecked must die out.

I can’t even buy an Oyster card if I don’t know if there is money in my pocket. How can a programme which has already spent multi millions of pounds keep driving on without a budget? There is no transparency of what financial and non-financial benefits are to be expected to justify the cost. There is no accountable public measure of success checking it stays on track.

While it may be more comfortable for the organisation to deny problems, I do not believe it serves the public interest to hide information. This is supported by the very reason for being of the MPA process and its ‘challenge to Whitehall secrecy‘ [7] who rated the care.data rollout red [8] in last years audit. This requires scrutiny to find solutions.

care.data plans do not need to use lessons learned (do they?)

I hope at least there are lessons learned here in the pathfinder on what not to do before the communications rollout to 60m people.  In the words of Richard Feynman, “For successful technology, reality must take precedence over public relations.”

NHS England is using the public interest test to withhold information: “the particular public interest in preserving confidential communications between NHS England and its sponsoring department [the DH].”  I do not believe this serves the public interest if it is used to hide issues and critical external opinion. The argument made is that there is “stronger public interest in maintaining the exemption where it allows the effective development of policy and operational matters on an ongoing basis.”  The Public Accounts Committee in 2013 called for early transparency and intervention which prevents the ongoing waste of “billions of pounds of taxpayers’ money” in their report into the NPfIT. [9] It showed that a lack of transparency and oversight contributed to public harm, not benefit, in that project, under the watch of the Department of Health. The report said:

“Parliament needs to be kept informed not only of what additional costs are being incurred, but also of exactly what has been delivered so far for the billions of pounds spent on the National Programme. The benefits flowing from the National Programme to date are extremely disappointing. The Department estimates £3.7 billion of benefits to March 2012, just half of the costs incurred. This saga [NPfIT] is one of the worst and most expensive contracting fiascos in the history of the public sector.”

And the Public Accounts Committee made a recommendation in 2013:

“If the Department is to deliver a paperless NHS, it needs to draw on the lessons from the National Programme and develop a clear plan, including estimates of costs and benefits and a realistic timetable.” [PAC 2013][9]

Can we see any lessons drawn on today in care.data? Or any in Jeremy Hunt’s speech or his refusal to comment on costs for the paperless NHS plans reported by HSJ journal at NHSExpo15?

While history repeats itself and “estimates of costs and benefits and a realistic timetable” continue to be absent in the care.data programme, the only reason given by Somerset for delay is to fix the specific issue of opt out:

“The National Data Guardian for health and care, Dame Fiona Caldicott, will… provide advice on the wording for a new model of consents and opt-outs to be used by the care.data programme that is so vital for the future of the NHS. The work will be completed by January [2016]…”

Perhaps delay will buy NHS England some time to get itself on track and not only respect public choice on consent, but also deliver a data usage report to shore up trust, and tell us what benefits the programme will deliver that cannot already be delivered today (through existing means, like the CPRD for research [10]).

Perhaps.

care.data plans will only deliver benefits (if you don’t measure costs)

I’ve been told “the realisation of the benefits, which serve the public interest, is dependent on the care.data programme going ahead.” We should be able to see this programme’s costs AND benefits. It is we collectively after all who are paying for it, and for whom we are told the benefits are to be delivered. DH should release the business plan and all cost/benefit/savings  plans. This is a reasonable thing to ask. What is there to hide?

The risk has been repeatedly documented in 2014-15 board meetings that “the project continues without an approved business case”.

The public and medical profession are directly affected by the lack of money given by the Department of Health as the reason for the reductions in service in health and social care. What are we missing out on to deliver what benefit that we do not already get elsewhere today?

On the pilot work continuing, the statement from NHS England reads: “The public interest is best served by a proper debate about the nature of a person’s right to opt out of data sharing and we will now have clarity on the wording for the next steps in the programme,” 

I’d like to see that ‘proper debate’ at public events. The NIB leadership avoids answering hard questions even if asked in advance, as requested. Questions such as mine go unanswered::

“How does NHS England plan to future proof trust and deliver a process of communications for the planned future changes in scope, users or uses?”

We’re expected to jump on for the benefits, but not ask about the cost.

care.data plans have no future costs (just as long as they’re unknown)

care.data isn’t only an IT infrastructure enhancement and the world’s first population wide database of 60m primary care records. It’s a massive change platform through which the NHS England Commissioning Board will use individual level business intelligence to reshape the health service. A massive change programme  that commodifies patient confidentiality as a kick-starter for economic growth.  This is often packaged together with improvements for patients, requirements for patient safety, often meaning explanations talk about use of records in direct care conflated with secondary uses.

“Without interoperable digital data, high quality effective local services cannot be delivered; nor can we achieve a transformation in patient access to new online services and ‘apps’; nor will the NHS maximise its opportunity to be a world centre in medical science and research.” [NHS England, September 1 2015] 

So who will this transformation benefit? Who and what are all its drivers? Change is expensive. It costs time and effort and needs investment.

Blackburn and Darwen’s Healthwatch appear to have received £10K for care.data engagement as stated in their annual report. Somerset’s less clear. We can only assume that Hampshire, expecting a go live ‘later in 2015’ has also had costs. Were any of their patient facing materials already printed for distribution, their ‘allocated-under-austerity’ budgets spent?

care.data is not a single destination but a long journey with a roadmap of plans for incremental new datasets and expansion of new users.

The programme should already know and be able to communicate the process behind informing the public of future changes to ensure future use will meet public expectations in advance of any change taking place. And we should know who is going to pay for that project lifetime process, and ongoing change management. I keep asking what that process will be and how it will be managed:

June 17 2015, NIB meeting at the King’s Fund Digital Conference on Health & Social Care:

june17

September 2 2015, NIB Meeting at NHS Expo 15:

NIBQ_Sept

It goes unanswered time and time again despite all the plans and roadmaps and plans for change.

These projects are too costly to fail. They are too costly to justify only having transparency applied after the event, when forced to do so.

care.data plans are never late (just as long as there is no artificial deadline)

So back to my original question. If you cancel a train does it still show up in the delayed trains statistics? I suppose if the care.data programme claims there is no artificial deadline, it can never be late. If you stop setting measurable deadlines to deliver against, the programme can never be delayed. If there is no budget set, it can never be over it. The programme will only deliver benefits, if you never measure costs.

The programme can claim it is in the public interest for as long as we are prepared to pay with an open public purse and wait for it to be on track.  Wait until data are ready to be extracted, which the notice said:

…” is thought to remain a long way off.” 

All I can say to that, is I sure hope so. Right now, it’s not fit for purpose. There must be decisions on content and process arrived at first. But we also deserve to know what we are expecting of the long journey ahead.

On time, under budget, and in the public interest?

As long as NHS England is the body both applying and measuring the criteria, it fulfils them all.

*******

[1] Somerset CCG announces delay to care.data plans https://www.somersetlmc.co.uk/caredatapaused

[2] NHS England reply to Somerset announcement reported in Government Computing http://healthcare.governmentcomputing.com/news/ccg-caredata-pilot-work-continues-4668290

[3] Healthwatch bulletin: care.data meetings cancelled http://us7.campaign-archive1.com/?u=16b067dc44422096602892350&id=5dbdfc924c

[4] Building public trust: after the NIB public engagement in Bristol https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[5] GPES IAG http://www.hscic.gov.uk/media/12911/GPES-IAG-Minutes-12-September-2013/pdf/GPES_IAG_Minutes_12.09.13.pdf

[6] The Register – Right, opt out everybody! hated care.data programme paused again http://www.theregister.co.uk/2015/09/08/hated_caredata_paused_again_opt_out/

[7] Pulse Today care.data MPA rating http://www.pulsetoday.co.uk/your-practice/practice-topics/it/caredata-looks-unachievable-says-whitehall-watchdog/20010381.article#.VfMXYlbtiyM

[8] Major Projects Authority https://engage.cabinetoffice.gov.uk/major-projects-authority/

[9] The PAC 2013 ttp://www.parliament.uk/business/committees/committees-a-z/commons-select/public-accounts-committee/news/npfit-report/

[10] Clinical Practice Research Datalink (CPRD)

***

image source: http://glaconservatives.co.uk/news/london-commuters-owed-56million-in-unclaimed-refunds-by-rail-operators/

 

Building Public Trust [5]: Future solutions for health data sharing in care.data

This wraps up my series of thoughts on ‘Building Public Trust’ since the NIB Bristol meeting on July 24th.

It has looked at how to stop chasing public trust and instead the need to become organisations that can be trustworthy [part 1]. What behaviours make an organisation trustworthy [part 2]. Why fixing the Type 2 opt out is a vital first step [part 3], and why being blinded by ‘the benefits’ is not the answer [part 4], but giving balanced and fair explanations of programme purposes, commissioning and research, is beneficial to communicate.

So I want to wrap up by suggesting how communications can be improved in content and delivery. Some ideas will challenge the current approach.

Here in part five: Future solutions, I suggest why aiming to “Build Public Trust” through a new communications approach may work better for the public than the past. I’ll propose communications on care.data:

  • Review content:  what would ethical, accurate content look like
  • Strengthen relationships for delivery: don’t attempt to rebuild trust where there is now none, but strengthen the channels that are already viewed by the public to be trustworthy
  • Rethink why you communicate and the plan for when: All communications need delivered through a conversation with real listening and action based upon it. Equal priority must be given to both a communications plan for today and for the future. It must set out a mechanism for future change communications now,  before the pathfinders begin
  • Since writing this, the Leeds area CCGs have released their ‘data sharing’ comms leaflet. I have reviewed this in detail and give my opinions as a case study.

NIB workstream 4, underpins the NHS digital future,  and aims to build and sustain public trust, delivering plans for consent based information sharing and assurance of safeguards. It focuses on 4 areas: governance and oversight, project risks, consent and genomics:

“The work will begin in 2015 and is expected to include deliberative groups to discuss complex issues and engagement events, as well as use of existing organisations and ways to listen. There will also be a need to listen to professional audiences.”  [NIB work stream 4] [ref 1]

Today’s starting point in trust, trust that enables two-way communication, could hardly be worse, with professionals and public audiences. Communications are packaged in mistrust:

“Relations between the doctors’ union and Health Secretary Jeremy Hunt hit a new low following his announcement in July that he was prepared to impose seven-day working on hospital doctors in England.” [BBC news, Aug 15, 2015]

There appears to be divided opinion between politicians and civil servants.

Right now, the Department of Health seems to be sabotaging its own plans for success at every turn.

What reason can there be for denying debate in the public domain of the very plans it says are the life blood of the savings central to the NHS future?

Has the Department learned nothing from the loss of public and professional trust in 2014?

And as regards the public in engagement work, Hetan Shah, executive director of the Royal Statistical Society said in 2014, “Our research shows a “data trust deficit”. In this data rich world, companies and government have to earn citizens’ trust in how they manage and use data – and those that get it wrong will pay the price.’ [RSS Data Trust Deficit, lessons for policymakers, 2014] [2]

Where do the NIB work stream discussions want to reach by 2020?

“The emergence of genomics requires a conversation about what kind of consent is appropriate by 2020. The work stream will investigate a strand of work to be led by an ethicist.” [NIB work stream 4]

Why is genomics here in workstream 4, when datasharing for genomics is with active consent from volunteers? Why will a strand of work be led by an ethicist for this, and not other work strands? Is there a gap in how their consent is managed today or in how consent is to be handled for genomics for the future? It seems to me there is a gap in what is planned and what the public is being told here. It is high time for an overdue public debate on what future today’s population-wide data sharing programme is building. Good communication must ensure there are no surprises.

The words I underlined from the work stream 4 paper, highlight the importance of communication; to listen and to have a conversation. Despite all the engagement work of 2014 I feel that is still to happen. As one participant summed up later, “They seem hell bent on going ahead. I know they listened, but what did they hear?” [3]

care.data pathfinder practices are apparently ready to roll out communications materials: “Extraction is likely to take place between September and November depending on how fair processing testing communications was conducted” [Blackburn and Darwen HW]

So what will patient facing materials look like in content? How will they be rolled out?

Are pathfinder communications more robust than 2014 materials?

I hope the creatives will also think carefully, what is the intent of communications to be delivered.  Is it to fully and ethically inform patients about their choice whether to accept or opt out from changes in their data access, management, use and oversight? Or is the programme guidance to minimise the opt out numbers?

The participants are not signing up to a one time, single use marketing campaign, but to a lifetime of data use by third parties. Third parties who remain in role and purposes, loosely defined.

It is important when balancing this decision not to forget that data  that is available and not used wisely could fail to mitigate risk; for example in identifying pharmaceutical harms.

At the same time to collect all data for all purposes under that ‘patient safety and quality’ umbrella theme is simplistic, and lends itself in some ways, to lazy communications.

Patients must also feel free and able to make an informed decision without coercion, that includes not making opting out feel guilty.

The wording used in the past was weighted towards the organisation’s preference.  The very concept of “data sharing” is weighted positively towards the organisation. Even though in reality the default is for data to be taken by the organisation, not donated by the citizen. In other areas of life, this is recognised as an unwilling position for the citizen to be in.

At the moment I feel that the scope of purposes both today and future are not clearly defined enough in communications or plans for me personally to be able to trust them. Withholding information about how digital plans will fit into the broader NHS landscape and what data sharing will mean beyond 2020 appears rightly or wrongly,  suspicious. Department of Health, what are you thinking?

What the organisation says it will do, it must do and be seen to do, to be demonstrably trustworthy.

This workstream carries two important strands of governance and oversight which now need to be seen to happen. Implementing the statutory footing of the National Data Guardian, which has been talked about since October 2014 and ‘at the earliest opportunity’ seems to have been rather long in coming, and ‘a whole system’ that respects patient choice. What will this look like and how will it take into account the granular level of choices asked for at care.data listening events through 2014?

“By April 2016 NIB will publish, in partnership with civil society and patient leaders, a roadmap for moving to a whole-system, consent-based approach, which respects citizens’ preferences and objections about how their personal and confidential data is used, with the goal of implementing that approach by December 2020.”

‘By December 2020’ is still some time away, yet the pathfinders for care.data rolls on now regardless. The proof that will demonstrate what was said about data use actually is what happens to data, that what is communicated is trustworthy, is part of a system that can communicate this by recording and sharing consent decisions, “and can provide information on the use to which an individual’s data has been put. Over the longer term, digital solutions will be developed that automate as far as possible these processes.”

Until then what will underpin trust to show that what is communicated is done, in the short term?

Future proofing Communications must start now

Since 2013 the NHS England care.data approach appeared to want a quick data grab without long term future-proofed plans. Like the hook-up app approach to dating.

To enable the NIB 2020 plans and beyond, to safeguard research in the public interest, all communications must shape a trusted long term relationship.

To ensure public trust, communications content and delivery can only come after changes. Which is again why focusing only on communicate the benefits without discussing balance of risk does not work.  That’s what 2014 patient facing communications tried.

In 2014 there were challenges on communications that were asked but not answered, on reaching those who are digitally excluded, on reaching those for whom reading text was a challenge, and deciding who the target audience will be, considering people with delegated authority young and old, as well as those who go in and out of GP care throughout their lives, such as some military. Has that changed?

In February 2014 Health Select Committee member Sarah Wollaston, now Chair, said: “There are very serious underlying problems here that need to be addressed.”

If you change nothing, you can expect nothing to change in public and professional feeling about the programme. Communications cannot in 2015 simply revamp the layout and pacakging. There must be a change in content and in the support given in its delivery. Change means that you need to stop doing some things and start doing others.

In summary for future communications to support trust, I suggest:

1. STOP: delivering content that is biased towards what the organsation wants to achieve often with a focus on fair processing requirement, under a coercive veil of patient safety and research

START: communicating with an entirely ethical based approach reconsidering all patient data held at HSCIC and whether omission of  ‘commercial use’, balanced risks as identified in the privacy impact assessment and stating ‘your name is not included’ is right.  

2. STOP: Consider all the releases of health data held by HSCIC again and decide for each type if they are going to deliver public confidence that your organisations are trustworthy. 

START: communicate publicly which commercial companies, re-users and back office would no longer be legally eligible to receive data and why. Demonstrate organisations who received data in the past that will not in future.  

3. STOP: the Department of Health and NHS England must stop undermining trust in its own leadership, through public communications that voice opposition to medical professional bodies. Doctors are trusted much more than politicians.

START: strengthen the public-GP relationship that is already well trusted. Strengthen the GP position that will in turn support the organisational-trust-chain that you need to sustain public support. 

4. STOP: stop delaying the legislative changes needed on Data Guardian and penalties for data misuse 

START: implement them and clearly explain them in Parliament and press

5. STOP: don’t rush through short term short-cuts  to get ‘some’ data but ignore the listening from the public that asked for choice.

START: design a thorough granular consent model fit for the 21stC and beyond and explain to the public what it will offer, the buy in for bona fide research will be much greater (be prepared to define ‘research’!

6. STOP: saying that future practices have been changed and that security and uses are now more trustworthy than in the past. Don’t rush to extract data until you can prove you are trustworthy.

START: Demonstrate in future who receives data to individuals through a data use report. Who future users are in practice can only be shown through a demonstrable tool to see your word can be relied upon in practice. This will I am convinced, lower the opt out rate.

 Point 6 is apparently work-in-progress. [p58]
NIB2015

7. STOP: rolling out the current communications approach without any public position on what changes will mean they are notified before a new purpose and user in future of our data

START: design a thorough change communications model fit for the 21stC and beyond and tell the public in THIS round of communications what changes of user or purposes will trigger a notification to enable them to opt out in future BEFORE a future change i.e. in a fictional future – if the government decided that the population wide database should be further commercialised ‘for the purposes of health’, linked to the NHSBT blood donor registry and sold to genomic research companies, how would I as a donor be told, BEFORE the event?

There are still unknowns in content and future scope that mean communications are difficult. If you don’t know what you’re saying how to say it is hard. But what is certain is that there are future changes in the programme planned, and how to communicate these these with the public and professionals must be designed for now, so that what we are signed up for today, stays what we signed up for.

Delivering messages about data sharing and the broader NHS, the DH/NHS England should consider carefully their relationships and behaviours, all communication becomes relevant to trust.

Solutions cannot only be thought of in terms tools, not of what can be imposed on people, but of what can be achieved with people.

That’s people from the public and professionals and the programme working with the same understanding of the plans together, in a trusted long term relationship.

For more detail including my case study comments on the Leeds area CCGs comms leaflet, continue reading below.

Thanks for sharing in discussions of ideas in my five part post on Building public trust – a New Approach. Comments welcome.

Continue reading Building Public Trust [5]: Future solutions for health data sharing in care.data