Tag Archives: #caredata

Ethically problematic

Five years ago, researchers at the Manchester University School of Social Sciences wrote, “It will no longer be possible to assume that secondary data use is ethically unproblematic.”

Five years on, other people’s use of the language of data ethics puts social science at risk. Event after event, we are witnessing the gradual dissolution of the value and meaning of ‘ethics’, into little more than a buzzword.

Companies and organisations are using the language of ‘ethical’ behaviour blended with ‘corporate responsibility’ modelled after their own values, as a way to present competitive advantage.

Ethics is becoming shorthand for, ‘we’re the good guys’. It is being subverted by personal data users’ self-interest. Not to address concerns over the effects of data processing on individuals or communities, but to justify doing it anyway.

An ethics race

There’s certainly a race on for who gets to define what data ethics will mean. We have at least three new UK institutes competing for a voice in the space. Digital Catapult has formed an AI ethics committee. Data charities abound. Even Google has developed an ethical AI strategy of its own, in the wake of their Project Maven.

Lessons learned in public data policy should be clear by now. There should be no surprises how administrative data about us are used by others. We should expect fairness. Yet these basics still seem hard for some to accept.

The NHS Royal Free Hospital in 2015 was rightly criticised – because they tried “to commercialise personal confidentiality without personal consent,” as reported in Wired recently.

The shortcomings we found were avoidable,” wrote Elizabeth Denham in 2017 when the ICO found six ways the Google DeepMind — Royal Free deal did not comply with the Data Protection Act. The price of innovation, she said, didn’t need to be the erosion of fundamental privacy rights underpinned by the law.

If the Centre for Data Ethics and Innovation is put on a statutory footing where does that leave the ICO, when their views differ?

It’s why the idea of DeepMind funding work in Ethics and Society seems incongruous to me. I wait to be proven wrong. In their own words, “technologists must take responsibility for the ethical and social impact of their work“. Breaking the law however, is conspicuous by its absence, and the Centre must not be used by companies, to generate pseudo lawful or ethical acceptability.

Do we need new digital ethics?

Admittedly, not all laws are good laws. But if recognising and acting under the authority of the rule-of-law is now an optional extra, it will undermine the ICO, sink public trust, and destroy any hope of achieving the research ambitions of UK social science.

I am not convinced there is any such thing as digital ethics. The claimed gap in an ability to get things right in this complex area, is too often after people simply get caught doing something wrong. Technologists abdicate accountability saying “we’re just developers,” and sociologists say, “we’re not tech people.

These shrugs of the shoulders by third-parties, should not be rewarded with more data access, or new contracts. Get it wrong, get out of our data.

This lack of acceptance of responsibility creates a sense of helplessness. We can’t make it work, so let’s make the technology do more. But even the most transparent algorithms will never be accountable. People can be accountable, and it must be possible to hold leaders to account for the outcomes of their decisions.

But it shouldn’t be surprising no one wants to be held to account. The consequences of some of these data uses are catastrophic.

Accountability is the number one problem to be solved right now. It includes openness of data errors, uses, outcomes, and policy. Are commercial companies, with public sector contracts, checking data are accurate and corrected from people who the data are about, before applying in predictive tools?

Unethical practice

As Tim Harford in the FT once asked about Big Data uses in general: “Who cares about causation or sampling bias, though, when there is money to be made?”

Problem area number two, whether researchers are are working towards a profit model, or chasing grant funding is this:

How data users can make unbiased decisions whether they should use the data? We have all the same bodies deciding on data access, that oversee its governance. Conflict of self interest is built-in by default, and the allure of new data territory is tempting.

But perhaps the UK key public data ethics problem, is that the policy is currently too often about the system goal, not about improving the experience of the people using systems. Not using technology as a tool, as if people mattered. Harmful policy, can generate harmful data.

Secondary uses of data are intrinsically dependent on the ethics of the data’s operational purpose at collection. Damage-by-design is evident right now across a range of UK commercial and administrative systems. Metrics of policy success and associated data may be just wrong.

Some of the damage is done by collecting data for one purpose and using it operationally for another in secret. Until these modus operandi change no one should think that “data ethics will save us”.

Some of the most ethical research aims try to reveal these problems. But we need to also recognise not all research would be welcomed by the people the research is about, and few researchers want to talk about it. Among hundreds of already-approved university research ethics board applications I’ve read, some were desperately lacking. An organisation is no more ethical than the people who make decisions in its name. People disagree on what is morally right. People can game data input and outcomes and fail reproducibility. Markets and monopolies of power bias aims. Trying to support the next cohort of PhDs and impact for the REF, shapes priorities and values.

Individuals turn into data, and data become regnant.” Data are often lacking in quality and completeness and given authority they do not deserve.

It is still rare to find informed discussion among the brightest and best of our leading data institutions, about the extensive everyday real world secondary data use across public authorities, including where that use may be unlawful and unethical, like buying from data brokers. Research users are pushing those boundaries for more and more without public debate. Who says what’s too far?

The only way is ethics? Where next?

The latest academic-commercial mash-ups on why we need new data ethics in a new regulatory landscape where the established is seen as past it, is a dangerous catch-all ‘get out of jail free card’.

Ethical barriers are out of step with some of today’s data politics. The law is being sidestepped and regulation diminished by lack of enforcement of gratuitous data grabs from the Internet of Things, and social media data are seen as a free-for-all. Data access barriers are unwanted. What is left to prevent harm?

I’m certain that we first need to take a step back if we are to move forward. Ethical values are founded on human rights that existed before data protection law. Fundamental human decency, rights to privacy, and to freedom from interference, common law confidentiality, tort, and professional codes of conduct on conflict of interest, and confidentiality.

Data protection law emphasises data use. But too often its first principles of necessity and proportionality are ignored. Ethical practice would ask more often, should we collect the data at all?

Although GDPR requires new necessary safeguards to ensure that technical and organisational measures are met to control and process data, and there is a clearly defined Right to Object, I am yet to see a single event thought giving this any thought.

Let’s not pretend secondary use of data is unproblematic, while uses are decided in secret. Calls for a new infrastructure actually seek workarounds of regulation. And human rights are dismissed.

Building a social license between data subjects and data users is unavoidable if use of data about people hopes to be ethical.

The lasting solutions are underpinned by law, and ethics. Accountability for risk and harm. Put the person first in all things.

We need more than hopes and dreams and talk of ethics.

We need realism if we are to get a future UK data strategy that enables human flourishing, with public support.

Notes of desperation or exasperation are increasingly evident in discourse on data policy, and start to sound little better than ‘we want more data at all costs’. If so, the true costs would be lasting.

Perhaps then it is unsurprising that there are calls for a new infrastructure to make it happen, in the form of Data Trusts. Some thoughts on that follow too.


Part 1. Ethically problematic

Ethics is dissolving into little more than a buzzword. Can we find solutions underpinned by law, and ethics, and put the person first?

Part 2. Can Data Trusts be trustworthy?

As long as data users ignore data subjects rights, Data Trusts have no social license.


Data Horizons: New Forms of Data For Social Research,

Elliot, M., Purdam, K., Mackey, E., School of Social Sciences, The University Of Manchester, CCSR Report 2013-312/6/2013

care.data listening events and consultation: The same notes again?

If lots of things get said in a programme of events, and nothing is left around to read about it, did they happen?

The care.data programme 2014-15 listening exercise and action plan has become impossible to find online. That’s OK, you might think, the programme has been scrapped. Not quite.

You can give your views online until September 7th on the new consultation, “New data security standards and opt-out models for health and social care”  and/or attend the new listening events, September 26th in London, October 3rd in Southampton and October 10th in Leeds.

The Ministerial statement on July 6, announced that NHS England had taken the decision to close the care.data programme after the review of data security and consent by Dame Fiona Caldicott, the National Data Guardian for Health and Care.

But the same questions are being asked again around consent and use of your medical data, from primary and secondary care. What a very long questionnaire asks is in effect,  do you want to keep your medical history private? You can answer only Q 15 if you want.

Ambiguity again surrounds what constitutes “de-identified” patient information.

What is clear is that public voice seems to have been deleted or lost from the care.data programme along with the feedback and brand.

People spoke up in 2014, and acted. The opt out that 1 in 45 people chose between January and March 2014 was put into effect by the HSCIC in April 2016. Now it seems, that might be revoked.

We’ve been here before.  There is no way that primary care data can be extracted without consent without it causing further disruption and damage to public trust and public interest research.  The future plans for linkage between all primary care data and secondary data and genomics for secondary uses, is untenable without consent.

Upcoming events cost time and money and will almost certainly go over the same ground that hours and hours were spent on in 2014. However if they do achieve a meaningful response rate, then I hope the results will not be lost and will be combined with those already captured under the ‘care.data listening events’ responses.  Will they have any impact on what consent model there may be in future?

So what we gonna do? I don’t know, whatcha wanna do? Let’s do something.

Let’s have accredited access and security fixed. While there may now be a higher transparency and process around release, there are still problems about who gets data and what they do with it.

Let’s have clear future scope and control. There is still no plan to give the public rights to control or delete data if we change our minds who can have it or for what purposes. And that is very uncertain. After all, they might decide to privatise or outsource the whole thing as was planned for the CSUs. 

Let’s have answers to everything already asked but unknown. The questions in the previous Caldicott review have still to be answered.

We have the possibility to  see health data used wisely, safely, and with public trust. But we seem stuck with the same notes again. And the public seem to be the last to be invited to participate and views once gathered, seem to be disregarded. I hope to be proved wrong.

Might, perhaps, the consultation deliver the nuanced consent model discussed at public listening exercises that many asked for?

Will the care.data listening events feedback summary be found, and will its 2014 conclusions and the enacted opt out be ignored? Will the new listening event view make more difference than in 2014?

Is public engagement, engagement, if nobody hears what was said?

Datasharing, lawmaking and ethics: power, practice and public policy

“Lawmaking is the Wire, not Schoolhouse Rock. It’s about blood and war and power, not evidence and argument and policy.”

"We can't trust the regulators," they say. "We need to be able to investigate the data for ourselves." Technology seems to provide the perfect solution. Just put it all online - people can go through the data while trusting no one.  There's just one problem. If you can't trust the regulators, what makes you think you can trust the data?" 

Extracts from The Boy Who Could Change the World: The Writings of Aaron Swartz. Chapter: ‘When is Technology Useful? ‘ June 2009.

The question keeps getting asked, is the concept of ethics obsolete in Big Data?

I’ve come to some conclusions why ‘Big Data’ use keeps pushing the boundaries of what many people find acceptable, and yet the people doing the research, the regulators and lawmakers often express surprise at negative reactions. Some even express disdain for public opinion, dismissing it as ignorant, not ‘understanding the benefits’, yet to be convinced. I’ve decided why I think what is considered ‘ethical’ in data science does not meet public expectation.

It’s not about people.

Researchers using large datasets, often have a foundation in data science, applied computing, maths, and don’t see data as people. It’s only data. Creating patterns, correlations, and analysis of individual level data are not seen as research involving human subjects.

This is embodied in the nth number of research ethics reviews I have read in the last year in which the question is asked, does the research involve people? The answer given is invariably ‘no’.

And these data analysts using, let’s say health data, are not working in a subject that is founded on any ethical principle, contrasting with the medical world the data come from.

The public feels differently about the information that is about them, and may be known, only to them or select professionals. The values that we as the public attach to our data  and expectations of its handling may reflect the expectation we have of handling of us as people who are connected to it. We see our data as all about us.

The values that are therefore put on data, and on how it can and should be used, can be at odds with one another, the public perception is not reciprocated by the researchers. This may be especially true if researchers are using data which has been de-identified, although it may not be anonymous.

New legislation on the horizon, the Better Use of Data in Government,  intends to fill the [loop]hole between what was legal to share in the past and what some want to exploit today, and emphasises a gap in the uses of data by public interest, academic researchers, and uses by government actors. The first incorporate by-and-large privacy and anonymisation techniques by design, versus the second designed for applied use of identifiable data.

Government departments and public bodies want to identify and track people who are somehow misaligned with the values of the system; either through fraud, debt, Troubled Families, or owing Student Loans. All highly sensitive subjects. But their ethical data science framework will not treat them as individuals, but only as data subjects. Or as groups who share certain characteristics.

The system again intrinsically fails to see these uses of data as being about individuals, but sees them as categories of people – “fraud” “debt” “Troubled families.” It is designed to profile people.

Services that weren’t built for people, but for government processes, result in datasets used in research, that aren’t well designed for research. So we now see attempts to shoehorn historical practices into data use  by modern data science practitioners, with policy that is shortsighted.

We can’t afford for these things to be so off axis, if civil service thinking is exploring “potential game-changers such as virtual reality for citizens in the autism spectrum, biometrics to reduce fraud, and data science and machine-learning to automate decisions.”

In an organisation such as DWP this must be really well designed since “the scale at which we operate is unprecedented: with 800 locations and 85,000  colleagues, we’re larger than most retail operations.”

The power to affect individual lives through poor technology is vast and some impacts seem to be being badly ignored. The ‘‘real time earnings’ database improved accuracy of benefit payments was widely agreed to have been harmful to some individuals through the Universal Credit scheme, with delayed payments meaning families at foodbanks, and contributing to worse.

“We believe execution is the major job of every business leader,” perhaps not the best wording in on DWP data uses.

What accountability will be built-by design?

I’ve been thinking recently about drawing a social ecological model of personal data empowerment or control. Thinking about visualisation of wants, gaps and consent models, to show rather than tell policy makers where these gaps exist in public perception and expectations, policy and practice. If anyone knows of one on data, please shout. I think it might be helpful.

But the data *is* all about people

Regardless whether they are in front of you or numbers on a screen, big or small datasets using data about real lives are data about people. And that triggers a need to treat the data with an ethical approach as you would people involved face-to-face.

Researchers need to stop treating data about people as meaningless data because that’s not how people think about their own data being used. Not only that, but if the whole point of your big data research is to have impact, your data outcomes, will change lives.

Tosh, I know some say. But, I have argued, the reason being is that the applications of the data science/ research/ policy findings / impact of immigration in education review / [insert purposes of the data user’s choosing] are designed to have impact on people. Often the people about whom the research is done without their knowledge or consent. And while most people say that is OK, where it’s public interest research, the possibilities are outstripping what the public has expressed as acceptable, and few seem to care.

Evidence from public engagement and ethics all say, hidden pigeon-holing, profiling, is unacceptable. Data Protection law has special requirements for it, on autonomous decisions. ‘Profiling’ is now clearly defined under article 4 of the GDPR as ” any form of automated processing of personal data consisting of using those data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements.”

Using big datasets for research that ‘isn’t interested in individuals’ may still intend to create results profiling groups for applied policing, or discriminate, to make knowledge available by location. The data may have been deidentified, but in application becomes no longer anonymous.

Big Data research that results in profiling groups with the intent for applied health policy impacts for good, may by the very point of research, with the intent of improving a particular ethnic minority access to services, for example.

Then look at the voting process changes in North Carolina and see how that same data, the same research knowledge might be applied to exclude, to restrict rights, and to disempower.

Is it possible to have ethical oversight that can protect good data use and protect people’s rights if they conflict with the policy purposes?

The “clear legal basis”is not enough for public trust

Data use can be legal and can still be unethical, harmful and shortsighted in many ways, for both the impacts on research – in terms of withholding data and falsifying data and avoiding the system to avoid giving in data – and the lives it will touch.

What education has to learn from health is whether it will permit the uses by ‘others’ outside education to jeopardise the collection of school data intended in the best interests of children, not the system. In England it must start to analyse what is needed vs wanted. What is necessary and proportionate and justifies maintaining named data indefinitely, exposed to changing scope.

In health, the most recent Caldicott review suggests scope change by design – that is a red line for many: “For that reason the Review recommends that, in due course, the opt-out should not apply to all flows of information into the HSCIC. This requires careful consideration with the primary care community.”

The community spoke out already, and strongly in Spring and Summer 2014 that there must be an absolute right to confidentiality to protect patients’ trust in the system. Scope that ‘sounds’ like it might sneakily change in future, will be a death knell to public interest research, because repeated trust erosion will be fatal.

Laws change to allow scope change without informing people whose data are being used for different purposes

Regulators must be seen to be trusted, if the data they regulate is to be trustworthy. Laws and regulators that plan scope for the future watering down of public protection, water down public trust from today. Unethical policy and practice, will not be saved by pseudo-data-science ethics.

Will those decisions in private political rooms be worth the public cost to research, to policy, and to the lives it will ultimately affect?

What happens when the ethical black holes in policy, lawmaking and practice collide?

At the last UK HealthCamp towards the end of the day, when we discussed the hard things, the topic inevitably moved swiftly to consent, to building big databases, public perception, and why anyone would think there is potential for abuse, when clearly the intended use is good.

The answer came back from one of the participants, “OK now it’s the time to say. Because, Nazis.” Meaning, let’s learn from history.

Given the state of UK politics, Go Home van policies, restaurant raids, the possibility of Trump getting access to UK sensitive data of all sorts from across the Atlantic, given recent policy effects on the rights of the disabled and others, I wonder if we would hear the gentle laughter in the room in answer to the same question today.

With what is reported as Whitehall’s digital leadership sharp change today, the future of digital in government services and policy and lawmaking does indeed seem to be more “about blood and war and power,” than “evidence and argument and policy“.

The concept of ethics in datasharing using public data in the UK is far from becoming obsolete. It has yet to begin.

We have ethical black holes in big data research, in big data policy, and big data practices in England. The conflicts between public interest research and government uses of population wide datasets, how the public perceive the use of our data and how they are used, gaps and tensions in policy and practice are there.

We are simply waiting for the Big Bang. Whether it will be creative, or destructive we are yet to feel.

*****

image credit: LIGO – graphical visualisation of black holes on the discovery of gravitational waves

References:

Report: Caldicott review – National Data Guardian for Health and Care Review of Data Security, Consent and Opt-Outs 2016

Report: The OneWay Mirror: Public attitudes to commercial access to health data

Royal Statistical Society Survey carried out by Ipsos MORI: The Data Trust Deficit

Thoughts since #UKHC15. UK health datasharing.

The world you will release your technology into, is the world you are familiar with, which is already of the past. Based on old data.

How can you design tools and systems fit for the future? And for all?

For my 100th post and the first of 2016, here is a summary of some of my thoughts prompted by . Several grains of thought related to UK heath data that have been growing for some time.

1000 words on “Hard things: identity, data sharing and consent.” The fun run version.

Do we confuse hard with complex? Hard does not have to mean difficult. Some things seem to be harder than necessary, because of politics. I’ve found this hard to write. Where to start?

The search to capture solutions has been elusive.

The starting line: Identity

Then my first thoughts on identity got taken care of by Vinay Gupta in this post, better than I could. (If you want a long read about identity, you might want to get a hot drink like I did and read and re-read. It says it’ll take an hour. It took me several, in absorption and thinking time. And worth it.)

That leaves data sharing and consent. Both of which I have written many of my other 99 posts about in the last year. So what’s new?

Why are we doing this: why aren’t we there yet?

It still feels very much that many parts of the health service and broader government thinking on ‘digital’ is we need to do something. Why is missing, and therefore achieving and measuring success is hard.

Often we start with a good idea and set about finding a solution how to achieve it. But if the ‘why’ behind the idea is shaky to start with, the solution may falter, as soon as it gets difficult. No one seems to know what #paperless actually means in practice.

So why try and change things? Fixing problems, rather than coming up with good ideas is another way to think of it as they suggested at  #ukhc15, it was a meet-up for people who want to make things better, usually for others, and sometimes that involves improving the systems they worked with directly, or supported others in.

I no longer work in systems’ introductions, or enhancement processes, although I have a lay role in research and admin data, but regular readers know, most of the last two years has been all about the data.  care.data.

More often than not, in #ukhc2015 discussions that focused on “the data” I would try and bring people back to thinking about what the change is trying to solve, what it wants to “make better” and why.

There’s a broad tendency to simply think more data = better. Not true, and I’ll show later a case study why. We must question why.

Why doesn’t everyone volunteer or not want to join in?

Very many people who have spoken with me over the last two years have shared their concrete concerns over the plans to share GP data and they do not get heard. They did not see a need to share their identifiable personal confidential data, or see why truly anonymous data would not be sufficient for health planning, for example.

Homeless men, and women at risk, people from the travelling community, those with disabilities, questions on patients with stigmatising conditions, minorities, children, sexual orientation – not to mention from lawyers or agencies representing them. Or the 11 million of our adult population not online. Few of whom we spoke about. Few of whom we heard from at #ukhc15. Yet put together, these individuals make up not only a significant number of people, but make up a disproportionately high proportion of the highest demands on our health and social care services.

The inverse care law appears magnified in its potential when applied to digital, and should magnify the importance of thinking about access. How will care.data make things better for them, and how will the risks be mitigated? And are those costs being properly assessed if there is no assessment of the current care.data business case and seemingly, since 2012 at least, no serious effort to look at alternatives?

The finish line? We can’t see what it looks like yet.

The #ukhc2015 event was well run, and I liked the spontaneity of people braver than me who were keen to lead sessions and did it well.  As someone who is white, living in a ‘nice’ area, I am privileged. It was a privilege to spend a day with #UKHC15 and packed with people who clearly think about hard things all the time. People who want to make things better.  People who were welcoming to nervous first-timers at an ‘un’conference over a shared lunch.

I hope the voices of those who can’t attend these events, and outside London, are equally accounted for in all government 2016 datasharing plans.

This may be the last chance after years of similar consultations have failed to deliver workable, consensual public data sharing policies.

We have vast streams of population-wide data stored in the UK, about which, the population is largely ignorant. But while the data may be from 25 years ago, whatever is designed today is going to need to think long term, not how do we solve what we know, but how do we design solutions that will work for what we don’t.

Transparency here will be paramount to trust if future decisions are made for us, or those we make for ourselves are ‘influenced’ by machine learning, by algorithms, machine learning and ‘mindspace’ work.

As Thurgood Marshall said,

“Our whole constitutional heritage rebels at the thought of giving government the power to control men’s minds.”

Control over who we are and who the system thinks we are becomes a whole new level of discussion, if we are being told how to make a decision, especially where the decision is toward a direction of public policy based on political choice. If pensions are not being properly funded, to not allocate taxes differently and fund them, is a choice the current government has made, while the DWP seeks to influence our decison, to make us save more in private pensions.

And how about in data discussions make an effort to start talking a little more clearly in the same terms – and stop packaging ‘sharing’ as if it is something voluntary in population-wide compulsory policy.

It’s done to us, not with us, in far too many areas of government we do not see. Perhaps this consultation might change that, but it’s the ‘nth’ number of consulations and I want to be convinvced this one is intentional of real change. It’s only open for a few weeks, and this meet up for discussion appeared to be something only organised in London.

I hope we’ll hear committment to real change in support of people and the uses of our personal data by the state in the new #UkDigiStrategy, not simply more blue skythinking and drinking the ‘datasharing’ kool-aid.  We’ve been talking in the UK for far too long about getting this right.

Let’s see the government serious about making it happen. Not for government, but in the public interest, in a respectful and ethical partnership with people, and not find changes forced upon us.

No other foundation will be fit for a future in which care.data, the phenotype data, is to be the basis for an NHS so totally personalised.

If you want a longer read, read on below for my ten things in detail.

Comment welcome.

########

Hard things: The marathon version, below.
Continue reading Thoughts since #UKHC15. UK health datasharing.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex. Too little, too late.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex ed. They should be discussing prevention and personal data protection for all our personal data, not just one company, after the event.

Everyone’s been talking about TalkTalk and for all the wrong reasons. Data loss and a 15-year-old combined with a reportedly reckless response to data protection, compounded by lack of care.

As Rory Cellan-Jones wrote [1] rebuilding its reputation with customers and security analysts is going to be a lengthy job.

In Parliament Chi Onwarah, Shadow Minister for Culture & the Digital Economy, summed up in her question, asking the Minister to acknowledge “that all the innovation has come from the criminals while the Government sit on their hands, leaving it to businesses and consumers to suffer the consequences?”  [Hansard 2]

MPs were concerned for the 4 million* customers’ loss of name, date of birth, email, and other sensitive data, and called for an inquiry. [It may now be fewer*.] [3] The SciTech committee got involved too.

I hope this means Parliament will talk about TalkTalk not as the problem to be solved, but as one case study in a review of contemporary policy and practices in personal data handling.

Government spends money in data protection work in the [4] “National Cyber Security Programme”. [NCSP] What is the measurable outcome – particularly for TalkTalk customers and public confidence – from its £860M budget?  If you look at the breakdown of those sums, with little going towards data protection and security compared with the Home Office and Defence, we should ask if government is spending our money in an appropriately balanced way on the different threats it perceives. Keith Vaz suggested British companies that lose £34 billion every year to cybercrime. Perhaps this question will come into the inquiry.

This all comes after things have gone wrong.  Again [5]. An organisation we trusted has abused that trust by not looking after data with the stringency that customers should be able to expect in the 21st century, and reportedly not making preventative changes, apparent a year ago. Will there be consequences this time?

The government now saying it is talking about data protection and consequences, is like saying they’re talking sex education with teens, but only giving out condoms to the boys.

It could be too little too late. And they want above all to avoid talking about their own practices. Let’s change that.

Will this mean a review to end risky behaviour, bring in change, and be wiser in future?

If MPs explore what the NCSP does, then we the public, should learn more about what government’s expectations of commercial companies is in regards modern practices.

In addition, any MPs’ inquiry should address government’s own role in its own handling of the public’s personal data. Will members of government act in a responsible manner or simply tell others how to do so?

Public discussion around both commercial and state use of our personal data, should mean genuine public engagement. It should involve a discussion of consent where necessary for purposes  beyond those we expect or have explained when we submit our data, and there needs to be a change in risky behaviour in terms of physical storage and release practices, or all the talk, is wasted.

Some say TalkTalk’s  practices mean they have broken their contract along with consumer trust. Government departments should also be asking whether their data handling would constitute a breach of the public’s trust and reasonable expectations.

Mr Vaizey should apply his same logic to government handling data as he does to commercial handling. He said he is open to suggestions for improvement. [6]

Let’s not just talk about TalkTalk.

    • Let’s Talk Consequences: organisations taking risk seriously and meaningful consequences if not [7]
    • Let’s Talk Education: the education of the public on personal data use by others and rights and responsibilities we have [8]
    • Let’s Talk Parliament’s Policies and Practices: about its own complementary lack of data  understanding in government and understand what good practice is in physical storage, good governance and transparent oversight
    • Let’s Talk Public Trust: and the question whether government can be trusted with public data it already has and whether its current handling makes it trustworthy to take more [9]

Vaizey said of the ICO now in his own department: “The Government take the UK’s cyber-security extremely seriously and we will continue to do everything in our power to protect organisations and individuals from attacks.”

“I will certainly meet the Information Commissioner to look at what further changes may be needed in the light of this data breach. [..] It has extensive powers to take action and, indeed, to levy significant fines. “

So what about consequences when data are used in ways the public would consider a loss, and not through an attack or a breach, but government policy? [10]

Let’s Talk Parliament’s Policies and Practices

Commercial companies are not alone in screwing up the use and processing [11] management of our personal data. The civil service under current policy seems perfectly capable of doing by itself. [12]

Government data policy has not kept up with 21st century practices and to me seems to work in the dark, as Chi Onwarah said,

‘illuminated by occasional flashes of incompetence.’

This incompetence can risk harm to people’s lives, to business and to public confidence.

And once given, trust would be undermined by changing the purposes or scope of use for which it was given, for example as care.data plans to do after the pilot. A most risky idea.

Trust in these systems, whether commercial or state, is crucial. Yet reviews which highlight this, and make suggestions to support trust such as ‘data should never be (and currently is never) released with personal identifiers‘ in The Shakespeare Review have been ignored by government.

Where our personal data are not used well in government departments by the department themselves, they seem content to date to rely on public ignorance to get away with current shoddy practices.

Practices such as not knowing who all your customers are, because they pass data on to others. Practices, such as giving individual level identifiable personal data to third parties without informing the public, or asking for consent. Practices, such as never auditing or measuring any benefit of giving away others personal data.

“It is very important that all businesses, particularly those handling significant amounts of sensitive customer data, have robust procedures in place to protect those data and to inform customers when there may have been a data breach.” Ed Vaizey, Oct 26th, HOC

If government departments prove to be unfit to handle the personal data we submit in trust to the state today, would we be right to trust them with even more?

While the government is busy wagging fingers at commercial data use poor practices, the care.data debacle is evidence that not all its MPs or civil service understand how data are used in commercial business or through government departments.

MPs calling for commercial companies to sharpen up their data protection must understand how commercial use of data often piggy-backs the public use of our personal data, or others getting access to it via government for purposes that were unintended.

Let’s Talk Education

If the public is to understand how personal data are to be kept securely with commercial organisations, why should they not equally ask to understand how the state secures their personal data? Educating the public could lead to better engagement with research, better understanding of how we can use digital services and a better educated society as a whole. It seems common sense.

At a recent public event [13],  I asked civil servants talking about big upcoming data plans they announced, linking school data with more further education and employment data, I asked how they planned to involve the people whose data they would use. There was no public engagement to mention. Why not? Inexcusable in this climate.

Public engagement is a matter of trust and developing understanding in a relationship. Organisations must get this right.[14]

If government is discussing risky practices by commercial companies, they also need to look closer to home and fix what is broken in government data handling where it exposes us to risk through loss of control of our personal data.

The National Pupil Database for example, stores and onwardly shares identifiable individual sensitive data of at least 8m children’s records from age 2 -19. That’s twice as big as the TalkTalk loss was first thought to be.

Prevention not protection is what we should champion. Rather than protection after the events,  MPs and public must demand emphasis on prevention measures in our personal data use.

This week sees more debate on how and why the government will legislate to have more powers to capture more data about all the people in the country. But are government policy, process and practices fit to handle our personal data, what they do with it and who they give it to?

Population-wide gathering of data surveillance in any of its many forms is not any less real just because you don’t see it. Children’s health, schools, increases in volume of tax data collection. We don’t discuss enough how these policies can be used every day without the right oversight. MPs are like the conservative parents not comfortable talking to their teens about sleeping with someone. Just because you don’t know, it doesn’t mean they’re not doing it. [15] It just means you don’t want to know because if you find out they’re not doing it safely, you’ll have to do something about it.

And it might be awkward. (Meanwhile in schools real, meaningful PHSE has been left off the curriculum.)

Mr. Vaizey asked in the Commons for suggestions for improvement.

My suggestion is this. How government manages data has many options. But the principle should be simple. Our personal data needs not only protected, but not exposed to unnecessary risk in the first place, by commercial or state bodies. Doing nothing, is not an option.

Let’s Talk about more than TalkTalk

Teens will be teens. If commercial companies can’t manage their systems better to prevent a child successfully hacking it, then it’s not enough to point at criminal behaviour. There is fault to learn from on all sides. In commercial and state uses of personal data.

There is talk of new, and bigger, data sharing plans. [16]

Will the government wait to see  and keep its fingers crossed each month to see if our data are used safely at unsecured settings with some of these unknown partners data might be onwardly shared with, hoping we won’t find out and they won’t need to talk about it, or have a grown up public debate based on public education?

Will it put preventative measures in place appropriate to the sensitivity and volume of the data it is itself responsible for?

Will moving forward with new plans mean safer practices?

If government genuinely wants our administrative data at the heart of digital government fit for the 21st century, it must first understand how all government departments collect and use public data. And it must educate the public in this and commercial data use.

We need a fundamental shift in the way the government respects public opinion and shift towards legal and privacy compliance – both of which are lacking.

Let’s not talk about TalkTalk. Let’s have meaningful grown up debate with genuine engagement. Let’s talk about prevention measures in our data protection. Let’s talk about consent. It’s personal.

******

[1] Questions for TalkTalk: http://www.bbc.co.uk/news/technology-34636308

[2] Hansard: http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm151026/debtext/151026-0001.htm#15102612000004

[3] TalkTalk update: http://www.talktalkgroup.com/press/press-releases/2015/cyber-attack-update-tuesday-october-30-2015.aspx

[4] The Cyber Security Programme: http://www.civilserviceworld.com/articles/feature/depth-look-national-cyber-security-programme

[5] Paul reviews TalkTalk; https://paul.reviews/value-security-avoid-talktalk/

[6] https://ico.org.uk/for-organisations/guide-to-data-protection/conditions-for-processing/

[7] Let’s talk Consequences: the consequences of current failures to meet customers’ reasonable expectations of acceptable risk, are low compared with elsewhere.  As John Nicolson (East Dunbartonshire) SNP pointed out in the debate, “In the United States, AT&T was fined £17 million for failing to protect customer data. In the United Kingdom, the ICO can only place fines of up to £500,000. For a company that received an annual revenue of nearly £1.8 billion, a fine that small will clearly not be terrifying. The regulation of telecoms must be strengthened to protect consumers.”

[8] Let’s talk education: FOI request revealing a samples of some individual level data released to members of the press: http://www.theyworkforyou.com/debates/?id=2015-10-26b.32.0

The CMA brought out a report in June, on the use of consumer data, the topic should be familiar in parliament, but little engagement has come about as a result. It suggested the benefit:

“will only be realised if consumers continue to provide data and this relies on them being able to trust the firms that collect and use it”, and that “consumers should know when and how their data is being collected and used and be able to decide whether and how to participate. They should have access to information from firms about how they are collecting, storing and using data.”

[9] Let’s Talk Public Trust – are the bodies involved Trustworthy? Government lacks an effective data policy and is resistant to change. Yet it wants to collect ever more personal and individual level for unknown purposes from the majority of 60m people, with an unprecedented PR campaign.  When I heard the words ‘we want a mature debate’ it was reminiscent of HSCIC’s ‘intelligent grown up debate’ requested by Kinglsey Manning, in a speech when he admitted lack of public knowledge was akin to a measure of past success, and effectively they would rather have kept the use of population wide health data ‘below the radar’.

Change: We need change, the old way after all, didn’t work, according to Minister Matt Hancock: “The old model of government has failed, so we will build a new one.” I’d like to see what that new one will look like. Does he mean to expand only data sharing policy, or the powers of the civil service?

[10] National Pupil Database detailed data releases to third parties https://www.whatdotheyknow.com/request/pupil_data_national_pupil_databa

[11] http://adrn.ac.uk/news-events/latest-news/adrn-rssevent

[12] https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[13] https://www.liberty-human-rights.org.uk/human-rights/privacy/state-surveillance

[14] http://www.computerweekly.com/news/4500256274/Government-will-tackle-barriers-to-sharing-and-linking-data-says-Cabinet-Office-minister-Hancock

care.data: delayed or not delayed? The train wreck that is always on time

If you cancel a train does it still show up in the delayed trains statistics?

care.data plans are not delayed (just don’t ask Healthwatch)

Somerset CCG’s announcement [1] of the delay in their care.data plans came as no surprise, except perhaps to NHS England who effectively denied it, reportedly saying work continues. [2] Both public statements may be true but it would have been good professional practice to publicly recognise that a top down delay affects others who are working hard on the ground to contribute to the effective rollout of the project. Causing confusion and delay is hard to work with. Change and technology projects run on timelines. Deadlines mean that different teams can each do their part and the whole gets done. Or not.

Healthwatch [3] has cancelled their planned public meetings.  Given that one of the reasons stated in the care.data CCG selection process was support from local patient groups including Healthwatch, this appears poor public relations. It almost wouldn’t matter, but in addition to the practicalities, the organisation and leadership are trying to prove it is trustworthy. [4]


HW_cancels


Somerset’s statement is straightforward and says it is applies to all pathfinders: 

“Following a speech by Jeremy Hunt, the Secretary of State for Health this week (3-9-15), in which he outlined his vision for the future use of technology across NHS, NHS England has asked the four care.data pathfinder pilots areas in England (Leeds, Blackburn and Derwent, West Hampshire and Somerset) to temporarily pause their activities.” [Sept 4, Somerset statement]


somerset


From when I first read of the GPES IAG concerns [5] I have seen the care.data programme hurtle from one crisis to another. But this is now a train wreck. A very quiet train wreck. No one has cried out much.[6] And yet I think the project,  professionals, and the public should be shouting from the top of the carriages that this programme needs help if it is ever to reach its destination.

care.data plans are not late against its business plan (there is none)

Where’s the business case? Why can’t it define deadlines that it can achieve?  In February 2015, I suggested the mentality that allows these unaccountable monster programmes to grow unchecked must die out.

I can’t even buy an Oyster card if I don’t know if there is money in my pocket. How can a programme which has already spent multi millions of pounds keep driving on without a budget? There is no transparency of what financial and non-financial benefits are to be expected to justify the cost. There is no accountable public measure of success checking it stays on track.

While it may be more comfortable for the organisation to deny problems, I do not believe it serves the public interest to hide information. This is supported by the very reason for being of the MPA process and its ‘challenge to Whitehall secrecy‘ [7] who rated the care.data rollout red [8] in last years audit. This requires scrutiny to find solutions.

care.data plans do not need to use lessons learned (do they?)

I hope at least there are lessons learned here in the pathfinder on what not to do before the communications rollout to 60m people.  In the words of Richard Feynman, “For successful technology, reality must take precedence over public relations.”

NHS England is using the public interest test to withhold information: “the particular public interest in preserving confidential communications between NHS England and its sponsoring department [the DH].”  I do not believe this serves the public interest if it is used to hide issues and critical external opinion. The argument made is that there is “stronger public interest in maintaining the exemption where it allows the effective development of policy and operational matters on an ongoing basis.”  The Public Accounts Committee in 2013 called for early transparency and intervention which prevents the ongoing waste of “billions of pounds of taxpayers’ money” in their report into the NPfIT. [9] It showed that a lack of transparency and oversight contributed to public harm, not benefit, in that project, under the watch of the Department of Health. The report said:

“Parliament needs to be kept informed not only of what additional costs are being incurred, but also of exactly what has been delivered so far for the billions of pounds spent on the National Programme. The benefits flowing from the National Programme to date are extremely disappointing. The Department estimates £3.7 billion of benefits to March 2012, just half of the costs incurred. This saga [NPfIT] is one of the worst and most expensive contracting fiascos in the history of the public sector.”

And the Public Accounts Committee made a recommendation in 2013:

“If the Department is to deliver a paperless NHS, it needs to draw on the lessons from the National Programme and develop a clear plan, including estimates of costs and benefits and a realistic timetable.” [PAC 2013][9]

Can we see any lessons drawn on today in care.data? Or any in Jeremy Hunt’s speech or his refusal to comment on costs for the paperless NHS plans reported by HSJ journal at NHSExpo15?

While history repeats itself and “estimates of costs and benefits and a realistic timetable” continue to be absent in the care.data programme, the only reason given by Somerset for delay is to fix the specific issue of opt out:

“The National Data Guardian for health and care, Dame Fiona Caldicott, will… provide advice on the wording for a new model of consents and opt-outs to be used by the care.data programme that is so vital for the future of the NHS. The work will be completed by January [2016]…”

Perhaps delay will buy NHS England some time to get itself on track and not only respect public choice on consent, but also deliver a data usage report to shore up trust, and tell us what benefits the programme will deliver that cannot already be delivered today (through existing means, like the CPRD for research [10]).

Perhaps.

care.data plans will only deliver benefits (if you don’t measure costs)

I’ve been told “the realisation of the benefits, which serve the public interest, is dependent on the care.data programme going ahead.” We should be able to see this programme’s costs AND benefits. It is we collectively after all who are paying for it, and for whom we are told the benefits are to be delivered. DH should release the business plan and all cost/benefit/savings  plans. This is a reasonable thing to ask. What is there to hide?

The risk has been repeatedly documented in 2014-15 board meetings that “the project continues without an approved business case”.

The public and medical profession are directly affected by the lack of money given by the Department of Health as the reason for the reductions in service in health and social care. What are we missing out on to deliver what benefit that we do not already get elsewhere today?

On the pilot work continuing, the statement from NHS England reads: “The public interest is best served by a proper debate about the nature of a person’s right to opt out of data sharing and we will now have clarity on the wording for the next steps in the programme,” 

I’d like to see that ‘proper debate’ at public events. The NIB leadership avoids answering hard questions even if asked in advance, as requested. Questions such as mine go unanswered::

“How does NHS England plan to future proof trust and deliver a process of communications for the planned future changes in scope, users or uses?”

We’re expected to jump on for the benefits, but not ask about the cost.

care.data plans have no future costs (just as long as they’re unknown)

care.data isn’t only an IT infrastructure enhancement and the world’s first population wide database of 60m primary care records. It’s a massive change platform through which the NHS England Commissioning Board will use individual level business intelligence to reshape the health service. A massive change programme  that commodifies patient confidentiality as a kick-starter for economic growth.  This is often packaged together with improvements for patients, requirements for patient safety, often meaning explanations talk about use of records in direct care conflated with secondary uses.

“Without interoperable digital data, high quality effective local services cannot be delivered; nor can we achieve a transformation in patient access to new online services and ‘apps’; nor will the NHS maximise its opportunity to be a world centre in medical science and research.” [NHS England, September 1 2015] 

So who will this transformation benefit? Who and what are all its drivers? Change is expensive. It costs time and effort and needs investment.

Blackburn and Darwen’s Healthwatch appear to have received £10K for care.data engagement as stated in their annual report. Somerset’s less clear. We can only assume that Hampshire, expecting a go live ‘later in 2015’ has also had costs. Were any of their patient facing materials already printed for distribution, their ‘allocated-under-austerity’ budgets spent?

care.data is not a single destination but a long journey with a roadmap of plans for incremental new datasets and expansion of new users.

The programme should already know and be able to communicate the process behind informing the public of future changes to ensure future use will meet public expectations in advance of any change taking place. And we should know who is going to pay for that project lifetime process, and ongoing change management. I keep asking what that process will be and how it will be managed:

June 17 2015, NIB meeting at the King’s Fund Digital Conference on Health & Social Care:

june17

September 2 2015, NIB Meeting at NHS Expo 15:

NIBQ_Sept

It goes unanswered time and time again despite all the plans and roadmaps and plans for change.

These projects are too costly to fail. They are too costly to justify only having transparency applied after the event, when forced to do so.

care.data plans are never late (just as long as there is no artificial deadline)

So back to my original question. If you cancel a train does it still show up in the delayed trains statistics? I suppose if the care.data programme claims there is no artificial deadline, it can never be late. If you stop setting measurable deadlines to deliver against, the programme can never be delayed. If there is no budget set, it can never be over it. The programme will only deliver benefits, if you never measure costs.

The programme can claim it is in the public interest for as long as we are prepared to pay with an open public purse and wait for it to be on track.  Wait until data are ready to be extracted, which the notice said:

…” is thought to remain a long way off.” 

All I can say to that, is I sure hope so. Right now, it’s not fit for purpose. There must be decisions on content and process arrived at first. But we also deserve to know what we are expecting of the long journey ahead.

On time, under budget, and in the public interest?

As long as NHS England is the body both applying and measuring the criteria, it fulfils them all.

*******

[1] Somerset CCG announces delay to care.data plans https://www.somersetlmc.co.uk/caredatapaused

[2] NHS England reply to Somerset announcement reported in Government Computing http://healthcare.governmentcomputing.com/news/ccg-caredata-pilot-work-continues-4668290

[3] Healthwatch bulletin: care.data meetings cancelled http://us7.campaign-archive1.com/?u=16b067dc44422096602892350&id=5dbdfc924c

[4] Building public trust: after the NIB public engagement in Bristol https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[5] GPES IAG http://www.hscic.gov.uk/media/12911/GPES-IAG-Minutes-12-September-2013/pdf/GPES_IAG_Minutes_12.09.13.pdf

[6] The Register – Right, opt out everybody! hated care.data programme paused again http://www.theregister.co.uk/2015/09/08/hated_caredata_paused_again_opt_out/

[7] Pulse Today care.data MPA rating http://www.pulsetoday.co.uk/your-practice/practice-topics/it/caredata-looks-unachievable-says-whitehall-watchdog/20010381.article#.VfMXYlbtiyM

[8] Major Projects Authority https://engage.cabinetoffice.gov.uk/major-projects-authority/

[9] The PAC 2013 ttp://www.parliament.uk/business/committees/committees-a-z/commons-select/public-accounts-committee/news/npfit-report/

[10] Clinical Practice Research Datalink (CPRD)

***

image source: http://glaconservatives.co.uk/news/london-commuters-owed-56million-in-unclaimed-refunds-by-rail-operators/

 

Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

care.data communicating the benefits as its response to the failed communications in spring 2014, has failed to deliver public trust, here’s why:

To focus on the benefits is a shortcut for avoiding the real issues

Talking about benefits is about telling people what the organisation wants to tell them. This fails to address what the public and professionals want to know. The result is not communication, but a PR exercise.

Talking about benefits in response to the failed communications in spring 2014 and failing to address criticism since, ignores concerns that public and professionals raised at macro and micro level.  It appears disingenuous about real engagement despite saying ‘we’re listening’ and seems uncaring.

Talking about only the benefits does not provide any solution to demonstrably outweigh the potential risk of individual and public health harm through loss of trust in the confidential GP relationship, or data inaccuracy, or loss, and by ignoring these, seems unrealistic.

Talking about short term benefits and not long term solutions [to the broken opt out, long term security, long term scope change of uses and users and how those will be communicated] does not demonstrate competency or reliability.

Talking about only the benefits of commissioning, and research for the merged dataset CES, doesn’t mention all the secondary uses to which all HSCIC patient level health data are put, [those reflected in Type 2 opt out] including commercial re-use and National Back Office: “2073 releases made from the National Back Office between April 2013 and December 2013. This includes 313 releases to police forces, 1531 to the Home Office and 229 to the National Crime Agency.” [HSCIC, July2,  2014].

This use of hospital records and other secondary data by the back office, without openly telling the public, does not feel  ethical and transparent.

Another example, is the past patient communications that expressly said, ‘we do not collect name’, the intent of which would appear to be to assure patients of anonymity, without saying name is already stored at HSCIC on the Personal Demographics Service, or that name is not needed to be identifiable.

We hear a lot about transparency. But is transparent the same fully accurate, complete and honest? Honest about the intended outcomes of the programme. Honest about all the uses to which health data are put. Honest about potential future scope changes and those already planned.

Being completely truthful in communications is fundamental to future-proofing trust in the programme.

NHS England’s care.data programme through the focus on ‘the benefits’ lacks balance and appears disingenuous, disinterested,  unrealistic and lacking in reliability, competency and honesty. Through these actions it does not demonstrate the organisation is trustworthy.  This could be changed.

care.data fundamentally got it wrong with the intention to not communicate the programme at all.  It got it wrong in the tool and tone of communications in the patient leaflet.  There is a chance to get it right now, if the organisation  would only stop the focus on communicating the benefits.

I’m going to step through with a couple of examples why to-date, some communications on care.data and use of NHS data are not conducive to trust.

Communication designed to ‘future-proof’ an ongoing relationship and trust must be by design, not afterthought.

Communications need to start addressing the changes that are happening and how they make people feel and address the changes that create concern – in the public and professionals – not address the  goals that the organisation has.

Sound familiar? Communications to date have been flawed in the same way that the concept of ‘building trust’ has been flawed. It has aimed to achieve the wrong thing and with the wrong audience.

Communications in care.data needs to stop focussing on what the organisation wants from the public and professionals – the benefits it sees of getting data – and address instead firstly at a macro level, why the change is necessary and why the organisation should be trusted to bring it about.

When explaining benefits there are clearly positives to be had from using primary and secondary data in the public interest. But what benefits will be delivered in care.data that are not already on offer today?

Why if commissioning is done today with less identifiable data, can there be no alternative to the care.data level of identifiable data extraction? Why if the CPRD offers research in both primary and secondary care today, will care.data offer better research possibilities? And secondly at a micro level, must address questions individuals asked up and down the country in 2014.

What’s missing and possible to be done?

  1. aim to meet genuine ongoing communication needs not just legal data protection fair processing tick-boxes
  2. change organisational attitude that encourages people to ask what they each want to know at macro and micro level – why the programme at all, and what’s in it for me? What’s new and a benefit that differs from the status quo? This is only possible if you will answer what is asked.
  3. deliver robust explanations of the reason why the macro and micro benefits demonstrably outweigh the risk of individual potential harms
  4. demonstrate reliability, honesty, competency and you are trustworthy
  5. agree how scope changes will trigger communication to ‘future-proof’ an ongoing relationship and trust by design.

As the NIB work stream on Public Trust says, “This is not merely a technical exercise to counter negative media attention; substantial change and long-term work is needed to deliver the benefits of data use.”

If they’re serious about that long term work, then why continue to roll out pathfinder communications based on a model that doesn’t work, with an opt out that doesn’t work? Communications isn’t an afterthought to public trust. It’s key.

If you’re interested in details and my proposals for success in communications I’ve outlined in depth below:

  • Why Communicate Changes at all?
  • What is change in care.data about?
  • Is NHS England being honest about why this is hard?
  • Communicate the Benefits is not working
  • A mock case study in why ‘communicate the benefits’ will fail
  • Long term trust needs a long term communications solution
  • How a new model for NHS care.data Communication could deliver

Continue reading Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

Let’s assume the question of public trust is as important to those behind data sharing plans in the NHS [1] as they say it is. That the success of the care.data programme today and as a result, the very future of the NHS depends upon it.

“Without the care.data programme, the health service will not have a future, said Tim Kelsey, national director for patients and information, NHS England.” [12]

And let’s assume we accept that public trust is not about the public, but about the organisation being trustworthy.[2]

The next step is to ask, how trustworthy is the programme and organisation behind care.data? And where and how do they start to build?

The table discussion on  [3] “Building Public Trust in Data Sharing”  considered  “what is the current situation?” and “why?”

What’s the current situation? On trust public opinion is measurable. The Royal Statistical Society Data Trust Deficit shows that the starting points are low with the state and government, but higher for GPs. It is therefore important that the medical profession themselves trust the programme in principle and practice. They are after all the care.data point of contact for patients.

The current status on the rollout, according to news reports, is that pathfinder  practices are preparing to rollout [4]  communications in the next few weeks. Engagement is reportedly being undertaken ‘over the summer months’. 

Understanding both public trust and the current starting point matters as the rollout is moving forwards and as leading charity and research organisation experts said: “Above all, patients, public and healthcare professionals must understand and trust the system. Building that trust is fundamental. We believe information from patient records has huge potential to save and improve lives but privacy concerns must be taken seriously. The stakes are too high to risk any further mistakes.” [The Guardian Letters, July 27, 2015]

Here’s three steps I feel could be addressed in the short term, to start to demonstrate why the public and professionals should trust  both organisation and process.

What is missing?

1. Opt out: The type 2 opt out does not work. [5]  

2 a. Professional voices called for answers and change: As mentioned in my previous summary various bodies called for change. Including the BMA whose policy [6] remains that care.data should be on a patient opt-in basis.

2bPublic voices called for answers and change: care.data’s own listening event feedback [7] concluded there was much more than ‘communicate the benefits’ that needed done. There is much missing. Such as questions on confusing SCR and care.data, legislation and concern over controlling its future change, GP concerns of their ethical stance, the Data Guardian’s statutory footing, correction of mistakes, future funding and more.
How are open questions being addressed? If at all?

3. A single clear point of ownership on data sharing and public trust communications> Is this now NIB, NHS England Patients and Information Directorate, the DH  who owns care.data now? It’s hard to ask questions if you don’t know where to go and the boards seem to have stopped any public communications. Why? The public needs clarity of organisational oversight.

What’s the Solution? 

1. Opt out: The type 2 opt out does not work. See the post graphic, the public wanted more clarity over opt out in 2014, so this needs explained clearly >>Solution: follows below from a detailed conversation with Mr. Kelsey.

2. Answers to professional opinions: The Caldicott panel,  raised 27 questions in areas of concern in their report. [8] There has not yet been any response to address them made available in the public domain by NHS England. Ditto APPG report, BMA LMC vote, and others >> Solution: publish the responses to these concerns and demonstrate what actions are being done to address them.

2b. Fill in the lack of transparency: There is no visibility of any care.data programme board meeting minutes or materials from 2015. In eight months, nothing has been published. Their 2014 proposal for transparency, appears to have come to nothing. Why?  The minutes from June-October 2014 are also missing entirely and the October-December 2014 materials published were heavily redacted. There is a care.data advisory board, which seems to have had little public visibility recently either. >> Solution: the care.data programme business case must be detailed and open to debate in the public domain by professionals and public. Scrutiny of its associated current costs and time requirements, and ongoing future financial implications at all levels should be welcomed by national, regional (CCG) and local level providers (GPs). Proactively publishing creates demonstrable reasons why both the organisation, and the plans are both trustworthy. Refusing this without clear justifications, seems counter productive, which is why I have challenged this in the public interest. [10]

3. Address public and professional confusion of ownership: Since data sharing and public trust are two key components of the care.data programme, it seems to come under the NIB umbrella, but there is a care.data programme board [9] of its own with a care.data Senior Responsible Owner and Programme Director. >> Solution: an overview of where all the different nationally driven NHS initiatives fit together and their owners would be helpful.

[Anyone got an interactive Gantt chart for all national level driven NHS initiatives?]

This would also help public and professionals see how and why different initiatives have co-dependencies. This could also be a tool to reduce the ‘them and us’ mentality. Also useful for modelling what if scenarios and reality checks on 5YFV roadmaps for example, if care.data pushes back six months, what else is delayed?

If the public can understand how things fit together it is more likely to invite questions, and an engaged public is more likely to be a supportive public. Criticism can be quashed if it’s incorrect. If it is justified criticism, then act on it.

Yes, these are hard decisions. Yes, to delay again would be awkward. If it were the right decision, would it be worse to ignore it and carry on regardless? Yes.

The most important of the three steps in detail: a conversation with Mr. Kelsey on Type 2 opt out. What’s the Solution?

We’re told “it’s complicated.” I’d say “it’s simple.” Here’s why.

At the table of about fifteen participants at the Bristol NIB event, Mr. Kelsey spoke very candidly and in detail about consent and the opt out.

On the differences between consent in direct care and other uses he first explained the assumption in direct care. Doctors and nurses are allowed to assume that you are happy to have your data shared, without asking you specifically. But he said, “beyond that boundary, for any other purpose, that is not a medical purpose in law, they have to ask you first.”

He went on to explain that what’s changed the whole dynamic of the conversation, is the fact that the current Secretary of State, decided that when your data is being shared for purposes other than your direct care, you not only have the right to be asked, but actually if you said you didn’t want it to be shared, that decision has to be respected, by your clinician.

He said: “So one of the reasons we’re in this rather complex situation now, is because if it’s for analysis, not only should you be asked, but also when you say no, it means no.”

Therefore, I asked him where the public stands with that now. Because at the moment there are ca. 700,000 people who we know said no in spring 2014.

Simply: They opted out of data used for secondary purposes, and HSCIC continues to share their data.

“Is anything more fundamentally damaging to trust, than feeling lied to?”

Mr. Kelsey told the table there is a future solution, but asked us not to tweet when. I’m not sure why, it was mid conversation and I didn’t want to interrupt:

“we haven’t yet been able to respect that preference, because technically the Information Centre doesn’t have the digital capability to actually respect it.”

He went on to say that they have hundreds of different databases and at the moment, it takes 24 hrs for a single person’s opt out to be respected across all those hundreds of databases. He explained a person manually has to enter a field on each database, to say a person’s opted out. He asked the hoped-for timing not be tweeted but explained that all those current historic objections which have been registered will be respected at a future date.

One of the other attendees expressed surprise that GP practices hadn’t been informed of that, having gathered consent choices in 2014 and suggested the dissent code could be extracted now.

The table discussion then took a different turn with other attendee questions, so I’m going to ask here what I would have asked next in response to his statement, “if it’s for analysis, not only should you be asked, but also when you say no, it means no.”

Where is the logic to proceed with pathfinder communications?

What was said has not been done and you therefore appear untrustworthy.

If there will be a future solution it will need communicated (again)?

“Trust is not about the public. Public trust is about the organisation being trustworthy.”

There needs to be demonstrable action that what the org said it would do, the org did. Respecting patient choice is not an optional extra. It is central in all current communications. It must therefore be genuine.

Knowing that what was promised was not respected, might mean millions of people choose to opt out who would not otherwise do so if the process worked when you communicate it.

Before then any public communications in Blackburn and Darwen, and Somerset, Hampshire and Leeds surely doesn’t make sense.

Either the pathfinders will test the same communications that are to be rolled out as a test for a national rollout, or they will not. Either those communications will explain the secondary uses opt out, or they will not. Either they will explain the opt out as is [type 2 not working] or as they hope it might be in future. [will be working] Not all of these can be true.

People who opt out on the basis of a broken process simply due to a technical flaw, are unlikely to ever opt back in again. If it works to starts with, they might choose to stay in.

Or will the communications roll out in pathfinders with a forward looking promise, repeating what was promised but has not yet been done? We will respect your promise (and this time we really mean it)? Would public trust survive that level of uncertainty? In my opinion, I don’t think so.

There needs to be demonstrable action in future as well, that what the org said it would do, the org did. So the use audit report and how any future changes will be communicated both seem basic principles to clarify for the current rollout as well.

So what’s missing and what’s the solution on opt out?

We’re told “it’s complicated.” I say “it’s simple.” The promised opt out must work before moving forward with anything else. If I’m wrong, then let’s get the communications materials out for broad review to see how they accommodate this and the future re-communication of  second process.

There must be a budgeted and planned future change communication process.

So how trustworthy is the programme and organisation behind care.data?

Public opinion on trust levels is measurable. The Royal Statistical Society Data Trust Deficit shows that the starting points are clear. The current position must address the opt out issue before anything else. Don’t say one thing, and do another.

To score more highly on the ‘truthworthy scale’ there must be demonstrable action, not simply more communications.

Behaviours need change and modelled in practice, to focus on people, not  tools and tech solutions, which make patients feel as if they are less important to the organisations than their desire to ‘enable data sharing’.

Actions need to demonstrate they are ethical and robust for a 21stC solution.

Policies, practical steps and behaviours all play vital roles in demonstrating that the organisations and people behind care.data are trustworthy.

These three suggestions are short term, by that I mean six months. Beyond that further steps need to be taken to be demonstrably trustworthy in the longer term and on an ongoing basis.

Right now, do I trust that the physical security of HSCIC is robust? Yes.

Do I trust that the policies in the programme would not to pass my data in the future to third party commercial pharma companies? No.
Do I believe that for enabling commissioning my fully identifiable confidential health records should be stored indefinitely with a third party? No.
Do I trust that the programme would not potentially pass my data to non-health organisations, such as police or Home Office? No.
Do I trust that the programme to tell me if they potentially change the purposes from those which they outline now ? No.

I am open to being convinced.

*****

What is missing from any communications to date and looks unlikely to be included in the current round and why that matters I address in my next post Building Public Trust [4]: Communicate the Benefits won’t work for care.data and then why a future change management model of consent needs approached now, and not after the pilot, I wrap up in [5]: Future solutions.

Continue reading Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

Building Public Trust [2]: a detailed approach to understanding Public Trust in data sharing

Enabling public trust in data sharing is not about ‘communicating benefits’. For those interested in nitty gritty, some practical suggestions for progress in Building Public Trust in data sharing follows on from my summary after the NIB Bristol event 24/7/15.

Trust is an important if invisible currency used in the two-way transactions between an organisation and people.

So far, there have been many interactions and listening events but much of what professionals and the public called for, remains undone and public trust in the programme remains unchanged since 2014.

If you accept that it is not public trust that needs built, but the tangible trusthworthiness of an organisation, then you should also ask  what needs done by the organisation to make that demonstrable change?

What’s today’s position on Public Trust of data storage and use

Trust in the data sharing process is layered and dependent on a number of factors. Mostly [based on polls and public event feedback from 2014] “who will access my data and what will they use it for?”

I’m going to look more closely below at planned purposes: research and commissioning.

It’s also important to remember that trust is not homogeneous. Trust  is nuanced even within the single relationship between one individual and another. Trust, like consent, is stratified – you may trust the Post Office to deliver a letter or postcard, but sign up to recorded delivery for something valuable.

So for example when it comes to secondary uses data sharing, I might trust HSCIC with storing and using my health records for anonymous statistics, for analysis of immunisation and illness patterns for example. But as long as they continue to share with the Home Office, police or other loosely defined third parties [5], do I want them to have fully  identifiable data at all?

Those bodies have their own public trust issues at an all time low.

Mixing the legitimate users of health data with these Back Office punitive  uses will invite some people’s opt out who would otherwise not. Some of the very groups who need the most health and social care understanding, research and care, will be the very groups who opt out if there is a possibility of police and Home Office access by the back door. Telling traveller communities what benefits care.data will bring them is wasted effort when they see NHS health data is a police accessible register. I know. I’ve talked to some about it.

That position on data storage and use should be reconsidered if NHS England is serious that this is about health and for the benefit of individuals and communities’ well being.

What has HSCIC changed to demonstrate that  it is trustworthy?

A new physical secure setting is being built that will enable researchers to view research data but not take raw data away.

That is something they can control, and have changed, and it demonstrates they take the public seriously and we reciprocate.

That is great – demonstrable change by the organisation, inviting change in the public.

That’s practical, so what can be done on policy by NHS England/DH?

What else should be done to demonstrate policy is trustworthy?

Act on what the public and professionals asked for in 2014. [8]

Right now it feels as though in public communications that the only kind of relationship that is wanted on the part of the leadership is a one night stand.

It’s all about what the programme wants. Minimise the objections, get the data, and sneak out. Even when its leaders talk about some sort of ongoing consent model, the focus is still about ‘how to enable sharing data.’

This focus is the wrong one. If you want to encourage people to share they need to know why, what’s in it for them, and why do you want it? What collecting the data is about is still important to explain and specifically, each time the scope changes if you are doing it fairly.

Remember. Data-sharing is not vital to future-proof the NHS. Using knowledge wisely is. 

What is the policy for the future of primary care research?

The CPRD already enables primary care GP data to be linked with secondary data for research. In fact it already links more items from GP held data than current are.data plans to extract. So what benefit will care.data offer to research that is not already available today?

Simply having ever more data, stored in more places will not make us wiser. Before it’s collected repeatedly, it is right to question why.

What do we have collected already? How is it used? Where are the gaps in what we want to achieve through the knowledge we could gain. It’s NOT simply about filling in what gaps exist in what data we could gather. Understand the purposes and what will be gained to see if it’s worth the efforts. Prioritise. Collect it all, is not a solution.

I had thought that the types of data to be collected in care.data were clear, and how it differs from direct care was clear. But the Bristol NIB meeting demonstrated a wide range of understanding in NHS and CCG staff, Local Authority staff, IT staff, IG professionals, data providers and other third parties.  Data for secondary purposes are not to be conflated with direct care.

But that’s not what care.data sharing is about. So where to start with public trust, asked the NIB Bristol #health2020 meeting?

Do you ignore the starting point or tailor your approach to it?

“The NHS is at a crossroads and needs to change and improve as it moves forward. That was the message from NHS England’s Chief Executive Simon Stevens as a Five Year Forward View for the NHS was launched.”  [1] [NHS England, Oct 2014]

As the public is told over and over again that change is vital to the health of a sustainable NHS, a parallel public debate rages, whether the policy-making organisations behind the NHS – the commissioning body NHS England, the Department of Health and Cabinet Office – are serious about the survival of universal health and care provision, and about supporting its clinicians.

It is against this backdrop, and under the premise that obtaining patient data for centralised secondary uses is do or die for the NHS, that the NIB #health2020 has set out [2] work stream 4: “Build and sustain public trust: Deliver roadmap to consent based information sharing and assurance of safeguards”

“Without the care.data programme, the health service will not have a future, said Tim Kelsey, national director for patients and information, NHS England.” [3]

 

Polls say [A] nearly all institutions suffer from a ‘trust in data deficit’. Trust in them to use data appropriately, is lower than trust in the organisation generally.

Public trust in what the Prime Minister says on health is low.

Trust in the Secretary of State for Health is possibly at an all time low, with: “a bitter divide, a growing rift between the Secretary of State for Health and the medical profession.” [New Statesman, July 2015]

This matters. care.data needs the support of professionals and public.

ADRN research showed multiple contributing factors: “Participants were also worried about personal data being leaked, lost, shared or sold by government departments to third parties, particularly commercial companies. Low trust in government more generally seemed to be driving these views.” [Dialogue on data]

It was interesting to see all the same issues as reflected by the public in care.data listening events, asked from the opposite perspective from data users.

But it was frustrating to sit ay the Bristol NIB #health2020 event and discuss questions around the same issues on data sharing already discussed at care.data events through the last 18 months.

Nothing substantial has changed other then HSCIC’s physical security for data storage.

It is frustrating knowing that these change and communications issues will keep coming back again and again if not addressed.

Personally, I’m starting to lose trust there is any real intention for change, if senior leadership is unwilling to address this properly and change themselves.

To see a change in Public Trust do what the public asked to see change: On Choice

At every care.data meeting I attended in 2014, people asked for choice.

They asked for boundaries between the purposes of data uses, real choice.

Willingness for their information to be used by academic researchers in the public interest does not equate to being willing for it to be used by a pharmaceutical company for their own market research and profit.

The public understand these separations well. To say they do not, underestimates people and does not reflect public feeling. Anyone attending 2014 care.data events, has heard many people discuss this. They want a granular consent model.

This would offer a red line between how data are used for what purposes.

Of the data-sharing organisations today some are trusted and others are not. Offering a granular consent approach would offer a choice of a red line between who gets access to data.

This choice of selective use, would encourage fewer people to opt out from all purposes, allowing more data to be available for research for example.

To see a change in Public Trust do what the public asked to see: Explain your purposes more robustly

Primarily this data is to be used and kept indefinitely for commissioning purposes. Research wasn’t included as purposes for care.data gathering  in the planned specifications for well over a year. [After research outcry]

Yet specific to commissioning, the Caldicott recommendations [3] were very clear; commissioning purposes were insufficient and illegal grounds for sharing fully identifiable data which was opposed by NHS England’s Commissioning Board:

“The NHS Commissioning Board suggested that the use of personal confidential data for commissioning purposes would be legitimate because it would form part of a ‘consent deal’ between the NHS and service users. The Review Panel does not support such a proposition. There is no evidence that the public is more likely to trust commissioners to handle personal confidential data than other groups of professionals who have learned how to work within the existing law.”

NHS England seems unwilling to change this position, despite the professionals bodies and the public’s opposition to sharing fully identifiable data for commissioning purposes [care.data listening events 2014]. Is it any wonder that they keep hitting the same barrier? More people don’t want that to happen than you do. Something’s gotta give.

Ref the GPES Customer Requirements specification from March 2013 v2.1 which states on page 11: “…for commissioning purposes, it is important to understand activity undertaken (or not undertaken) in all care settings. The “delta load” approach (by which only new events are uploaded) requires such data to be retained, to enable subsequent linkage.”

The public has asked for red lines to differentiate between the purposes of data uses. NHS England and the Department of Health policy seems unwilling to do so.  Why?

To see a change in Public Trust do what the public asked to see: Red lines on policy of commercial use – and its impact on opt out

The public has asked for red lines outlawing commercial exploitation of their data. Though it was said it was changed, in practice it is hard to see. Department of Health policy seems unwilling to be clear, because the Care Act 2012 purposes remained loose.  Why?

As second best, the public has asked for choice not to have their data used at all for secondary purposes and were offered an opt out.

NHS England leaflet and the Department of Health, Secretary of State publicly promised this but has been unable to implement it and to date has made no public announcement on when it will be respected.  Why?

Trust does not exist in a vacuum.  What you say and what you actually do, matter. Policy and practice are co-dependent. Public trust depends on your organisations being trustworthy.

Creating public trust is not the government, the DH or NIB’s task ahead. They must instead focus on improving their own competency, honesty and reliability and through those, they will demonstrate that they can be trusted.

That the secondary purposes opt out has not been respected does not demonstrate those qualities.

“Trust is not about the public. Public trust is about the organisation being trustworthy.”

How will they do that?

Let the DH/NHS England and organisations in policy and practice address what they themselves will stop and start doing to bring about change in their own actions and behaviours.

Communications change request: Start by addressing the current position NOT what the change will bring. You must move people along the curve , not dump them with a fait accomplice and wonder why the reaction is so dire.

changecurve

Vital for this is the current opt out; what was promised and what was done.

The secondary uses opt out must be implemented with urgency.

To see a change in Public Trust you need to take action. the Programme needs to do what the public asked to see change: on granular consent, on commercial use and defined purposes.

And to gather suggested actions, start asking the right questions.

Not ‘how do we rebuild public trust?’ but “how can we demonstrate that we are trustworthy to the public?”

1. How can a [data-sharing] org demonstrate it is trustworthy?
2. Identify: why people feel confident their trust is well placed?
3. Why do clinical professionals feel confident in any org?
4. What would harm the organisational-trust-chain in future?
5. How will the org-trust-chain be positively maintained in future?
6. What opportunities will be missed if that does not happen?
(identify value)

Yes the concepts are close,  but how it is worded defines what is done.

These apparent small differences make all the difference in how people provide you ideas, how you harness them into real change and improvement.

Only then can you start understanding why “communicating the benefits” has not worked and how it should affect future communications  materials.

From this you will find it much easier to target actual tasks, and short and long term do-able solutions than an open discussion will deliver. Doing should  include thinking/attitudes as well as actions.

This will lead to communications messages that are concrete not wooly. More about that in the next posts.

####

To follow, for those interested in nitty gritty, some practical suggestions for progress in Building Public Trust in data sharing:

Part one: A seven step top line summary – What I’d like to see change addressing public trust in health data sharing for secondary purposes.

This is Part two: a New Approach is needed to understanding Public Trust For those interested in a detailed approach on Trust. What Practical and Policy steps influence trust. On Reserach and Commissioning. Trust is not homogeneous. Trust  is nuanced even within the single relationship between one individual and another. It doesn’t exist in a vacuum.

Part three: Know where you’re starting from. What behaviours influence trust and how can we begin to see them demonstrated. Mr.Kelsey discusses  consent and opt out. Fixing what has already been communicated is vital before new communications get rolled out. Vital to tailor the content of public communications, for public trust and credibility the programme must be clear what is missing and what needs filled in. #Health2020 Bristol NIB meeting.

Part four: “Communicate the Benefits” won’t work – How Communications influence trust. For those interested in more in-depth reasons, I outline in part two why the communications approach is not working, why the focus on ‘benefits’ is wrong, and fixes.

Part five: Future solutions – why a new approach may work better for future trust – not to attempt to rebuild trust where there is now none, but strengthen what is already trusted and fix today’s flawed behaviours; honesty and reliability, that  are vital to future proofing public trust.

 

####

References:

[1] NHS England October 2014 http://www.england.nhs.uk/2014/10/23/nhs-leaders-vision/

[2] Workstream 4: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/442829/Work_Stream_4.pdf

[3] Caldicott Review 2: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/192572/2900774_InfoGovernance_accv2.pdf

[4] Missing Programme Board documents: 2015 and June-October 2014

[5] HSCIC Data release register

[6] Telegraph article on Type 2 opt out http://www.telegraph.co.uk/news/health/news/11655777/Nearly-1million-patients-could-be-having-confidential-data-shared-against-their-wishes.html

[7] Why Wanting a Better Care.Data is not Luddite: http://davidg-flatout.blogspot.co.uk/2014/04/why-wanting-better-caredata-is-not.html

[8] Talking to the public about using their data is crucial- David Walker, StatsLife http://www.statslife.org.uk/opinion/1316-talking-to-the-public-about-using-their-data-is-crucial

[9] Dame Fiona Caldicott appointed in new role as National Data Guardian

[10] Without care.data health service has no future says director http://www.computerweekly.com/news/2240216402/Without-Caredata-we-wont-have-a-health-service-for-much-longer-says-NHS

[11] Coin Street, care.data advisory meeting, September 6th 2014: https://storify.com/ruth_beattie/care-data-advisory-group-open-meeting-6th-septembe

[12] Public questions unanswered: https://jenpersson.com/pathfinder/