Tag Archives: NHS England

A vanquished ghost returns as details of distress required in NHS opt out

It seems the ugly ghosts of care.data past were alive and well at NHS Digital this Christmas.

Old style thinking, the top-down patriarchal ‘no one who uses a public service should be allowed to opt out of sharing their records. Nor can people rely on their record being anonymised,‘ that you thought was vanquished, has returned with a vengeance.

The Secretary of State for Health, Jeremy Hunt, has reportedly  done a U-turn on opt out of the transfer of our medical records to third parties without consent.

That backtracks on what he said in Parliament on January 25th, 2014 on opt out of anonymous data transfers, despite the right to object in the NHS constitution [1].

So what’s the solution? If the new opt out methods aren’t working, then back to the old ones and making Section 10 requests? But it seems the Information Centre isn’t keen on making that work either.

All the data the HSCIC holds is sensitive and as such, its release risks patients’ significant harm or distress [2] so it shouldn’t be difficult to tell them to cease and desist, when it comes to data about you.

But how is NHS Digital responding to people who make the effort to write directly?

Someone who “got a very unhelpful reply” is being made to jump through hoops.

If anyone asks that their hospital data should not be used in any format and passed to third parties, that’s surely for them to decide.

Let’s take the case study of a woman who spoke to me during the whole care.data debacle who had been let down by the records system after rape. Her NHS records subsequently about her mental health care were inaccurate, and had led to her being denied the benefit of private health insurance at a new job.

Would she have to detail why selling her medical records would cause her distress? What level of detail is fair and who decides? The whole point is, you want to keep info confidential.

Should you have to state what you fear? “I have future distress, what you might do to me?” Once you lose control of data, it’s gone. Based on past planning secrecy and ideas for the future, like mashing up health data with retail loyalty cards as suggested at Strata in November 2013 [from 16:00] [2] no wonder people are sceptical. 

Given the long list of commercial companies,  charities, think tanks and others that passing out our sensitive data puts at risk and given the Information Centre’s past record, HSCIC might be grateful they have only opt out requests to deal with, and not millions of medical ethics court summonses. So far.

HSCIC / NHS Digital has extracted our identifiable records and has given them away, including for commercial product use, and continues give them away, without informing us. We’ve accepted Ministers’ statements and that a solution would be found. Two years on, patience wears thin.

“Without that external trust, we risk losing our public mandate and then cannot offer the vital insights that quality healthcare requires.”

— Sir Nick Partridge on publication of the audit report of 10% of 3,059 releases by the HSCIC between 2005-13

— Andy WIlliams said, “We want people to be certain their choices will be followed.”

Jeremy Hunt said everyone should be able to opt out of having their anonymised data used. David Cameron did too when the plan was  announced in 2012.

In 2014 the public was told there should be no more surprises. This latest response is not only a surprise but enormously disrespectful.

When you’re trying to rebuild trust, assuming that we accept that ‘is’ the aim, you can’t say one thing, and do another.  Perhaps the Department for Health doesn’t like the public answer to what the public wants from opt out, but that doesn’t make the DH view right.

Perhaps NHS Digital doesn’t want to deal with lots of individual opt out requests, that doesn’t make their refusal right.

Kingsley Manning recognised in July 2014, that the Information Centre “had made big mistakes over the last 10 years.” And there was “a once-in-a-generation chance to get it right.”

I didn’t think I’d have to move into the next one before they fix it.

The recent round of 2016 public feedback was the same as care.data 1.0. Respect nuanced opt outs and you will have all the identifiable public interest research data you want. Solutions must be better for other uses, opt out requests must be respected without distressing patients further in the process, and anonymous must mean  anonymous.

Pseudonymised data requests that go through the DARS process so that a Data Sharing Framework Contract and Data Sharing Agreement are in place are considered to be compliant with the ICO code of practice – fine, but they are not anonymous. If DARS is still giving my family’s data to Experian, Harvey Walsh, and co, despite opt out, I’ll be furious.

The [Caldicott 2] Review Panel found “that commissioners do not need dispensation from confidentiality, human rights & data protection law.

Neither do our politicians, their policies or ALBs.


[1] https://www.england.nhs.uk/ourwork/tsd/ig/ig-fair-process/further-info-gps/

“A patient can object to their confidential personal information from being disclosed out of the GP Practice and/or from being shared onwards by the HSCIC for non-direct care purposes (secondary purposes).”

[2] Minimum Mandatory Measures http://www.nationalarchives.gov.uk/documents/information-management/cross-govt-actions.pdf p7

Breaking up is hard to do. Restructuring education in England.

This Valentine’s I was thinking about the restructuring of education in England and its wide ranging effects. It’s all about the break up.

The US EdTech market is very keen to break into the UK, and our front door is open.

We have adopted the model of Teach First partnered with Teach America, while some worry we do not ask “What is education for?

Now we hear the next chair of Oftsed is to be sought from the US, someone who is renowned as “the scourge of the unions.”

Should we wonder how long until the management of schools themselves is US-sourced?

The education system in England has been broken up in recent years into manageable parcels  – for private organisations, schools within schools, charity arms of commercial companies, and multi-school chains to take over – in effect, recent governments have made reforms that have dismantled state education as I knew it.

Just as the future vision of education outlined in the 2005 Direct Democracy co-authored by Michael Gove said, “The first thing to do is to make existing state schools genuinely independent of the state.”

Free schools touted as giving parents the ultimate in choice, are in effect another way to nod approval to the outsourcing of the state, into private hands, and into big chains. Despite seeing the model fail spectacularly abroad, the government seems set on the same here.

Academies, the route that finagles private corporations into running public-education is the preferred model, says Mr Cameron. While there are no plans to force schools to become academies, the legislation currently in ping-pong under the theme of coasting schools enables just that. The Secretary of State can impose academisation. Albeit only on Ofsted labeled ‘failing’ schools.

What fails appears sometimes to be a school that staff and parents cannot understand as anything less than good, but small. While small can be what parents want, small pupil-teacher ratios, mean higher pupil-per teacher costs. But the direction of growth is towards ‘big’ is better’.

“There are now 87 primary schools with more than 800 pupils, up from 77 in 2014 and 58 in 2013. The number of infants in classes above the limit of 30 pupils has increased again – with 100,800 pupils in these over-sized classes, an increase of 8% compared with 2014.” [BBC]

All this restructuring creates costs about which the Department wants to be less than transparent.  And has lost track of.

If only we could see that these new structures raised standards?  But,” while some chains have clearly raised attainment, others achieve worse outcomes creating huge disparities within the academy sector.”

If not delivering better results for children, then what is the goal?

A Valentine’s view of Public Service Delivery: the Big Break up

Breaking up the State system, once perhaps unthinkable is possible through the creation of ‘acceptable’ public-private partnerships (as opposed to outright privatisation per se). Schools become academies through a range of providers and different pathways, at least to start with, and as they fail, the most successful become the market leaders in an oligopoly. Ultimately perhaps, this could become a near monopoly. Delivering ‘better’. Perhaps a new model, a new beginning, a new provider offering salvation from the flood of ‘failing’ schools coming to the State’s rescue.

In order to achieve this entry to the market by outsiders, you must first remove conditions seen as restrictive, giving more ‘freedom’ to providers; to cut corners make efficiency savings on things like food standards, required curriculum, and numbers of staff, or their pay.

And what if, as a result, staff leave, or are hard to recruit?

Convincing people that “tech” and “digital” will deliver cash savings and teach required skills through educational machine learning is key if staff costs are to be reduced, which in times of austerity and if all else has been cut, is the only budget left to slash.

Self-taught systems’ providers are convincing in their arguments that tech is the solution.

Sadly I remember when a similar thing was tried on paper. My first year of GCSE maths aged 13-14  was ‘taught’ at our secondary comp by working through booklets in a series that we self-selected from the workbench in the classroom. Then we picked up the master marking-copy once done. Many of the boys didn’t need long to work out the first step was an unnecessary waste of time. The teacher had no role in the classroom. We were bored to bits. By the final week at end of the year they sellotaped the teacher to his chair.

I kid you not.

Teachers are so much more than knowledge transfer tools, and yet by some today seem to be considered replaceable by technology.

The US is ahead of us in this model, which has grown hand-in-hand with commercialism in schools. Many parents are unhappy.

So is the DfE setting us up for future heartbreak if it wants us to go down the US route of more MOOCs, more tech, and less funding and fewer staff? Where’s the cost benefit risk analysis and transparency?

We risk losing the best of what is human from the classroom, if we will remove the values they model and inspire. Unions and teachers and educationalists are I am sure, more than aware of all these cumulative changes. However the wider public seems little engaged.

For anyone ‘in education’ these changes will all be self-evident and their balance of risks and benefits a matter of experience, and political persuasion. As a parent I’ve only come to understand these changes, through researching how our pupils’ personal and school data have been commercialised,  given away from the National Pupil Database without our consent, since legislation changed in 2013; and the Higher Education student and staff data sold.

Will more legislative change be needed to keep our private data accessible in public services operating in an increasingly privately-run delivery model? And who will oversee that?

The Education Market is sometimes referred to as ‘The Wild West’. Is it getting a sheriff?

The news that the next chair of Oftsed is to be sought from the US did set alarm bells ringing for some in the press, who fear US standards and US-led organisations in British schools.

“The scourge of unions” means not supportive of staff-based power and in health our junior doctors have clocked exactly what breaking their ‘union’ bargaining power is all about.  So who is driving all this change in education today?

Some ed providers might be seen as profiting individuals from the State break up. Some were accused of ‘questionable practices‘. Oversight has been lacking others said. Margaret Hodge in 2014 was reported to have said: “It is just wrong to hand money to a company in which you have a financial interest if you are a trustee.”

I wonder if she has an opinion on a lead non-executive board member at the Department for Education also being the director of one of the biggest school chains? Or the ex Minister now employed by the same chain? Or that his campaign was funded by the same Director?  Why this register of interests is not transparent is a wonder.

It could appear to an outsider that the private-public revolving door is well oiled with sweetheart deals.

Are the reforms begun by Mr Gove simply to be executed until their end goal, whatever that may be, through Nikky Morgan or she driving her own new policies?

If Ofsted were  to become US-experience led, will the Wild West be tamed or US providers invited to join the action, reshaping a new frontier? What is the end game?

Breaking up is not hard to do, but in whose best interest is it?

We need only look to health to see the similar pattern.

The structures are freed up, and boundaries opened up (if you make the other criteria) in the name of ‘choice’. The organisational barriers to break up are removed in the name of ‘direct accountability’. And enabling plans through more ‘business intelligence’ gathered from data sharing, well, those plans abound.

Done well, new efficient systems and structures might bring public benefits, the right technology can certainly bring great things, but have we first understood what made the old less efficient if indeed it was and where are those baselines to look back on?

Where is the transparency of the end goal and what’s the price the Department is prepared to pay in order to reach it?

Is reform in education, transparent in its ideology and how its success is being measured if not by improved attainment?

The results of change can also be damaging. In health we see failing systems and staff shortages and their knock-on effects into patient care. In schools, these failures damage children’s start in life, it’s not just a ‘system’.

Can we assess if and how these reforms are changing the right things for the right reasons? Where is the transparency of what problems we are trying to solve, to assess what solutions work?

How is change impact for good and bad being measured, with what values embedded, with what oversight, and with whose best interests at its heart?

2005’s Direct Democracy could be read as a blueprint for co-author Mr Gove’s education reforms less than a decade later.

Debate over the restructuring of education and its marketisation seems to have bypassed most of us in the public, in a way health has not.

Underperformance as measured by new and often hard to discern criteria, means takeover at unprecedented pace.

And what does this mean for our most vulnerable children? SEN children are not required to be offered places by academies. The 2005 plans co-authored by Mr Gove also included: “killing the government’s inclusion policy stone dead,” without an alternative.

Is this the direction of travel our teachers and society supports?

What happens when breakups happen and relationship goals fail?

Who picks up the pieces? I fear the state is paying heavily for the break up deals, investing heavily in new relationships, and yet will pay again for failure. And so will our teaching staff, and children.

While Mr Hunt is taking all the heat right now, for his part in writing Direct Democracy and its proposals to privatise health – set against the current health reforms and restructuring of junior doctors contracts – we should perhaps also look to Mr Gove co-author, and ask to better understand the current impact of his recent education reforms, compare them with what he proposed in 2005, and prepare for the expected outcomes of change before it happens (see p74).

One outcome was that failure was to be encouraged in this new system, and Sweden held up as an exemplary model:

“Liberating state schools would also allow the all-important freedom to fail.”

As Anita Kettunen, principal of JB Akersberga in Sweden reportedly said when the free schools chain funded by a private equity firm failed:

“if you’re going to have a system where you have a market, you have to be ready for this.”

Breaking up can be hard to do. Failure hurts. Are we ready for this?
******

 

Abbreviated on Feb 18th.

 

The front door to our children’s personal data in schools

“EdTech UK will be a pro-active organisation building and accelerating a vibrant education and learning technology sector and leading new developments with our founding partners. It will also be a front door to government, educators, companies and investors from Britain and globally.”

Ian Fordham, CEO, EdTech UK

This front door is a gateway to access our children’s personal data and through it some companies are coming into our schools and homes and taking our data without asking.  And with that, our children lose control over their safeguarded digital identity. Forever.

Companies are all “committed to customer privacy” in those privacy policies which exist at all. However, typically this means they also share your information with ‘our affiliates, our licensors, our agents, our distributors and our suppliers’ and their circles are wide and often in perpetuity. Many simply don’t have a published policy.

Where do they store any data produced in the web session? Who may access it and use it for what purposes? Or how may they use the personal data associated with staff signing up with payment details?

According to research from London & Partners, championed by Boris Johnson, Martha Lane-Fox and others in EdTech, education is one of the fastest growing tech sectors in Britain and is worth £45bn globally; a number set to reach a staggering £129bn by 2020. And perhaps the EdTech diagrams in US dollars shows where the UK plan to draw companies from. If you build it, they will come.

The enthusiasm that some US EdTech type entrepreneurs I have met or listened to speak, is akin to religious fervour. Such is their drive for tech however, that they appear to forget that education is all about the child. Individual children. Not cohorts, or workforces. And even when they do it can be sincerely said, but lacks substance when you examine policies in practice.

How is the DfE measuring the cost and benefit of tech and its applications in education?

Is anyone willing to say not all tech is good tech, not every application is a wise application? Because every child is unique, not every app is one size fits all?

My 7-yo got so caught up in the game and in the mastery of the app their class was prescribed for homework in the past, that she couldn’t master the maths and harmed her confidence. (Imagine something like this, clicking on the two correct sheep with numbers stamped on them, that together add up to 12, for example, before they fall off and die.)

She has no problem with maths. Nor doing sums under pressure. She told me happily today she’d come joint second in a speed tables test. That particular app style simply doesn’t suit her.

I wonder if other children and parents find the same and if so, how would we know if these apps do more harm than good?

Nearly 300,000 young people in Britain have an anxiety disorder according to the Royal College of Psychiatrists. Feeling watched all the time on-and offline is unlikely to make anxiety any better.

How can the public and parents know that edTech which comes into the home with their children, is behaviourally sound?

How can the public and parents know that edTech which affects their children, is ethically sound in both security and application?

Where is the measured realism in the providers’ and policy makers fervour when both seek to marketise edTech and our personal data for the good of the economy, and ‘in the public interest’.

Just because we can, does not always mean we should. Simply because data linkage is feasible, even if it brings public benefit, cannot point blank mean it will always be in our best interest.

In whose best Interest is it anyway?

Right now, I’m not convinced that the digital policies at the heart of the Department for Education, the EdTech drivers or many providers have our children’s best interests at heart at all. It’s all about the economy; when talking if at all about children using the technology, many talk only of ‘preparing the workforce’.

Are children and parents asked to consent at individual level to the terms and conditions of the company and told what data will be extracted from the school systems about their child? Or do schools simply sign up their children and parents en masse, seeing it as part of their homework management system?

How much ‘real’ personal data they use varies. Some use only pseudo-IDs assigned by the teacher. Others log, store and share everything they do assigned to their ID or real email address , store performance over time and provide personalised reports of results.

Teachers and schools have a vital role to play in understanding data ethics and privacy to get this right and speaking to many, it doesn’t seem something they feel well equipped to do. Parents aren’t always asked. But should schools not always have to ask before giving data to a commercial third party or when not in an ’emergency’ situation?

I love tech. My children love making lego robots move with code. Or driving drones with bananas. Or animation. Technology offers opportunity for application in and outside schools for children that are fascinating, and worthy, and of benefit.

If however all parents are to protect children’s digital identity for future, and to be able to hand over any control and integrity over their personal data to them as adults,  we must better accommodate children’s data privacy in this 2016 gold rush for EdTech.

Pupils and parents need to be assured their software is both educationally and ethically sound.  Who defines those standards?

Who is in charge of Driving, Miss Morgan?

Microsoft’s vice-president of worldwide education, recently opened the BETT exhibition and praised teachers for using technology to achieve amazing things in the classroom, and urged innovators to  “join hands as a global community in driving this change”.

While there is a case to say no exposure to technology in today’s teaching would be neglectful, there is a stronger duty to ensure exposure to technology is positive and inclusive, not harmful.

Who regulates that?

We are on the edge of an explosion of tech and children’s personal data ‘sharing’ with third parties in education.

Where is its oversight?

The community of parents and children are at real risk of being completely left out these decisions, and exploited.

The upcoming “safeguarding” policies online are a joke if the DfE tells us loudly to safeguard children’s identity out front, and quietly gives their personal data away for cash round the back.

The front door to our children’s data “for government, educators, companies and investors from Britain and globally” is wide open.

Behind the scenes  in pupil data privacy, it’s a bit of a mess. And these policy makers and providers forgot to ask first,  if they could come in.

If we build it, would you come?

My question now is, if we could build something better on pupil data privacy AND better data use, what would it look like?

Could we build an assessment model of the collection, use and release of data in schools that could benefit pupils and parents, AND educational establishments and providers?

This could be a step towards future-proofing public trust which will be vital for companies who want a foot-in-the door of EdTech. Design an ethical framework for digital decision making and a practical data model for use in Education.

Educationally and ethically sound.

If together providers, policy makers, schools at group Trust level, could meet with Data Protection and Privacy civil society experts to shape a tool kit of how to assess privacy impact, to ensure safeguarding and freedoms, enable safe data flow and help design cybersecurity that works for them and protects children’s privacy that is lacking today, designing for tomorrow, would you come?

Which door will we choose?

*******

image credit: @ Ben Buschfeld Wikipedia

*added February 13th: Oftsed Chair sought from US

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (1)

Nicky Morgan, the Education Secretary,  gave a speech [1] this week and shared her dream of the benefits technology for pupils.

It mentioned two initiatives to log children’s individual actions; one is included in a consultation on new statutory guidance, and the other she praised, is a GPS based mobile monitoring app.

As with many new applications of technology, how the concept is to be implemented in practice is important to help understand how intrusive any new use of data is going to be.

Unfortunately for this consultation there is no supporting code of practice what the change will mean, and questions need asked.

The most significant aspects in terms of changes to data collection through required monitoring are in the areas of statutory monitoring, systems, and mandatory teaching of ‘safeguarding’:

Consultation p11/14: “We believe including the requirement to ensure appropriate filtering and monitoring are in place, in statutory guidance, is proportional and reasonable in order to ensure all schools and colleges are meeting this requirement. We don’t think including this requirement will create addition burdens for the vast majority of schools, as they are already doing this, but we are keen to test this assumption.”

Consultation:  paragraph 75 on page 22 introduces this guidance section and ends with a link to “Buying advice for schools.” “Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. Guidance on e-security is available from the National Education Network.

Guidance: para 78 “Whilst it is essential that governing bodies and proprietors ensure that appropriate filters and monitoring systems are in place they should be careful  that “over blocking” does not lead to unreasonable restrictions as to what children can be taught with regards to online teaching  and safeguarding.” —

Consultation: “The Opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and  “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities. This is an important topic and the assumption is the vast majority of governing bodies and proprietors will already be ensuring the children in their school are suitably equipped with regards to safeguarding. But we are keen to hear views as to the change in emphasis.”

Paragraph 88 on p24  is oddly phrased: “Governing bodies and proprietors should ensure that staff members do not agree confidentiality and always act in the best interests of the child.”

What if confidentiality may sometimes be in the best interests of the child? What would that mean in practice?

 

Keeping Children Safe in Education – Questions on the Consultation and Use in practice

The consultation [2] is open until February 16th, and includes a new requirement to have web filtering and monitoring systems.

Remembering that 85% of children’s waking hours are spent outside school, and in a wide range of schools our children aged 2 -19, not every moment is spent unsupervised and on-screen, is it appropriate that this 24/7 monitoring would be applied to all types of school?

This provider software is reportedly being used in nearly 1,400 secondary schools in the UK.  We hear little about its applied use.

The cases of cyber bullying or sexting in schools I hear of locally, or read in the press, are through smartphones. Unless the school snoops on individual devices I wonder therefore if the cost, implementation and impact is proportionate to the benefit?

  1. Necessary and proportionate? How does this type of monitoring compare with other alternatives?
  2. Privacy impact assessment? Has any been done – surely required as a minimum measure?
  3. Cost benefit risk assessment of the new guidance in practice?
  4. Problem vs Solution: What problem is it trying to solve and how will they measure if it is successful, or stop its use if it is not?  Are other methods on offer?
  5. Due diligence: how do parents know that the providers have undergone thorough vetting and understand who they are? After all, these providers have access to millions of our children’s  online interactions.
  6. Evidence: If it has been used for years in school, how has it been assessed against other methods of supervision?
  7. The national cash cost: this must be enormous when added up for every school in the country, how is cost balanced against risk?
  8. Intangible costs – has anyone asked our children’s feeling on this? Where is the boundary between what is constructive and creepy? Is scope change documented if they decide to collect more data?

Are we Creating a Solution that Solves or creates a Problem?

The private providers would have no incentive to say their reports don’t work and schools, legally required to be risk averse, would be unlikely to say stop if there is no outcome at all.

Some providers  include “review of all incidents by child protection and forensic experts; freeing up time for teachers to focus on intervention” and “trends such as top users can be viewed.” How involved are staff who know the child as a first point of information sharing?

Most tools are multipurposed and I understand the reasons given behind them, but how it is implemented concerns me.

If the extent of these issues really justify this mass monitoring in every school – what are we doing to fix the causes, not simply spy on every child’s every online action in school? (I look at how it extends outside in part two.)

Questions on Public engagement: How are children and families involved in the implementation and with what oversight?

  1. Privacy and consent: Has anyone asked pupils and parents what they think and what rights they have to say no to sharing data?
  2. Involvement: Are parents to be involved and informed in software purchasing and in all data sharing decisions at local or regional level? Is there consistency of message if providers vary?
  3. Transparency: Where are the data created through the child’s actions stored, and for how long? Who has access to the data? What actions may result from it? And with what oversight?
  4. Understanding: How will children and parents be told what is “harmful and inappropriate content” as loosely defined by the consultation, and what they may or may not research? Children’s slang changes often, and “safeguarding” terms are subjective.
  5. Recourse: Will it include assessment of unintended consequences from misinterpretation of information gathered?
  6. Consent: And can I opt my child out from data collection by these unknown and ‘faceless’ third parties?

If children and parents are told their web use is monitored, what chilling effect may that have on their trust of the system, of teaching staff, and their ability to look for content to support their own sensitive concerns or development  that they may not be able to safe to look for at home? What limitation will that put on their creativity?

These are all questions that should be asked to thoroughly understand the consultation and requires wide public examination.

Since key logging is already common practice (according to provider websites) and will effectively in practice become statutory, where is the public discussion? If it’s not explicitly statutory, should pupils be subject to spying on their web searches in a postcode lottery?

What exactly might this part of the new guidance mean for pupils?

In part two, I include the other part of her speech, the GPS app and ask whether if we track every child in and outside school, are we moving closer to the digital dream, or nightmare, in the search to close the digital skills gap?

****

References:

[1] Nicky Morgan’s full speech at BETT

[2] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

“Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. [Proposed statutory guidance]

Since “guidance on procuring appropriate ICT” from the National Education Network NEN* is offered, it is clearly intended that this ‘system’ to be ‘in place’, should be computer based. How will it be applied in practice? A number of the software providers for schools already provide services that include key logging, using “keyword detection libraries” that “provides a complete log of all online activity”.

(*It’s hard to read more about as many of NEN’s links are dead.)  

care.data: delayed or not delayed? The train wreck that is always on time

If you cancel a train does it still show up in the delayed trains statistics?

care.data plans are not delayed (just don’t ask Healthwatch)

Somerset CCG’s announcement [1] of the delay in their care.data plans came as no surprise, except perhaps to NHS England who effectively denied it, reportedly saying work continues. [2] Both public statements may be true but it would have been good professional practice to publicly recognise that a top down delay affects others who are working hard on the ground to contribute to the effective rollout of the project. Causing confusion and delay is hard to work with. Change and technology projects run on timelines. Deadlines mean that different teams can each do their part and the whole gets done. Or not.

Healthwatch [3] has cancelled their planned public meetings.  Given that one of the reasons stated in the care.data CCG selection process was support from local patient groups including Healthwatch, this appears poor public relations. It almost wouldn’t matter, but in addition to the practicalities, the organisation and leadership are trying to prove it is trustworthy. [4]


HW_cancels


Somerset’s statement is straightforward and says it is applies to all pathfinders: 

“Following a speech by Jeremy Hunt, the Secretary of State for Health this week (3-9-15), in which he outlined his vision for the future use of technology across NHS, NHS England has asked the four care.data pathfinder pilots areas in England (Leeds, Blackburn and Derwent, West Hampshire and Somerset) to temporarily pause their activities.” [Sept 4, Somerset statement]


somerset


From when I first read of the GPES IAG concerns [5] I have seen the care.data programme hurtle from one crisis to another. But this is now a train wreck. A very quiet train wreck. No one has cried out much.[6] And yet I think the project,  professionals, and the public should be shouting from the top of the carriages that this programme needs help if it is ever to reach its destination.

care.data plans are not late against its business plan (there is none)

Where’s the business case? Why can’t it define deadlines that it can achieve?  In February 2015, I suggested the mentality that allows these unaccountable monster programmes to grow unchecked must die out.

I can’t even buy an Oyster card if I don’t know if there is money in my pocket. How can a programme which has already spent multi millions of pounds keep driving on without a budget? There is no transparency of what financial and non-financial benefits are to be expected to justify the cost. There is no accountable public measure of success checking it stays on track.

While it may be more comfortable for the organisation to deny problems, I do not believe it serves the public interest to hide information. This is supported by the very reason for being of the MPA process and its ‘challenge to Whitehall secrecy‘ [7] who rated the care.data rollout red [8] in last years audit. This requires scrutiny to find solutions.

care.data plans do not need to use lessons learned (do they?)

I hope at least there are lessons learned here in the pathfinder on what not to do before the communications rollout to 60m people.  In the words of Richard Feynman, “For successful technology, reality must take precedence over public relations.”

NHS England is using the public interest test to withhold information: “the particular public interest in preserving confidential communications between NHS England and its sponsoring department [the DH].”  I do not believe this serves the public interest if it is used to hide issues and critical external opinion. The argument made is that there is “stronger public interest in maintaining the exemption where it allows the effective development of policy and operational matters on an ongoing basis.”  The Public Accounts Committee in 2013 called for early transparency and intervention which prevents the ongoing waste of “billions of pounds of taxpayers’ money” in their report into the NPfIT. [9] It showed that a lack of transparency and oversight contributed to public harm, not benefit, in that project, under the watch of the Department of Health. The report said:

“Parliament needs to be kept informed not only of what additional costs are being incurred, but also of exactly what has been delivered so far for the billions of pounds spent on the National Programme. The benefits flowing from the National Programme to date are extremely disappointing. The Department estimates £3.7 billion of benefits to March 2012, just half of the costs incurred. This saga [NPfIT] is one of the worst and most expensive contracting fiascos in the history of the public sector.”

And the Public Accounts Committee made a recommendation in 2013:

“If the Department is to deliver a paperless NHS, it needs to draw on the lessons from the National Programme and develop a clear plan, including estimates of costs and benefits and a realistic timetable.” [PAC 2013][9]

Can we see any lessons drawn on today in care.data? Or any in Jeremy Hunt’s speech or his refusal to comment on costs for the paperless NHS plans reported by HSJ journal at NHSExpo15?

While history repeats itself and “estimates of costs and benefits and a realistic timetable” continue to be absent in the care.data programme, the only reason given by Somerset for delay is to fix the specific issue of opt out:

“The National Data Guardian for health and care, Dame Fiona Caldicott, will… provide advice on the wording for a new model of consents and opt-outs to be used by the care.data programme that is so vital for the future of the NHS. The work will be completed by January [2016]…”

Perhaps delay will buy NHS England some time to get itself on track and not only respect public choice on consent, but also deliver a data usage report to shore up trust, and tell us what benefits the programme will deliver that cannot already be delivered today (through existing means, like the CPRD for research [10]).

Perhaps.

care.data plans will only deliver benefits (if you don’t measure costs)

I’ve been told “the realisation of the benefits, which serve the public interest, is dependent on the care.data programme going ahead.” We should be able to see this programme’s costs AND benefits. It is we collectively after all who are paying for it, and for whom we are told the benefits are to be delivered. DH should release the business plan and all cost/benefit/savings  plans. This is a reasonable thing to ask. What is there to hide?

The risk has been repeatedly documented in 2014-15 board meetings that “the project continues without an approved business case”.

The public and medical profession are directly affected by the lack of money given by the Department of Health as the reason for the reductions in service in health and social care. What are we missing out on to deliver what benefit that we do not already get elsewhere today?

On the pilot work continuing, the statement from NHS England reads: “The public interest is best served by a proper debate about the nature of a person’s right to opt out of data sharing and we will now have clarity on the wording for the next steps in the programme,” 

I’d like to see that ‘proper debate’ at public events. The NIB leadership avoids answering hard questions even if asked in advance, as requested. Questions such as mine go unanswered::

“How does NHS England plan to future proof trust and deliver a process of communications for the planned future changes in scope, users or uses?”

We’re expected to jump on for the benefits, but not ask about the cost.

care.data plans have no future costs (just as long as they’re unknown)

care.data isn’t only an IT infrastructure enhancement and the world’s first population wide database of 60m primary care records. It’s a massive change platform through which the NHS England Commissioning Board will use individual level business intelligence to reshape the health service. A massive change programme  that commodifies patient confidentiality as a kick-starter for economic growth.  This is often packaged together with improvements for patients, requirements for patient safety, often meaning explanations talk about use of records in direct care conflated with secondary uses.

“Without interoperable digital data, high quality effective local services cannot be delivered; nor can we achieve a transformation in patient access to new online services and ‘apps’; nor will the NHS maximise its opportunity to be a world centre in medical science and research.” [NHS England, September 1 2015] 

So who will this transformation benefit? Who and what are all its drivers? Change is expensive. It costs time and effort and needs investment.

Blackburn and Darwen’s Healthwatch appear to have received £10K for care.data engagement as stated in their annual report. Somerset’s less clear. We can only assume that Hampshire, expecting a go live ‘later in 2015’ has also had costs. Were any of their patient facing materials already printed for distribution, their ‘allocated-under-austerity’ budgets spent?

care.data is not a single destination but a long journey with a roadmap of plans for incremental new datasets and expansion of new users.

The programme should already know and be able to communicate the process behind informing the public of future changes to ensure future use will meet public expectations in advance of any change taking place. And we should know who is going to pay for that project lifetime process, and ongoing change management. I keep asking what that process will be and how it will be managed:

June 17 2015, NIB meeting at the King’s Fund Digital Conference on Health & Social Care:

june17

September 2 2015, NIB Meeting at NHS Expo 15:

NIBQ_Sept

It goes unanswered time and time again despite all the plans and roadmaps and plans for change.

These projects are too costly to fail. They are too costly to justify only having transparency applied after the event, when forced to do so.

care.data plans are never late (just as long as there is no artificial deadline)

So back to my original question. If you cancel a train does it still show up in the delayed trains statistics? I suppose if the care.data programme claims there is no artificial deadline, it can never be late. If you stop setting measurable deadlines to deliver against, the programme can never be delayed. If there is no budget set, it can never be over it. The programme will only deliver benefits, if you never measure costs.

The programme can claim it is in the public interest for as long as we are prepared to pay with an open public purse and wait for it to be on track.  Wait until data are ready to be extracted, which the notice said:

…” is thought to remain a long way off.” 

All I can say to that, is I sure hope so. Right now, it’s not fit for purpose. There must be decisions on content and process arrived at first. But we also deserve to know what we are expecting of the long journey ahead.

On time, under budget, and in the public interest?

As long as NHS England is the body both applying and measuring the criteria, it fulfils them all.

*******

[1] Somerset CCG announces delay to care.data plans https://www.somersetlmc.co.uk/caredatapaused

[2] NHS England reply to Somerset announcement reported in Government Computing http://healthcare.governmentcomputing.com/news/ccg-caredata-pilot-work-continues-4668290

[3] Healthwatch bulletin: care.data meetings cancelled http://us7.campaign-archive1.com/?u=16b067dc44422096602892350&id=5dbdfc924c

[4] Building public trust: after the NIB public engagement in Bristol https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[5] GPES IAG http://www.hscic.gov.uk/media/12911/GPES-IAG-Minutes-12-September-2013/pdf/GPES_IAG_Minutes_12.09.13.pdf

[6] The Register – Right, opt out everybody! hated care.data programme paused again http://www.theregister.co.uk/2015/09/08/hated_caredata_paused_again_opt_out/

[7] Pulse Today care.data MPA rating http://www.pulsetoday.co.uk/your-practice/practice-topics/it/caredata-looks-unachievable-says-whitehall-watchdog/20010381.article#.VfMXYlbtiyM

[8] Major Projects Authority https://engage.cabinetoffice.gov.uk/major-projects-authority/

[9] The PAC 2013 ttp://www.parliament.uk/business/committees/committees-a-z/commons-select/public-accounts-committee/news/npfit-report/

[10] Clinical Practice Research Datalink (CPRD)

***

image source: http://glaconservatives.co.uk/news/london-commuters-owed-56million-in-unclaimed-refunds-by-rail-operators/

 

Building Public Trust [5]: Future solutions for health data sharing in care.data

This wraps up my series of thoughts on ‘Building Public Trust’ since the NIB Bristol meeting on July 24th.

It has looked at how to stop chasing public trust and instead the need to become organisations that can be trustworthy [part 1]. What behaviours make an organisation trustworthy [part 2]. Why fixing the Type 2 opt out is a vital first step [part 3], and why being blinded by ‘the benefits’ is not the answer [part 4], but giving balanced and fair explanations of programme purposes, commissioning and research, is beneficial to communicate.

So I want to wrap up by suggesting how communications can be improved in content and delivery. Some ideas will challenge the current approach.

Here in part five: Future solutions, I suggest why aiming to “Build Public Trust” through a new communications approach may work better for the public than the past. I’ll propose communications on care.data:

  • Review content:  what would ethical, accurate content look like
  • Strengthen relationships for delivery: don’t attempt to rebuild trust where there is now none, but strengthen the channels that are already viewed by the public to be trustworthy
  • Rethink why you communicate and the plan for when: All communications need delivered through a conversation with real listening and action based upon it. Equal priority must be given to both a communications plan for today and for the future. It must set out a mechanism for future change communications now,  before the pathfinders begin
  • Since writing this, the Leeds area CCGs have released their ‘data sharing’ comms leaflet. I have reviewed this in detail and give my opinions as a case study.

NIB workstream 4, underpins the NHS digital future,  and aims to build and sustain public trust, delivering plans for consent based information sharing and assurance of safeguards. It focuses on 4 areas: governance and oversight, project risks, consent and genomics:

“The work will begin in 2015 and is expected to include deliberative groups to discuss complex issues and engagement events, as well as use of existing organisations and ways to listen. There will also be a need to listen to professional audiences.”  [NIB work stream 4] [ref 1]

Today’s starting point in trust, trust that enables two-way communication, could hardly be worse, with professionals and public audiences. Communications are packaged in mistrust:

“Relations between the doctors’ union and Health Secretary Jeremy Hunt hit a new low following his announcement in July that he was prepared to impose seven-day working on hospital doctors in England.” [BBC news, Aug 15, 2015]

There appears to be divided opinion between politicians and civil servants.

Right now, the Department of Health seems to be sabotaging its own plans for success at every turn.

What reason can there be for denying debate in the public domain of the very plans it says are the life blood of the savings central to the NHS future?

Has the Department learned nothing from the loss of public and professional trust in 2014?

And as regards the public in engagement work, Hetan Shah, executive director of the Royal Statistical Society said in 2014, “Our research shows a “data trust deficit”. In this data rich world, companies and government have to earn citizens’ trust in how they manage and use data – and those that get it wrong will pay the price.’ [RSS Data Trust Deficit, lessons for policymakers, 2014] [2]

Where do the NIB work stream discussions want to reach by 2020?

“The emergence of genomics requires a conversation about what kind of consent is appropriate by 2020. The work stream will investigate a strand of work to be led by an ethicist.” [NIB work stream 4]

Why is genomics here in workstream 4, when datasharing for genomics is with active consent from volunteers? Why will a strand of work be led by an ethicist for this, and not other work strands? Is there a gap in how their consent is managed today or in how consent is to be handled for genomics for the future? It seems to me there is a gap in what is planned and what the public is being told here. It is high time for an overdue public debate on what future today’s population-wide data sharing programme is building. Good communication must ensure there are no surprises.

The words I underlined from the work stream 4 paper, highlight the importance of communication; to listen and to have a conversation. Despite all the engagement work of 2014 I feel that is still to happen. As one participant summed up later, “They seem hell bent on going ahead. I know they listened, but what did they hear?” [3]

care.data pathfinder practices are apparently ready to roll out communications materials: “Extraction is likely to take place between September and November depending on how fair processing testing communications was conducted” [Blackburn and Darwen HW]

So what will patient facing materials look like in content? How will they be rolled out?

Are pathfinder communications more robust than 2014 materials?

I hope the creatives will also think carefully, what is the intent of communications to be delivered.  Is it to fully and ethically inform patients about their choice whether to accept or opt out from changes in their data access, management, use and oversight? Or is the programme guidance to minimise the opt out numbers?

The participants are not signing up to a one time, single use marketing campaign, but to a lifetime of data use by third parties. Third parties who remain in role and purposes, loosely defined.

It is important when balancing this decision not to forget that data  that is available and not used wisely could fail to mitigate risk; for example in identifying pharmaceutical harms.

At the same time to collect all data for all purposes under that ‘patient safety and quality’ umbrella theme is simplistic, and lends itself in some ways, to lazy communications.

Patients must also feel free and able to make an informed decision without coercion, that includes not making opting out feel guilty.

The wording used in the past was weighted towards the organisation’s preference.  The very concept of “data sharing” is weighted positively towards the organisation. Even though in reality the default is for data to be taken by the organisation, not donated by the citizen. In other areas of life, this is recognised as an unwilling position for the citizen to be in.

At the moment I feel that the scope of purposes both today and future are not clearly defined enough in communications or plans for me personally to be able to trust them. Withholding information about how digital plans will fit into the broader NHS landscape and what data sharing will mean beyond 2020 appears rightly or wrongly,  suspicious. Department of Health, what are you thinking?

What the organisation says it will do, it must do and be seen to do, to be demonstrably trustworthy.

This workstream carries two important strands of governance and oversight which now need to be seen to happen. Implementing the statutory footing of the National Data Guardian, which has been talked about since October 2014 and ‘at the earliest opportunity’ seems to have been rather long in coming, and ‘a whole system’ that respects patient choice. What will this look like and how will it take into account the granular level of choices asked for at care.data listening events through 2014?

“By April 2016 NIB will publish, in partnership with civil society and patient leaders, a roadmap for moving to a whole-system, consent-based approach, which respects citizens’ preferences and objections about how their personal and confidential data is used, with the goal of implementing that approach by December 2020.”

‘By December 2020’ is still some time away, yet the pathfinders for care.data rolls on now regardless. The proof that will demonstrate what was said about data use actually is what happens to data, that what is communicated is trustworthy, is part of a system that can communicate this by recording and sharing consent decisions, “and can provide information on the use to which an individual’s data has been put. Over the longer term, digital solutions will be developed that automate as far as possible these processes.”

Until then what will underpin trust to show that what is communicated is done, in the short term?

Future proofing Communications must start now

Since 2013 the NHS England care.data approach appeared to want a quick data grab without long term future-proofed plans. Like the hook-up app approach to dating.

To enable the NIB 2020 plans and beyond, to safeguard research in the public interest, all communications must shape a trusted long term relationship.

To ensure public trust, communications content and delivery can only come after changes. Which is again why focusing only on communicate the benefits without discussing balance of risk does not work.  That’s what 2014 patient facing communications tried.

In 2014 there were challenges on communications that were asked but not answered, on reaching those who are digitally excluded, on reaching those for whom reading text was a challenge, and deciding who the target audience will be, considering people with delegated authority young and old, as well as those who go in and out of GP care throughout their lives, such as some military. Has that changed?

In February 2014 Health Select Committee member Sarah Wollaston, now Chair, said: “There are very serious underlying problems here that need to be addressed.”

If you change nothing, you can expect nothing to change in public and professional feeling about the programme. Communications cannot in 2015 simply revamp the layout and pacakging. There must be a change in content and in the support given in its delivery. Change means that you need to stop doing some things and start doing others.

In summary for future communications to support trust, I suggest:

1. STOP: delivering content that is biased towards what the organsation wants to achieve often with a focus on fair processing requirement, under a coercive veil of patient safety and research

START: communicating with an entirely ethical based approach reconsidering all patient data held at HSCIC and whether omission of  ‘commercial use’, balanced risks as identified in the privacy impact assessment and stating ‘your name is not included’ is right.  

2. STOP: Consider all the releases of health data held by HSCIC again and decide for each type if they are going to deliver public confidence that your organisations are trustworthy. 

START: communicate publicly which commercial companies, re-users and back office would no longer be legally eligible to receive data and why. Demonstrate organisations who received data in the past that will not in future.  

3. STOP: the Department of Health and NHS England must stop undermining trust in its own leadership, through public communications that voice opposition to medical professional bodies. Doctors are trusted much more than politicians.

START: strengthen the public-GP relationship that is already well trusted. Strengthen the GP position that will in turn support the organisational-trust-chain that you need to sustain public support. 

4. STOP: stop delaying the legislative changes needed on Data Guardian and penalties for data misuse 

START: implement them and clearly explain them in Parliament and press

5. STOP: don’t rush through short term short-cuts  to get ‘some’ data but ignore the listening from the public that asked for choice.

START: design a thorough granular consent model fit for the 21stC and beyond and explain to the public what it will offer, the buy in for bona fide research will be much greater (be prepared to define ‘research’!

6. STOP: saying that future practices have been changed and that security and uses are now more trustworthy than in the past. Don’t rush to extract data until you can prove you are trustworthy.

START: Demonstrate in future who receives data to individuals through a data use report. Who future users are in practice can only be shown through a demonstrable tool to see your word can be relied upon in practice. This will I am convinced, lower the opt out rate.

 Point 6 is apparently work-in-progress. [p58]
NIB2015

7. STOP: rolling out the current communications approach without any public position on what changes will mean they are notified before a new purpose and user in future of our data

START: design a thorough change communications model fit for the 21stC and beyond and tell the public in THIS round of communications what changes of user or purposes will trigger a notification to enable them to opt out in future BEFORE a future change i.e. in a fictional future – if the government decided that the population wide database should be further commercialised ‘for the purposes of health’, linked to the NHSBT blood donor registry and sold to genomic research companies, how would I as a donor be told, BEFORE the event?

There are still unknowns in content and future scope that mean communications are difficult. If you don’t know what you’re saying how to say it is hard. But what is certain is that there are future changes in the programme planned, and how to communicate these these with the public and professionals must be designed for now, so that what we are signed up for today, stays what we signed up for.

Delivering messages about data sharing and the broader NHS, the DH/NHS England should consider carefully their relationships and behaviours, all communication becomes relevant to trust.

Solutions cannot only be thought of in terms tools, not of what can be imposed on people, but of what can be achieved with people.

That’s people from the public and professionals and the programme working with the same understanding of the plans together, in a trusted long term relationship.

For more detail including my case study comments on the Leeds area CCGs comms leaflet, continue reading below.

Thanks for sharing in discussions of ideas in my five part post on Building public trust – a New Approach. Comments welcome.

Continue reading Building Public Trust [5]: Future solutions for health data sharing in care.data

Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

care.data communicating the benefits as its response to the failed communications in spring 2014, has failed to deliver public trust, here’s why:

To focus on the benefits is a shortcut for avoiding the real issues

Talking about benefits is about telling people what the organisation wants to tell them. This fails to address what the public and professionals want to know. The result is not communication, but a PR exercise.

Talking about benefits in response to the failed communications in spring 2014 and failing to address criticism since, ignores concerns that public and professionals raised at macro and micro level.  It appears disingenuous about real engagement despite saying ‘we’re listening’ and seems uncaring.

Talking about only the benefits does not provide any solution to demonstrably outweigh the potential risk of individual and public health harm through loss of trust in the confidential GP relationship, or data inaccuracy, or loss, and by ignoring these, seems unrealistic.

Talking about short term benefits and not long term solutions [to the broken opt out, long term security, long term scope change of uses and users and how those will be communicated] does not demonstrate competency or reliability.

Talking about only the benefits of commissioning, and research for the merged dataset CES, doesn’t mention all the secondary uses to which all HSCIC patient level health data are put, [those reflected in Type 2 opt out] including commercial re-use and National Back Office: “2073 releases made from the National Back Office between April 2013 and December 2013. This includes 313 releases to police forces, 1531 to the Home Office and 229 to the National Crime Agency.” [HSCIC, July2,  2014].

This use of hospital records and other secondary data by the back office, without openly telling the public, does not feel  ethical and transparent.

Another example, is the past patient communications that expressly said, ‘we do not collect name’, the intent of which would appear to be to assure patients of anonymity, without saying name is already stored at HSCIC on the Personal Demographics Service, or that name is not needed to be identifiable.

We hear a lot about transparency. But is transparent the same fully accurate, complete and honest? Honest about the intended outcomes of the programme. Honest about all the uses to which health data are put. Honest about potential future scope changes and those already planned.

Being completely truthful in communications is fundamental to future-proofing trust in the programme.

NHS England’s care.data programme through the focus on ‘the benefits’ lacks balance and appears disingenuous, disinterested,  unrealistic and lacking in reliability, competency and honesty. Through these actions it does not demonstrate the organisation is trustworthy.  This could be changed.

care.data fundamentally got it wrong with the intention to not communicate the programme at all.  It got it wrong in the tool and tone of communications in the patient leaflet.  There is a chance to get it right now, if the organisation  would only stop the focus on communicating the benefits.

I’m going to step through with a couple of examples why to-date, some communications on care.data and use of NHS data are not conducive to trust.

Communication designed to ‘future-proof’ an ongoing relationship and trust must be by design, not afterthought.

Communications need to start addressing the changes that are happening and how they make people feel and address the changes that create concern – in the public and professionals – not address the  goals that the organisation has.

Sound familiar? Communications to date have been flawed in the same way that the concept of ‘building trust’ has been flawed. It has aimed to achieve the wrong thing and with the wrong audience.

Communications in care.data needs to stop focussing on what the organisation wants from the public and professionals – the benefits it sees of getting data – and address instead firstly at a macro level, why the change is necessary and why the organisation should be trusted to bring it about.

When explaining benefits there are clearly positives to be had from using primary and secondary data in the public interest. But what benefits will be delivered in care.data that are not already on offer today?

Why if commissioning is done today with less identifiable data, can there be no alternative to the care.data level of identifiable data extraction? Why if the CPRD offers research in both primary and secondary care today, will care.data offer better research possibilities? And secondly at a micro level, must address questions individuals asked up and down the country in 2014.

What’s missing and possible to be done?

  1. aim to meet genuine ongoing communication needs not just legal data protection fair processing tick-boxes
  2. change organisational attitude that encourages people to ask what they each want to know at macro and micro level – why the programme at all, and what’s in it for me? What’s new and a benefit that differs from the status quo? This is only possible if you will answer what is asked.
  3. deliver robust explanations of the reason why the macro and micro benefits demonstrably outweigh the risk of individual potential harms
  4. demonstrate reliability, honesty, competency and you are trustworthy
  5. agree how scope changes will trigger communication to ‘future-proof’ an ongoing relationship and trust by design.

As the NIB work stream on Public Trust says, “This is not merely a technical exercise to counter negative media attention; substantial change and long-term work is needed to deliver the benefits of data use.”

If they’re serious about that long term work, then why continue to roll out pathfinder communications based on a model that doesn’t work, with an opt out that doesn’t work? Communications isn’t an afterthought to public trust. It’s key.

If you’re interested in details and my proposals for success in communications I’ve outlined in depth below:

  • Why Communicate Changes at all?
  • What is change in care.data about?
  • Is NHS England being honest about why this is hard?
  • Communicate the Benefits is not working
  • A mock case study in why ‘communicate the benefits’ will fail
  • Long term trust needs a long term communications solution
  • How a new model for NHS care.data Communication could deliver

Continue reading Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

Let’s assume the question of public trust is as important to those behind data sharing plans in the NHS [1] as they say it is. That the success of the care.data programme today and as a result, the very future of the NHS depends upon it.

“Without the care.data programme, the health service will not have a future, said Tim Kelsey, national director for patients and information, NHS England.” [12]

And let’s assume we accept that public trust is not about the public, but about the organisation being trustworthy.[2]

The next step is to ask, how trustworthy is the programme and organisation behind care.data? And where and how do they start to build?

The table discussion on  [3] “Building Public Trust in Data Sharing”  considered  “what is the current situation?” and “why?”

What’s the current situation? On trust public opinion is measurable. The Royal Statistical Society Data Trust Deficit shows that the starting points are low with the state and government, but higher for GPs. It is therefore important that the medical profession themselves trust the programme in principle and practice. They are after all the care.data point of contact for patients.

The current status on the rollout, according to news reports, is that pathfinder  practices are preparing to rollout [4]  communications in the next few weeks. Engagement is reportedly being undertaken ‘over the summer months’. 

Understanding both public trust and the current starting point matters as the rollout is moving forwards and as leading charity and research organisation experts said: “Above all, patients, public and healthcare professionals must understand and trust the system. Building that trust is fundamental. We believe information from patient records has huge potential to save and improve lives but privacy concerns must be taken seriously. The stakes are too high to risk any further mistakes.” [The Guardian Letters, July 27, 2015]

Here’s three steps I feel could be addressed in the short term, to start to demonstrate why the public and professionals should trust  both organisation and process.

What is missing?

1. Opt out: The type 2 opt out does not work. [5]  

2 a. Professional voices called for answers and change: As mentioned in my previous summary various bodies called for change. Including the BMA whose policy [6] remains that care.data should be on a patient opt-in basis.

2bPublic voices called for answers and change: care.data’s own listening event feedback [7] concluded there was much more than ‘communicate the benefits’ that needed done. There is much missing. Such as questions on confusing SCR and care.data, legislation and concern over controlling its future change, GP concerns of their ethical stance, the Data Guardian’s statutory footing, correction of mistakes, future funding and more.
How are open questions being addressed? If at all?

3. A single clear point of ownership on data sharing and public trust communications> Is this now NIB, NHS England Patients and Information Directorate, the DH  who owns care.data now? It’s hard to ask questions if you don’t know where to go and the boards seem to have stopped any public communications. Why? The public needs clarity of organisational oversight.

What’s the Solution? 

1. Opt out: The type 2 opt out does not work. See the post graphic, the public wanted more clarity over opt out in 2014, so this needs explained clearly >>Solution: follows below from a detailed conversation with Mr. Kelsey.

2. Answers to professional opinions: The Caldicott panel,  raised 27 questions in areas of concern in their report. [8] There has not yet been any response to address them made available in the public domain by NHS England. Ditto APPG report, BMA LMC vote, and others >> Solution: publish the responses to these concerns and demonstrate what actions are being done to address them.

2b. Fill in the lack of transparency: There is no visibility of any care.data programme board meeting minutes or materials from 2015. In eight months, nothing has been published. Their 2014 proposal for transparency, appears to have come to nothing. Why?  The minutes from June-October 2014 are also missing entirely and the October-December 2014 materials published were heavily redacted. There is a care.data advisory board, which seems to have had little public visibility recently either. >> Solution: the care.data programme business case must be detailed and open to debate in the public domain by professionals and public. Scrutiny of its associated current costs and time requirements, and ongoing future financial implications at all levels should be welcomed by national, regional (CCG) and local level providers (GPs). Proactively publishing creates demonstrable reasons why both the organisation, and the plans are both trustworthy. Refusing this without clear justifications, seems counter productive, which is why I have challenged this in the public interest. [10]

3. Address public and professional confusion of ownership: Since data sharing and public trust are two key components of the care.data programme, it seems to come under the NIB umbrella, but there is a care.data programme board [9] of its own with a care.data Senior Responsible Owner and Programme Director. >> Solution: an overview of where all the different nationally driven NHS initiatives fit together and their owners would be helpful.

[Anyone got an interactive Gantt chart for all national level driven NHS initiatives?]

This would also help public and professionals see how and why different initiatives have co-dependencies. This could also be a tool to reduce the ‘them and us’ mentality. Also useful for modelling what if scenarios and reality checks on 5YFV roadmaps for example, if care.data pushes back six months, what else is delayed?

If the public can understand how things fit together it is more likely to invite questions, and an engaged public is more likely to be a supportive public. Criticism can be quashed if it’s incorrect. If it is justified criticism, then act on it.

Yes, these are hard decisions. Yes, to delay again would be awkward. If it were the right decision, would it be worse to ignore it and carry on regardless? Yes.

The most important of the three steps in detail: a conversation with Mr. Kelsey on Type 2 opt out. What’s the Solution?

We’re told “it’s complicated.” I’d say “it’s simple.” Here’s why.

At the table of about fifteen participants at the Bristol NIB event, Mr. Kelsey spoke very candidly and in detail about consent and the opt out.

On the differences between consent in direct care and other uses he first explained the assumption in direct care. Doctors and nurses are allowed to assume that you are happy to have your data shared, without asking you specifically. But he said, “beyond that boundary, for any other purpose, that is not a medical purpose in law, they have to ask you first.”

He went on to explain that what’s changed the whole dynamic of the conversation, is the fact that the current Secretary of State, decided that when your data is being shared for purposes other than your direct care, you not only have the right to be asked, but actually if you said you didn’t want it to be shared, that decision has to be respected, by your clinician.

He said: “So one of the reasons we’re in this rather complex situation now, is because if it’s for analysis, not only should you be asked, but also when you say no, it means no.”

Therefore, I asked him where the public stands with that now. Because at the moment there are ca. 700,000 people who we know said no in spring 2014.

Simply: They opted out of data used for secondary purposes, and HSCIC continues to share their data.

“Is anything more fundamentally damaging to trust, than feeling lied to?”

Mr. Kelsey told the table there is a future solution, but asked us not to tweet when. I’m not sure why, it was mid conversation and I didn’t want to interrupt:

“we haven’t yet been able to respect that preference, because technically the Information Centre doesn’t have the digital capability to actually respect it.”

He went on to say that they have hundreds of different databases and at the moment, it takes 24 hrs for a single person’s opt out to be respected across all those hundreds of databases. He explained a person manually has to enter a field on each database, to say a person’s opted out. He asked the hoped-for timing not be tweeted but explained that all those current historic objections which have been registered will be respected at a future date.

One of the other attendees expressed surprise that GP practices hadn’t been informed of that, having gathered consent choices in 2014 and suggested the dissent code could be extracted now.

The table discussion then took a different turn with other attendee questions, so I’m going to ask here what I would have asked next in response to his statement, “if it’s for analysis, not only should you be asked, but also when you say no, it means no.”

Where is the logic to proceed with pathfinder communications?

What was said has not been done and you therefore appear untrustworthy.

If there will be a future solution it will need communicated (again)?

“Trust is not about the public. Public trust is about the organisation being trustworthy.”

There needs to be demonstrable action that what the org said it would do, the org did. Respecting patient choice is not an optional extra. It is central in all current communications. It must therefore be genuine.

Knowing that what was promised was not respected, might mean millions of people choose to opt out who would not otherwise do so if the process worked when you communicate it.

Before then any public communications in Blackburn and Darwen, and Somerset, Hampshire and Leeds surely doesn’t make sense.

Either the pathfinders will test the same communications that are to be rolled out as a test for a national rollout, or they will not. Either those communications will explain the secondary uses opt out, or they will not. Either they will explain the opt out as is [type 2 not working] or as they hope it might be in future. [will be working] Not all of these can be true.

People who opt out on the basis of a broken process simply due to a technical flaw, are unlikely to ever opt back in again. If it works to starts with, they might choose to stay in.

Or will the communications roll out in pathfinders with a forward looking promise, repeating what was promised but has not yet been done? We will respect your promise (and this time we really mean it)? Would public trust survive that level of uncertainty? In my opinion, I don’t think so.

There needs to be demonstrable action in future as well, that what the org said it would do, the org did. So the use audit report and how any future changes will be communicated both seem basic principles to clarify for the current rollout as well.

So what’s missing and what’s the solution on opt out?

We’re told “it’s complicated.” I say “it’s simple.” The promised opt out must work before moving forward with anything else. If I’m wrong, then let’s get the communications materials out for broad review to see how they accommodate this and the future re-communication of  second process.

There must be a budgeted and planned future change communication process.

So how trustworthy is the programme and organisation behind care.data?

Public opinion on trust levels is measurable. The Royal Statistical Society Data Trust Deficit shows that the starting points are clear. The current position must address the opt out issue before anything else. Don’t say one thing, and do another.

To score more highly on the ‘truthworthy scale’ there must be demonstrable action, not simply more communications.

Behaviours need change and modelled in practice, to focus on people, not  tools and tech solutions, which make patients feel as if they are less important to the organisations than their desire to ‘enable data sharing’.

Actions need to demonstrate they are ethical and robust for a 21stC solution.

Policies, practical steps and behaviours all play vital roles in demonstrating that the organisations and people behind care.data are trustworthy.

These three suggestions are short term, by that I mean six months. Beyond that further steps need to be taken to be demonstrably trustworthy in the longer term and on an ongoing basis.

Right now, do I trust that the physical security of HSCIC is robust? Yes.

Do I trust that the policies in the programme would not to pass my data in the future to third party commercial pharma companies? No.
Do I believe that for enabling commissioning my fully identifiable confidential health records should be stored indefinitely with a third party? No.
Do I trust that the programme would not potentially pass my data to non-health organisations, such as police or Home Office? No.
Do I trust that the programme to tell me if they potentially change the purposes from those which they outline now ? No.

I am open to being convinced.

*****

What is missing from any communications to date and looks unlikely to be included in the current round and why that matters I address in my next post Building Public Trust [4]: Communicate the Benefits won’t work for care.data and then why a future change management model of consent needs approached now, and not after the pilot, I wrap up in [5]: Future solutions.

Continue reading Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

Building Public Trust in care.data sharing [1]: Seven step summary to a new approach

Here’s my opinion after taking part in the NIB #health2020 Bristol event 24/7/2015 and presentation of plans at the June King’s Fund hosted event. Data sharing includes plans for extraction and uses of primary care data by third parties, charging ahead under the care.data banner.

Wearing my hat from a previous role in change management and communications, I share my thoughts in the hope the current approach can adapt and benefit from outside perspectives.

The aim of “Rebuilding and sustaining Public trust” [1] needs refocused to treat the cause, not only the symptoms of the damage done in 2014.  Here’s why:

A Seven Step Top Line Summary

1. Abstract ‘public trust’ is not vital to the future of data sharing. Being demonstrably worthy of public trust is.

2. Data-sharing is not vital to future-proof the NHS. Using knowledge wisely is.

3. A timed target to ‘get the public’s data’, is not what is needed. Having a stable, long term future-proofed and governable model is.

4. Tech solutions do not create trust. Enable the positive human response to what the org wants from people, enabling their confident ‘yes to data-sharing.’ [It might be supported by technology-based tools.]

5. Communications that tell the public ‘we know best, trust us’ fail.  While professional bodies [BMA [2], GPES advisory group, APPG report calling for a public benefits plan, ICO, and expert advice such as Caldicott] are ignored or remain to be acted upon, it remains challenging for the public to see how the programme’s needs, motives and methods are trustworthy. The [Caldicott 2] Review Panel found that commissioners do not need dispensation from confidentiality, human rights & data protection law.” [3] Something’s gotta give. What will it be?

6. care.data consistency. Relationships must be reliable and have integrity.
“Trust us – see the benefits” [But we won’t share the business cost/benefit plan.]
“Trust us – we’re transparent” [But there is nothing published in 2015 at all from the programme board minutes] [4]
“Trust us – we’ll only use your data wisely, with the patient in control” [Ignore that we didn’t before [5] and that we still share your data for secondary uses even if you opted out [6] and no, we can’t tell you when it will be fixed…]

7. Voices do not exist in a vacuum. Being trustworthy on care.data  does not stand alone but is part of the NHS ‘big picture’.
Department of Health to GPs: “Trust us about data sharing.’  [And ignore that we haven’t respected many of  your judgement or opinions.]
NHS England to GPs: “Trust us about data sharing.’  
[And ignore our lack of general GP support: MPIG withdrawal, misrepresentation in CQC reports] NHS England and Department of Health to professionals and public: “The NHS is safe in our hands.’ Everyone: “We see no evidence that plans for cost savings, 7 day working, closures and the 5YFV integration will bring the promised benefits. Let us ‘see the holes’, so that we can trust you based on evidence.”

See the differences?

Target the cause not Symptom:

The focus in the first half, the language used by NHS England/NIB/ DH, sets out their expectations of the public. “You must trust us and how you give us your data.”

The focus should instead to be on the second half, a shift to the organisation, the NHS England/NIB/ DH, and set out expectations from the public point-of-view. ” Enable the public to trust the organisation. Enable individual citizens to trust what is said by individual leaders. This will enable citizens to be consensual sharers in the activity your organisation imposes – the demand for care.data through a statutory gateway, obliging GPs to disclose patient data.

The fact that trust is broken, and specifically to data-sharing that there is the deficit [A] between how much the public trusts the organisation and how the organisation handles data, is not the fault of the public, or “1.4 M NHS staff”, or the media, or patient groups’ pressure. It’s based on proven experience.

It’s based on how organisations have handled data in the past. [5] Specifically on the decisions made by DH, and the Information Centre and leaders in between. Those who chose to sell patient data without asking the public.

The fact that trust is broken is based on how leadership individuals in those organisations have responded to that. Often taking no responsibility for loss.

No matter how often we hear “commissioners will get a better joined up picture of care needs and benefit you”, it does not compensate for past failings.

Only demonstrable actions to show why it will not happen in future can start that healing process.

Target the timing to the solution, not a shipping deadline

“Building trust to enable data sharing” aims at quick fixes, when what is needed is a healing process and ongoing relationship maintenance.

Timing has to be tailored to what needs done, not an ‘artificial deadline’. Despite that being said it doesn’t seem to match reality.

Addressing the Symptoms and not the Cause, will not find a Cure

What needs done?

Lack of public trust, the data trust deficit [A] are symptoms in the public to be understood. But it is the causes in the organisations that must be treated.

So far many NHS England staff I have met in relation to care.data, appear to have a “them and us” mentality. It’s almost tangibly wrapped up in the language used at these meetings or in defensive derision of public concerns: “Tin foil hat wearers”, “Luddites” [7] and my personal favourite, ‘Consent fetishists.’ [8] It’s counter productive and seems borne from either a lack of understanding, or frustration.

The NIB/DH/NHS England/ P&I Directorate must accept they cannot force any consensual change in an emotion-based belief based on past experiences, held by the public.

Those people each have different starting points of knowledge and beliefs.  As one attendee said, “There is no single patient replicated 60 million times.”

The NIB/DH/NHS England/ P&I Directorate can only change what they themselves can control. They have to model and be seen to model change that is trustworthy.

How can an organisation demonstrate it is trustworthy?

This means shifting the focus of the responsibility for change from public and professionals, to leadership organisation.

There is a start in this work stream, but there is little new that is concrete.

The National Data Guardian (NDG) role has been going to be put on a legal footing “at the earliest opportunity” since November 2014. [9] Nine months.

Updated information governance guidance is on the way.

Then there’s two really strong new items that would underpin public trust, to be planned in a ‘roadmap’: the first a system that can record and share consent decisions and the second, to provide information on the use to which an individual’s data has been put.

How and when those two keystones to public trust will be actually offered appear unknown. They will  encourage public trust by enabling choice and control of our data. So I would ask, if we’re not there yet on the roadmap, how can consent options be explained to the public in care.data communications, if there is as yet no mechanism to record and effect them? More on that later.

Secondly, when will a usage report be available? That will be the proof to demonstrate that what was offered, was honoured. It is one of the few tools the organisation(s) can offer to demonstrate they are trustworthy: you said, we did. So again, why jeopardise public trust by rolling out data extractions into the existing, less trustworthy environment?

How well this is done will determine whether it can realise its hoped for benefits. How the driving leadership influences that outcome, will be about the organisational approach to opt out, communicating care.data content decisions, the way and the channels in which they are communicated, accepting what has not worked to date and planning long-term approaches to communicating change before you start the pathfinders. [Detailed steps on this follows.]

Considering the programme’s importance we have been told, it’s vital to get right. [10]

i believe changing the approach from explaining benefits and focus on public trust, to demonstrating why the public should trust demonstrable changes made, will make all the difference.

So before rolling out next data sharing steps think hard what the possible benefits and risks will be, versus waiting for a better environment to do it in.

Conclusion: Trust is not about the public. Public trust is about the organisation being trustworthy. Over to you, orgs.

####

To follow, for those interested in nitty gritty, some practical suggestions for progress in Building Public Trust in data sharing:

This is Part one: A seven step top line summary – What I’d like to see change addressing public trust in health data sharing for secondary purposes.

Part two: a New Approach is needed to understanding Public Trust For those interested in a detailed approach on Trust. What Practical and Policy steps influence trust. On Research and Commissioning. Trust is not homogeneous. Trust  is nuanced even within the single relationship between one individual and another. It doesn’t exist in a vacuum.

Part three: Know where you’re starting from. What behaviours influence trust. Fixing what has already been communicated is vital before new communications get rolled out. Vital to content of your communications and vital for public trust and credibility.

Part four: Communicate the Benefits won’t work – How Communications influence trust. For those interested in more in-depth reasons, I outline in part two why the communications approach is not working, why the focus on ‘benefits’ is wrong, and fixes.

Part five: Future solutions – why a new approach may work better for future trust – not to attempt to rebuild trust where there is now none, but strengthen what is already trusted and fix today’s flawed behaviours; honesty and reliability, that  are vital to future proofing trust

####

Background References:

I’m passionate about people using technology to make their jobs and lives better, simpler, and about living well. So much so, that this became over 5000 words. To solve that, I’ve assumed a baseline knowledge and I will follow up with separate posts on why a new approach is needed to understanding “Public Trust”, to “Communicating the benefits” and “Being trustworthy and other future solutions”.

If this is all new, welcome, and I suggest you look over some of the past 18 months posts that include  public voice captured from eight care.data  events in 2014. care.data is about data sharing for secondary purposes not direct care.

[1] NHS England October 2014 http://www.england.nhs.uk/2014/10/23/nhs-leaders-vision/

[2] BMA LMC Vote 2014 http://bma.org.uk/news-views-analysis/news/2014/june/patients-medical-data-sacrosanct-declares–bma

[3] Caldicott Review 2: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/192572/2900774_InfoGovernance_accv2.pdf

[4] Missing Programme Board documents: 2015 and June-October 2014

[5] HSCIC Data release register

[6] Telegraph article on Type 2 opt out http://www.telegraph.co.uk/news/health/news/11655777/Nearly-1million-patients-could-be-having-confidential-data-shared-against-their-wishes.html

[7] Why Wanting a Better Care.Data is not Luddite: http://davidg-flatout.blogspot.co.uk/2014/04/why-wanting-better-caredata-is-not.html

[8] Talking to the public about using their data is crucial- David Walker, StatsLife http://www.statslife.org.uk/opinion/1316-talking-to-the-public-about-using-their-data-is-crucial

[9] Dame Fiona Caldicott appointed in new role as National Data Guardian

[10] Without care.data health service has no future says director http://www.computerweekly.com/news/2240216402/Without-Caredata-we-wont-have-a-health-service-for-much-longer-says-NHS

Polls of public feeling:

[A] Royal Statistical Society Data Trust Deficit http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

(B] Dialogue on data – work carried out through the ADRN

 

 

care.data : the economic value of data versus the public interest?

 This is a repost of my opinion piece published in StatsLife in June 2015.

The majority of the public supports the concept of using data for public benefit.[1] But the measurable damage done in 2014 to the public’s trust in data sharing [2] and reasons for it, are an ongoing threat to its achievement.

Rebuilding trust and the public legitimacy of government data gathering could be a task for Sisyphus, given the media atmosphere clouded by the smoke and mirrors of state surveillance. As Mark Taylor, chair of the NHS’s Confidentiality Advisory Group wrote when he considered the tribulations of care.data [3] ‘…we need a much better developed understanding of ‘the public interest’ than is currently offered by law.’

So what can we do to improve this as pilot sites move forward and for other research? Can we consistently quantify the value of the public good and account for intangible concerns and risks alongside demonstrable benefits? Do we have a common understanding of how the public feels what is in its own best interests?

And how are shifting public and professional expectations to be reflected in the continued approach to accessing citizens’ data, with the social legitimacy upon which research depends?

Listening and lessons learned

Presented as an interval to engage the public and professionals, the 18 month long pause in care.data involved a number of ‘listening’ events. I attended several of these to hear what people were saying about the use of personal health data. The three biggest areas of concern raised frequently [4] were:

  • Commercial companies’ use and re-use of data
  • Lack of transparency and control over who was accessing data for what secondary purposes, and
  • Potential resulting harms: from data inaccuracy, loss of trust and confidentiality, and fear of discrimination.

It’s not the use of data per se that the majority of the public raises objection to. Indeed many people would object if health data were not used for research in the public interest. Objections were more about the approach to this in the past and in the future.

There is a common understanding of what bona fide research is, how it serves the public interest, and polls confirm a widespread acceptance of ‘reasonable’ research use of data. The HSCIC audit under Sir Nick Partridge [5] acknowledged that some past users or raw data sharing had not always met public expectations of what was ‘reasonable’. The new secure facility should provide a safe setting for managing this better, but open questions remain on governance and transparency.

As one question from a listening event succinctly put it [6]:

‘Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.’

Using the information gleaned from data was often seen as exploitation when used in segmenting the insurance markets, consumer market research or individual targeting. There is also concern, even outright hostility, to raw health data being directly sold, re-used or exchanged as a commodity – regardless whether this is packaged as ‘for profit’ or ‘covering administrative costs’.

Add to that, the inability to consent to, control or find out who uses individual level data and for what purpose, or to delete mistakes, and there is a widespread sense of disempowerment and loss of trust.

Quantifying the public perception of care.data’s value

While the pause was to explain the benefits of the care.data extraction, it actually seemed clear at meetings that people already understood the potential benefits. There is clear public benefit to be gained for example, from using data as a knowledge base, often by linking with other data to broaden scientific and social insights, generating public good.

What people were asking, was what new knowledge would be gained that isn’t gathered from non-identifiable data already? Perhaps more tangible, yet less discussed at care.data events, is the economic benefits for commissioning use by using data as business intelligence to inform decisions in financial planning and cost cutting.

There might be measurable economic public good from data, from outside interests who will make a profit by using data to create analytic tools. Some may even sell information back into the NHS as business insights.

Care.data is also to be an ‘accelerator’ for other projects [7]. But it is hard to find publicly available evidence to a) support the economic arguments for using primary care data in any future projects, and b) be able to compare them with the broader current and future needs of the NHS.

A useful analysis could find that potential personal benefits and the public good overlap, if the care.data business case were to be made available by NHS England in the public domain. In a time when the NHS budget is rarely out of the media it seems a no-brainer that this should be made open.

Feedback consistently shows that making money from data raises more concern over its uses. Who all future users might be remains open as the Care Act 2014 clause is broadly defined. Jamie Reed MP said in the debate [8]: ‘the new clause provides for entirely elastic definitions that, in practice, will have a limitless application.’

Unexpected uses and users of public data has created many of its historical problems. But has the potential future cost of ‘limitless’ applications been considered in the long term public interest? And what of the confidentiality costs [9]? The NHS’s own Privacy Impact Assessment on care.data says [10]:

‘The extraction of personal confidential data from providers without consent carries the risk that patients may lose trust in the confidential nature of the health service.

Who has quantified the cost of that loss of confidence and have public and professional opinions been accounted for in any cost/benefit calculations? All these tangible and intangible factors should be measured in calculating its value in the public interest and ask, ‘what does the public want?’ It is after all, our data and our NHS.

Understanding shifting public expectations

‘The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.’ – David Carr, policy adviser at the Wellcome Trust [11]

To rebuild trust in data sharing, individuals need the imbalance of power corrected, so they can control ‘their data’. The public was mostly unaware health records were being used for secondary purposes by third parties, before care.data. In February 2014, the secretary of state stepped in to confirm an opt-out will be offered, as promised by the prime minister in his 2010 ‘every patient a willing research patient’ speech.

So leaving aside the arguments for and against opt-in versus opt-out (and that for now it is not technically possible to apply the 700,000 opt-outs already made) the trouble is, it’s all or nothing. By not offering any differentiation between purposes, the public may feel forced to opt-out of secondary data sharing, denying all access to all their data even if they want to permit some uses and not others.

Defining and differentiating secondary uses and types of ‘research purposes’ could be key to rebuilding trust. The HSCIC can disseminate information ‘for the purposes of the provision of health care or adult social care, or the promotion of health’. This does not exclude commercial use. Cutting away commercial purposes which appear exploitative from purposes in the public interest could benefit the government, commerce and science if, as a result, more people would be willing to share their data.

This choice is what the public has asked for at care.data events, other research events [12] and in polls, but to date has yet to see any move towards. I feel strongly that the government cannot continue to ignore public opinion and assume its subjects are creators of data, willing to be exploited, without expecting further backlash. Should a citizen’s privacy become a commodity to put a price tag on if it is a basic human right?

One way to protect that right is to require an active opt-in to sharing. With ongoing renegotiation of public rights and data privacy at EU level, consent is no longer just a question best left ignored in the pandora’s box of ethics, as it has been for the last 25 years in hospital data secondary use. [13]

The public has a growing awareness, differing expectations, and different degrees of trust around data use by different users. Policy makers ignoring these expectations, risk continuing to build on a shaky foundation and jeopardise the future data sharing infrastructure. Profiting at the expense of public feeling and ethical good practice is an unsustainable status quo.

Investing in the public interest for future growth

The care.data pause has revealed differences between the thinking of government, the drivers of policy, the research community, ethics panels and the citizens of the country. This is not only about what value we place on our own data, but how we value it as a public good.

Projects that ignore the public voice, that ‘listen’ but do not act, risk their own success and by implication that of others. And with it they risk the public good they should create. A state which allows profit for private companies to harm the perception of good research practice sacrifices the long term public interest for short term gain. I go back to the words of Mark Taylor [3]:

‘The commitment must be an ongoing one to continue to consult with people, to continue to work to optimally protect both privacy and the public interest in the uses of health data. We need to use data but we need to use it in ways that people have reason to accept. Use ‘in the public interest’ must respect individual privacy. The current law of data protection, with its opposed concepts of ‘privacy’ and ‘public interest’, does not do enough to recognise the dependencies or promote the synergies between these concepts.’ 

The economic value of data, personal rights and the public interest are not opposed to one another, but have synergies and a co-dependency. The public voice from care.data listening could positively help shape a developing consensual model of data sharing if the broader lessons learned are built upon in an ongoing public dialogue. As Mark Taylor also said, ‘we need to do this better.’

*******

[1] according to various polls and opinions gathered from my own discussions and attendance at care.data events in 2014 [ refs: 2, 4. 6. 12]

[2] The data trust deficit, work by the Royal Statistical Society in 2014

[3] M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1 http://script-ed.org/?p=1377

[4] Communications and Change – blogpost https://jenpersson.com/care-data-communications-change/

[5] HSCIC audit under Sir Nick Partridge https://www.gov.uk/government/publications/review-of-data-releases-made-by-the-nhs-information-centre

[6] Listening events, NHS Open Day blogpost https://jenpersson.com/care-data-communications-core-concepts-part-two/

[7] Accelerator for projects mentioned include the 100K Genomics programme https://www.youtube.com/watch?v=s8HCbXsC4z8

[8] Hansard http://www.publications.parliament.uk/pa/cm201314/cmhansrd/cm140311/debtext/140311-0002.htm

[9] Confidentiality Costs; StatsLife http://www.statslife.org.uk/opinion/1723-confidentiality-costs

[10] care.data privacy impact assessment Jan 2014 [newer version has not been publicly released] http://www.england.nhs.uk/wp-content/uploads/2014/01/pia-care-data.pdf

[11] Wellcome Trust http://blog.wellcome.ac.uk/2015/04/08/sharing-research-data-to-improve-public-health/

[12]  Dialogue on Data – Exploring the public’s views on using linked administrative data for research purposes: https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx

[13] HSCIC Lessons Learned http://www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

The views expressed in this article originally published in the Opinion section of StatsLife are solely mine, the original author. These views and opinions do not necessarily represent those of The Royal Statistical Society.