Category Archives: research

The power of imagination in public policy

“A new, a vast, and a powerful language is developed for the future use of analysis, in which to wield its truths so that these may become of more speedy and accurate practical application for the purposes of mankind than the means hitherto in our possession have rendered possible.” [on Ada Lovelace, The First tech Visionary, New Yorker, 2013]

What would Ada Lovelace have argued for in today’s AI debates? I think she may have used her voice not only to call for the good use of data analysis, but for her second strength.The power of her imagination.

James Ball recently wrote in The European [1]:

“It is becoming increasingly clear that the modern political war isn’t one against poverty, or against crime, or drugs, or even the tech giants – our modern political era is dominated by a war against reality.”

My overriding take away from three days spent at the Conservative Party Conference this week, was similar. It reaffirmed the title of a school debate I lost at age 15, ‘We only believe what we want to believe.’

James writes that it is, “easy to deny something that’s a few years in the future“, and that Conservatives, “especially pro-Brexit Conservatives – are sticking to that tried-and-tested formula: denying the facts, telling a story of the world as you’d like it to be, and waiting for the votes and applause to roll in.”

These positions are not confined to one party’s politics, or speeches of future hopes, but define perception of current reality.

I spent a lot of time listening to MPs. To Ministers, to Councillors, and to party members. At fringe events, in coffee queues, on the exhibition floor. I had conversations pressed against corridor walls as small press-illuminated swarms of people passed by with Queen Johnson or Rees-Mogg at their centre.

In one panel I heard a primary school teacher deny that child poverty really exists, or affects learning in the classroom.

In another, in passing, a digital Minister suggested that Pupil Referral Units (PRU) are where most of society’s ills start, but as a Birmingham head wrote this week, “They’ll blame the housing crisis on PRUs soon!” and “for the record, there aren’t gang recruiters outside our gates.”

This is no tirade on failings of public policymakers however. While it is easy to suspect malicious intent when you are at, or feel, the sharp end of policies which do harm, success is subjective.

It is clear that an overwhelming sense of self-belief exists in those responsible, in the intent of any given policy to do good.

Where policies include technology, this is underpinned by a self re-affirming belief in its power. Power waiting to be harnessed by government and the public sector. Even more appealing where it is sold as a cost-saving tool in cash strapped councils. Many that have cut away human staff are now trying to use machine power to make decisions. Some of the unintended consequences of taking humans out of the process, are catastrophic for human rights.

Sweeping human assumptions behind such thinking on social issues and their causes, are becoming hard coded into algorithmic solutions that involve identifying young people who are in danger of becoming involved in crime using “risk factors” such as truancy, school exclusion, domestic violence and gang membership.

The disconnect between perception of risk, the reality of risk, and real harm, whether perceived or felt from these applied policies in real-life, is not so much, ‘easy to deny something that’s a few years in the future‘ as Ball writes, but a denial of the reality now.

Concerningly, there is lack of imagination of what real harms look like.There is no discussion where sometimes these predictive policies have no positive, or even a negative effect, and make things worse.

I’m deeply concerned that there is an unwillingness to recognise any failures in current data processing in the public sector, particularly at scale, and where it regards the well-known poor quality of administrative data. Or to be accountable for its failures.

Harms, existing harms to individuals, are perceived as outliers. Any broad sweep of harms across policy like Universal Credit, seem perceived as political criticism, which makes the measurable failures less meaningful, less real, and less necessary to change.

There is a worrying growing trend of finger-pointing exclusively at others’ tech failures instead. In particular, social media companies.

Imagination and mistaken ideas are reinforced where the idea is plausible, and shared. An oft heard and self-affirming belief was repeated in many fora between policymakers, media, NGOs regards children’s online safety. “There is no regulation online”. In fact, much that applies offline applies online. The Crown Prosecution Service Social Media Guidelines is a good place to start. [2] But no one discusses where children’s lives may be put at risk or less safe, through the use of state information about them.

Policymakers want data to give us certainty. But many uses of big data, and new tools appear to do little more than quantify moral fears, and yet still guide real-life interventions in real-lives.

Child abuse prediction, and school exclusion interventions should not be test-beds for technology the public cannot scrutinise or understand.

In one trial attempting to predict exclusion, this recent UK research project in 2013-16 linked children’s school records of 800 children in 40 London schools, with Metropolitan Police arrest records of all the participants. It found interventions created no benefit, and may have caused harm. [3]

“Anecdotal evidence from the EiE-L core workers indicated that in some instances schools informed students that they were enrolled on the intervention because they were the “worst kids”.”

Keeping students in education, by providing them with an inclusive school environment, which would facilitate school bonds in the context of supportive student–teacher relationships, should be seen as a key goal for educators and policy makers in this area,” researchers suggested.

But policy makers seem intent to use systems that tick boxes, and create triggers to single people out, with quantifiable impact.

Some of these systems are known to be poor, or harmful.

When it comes to predicting and preventing child abuse, there is concern with the harms in US programmes ahead of us, such as both Pittsburgh, and Chicago that has scrapped its programme.

The Illinois Department of Children and Family Services ended a high-profile program that used computer data mining to identify children at risk for serious injury or death after the agency’s top official called the technology unreliable, and children still died.

“We are not doing the predictive analytics because it didn’t seem to be predicting much,” DCFS Director Beverly “B.J.” Walker told the Tribune.

Many professionals in the UK share these concerns. How long will they be ignored and children be guinea pigs without transparent error rates, or recognition of the potential harmful effects?

Helen Margetts, Director of the Oxford Internet Institute and Programme Director for Public Policy at the Alan Turing Institute, suggested at the IGF event this week, that stopping the use of these AI in the public sector is impossible. We could not decide that, “we’re not doing this until we’ve decided how it’s going to be.” It can’t work like that.” [45:30]

Why on earth not? At least for these high risk projects.

How long should children be the test subjects of machine learning tools at scale, without transparent error rates, audit, or scrutiny of their systems and understanding of unintended consequences?

Is harm to any child a price you’re willing to pay to keep using these systems to perhaps identify others, while we don’t know?

Is there an acceptable positive versus negative outcome rate?

The evidence so far of AI in child abuse prediction is not clearly showing that more children are helped than harmed.

Surely it’s time to stop thinking, and demand action on this.

It doesn’t take much imagination, to see the harms. Safe technology, and safe use of data, does not prevent the imagination or innovation, employed for good.

If we continue to ignore views from Patrick Brown, Ruth Gilbert, Rachel Pearson and Gene Feder, Charmaine Fletcher, Mike Stein, Tina Shaw and John Simmonds I want to know why.

Where you are willing to sacrifice certainty of human safety for the machine decision, I want someone to be accountable for why.

 


References

[1] James Ball, The European, Those waging war against reality are doomed to failure, October 4, 2018.

[2] Thanks to Graham Smith for the link. “Social Media – Guidelines on prosecuting cases involving communications sent via social media. The Crown Prosecution Service (CPS) , August 2018.”

[3] Obsuth, I., Sutherland, A., Cope, A. et al. J Youth Adolescence (2017) 46: 538. https://doi.org/10.1007/s10964-016-0468-4 London Education and Inclusion Project (LEIP): Results from a Cluster-Randomized Controlled Trial of an Intervention to Reduce School Exclusion and Antisocial Behavior (March 2016)

When is a profile no longer personal data

This is bothering me in current and future data protection.

When is a profile no longer personal?

I’m thinking of a class, or even a year group of school children.

If you strip off enough identifiers or aggregate data so that individuals are no longer recognisable and could not be identified from other data — directly or indirectly — that is in your control or may come into your control, you no longer process personal data.

How far does Article 4(1) go to the boundary of what is identifiable on economic, cultural or social identity?

There is a growing number of research projects using public sector data (including education but often in conjunction and linkage with various others) in which personal data are used to profile and identify sets of characteristics, with a view to intervention.

Let’s take a case study.

Exclusions and absence, poverty, ethnicity, language, SEN and health, attainment indicators, their birth year and postcode areas are all profiled on individual level data in a data set of 100 London schools all to identify the characteristics of children more likely than others to drop out.

It’s a research project, with a view to shaping a NEET intervention program in early education (Not in Education, Employment or Training). There is no consent sought for using the education, health, probation, Police National Computer, and HMRC data like this, because it’s research, and enjoys an exemption.

Among the data collected BAME ethnicity and non-English language students in certain home postcodes are more prevalent. The names of pupils and DOB and their school address have been removed.

In what is in effect a training dataset, to teach the researchers, “what does a potential NEET look like?” pupils with characteristics like Mohammed Jones, are more likely than others to be prevalent.

It does not permit the identification of the data subject as himself, but the data knows exactly what a pupil like MJ looks like.

Armed with these profiles of what potential NEETs look like, researchers now work with the 100 London schools, to give the resulting knowledge, to help teachers identify their children at risk of potential drop out, or exclusion, or of becoming a NEET.

In one London school, MJ, is a perfect match for the profile. The teacher is relieved from any active judgement who should join the program, he’s a perfect match for what to look for.  He’s asked to attend a special intervention group, to identify and work on his risk factors.

The data are accurate. His profile does match. But would he have gone on to become NEET?

Is this research, or was it a targeted intervention?

Are the tests for research exemptions met?

Is this profiling and automated decision-making?

If the teacher is asked to “OK” the list, but will not in practice edit it, does that make it exempt from the profiling restriction for children?

The GDPR also sets out the rules (at Article 6(4)) on factors a controller must take into account to assess whether a new processing purpose is compatible with the purpose for which the data were initially collected.

But if the processing is done only after the identifiers are removed that could identify MJ, not just someone like him, does it apply?

In a world that talks about ever greater personalisation, we are in fact being treated less and less as an individual, but instead we’re constantly assessed by comparison, and probability, how we measure up against other profiles of other people built up from historical data.

Then it is used to predict what someone with a similar profile would do, and therefore by inference, what we the individual would do.

What is the difference in reality, of having given the researchers all the education, health, probation, Police National Computer, and HMRC — as they had it — and then giving them the identifying school datasets with pupils’ named data, and saying “match them up.”

I worry that we are at great risk, in risk prediction, of not using the word research, to mean what we think it means.

And our children are unprotected from bias and unexpected consequences as a result.

The illusion that might cheat us: ethical data science vision and practice

This blog post is also available as an audio file on soundcloud.


Anais Nin, wrote in her 1946 diary of the dangers she saw in the growth of technology to expand our potential for connectivity through machines, but diminish our genuine connectedness as people. She could hardly have been more contemporary for today:

“This is the illusion that might cheat us of being in touch deeply with the one breathing next to us. The dangerous time when mechanical voices, radios, telephone, take the place of human intimacies, and the concept of being in touch with millions brings a greater and greater poverty in intimacy and human vision.”
[Extract from volume IV 1944-1947]

Echoes from over 70 years ago, can be heard in the more recent comments of entrepreneur Elon Musk. Both are concerned with simulation, a lack of connection between the perceived, and reality, and the jeopardy this presents for humanity. But both also have a dream. A dream based on the positive potential society has.

How will we use our potential?

Data is the connection we all have between us as humans and what machines and their masters know about us. The values that masters underpin their machine design with, will determine the effect the machines and knowledge they deliver, have on society.

In seeking ever greater personalisation, a wider dragnet of data is putting together ever more detailed pieces of information about an individual person. At the same time data science is becoming ever more impersonal in how we treat people as individuals. We risk losing sight of how we respect and treat the very people whom the work should benefit.

Nin grasped the risk that a wider reach, can mean more superficial depth. Facebook might be a model today for the large circle of friends you might gather, but how few you trust with confidences, with personal knowledge about your own personal life, and the privilege it is when someone chooses to entrust that knowledge to you. Machine data mining increasingly tries to get an understanding of depth, and may also add new layers of meaning through profiling, comparing our characteristics with others in risk stratification.
Data science, research using data, is often talked about as if it is something separate from using information from individual people. Yet it is all about exploiting those confidences.

Today as the reach has grown in what is possible for a few people in institutions to gather about most people in the public, whether in scientific research, or in surveillance of different kinds, we hear experts repeatedly talk of the risk of losing the valuable part, the knowledge, the insights that benefit us as society if we can act upon them.

We might know more, but do we know any better? To use a well known quote from her contemporary, T S Eliot, ‘Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?’

What can humans achieve? We don’t yet know our own limits. What don’t we yet know?  We have future priorities we aren’t yet aware of.

To be able to explore the best of what Nin saw as ‘human vision’ and Musk sees in technology, the benefits we have from our connectivity; our collaboration, shared learning; need to be driven with an element of humility, accepting values that shape  boundaries of what we should do, while constantly evolving with what we could do.

The essence of this applied risk is that technology could harm you, more than it helps you. How do we avoid this and develop instead the best of what human vision makes possible? Can we also exceed our own expectations of today, to advance in moral progress?

Continue reading The illusion that might cheat us: ethical data science vision and practice

OkCupid and Google DeepMind: Happily ever after? Purposes and ethics in datasharing

This blog post is also available as an audio file on soundcloud.


What constitutes the public interest must be set in a universally fair and transparent ethics framework if the benefits of research are to be realised – whether in social science, health, education and more – that framework will provide a strategy to getting the pre-requisite success factors right, ensuring research in the public interest is not only fit for the future, but thrives. There has been a climate change in consent. We need to stop talking about barriers that prevent datasharing  and start talking about the boundaries within which we can.

What is the purpose for which I provide my personal data?

‘We use math to get you dates’, says OkCupid’s tagline.

That’s the purpose of the site. It’s the reason people log in and create a profile, enter their personal data and post it online for others who are looking for dates to see. The purpose, is to get a date.

When over 68K OkCupid users registered for the site to find dates, they didn’t sign up to have their identifiable data used and published in ‘a very large dataset’ and onwardly re-used by anyone with unregistered access. The users data were extracted “without the express prior consent of the user […].”

Are the registration consent purposes compatible with the purposes to which the researcher put the data should be a simple enough question.  Are the research purposes what the person signed up to, or would they be surprised to find out their data were used like this?

Questions the “OkCupid data snatcher”, now self-confessed ‘non-academic’ researcher, thought unimportant to consider.

But it appears in the last month, he has been in good company.

Google DeepMind, and the Royal Free, big players who do know how to handle data and consent well, paid too little attention to the very same question of purposes.

The boundaries of how the users of OkCupid had chosen to reveal information and to whom, have not been respected in this project.

Nor were these boundaries respected by the Royal Free London trust that gave out patient data for use by Google DeepMind with changing explanations, without clear purposes or permission.

The legal boundaries in these recent stories appear unclear or to have been ignored. The privacy boundaries deemed irrelevant. Regulatory oversight lacking.

The respectful ethical boundaries of consent to purposes, disregarding autonomy, have indisputably broken down, whether by commercial org, public body, or lone ‘researcher’.

Research purposes

The crux of data access decisions is purposes. What question is the research to address – what is the purpose for which the data will be used? The intent by Kirkegaard was to test:

“the relationship of cognitive ability to religious beliefs and political interest/participation…”

In this case the question appears intended rather a test of the data, not the data opened up to answer the test. While methodological studies matter, given the care and attention [or self-stated lack thereof] given to its extraction and any attempt to be representative and fair, it would appear this is not the point of this study either.

The data doesn’t include profiles identified as heterosexual male, because ‘the scraper was’. It is also unknown how many users hide their profiles, “so the 99.7% figure [identifying as binary male or female] should be cautiously interpreted.”

“Furthermore, due to the way we sampled the data from the site, it is not even representative of the users on the site, because users who answered more questions are overrepresented.” [sic]

The paper goes on to say photos were not gathered because they would have taken up a lot of storage space and could be done in a future scraping, and

“other data were not collected because we forgot to include them in the scraper.”

The data are knowingly of poor quality, inaccurate and incomplete. The project cannot be repeated as ‘the scraping tool no longer works’. There is an unclear ethical or peer review process, and the research purpose is at best unclear. We can certainly give someone the benefit of the doubt and say intent appears to have been entirely benevolent. It’s not clear what the intent was. I think it is clearly misplaced and foolish, but not malevolent.

The trouble is, it’s not enough to say, “don’t be evil.” These actions have consequences.

When the researcher asserts in his paper that, “the lack of data sharing probably slows down the progress of science immensely because other researchers would use the data if they could,”  in part he is right.

Google and the Royal Free have tried more eloquently to say the same thing. It’s not research, it’s direct care, in effect, ignore that people are no longer our patients and we’re using historical data without re-consent. We know what we’re doing, we’re the good guys.

However the principles are the same, whether it’s a lone project or global giant. And they’re both wildly wrong as well. More people must take this on board. It’s the reason the public interest needs the Dame Fiona Caldicott review published sooner rather than later.

Just because there is a boundary to data sharing in place, does not mean it is a barrier to be ignored or overcome. Like the registration step to the OkCupid site, consent and the right to opt out of medical research in England and Wales is there for a reason.

We’re desperate to build public trust in UK research right now. So to assert that the lack of data sharing probably slows down the progress of science is misplaced, when it is getting ‘sharing’ wrong, that caused the lack of trust in the first place and harms research.

A climate change in consent

There has been a climate change in public attitude to consent since care.data, clouded by the smoke and mirrors of state surveillance. It cannot be ignored.  The EUGDPR supports it. Researchers may not like change, but there needs to be an according adjustment in expectations and practice.

Without change, there will be no change. Public trust is low. As technology advances and if we continue to see commercial companies get this wrong, we will continue to see public trust falter unless broken things get fixed. Change is possible for the better. But it has to come from companies, institutions, and people within them.

Like climate change, you may deny it if you choose to. But some things are inevitable and unavoidably true.

There is strong support for public interest research but that is not to be taken for granted. Public bodies should defend research from being sunk by commercial misappropriation if they want to future-proof public interest research.

The purpose for which the people gave consent are the boundaries within which you have permission to use data, that gives you freedom within its limits, to use the data.  Purposes and consent are not barriers to be overcome.

If research is to win back public trust developing a future proofed, robust ethical framework for data science must be a priority today.

Commercial companies must overcome the low levels of public trust they have generated in the public to date if they ask ‘trust us because we’re not evil‘. If you can’t rule out the use of data for other purposes, it’s not helping. If you delay independent oversight it’s not helping.

This case study and indeed the Google DeepMind recent episode by contrast demonstrate the urgency with which working out what common expectations and oversight of applied ethics in research, who gets to decide what is ‘in the public interest’ and data science public engagement must be made a priority, in the UK and beyond.

Boundaries in the best interest of the subject and the user

Society needs research in the public interest. We need good decisions made on what will be funded and what will not be. What will influence public policy and where needs attention for change.

To do this ethically, we all need to agree what is fair use of personal data, when is it closed and when is it open, what is direct and what are secondary uses, and how advances in technology are used when they present both opportunities for benefit or risks to harm to individuals, to society and to research as a whole.

The potential benefits of research are potentially being compromised for the sake of arrogance, greed, or misjudgement, no matter intent. Those benefits cannot come at any cost, or disregard public concern, or the price will be trust in all research itself.

In discussing this with social science and medical researchers, I realise not everyone agrees. For some, using deidentified data in trusted third party settings poses such a low privacy risk, that they feel the public should have no say in whether their data are used in research as long it’s ‘in the public interest’.

For the DeepMind researchers and Royal Free, they were confident even using identifiable data, this is the “right” thing to do, without consent.

For the Cabinet Office datasharing consultation, the parts that will open up national registries, share identifiable data more widely and with commercial companies, they are convinced it is all the “right” thing to do, without consent.

How can researchers, society and government understand what is good ethics of data science, as technology permits ever more invasive or covert data mining and the current approach is desperately outdated?

Who decides where those boundaries lie?

“It’s research Jim, but not as we know it.” This is one aspect of data use that ethical reviewers will need to deal with, as we advance the debate on data science in the UK. Whether independents or commercial organisations. Google said their work was not research. Is‘OkCupid’ research?

If this research and data publication proves anything at all, and can offer lessons to learn from, it is perhaps these three things:

Who is accredited as a researcher or ‘prescribed person’ matters. If we are considering new datasharing legislation, and for example, who the UK government is granting access to millions of children’s personal data today. Your idea of a ‘prescribed person’ may not be the same as the rest of the public’s.

Researchers and ethics committees need to adjust to the climate change of public consent. Purposes must be respected in research particularly when sharing sensitive, identifiable data, and there should be no assumptions made that differ from the original purposes when users give consent.

Data ethics and laws are desperately behind data science technology. Governments, institutions, civil, and all society needs to reach a common vision and leadership how to manage these challenges. Who defines these boundaries that matter?

How do we move forward towards better use of data?

Our data and technology are taking on a life of their own, in space which is another frontier, and in time, as data gathered in the past might be used for quite different purposes today.

The public are being left behind in the game-changing decisions made by those who deem they know best about the world we want to live in. We need a say in what shape society wants that to take, particularly for our children as it is their future we are deciding now.

How about an ethical framework for datasharing that supports a transparent public interest, which tries to build a little kinder, less discriminating, more just world, where hope is stronger than fear?

Working with people, with consent, with public support and transparent oversight shouldn’t be too much to ask. Perhaps it is naive, but I believe that with an independent ethical driver behind good decision-making, we could get closer to datasharing like that.

That would bring Better use of data in government.

Purposes and consent are not barriers to be overcome. Within these, shaped by a strong ethical framework, good data sharing practices can tackle some of the real challenges that hinder ‘good use of data’: training, understanding data protection law, communications, accountability and intra-organisational trust. More data sharing alone won’t fix these structural weaknesses in current UK datasharing which are our really tough barriers to good practice.

How our public data will be used in the public interest will not be a destination or have a well defined happy ending, but it is a long term  process which needs to be consensual and there needs to be a clear path to setting out together and achieving collaborative solutions.

While we are all different, I believe that society shares for the most part, commonalities in what we accept as good, and fair, and what we believe is important. The family sitting next to me have just counted out their money and bought an ice cream to share, and the staff gave them two. The little girl is beaming. It seems that even when things are difficult, there is always hope things can be better. And there is always love.

Even if some might give it a bad name.

********

img credit: flickr/sofi01/ Beauty and The Beast  under creative commons

Wearables: patients will ‘essentially manage their data as they wish’. What will this mean for diagnostics, treatment and research and why should we care? [#NHSWDP 3]

 

Consent to data sharing appears to be a new choice firmly available on the NHS England patient menu if patient ownership of our own records, is clearly acknowledged as ‘the operating principle legally’.

Simon Stevens, had just said in his keynote speech:

“..smartphones; […] the single most important health treatment and diagnostic tool at our disposal over the coming decade and beyond ” Simon Stevens, March 18 2015.

Tim Kelsey, Director Patients and Information, NHS England, then talked about consent in the Q&A:

“We now acknowledge the patient’s ownership of the record […] essentially, it’s always been implied, it’s still not absolutely explicit but it is the operating principle now legally for the NHS.

“So, let’s get back to consent and what it means for clinical professionals, because we are going to move to a place where people will make those decisions as they currently do with wearable devices, and other kinds of mobile, and we need to get to a point where people can plug their wearable device into their medical record, and essentially manage their data as they wish.

“It is essentially, their data.”

How this principle has been applied in the past, is being now, and how it may change matters, as it will affect many other areas.

Our personal health data is the business intelligence of the health industry’s future.

Some parts of that industry will say we don’t share enough data. Or don’t use it in the right way.  For wearables designed as medical devices, it will be vital to do so.

But before some launch into polemics on the rights and wrongs of blanket ‘data sharing’ we should be careful what types of data we mean, and for what purposes it is extracted.It matters when discussing consent and sharing.

We should be clear to separate consent to data sharing for direct treatment from consent for secondary purposes other than care (although Mr Kelsey hinted at a conflation of the two in a later comment). The promised opt-out from sharing for secondary uses is pending legal change. At least that’s what we’ve been told.

Given that patient data from hospital and range of NHS health settings today, are used for secondary purposes without consent – despite the political acknowledgement that patients have an opt out – this sounded a bold new statement, and contrasted with his past stance.

Primary care data extraction for secondary uses, in the care.data programme, was not intended to be consensual. Will it become so?

Its plan so far has an assumed opt-in model, despite professional calls from some, such as at the the BMA ARM to move to an opt-in model, and the acknowledged risk of harm that it will do to patient trust.

The NHS England Privacy Assessment said: ‘The extraction of personal confidential data from providers without consent carries the risk that patients may lose trust in the confidential nature of the health service.’

A year into the launch, Jan 2014, a national communications plan should have solved the need for fair processing, but another year on, March 2015, there is postcode lottery, pilot approach.

If in principle, datasharing is to be decided by consensual active choice,  as it “is the operating principle now legally for the NHS” then why not now, for care.data, and for all?

When will the promised choice be enacted to withhold data from secondary uses and sharing with third parties beyond the HSCIC?

“we are going to move to a place where people will make those decisions as they currently do with wearable devices” [Widening digital participation, at the King’s Fund March 2015]

So when will we see this ‘move’ and what will it mean?

Why plan to continue to extract more data under the ‘old’ assumption principle, if ownership of data is now with the individual?

And who is to make the move first – NHS patients or NHS patriarchy – if patients use wearables before the NHS is geared up to them?

Looking back or forward thinking?

Last year’s programme has become outdated not only in principle, but digital best practice if top down dictatorship is out, and the individual is now to “manage their data as they wish.”

What might happen in the next two years, in the scope of the Five Year Forward Plan or indeed by 2020?

This shift in data creation, sharing and acknowledged ownership may mean epic change for expectations and access.

It will mean that people’s choice around data sharing; from patients and healthy controls, need considered early on in research & projects. Engagement, communication and involvement will be all about trust.

For the ‘worried well’, wearables could ‘provide digital “nudges” that will empower us to live healthier and better lives‘ or perhaps not.

What understanding have we yet, of the big picture of what this may mean and where apps fit into the wider digital NHS application and beyond?

Patients right to choose

The rights to information and decision making responsibility is shifting towards the patient in other applied areas of care.

But what data will patients truly choose to apply and what to share, manipulate or delete? Who will use wearables and who will not, and how will that affect the access to and delivery of care?

What data will citizens choose to share in future and how will it affect the decision making by their clinician, the NHS as an organisation, research, public health, the state, and the individual?

Selective deletion could change a clinical history and clinician’s view.

Selective accuracy in terms of false measurements [think diabetes], or in medication, could kill people quickly.

How are apps to be regulated? Will only NHS ‘approved’ apps be licensed for use in the NHS and made available to choose from and what happens to patients’ data who use a non-approved app?

How will any of their data be accessed and applied in primary care?

Knowledge is used to make choices and inform decisions. Individuals make choices about their own lives, clinicians make decisions for and with their patients in their service provision, organisations make choices about their business model which may include where to profit.

Our personal health data is the business intelligence of the health industry’s future.

Who holds the balance of power in that future delivery model for healthcare in England, is going to be an ongoing debate of epic proportions but it will likely change in drips rather than a flood.

It has already begun. Lobbyists and companies who want access to data are apparently asking for significant changes to be made in the access to micro data held at the ONS. EU laws are changing.

The players who hold data, will hold knowledge, will hold power.

If the NHS were a monopoly board game, data intermediaries would be some of the wealthiest sites, but the value they create from publicly funded NHS data, should belong in the community chest.

If consent is to be with the individual for all purposes other than direct care, then all data sharing bodies and users had best set their expectations accordingly. Patients will need to make wise decisions, for themselves and in the public interest.

Projects for research and sharing must design trust and security into plans from the start or risk failure through lack of participants.

It’s enormously exciting.  I suspect some apps will be rather well hyped and deflate quickly if not effective. Others might be truly useful. Others may kill us.

As twitter might say, what a time to be alive.

Digital opportunities for engaging citizens as far as apps and data sharing goes, is not only not about how the NHS will engage citizens, but how citizens will engage with what NHS offering.

Consent it seems will one day be king.
Will there or won’t there be a wearables revolution?
Will we be offered or choose digital ‘wellness tools’ or medically approved apps? Will we trust them for diagnostics and treatment? Or will few become more than a fad for the worried well?
Control for the individual over their own data and choice to make their own decisions of what to store, share or deny may rule in practice, as well as theory.
That practice will need to differentiate between purposes for direct clinical care and secondary uses as it does today, and be supported and protected in legislation, protecting patient trust.
“We are going to move to a place where people will make those decisions as they currently do with wearable devices, and other kinds of mobile, and we need to get to a point where people can plug their wearable device into their medical record, and essentially manage their data as they wish.”
However as ‘choice’ was the buzzword for NHS care in recent years – conflated with increasing the use of private providers – will consent be abused to mean a shift of responsibility from the state to the individual, with caveats for how it could affect care?
With that shift in responsibility for decision making, as with personalized budgets, will we also see a shift in responsibility for payment choices from state to citizen?
Will our lifestyle choices in one area exclude choice in another?
Could app data of unhealthy purchases from the supermarket or refusal to share our health data, one day be seen as refusal of care and a reason to decline it? Mr Kelsey hinted at this last question in the meeting.
Add a population stratified by risk groups into the mix, and we have lots of legitimate questions to ask on the future vision of the NHS.
He went on to say:
“we have got some very significant challenges to explore in our minds, and we need to do, quite urgently from a legal and ethical perspective, around the advent of machine learning, and …artificial intelligence capable of handling data at a scale which we don’t currently do […] .
“I happen to be the person responsible in the NHS for the 100K genomes programme[…]. We are on the edge of a new kind of medicine, where we can also look at the interaction of all your molecules, as they bounce around your DNA. […]
“The point is, the principle is, it’s the patient’s data and they must make decisions about who uses it and what they mash it up with.”
How well that is managed will determine who citizens will choose to engage and share data with, inside and outside our future NHS.
Simon Stevens earlier at the event, had acknowledged a fundamental power shift he sees as necessary:
“This has got to be central about what the redesign of care looks like, with a fundamental power shift actually, in the way in which services are produced and co-produced.”

That could affect everyone in the NHS, with or without a wearables revolution.

These are challenges the public is not yet discussing and we’re already late to the party.

We’re all invited. What will you be wearing?

********
[Previous: part one here #NHSWDP 1  – From the event “Digital Participation and Health Literacy: Opportunities for engaging citizens” held at the King’s Fund, London, March 18, 2015]

[Previous: part two #NHSWDP 2: Smartphones: the single most important health treatment & diagnostic tool at our disposal]

********

Apple ResearchKit: http://techcrunch.com/2015/03/09/apple-introduces-researchkit-turning-iphones-into-medical-diagnostic-devices/#lZOCiR:UwOp
Digital nudges – the Tyranny of the Should by Maneesha Juneja http://maneeshjuneja.com/blog/2015/3/2/the-tyranny-of-the-should

You may use these HTML tags and attributes: <blockquote cite="">

The care.data engagement – is it going to jilt citizens after all? A six month summary in twenty-five posts.

[Note update Sept 19th: after the NHS England AGM in the evening of Sept 18th – after this care.data engagement post published 18hrs earlier – I managed to ask Mr.Kelsey, National Director for Patients and Information, in person what was happening with all the engagement feedback and asked why it had not been made publicly available.

He said that the events’ feedback will be published before the pathfinder rollout begins, so that all questions and concerns can be responded to and that they will be taken into account before the pathfinders launch.

When might that be, I asked? ‘Soon’.

Good news? I look forward to seeing that happen. My open questions on commercial uses and more, and those of many others I have heard, have been captured in previous posts, in particular the most recent at the end of this post. – end of update.]

Medical data has huge power to do good, but it presents risks too. When leaked, it cannot be unleaked. When lost, public trust cannot be easily regained. That’s what broken-hearted Ben Goldacre wrote about care.data on February 28th of this year, ten days after the the pause was announced on February 18th [The Guardian] .

Fears and opinions, facts and analysis, with lots and lots of open questions. That’s what I’ve written up in the following posts related to care.data since then, including my own point-of-view and feedback from other citizens, events and discussions. All my care.data posts are listed here below, in one post, to give an overview of the whole story, and any progress in the six months ‘listening’ and ‘engagement’.

So what of that engagement? If there really have been all these events and listening, why has there been not one jot of public feedback published? This is from September 2014, I find it terrifyingly empty of anything but discussing change in communications of the status quo programme.

I was at that workshop, hosted by Mencap on communicating

with vulnerable and excluded groups the article mentions. It was carefully managed, with little open room discussion to share opinions cross groups (as the Senior Policy Adviser at Signature pointed out.) Whilst we got the NHS England compilation of the group feedback afterwards, it was not published. Maybe I should do that and ask how each concern will be addressed? I didn’t want to stand on the NHS England national comms. toes, assuming it would be, but you know, what? If the raw feedback says from all these meetings, these are our concerns and we want these changes, and none are forthcoming, then the public should justifiably question the whole engagement process.

It’s public money, and the public’s data. How both are used and why, is not to be hidden away in some civil service spreadsheet. Publish the business case. Publish the concerns. Publish how they are to be addressed.

From that meeting and the others I have been to, many intelligent questions from the public remain unanswered. The most recent care.data advisory workshop summarised many from the last year, and brought out some minority voices as well.

 

On the day of NHS Citizen, the new flagship of public involvement, people like me who attended the NHS England Open Day on June 17th, or care.data listening events, may be understandably frustrated that there is no publicly available feedback or plan of any next steps.
care.data didn’t make it into the NHS Citizen agenda for discussion for the 18th. [Many equally other worthy subjects did, check them out here if not attending or watch it online.] So from where will we get any answers? Almost all the comment, question and feedback I have heard at events has been constructively critical, and worthy of response. None is forthcoming.

 

Instead, the article above, this reported speech by Mr.Kelsey and its arguments, make me think engagement is going nowhere. No concerns are addressed. PR is repeated. More facts and figures which are a conflation of data use for clinical treatment and all sorts of other uses, are presented as an argument for gathering more data.

Citizens do not need told of the benefits. We need concrete steps taken in policy, process and practice, to demonstrate why we can now trust the new  system.

Only then is it worthwhile to come back to communications.

How valued is patient engagement in reality, if it is ignored?

How will involvement continue to be promoted in NHS Citizen and other platforms, if it is seen to be ineffective?

How might this affect future programmes and our willingness to get involved in clinical research?

I sincerely hope to see the raw feedback published very soon, which NHS England has gathered in their listening events. How that will be incorporated into any programme changes, as well as  communications, will go a long way to assuring the quantity in numbers and quality of cross-population participation.

The current care.data status is in limbo, as we await to see if and when any ‘pathfinder’ CCGs will be announced that will guinea pig the patient records from the GP practices in a trial rollout, in whatever form that may take. The latest official statements from Mr.Kelsey have been on 100-500 practices, but without any indicator of where or when. He suggests ‘shortly’.

What next for care.data? I’ll keep asking the questions and hope we hear some answers from the NHS England Patients and Information Directorate. Otherwise, what was the [&88!@xY!] point of a six month pause and all these efforts and listening?

Publish the business case. Publish the concerns. Publish how they are to be addressed.

What is there to hide?

After this six-month engagement, will there be a happy ending? I feel patients are about to be left jilted at the eleventh hour.
******

You’ll find my more recent posts [last] have more depth and linked document articles if you are looking for more detailed information.

******

March 31st: A mother’s journey – intro

March 31st: Transparency

April 3rd: Communication & Choice

April 4th: Fears & Facts

April 7th: What is care.data? Defined Scope is vital for Trust

April 10th: Raw Highlights from the Health Select Committee

April 12th: care.data Transparency & Truth, Remit & Responsibility

April 15th: No Security Blanket : why consent packages fail our kids

April 18th: care.data : Getting the Ducks in a Row

April 23rd: an Ode to care.data (on Shakespeare’s anniversary)

May 3rd: care.data, riding the curve: Change Management

May 15th: care.data the 4th circle: Empowerment

May 24th: Flagship care.data – commercial uses in theory [1]

June 6th: Reality must take Precedence over Public Relations

June 14th: Flagship care.data – commercial use with brokers [2]

June 20th: The Impact of the Partridge Review on care.data

June 24th: On Trying Again – Project Lessons Learned

July 1st: Communications & Core Concepts [1] Ten Things Learned at the Open House on care.data and part two: Communications and Core Concepts [2] – Open House 17th June Others’ Questions

July 12th: Flagship care.data – commercial use in Practice [3]

July 25th: care.data should be like playing Chopin – review after the HSCIC Data Sharing review ‘Driving Positive Change’ meeting

July 25th: Care.data should be like playing Chopin – but will it be all the right notes, in the wrong order? Looking forwards.

August 9th: care.data and genomics : launching lifeboats [Part One] the press, public reaction and genomics & care.data interaction

August 9th: care.data and genomics : launching lifeboats [Part Two] Where is the Engagement?

September 3rd: care.data – a Six Month Pause, Anniversary round up [Part one] Open questions: What and Who?

September 3rd: care.data – a Six Month Pause, Anniversary round up [Part two] Open questions: How, Why, When?

September 16th: care.data cutouts – Listening to Minority Voices Includes questions from those groups.

September 16th: care.data – “Anticipating Things to Come” means Confidence by Design

October 30th: patient questions on care.data – an open letter

November 19th: questions remain unanswered: what do patients do now?

December 9th: Rebuilding trust in care.data

December 24th: A care.data wish list for 2015

2015 (updated after this post was published, throughout the year)

January 5th 2015: care.data news you may have missed

January 21st 2015: care.data communications – all change or the end of the line?

February 25th 2015: care.data – one of our Business Cases is Missing.

March 14th 2015: The future of care.data in recent discussions

March 26th 2015: Wearables: patients will ‘essentially manage their data as they wish’. What will this mean for diagnostics, treatment and research and why should we care? [#NHSWDP 3]

May 10th 2015: The Economic Value of Data vs the Public Good? [1] care.data, Concerns and the cost of Consent

The Economic Value of Data vs the Public Good? [2] Pay-for-privacy, defining purposes

The Economic Value of Data vs the Public Good? [3] The value of public voice.

May 14th 2015: Public data in private hands – should we know who manages our data?

June 20th 2015: Reputational risk. Is NHS England playing a game of public confidence?

June 25th 2015: Digital revolution by design: building for change and people (1)

July 13th 2015: The nhs.uk digital platform: a personalised gateway to a new NHS?

July 27th 2015: care.data : the economic value of data versus the public interest? (First published in StatsLife)

August 4th 2015: Building Public Trust in care.data sharing [1]: Seven step summary to a new approach

August 5th, 2015: Building Public Trust [2]: a detailed approach to understanding Public Trust in data sharing

August 6th 2015: Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

August 12th 2015: Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

August 17th 2015: Building Public Trust [5]: Future solutions for health data sharing in care.data

September 12th 2015: care.data: delayed or not delayed? The train wreck that is always on time

****

Questions, ideas, info & other opinions continue to be all welcome. I’ll do my best to provide answers, or point to source sites.

For your reference and to their credit, I’ve found the following three websites useful and kept up to date with news and information:

Dr. Bhatia, GP in Hampshire’s care.data info site

HSCIC’s care.data site

medConfidential – campaign for confidentiality and consent in health and social care – seeking to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent

 

 

 

Launching genomics, lifeboats, & care.data [part 2]

“On Friday 1st August the media reported the next giant leap in the genomics programme in England, suggesting the 100K Genomics Project news was akin to Kennedy launching the Space Race. [1] [from 2:46.30].”

[Part one of this post is in this link, and includes thinking about care.data & genomics interaction].

Part two:

What is the expectation beyond 2017?

The investment to date may seem vast if, like me, you are unfamiliar with the amounts of money that are spent in research [in 2011 an £800M announcement, last summer £90M in Oxford as just two examples], and Friday revealed yet more money, a new £300M research package.  It is complex how it all adds up, and from mixed sourcing. But the stated aim of the investment is relatively simple: the whole genomes of 75,000 people [40K patients and 35K healthy relatives] are to be mapped by 2017.

Where the boundary lies between participation for clinical care and for research is less clear in the media presentation. If indeed participants’ results will be fed back into their NHS care pathway,  then both aims seem to be the intent of the current wave of participants.

It remains therefore perhaps unclear, how this new offering interacts with the existing NHS genetic services for direct clinical care, or the other research projects such as the UK Biobank for example, particularly when aims appear to overlap:.

“The ultimate aim is to make genomic testing a routine part of clinical practice – but only if patients and clinicians want it.” [Genomics England, how we work]

The infrastructure of equipment is enormous to have these sequences running 24/7 as was indicated in media TV coverage. I’m no maths whizz, but it appears to me they’re building Titantic at Genomics England and the numbers of actual people planned to take part (75K) would fit on the lifeboats. So with what, from whom, are they expecting to fill the sequencing labs after 2017?  At Genomics England events it has been stated that the infrastructure will then be embedded in the NHS. How is unclear, if commercial funding has been used to establish it. But at its most basic, there will be  no point building the infrastructure and finding no volunteers want to take part. You don’t build the ship and sail without passengers. What happens, if the English don’t volunteer in the desired numbers?

What research has been done to demonstrate the need or want for this new WGS project going forwards at scale, compared with a) present direct care or b) existing research facilities?

I cannot help but think of the line in the film, Field of Dreams. If you build it they will come. So who will come to be tested? Who will come to exploit the research uses for public good? Who will come in vast numbers in our aging population to exploit the resulting knowledge for their personal benefit vs companies who seek commercial profit? How will the commercial and charity investors, make it worth their while? Is the cost/benefit to society worth it?

All the various investors in addition to the taxpayer; Wellcome Trust, the MRC, Illumina, and others, will want to guarantee they are not left with an empty shell. There is huge existing and promised investment. Wellcome for example, has already “invested more than £1 billion in genomic research and has agreed to spend £27 million on a world class sequencing hub at its Genome Campus near Cambridge. This will house Genomics England’s operations alongside those of the internationally respected Sanger Institute.”

Whilst the commercial exploitation by third parties is explicit, there may also be another possibility to consider: would the Government want:

a) some cost participation by the participants? and

b) will want to sell the incidental findings’ results to the participants?

[ref: http://www.phgfoundation.org/file/10363 ref. #13]

“Regier et al. 345 have estimated the willingness-to-pay (WTP) for a diagnostic test to find the genetic cause of idiopathic developmental disability from families with an affected child. They used a discrete choice experiment to obtain WTP values and found that these families were willing to pay CDN$1118 (95% CI CDN$498-1788) for the expected benefit of twice as many diagnoses using aCGH and a reduction in waiting time of 1 week when compared to conventional cytogenetic analysis.”

“Moreover, it is advisable to minimise incidental findings where possible; health care professionals should not have an obligation to feedback findings that do not relate to the clinical question, except in cases where they are unavoidably discovered and have high predictive value. It follows that the NHS does not have an obligation to provide patients with their raw genome sequence data for further analysis outside of the NHS. We make no judgement here about whether the individual should be able to purchase and analyse their genome sequence independently; however, if this course of action is pursued, the NHS should provide follow-up advice and care only when additional findings are considered to be of significant clinical relevance in that individual…” [13]

How much is that cost, per person to be mapped? What is the expected return on the investment?

What are the questions which are not being asked of this huge state investment, particularly at a time when we are told he NHS is in such financial dire straits?

Are we measuring the costs and benefits?

Patient and medical staff support is fundamental to the programme, not an optional extra. It should not be forgotten that the NHS is a National Service owned by all of us. We should know how it runs. We should know what is spends. Ultimately, it is we who pay for it.

So let’s see on paper, what are the actual costs vs benefits? Where is the overall and long term cost benefit business case covering the multi-year investment, both of tangible and intangible benefits? In my personal research, I’m yet to find one. There is however, some discussion in this document:

“The problem for NGS is that very little ‘real’ information is available on the actual costs for NGS from the NHS perspective and the NHS Department of Health Reference Costs Database and PSSRU, where standard NHS costings are listed, are generally not helpful.” [13 – PHG, 2011]

Where are the questions being asked if this is really what we should be doing for the public good and for the future of the NHS?

Research under good ethics and bona fide transparent purposes is a public asset. This rollout, has potential to become a liability.

To me, yet again it seems, politics has the potential to wreck serious research aims and the public good.

Perhaps more importantly, the unrestrained media hype carries the very real risk of creating unfounded hope for an immediate diagnosis or treatment, for vulnerable individuals and families who in reality will see no personal benefit. This is not to undermine what may be possible in future. It is simply a plea to rein in hype to reality.

Politicians and civil servants in NHS England appear to use both research and the notion of the broad ‘public good’, broadly in speeches to appear to be doing ‘the right thing to do’, but without measurable substance. Without a clear cost-benefit analysis, I admit, I am skeptical. I would like to see more information in the public domain.

Has the documentation of the balance of patient/public good and  expected “major contribution to make to wealth creation and economic growth in this country” been examined?

Is society prepared for this?

I question whether the propositions of the initiative have been grasped by Parliament and society as a whole, although I understand this is not a ‘new’ subject as such. This execution however, does appear at least, massive in its practical implications, not least for GPs if it is to become so mainstream, as quickly as plans predict. It raises a huge number of ethical questions. Not least of which will be around incidental findings, as the Radio 4 interview raised.

The first I have is consideration of pre-natal testing plans:

“Aside from WGS of individuals, other applications using NGS could potentially be more successful in the DTC market. For example, the use of NGS for non-invasive prenatal testing would doubtless be very popular if it became available DTC prior to being offered by the NHS, particularly for relatively common conditions such as Down syndrome…” [

and then the whole question of consent, particularly from children:

“…it may be almost impossible to mitigate the risk that individuals may have their genome sequenced without their consent. Some genome scan companies (e.g. 23andMe) have argued that the risks of covert testing are reduced by their sample collection method, which requires 2ml of saliva; in addition, individuals are asked to sign to confirm that the sample belongs to them (or that they have gained consent from the individual to whom it belongs). However, neither of these methods will have any effect on the possibility of sequencing DNA from children, which is a particularly contentious issue within DTC genomics.” [13]

“two issues have emerged as being particularly pressing: first is the paradox that individuals cannot be asked to consent to the discovery of risks the importance of which is impossible to assess. Thus from a legal perspective, there is no ‘meeting of minds’ and contractually the contract between researcher and participant might be void. It is also unclear whether informed consent is sufficient to deal with the feedback of incidental findings which are not pertinent to the initial research or clinical question but that may have either clinical or personal significance…” [PHG page 94]

And thirdly, we should not forget the elderly. In February 2014 the Department of Health proposed that a patient’s economic value should be taken into account when deciding on healthcare. Sir Andrew Dillon, head of the National Institute for Healthcare and Excellence (NICE, who set national healthcare budgets and priorities), disagreed saying:
“What we don’t want to say is those 10 years you have between 70 and 80, although clearly you are not going to be working, are not going to be valuable to somebody.

Clearly they are. You might be doing all sorts of very useful things for your family or local society. That’s what we are worried about and that’s the problem with the Department of Health’s calculation.

There are lots of people who adopt the fair-innings approach; ‘you’ve had 70 years of life you’ve got to accept that society is going to bias its investments in younger people.”

[14 – see Channel 4] Yet our population is ageing and we need to find a balance of where roles, rules and expectations meet. And question, how do we measure human value, should we, and on what basis are we making cost-based care decisions?

The Department of Health proposed that a patient’s economic value should be taken into account when deciding on healthcare. What is their thinking on genomics for the care of the elderly?

Clinical environment changes make engagement and understanding harder to achieve

All this, is sitting on shifting, fundamental questions on how decision making and accountability will be set, in a world of ever fragmenting NHS structure:

“More problematic will be the use of specific genomic technologies such as NGS in patient pathways for inherited disorders that are delivered outside the clinical genetics services (such as services for FH, haemophilia and sickle cell disease) and NGS that is used for non-inherited disease conditions. These will be commissioned by GP consortia within established care pathways. Such commissioning of companion diagnostics would, in theory be evaluated first by NICE. However, it is not clear what capacity NICE will have across a broad range of uses. In practice it seems likely that GP consortia may make a variety of different decisions influenced by local experts and pressure, funding and different priorities. Particular questions for NGS will include: How will commissioners be provided with the necessary evidence for decision-making and can this be developed and coordinated at a national level? How will commissioners prioritise particularly when it may be necessary to invest early in order to achieve savings later? What (if any) influence may commissioners be able to exert over the configuration of test providers (for example the rationalisation of laboratories or the use of private testing companies)? [13]
Today (August 8th) the public row between Roche and the Government through NICE became apparant on cancer treatment. And again I found myself asking, what are we not funding, whilst we spend on genomics?  If you did not you hear Sir Andrew Dillon & the discussion, you can listen again on BBC Radio 2 iPlayer here. [It’s in the middle of the programme, and begins at 01:09.06.]

Questions, in search of an answer
Where has the population indicated that this is the direction of travel we wish our National Health Service to take? What preparation has been made for the significant changes in society it will bring? When was Parliament asked before this next step in policy and huge public spend were signed off and where is the periodic check against progress and public sign off, of the next step? Who is preparing the people and processes for this explosive change, announced with sparklers, at arms length and a long taper? Are the challenges being shared honestly between policy, politicians and scientists, being shared with patients and public: as discussed at the stakeholder meeting at St.Barts London, 3rd October 2013 (a key panel presentation: 45 minute video with slides)? When will that be shared with the public and NHS staff in full? Why does NHS England feel this is so fundamental to the future of the NHS? Must we abandon a scuppered and sinking NHS for personalised medicine on personal budgets and expectations of increased use of private health insurance?

Is genomics really the lifeboat to which the NHS is inextricably bound?

The Patients and Information Directorate nor wider NHS England Board does not discuss these questions in public.  At the July 3rd 2014 Board Meeting, in the discussion of the genomics programme I understood the discussion as starting to address the inevitable future loss of equity of access because of genomic stratification, dividing the population into risk pool classifications [10.42] . To my mind, that is the end of the free-to-all NHS as we know it. And IF it is so, through planned policy. More people paying for their own care under ‘personalisation;  is in line with ISCG expectations set out  earlier in 2014: “there will be increasing numbers of people funding their own care and caring for others.”

Not everyone may have understood it that way, but if not, I’d like to know what was meant.

I would like to understand what is meant when Genomics England spokespeople  say the future holds:

“Increasingly to select most appropriate treatment strategy. In the longer term, potential shift to prevention based on risk-based information.”
or
“Review the role of sequencing in antenatal and adult screening.”

I would welcome the opportunity to fully understand what was suggested at that Board meeting as a result of our shared risk pool, and readers should view it and make up their own mind. Even better, a frank public and/or press board meeting with Q&A could be rewarding.

The ethical questions that are thrown up by this seem yet to have little public media attention.

Not least, incidental findings: if by sequencing someone’s DNA, you establish there is something for their health that they ought to be doing soon, will you go to that patient and say look, you should be doing this…. these are incidental findings, and may be quite unexpected and separate from the original illness under investigation in say, a family member, and may also only suggest risk indicators, not clear facts.

If this is expected to be mainstream by 2018, what training plans are in place as indicated needed as a “requirement for professionals across the NHS to be trained in genetics and its implications”? [presentation by Mark Bale, DoH, July 2014]

When will we get answers to these questions, and more?

Because there is so much people like me don’t know, but should, if this is our future NHS under such fundamental change as is hyped.

Because even the most esteemed in our land can get things wrong. One of them at the St.Bart’s events quotes on of my favourite myths attributed wrongly to Goethe. It cannot be attributed to him, that he said, ” “Whatever you can do or dream you can, begin it. Boldness has genius, power and magic in it.” You see, we just hear something which sounds plausible, from someone who seems to know what they are talking about. It isn’t always right.

Because patients of rare disease in search of clinical care answers should be entitled to have expectations set appropriately, and participants in research know to what they, and possibly family members indirectly, are committed.

Because if the NHS belongs to all of us, we should be able to ask questions and expect answers about its planning,  how we choose to spend its budget and how it will look in future.

These are all questions we should be asking as society

Fundamentally, in what kind of society will my children grow up?

With the questions of pre-natal intervention, how will we shape our attitudes towards our disabled and those who are sick, or vulnerable or elderly? Are we moving towards the research vision Mr.Hunt, Cameron and Freeman appear to share, only for good, or are we indeed to look further head to a Gattacan vision of perfection?

As we become the first country in the world to permit so called ‘three parent children’ how far will we go down the path of ‘fixing’ pre-natal genetic changes, here or in PGD?

How may this look in a society where ‘some cornflakes get to the top‘ and genetic advantage seen as a natural right over those without that ability? In a state where genetics could be considered as part of education planning? [16]

For those with lifelong conditions, how may genetic screening affect their life insurance when the Moratorium expires*  in 2017 (*any shift in date TBC pending discussion) ? How will it affect their health care, if the NHS England Board sees a potential effect on equity of access? How will it affect those of us who choose not to have screening – will we be penalised for that?

And whilst risk factors may include genomic factors, lifestyle factors some argue are even more important, but these change over time. How would those, who may have had past genetic screening be affected in future requirements?

After the August 1st announcement, [11] The Wellcome Trust‘s reporting was much more balanced and sensible than the political championing had been. It grasps the challenges ahead:

“Genomics England has ambitious plans to sequence 100,000 genomes from 75,000 people, some of whom will also have cancer cells sequenced. The sheer scale of the plans is pretty daunting. The genetic information arising from this project will be immense and a huge challenge for computational analysis as well as clinical interpretation. It will also raise a number of issues regarding privacy of patient data. Ensuring that these genetic data can be used maximally for patient benefit whilst protecting the rights of the individual participant must be at the heart of this project.

At the beginning of the Human Genome Project, scientists and funders like the Wellcome Trust knew they were on a journey that would be fraught with difficulties and challenges, but the long-term vision was clear. And so it is with the plans for Genomics England, it will most certainly not be easy…”

Managing change

Reality is that yet again, Change Management and Communications have been relegated to the bottom of the boarding priorities list.

This is not only a research technology or health programme. Bigger than all of that is the change it may bring. Not only in NHS practice, should the everyday vision of black boxes in GP surgeries become reality, but for the whole of society. For the shape of society, in age and diversity. Indeed if we are to be world leaders, we have potential to start to sling the world on a dangerous orbit if the edges of scope are ill defined. Discussing only with interested parties, those who have specific personal or business interests in genomic research and data sharing, whilst at Board meetings not clearly discussing the potential effects of risk stratification and personalisation on a free at the point of delivery health service is in my opinion, not transparent, and requires more public discussion.

After all, there are patients who are desperate for answers, who are part of the NHS and need our fair treatment and equity of access for rare disease. There is the majority who may not have those needs but knows someone who does. And we all fund and support the structure and staff in our world class service, we know and love. We want this to work well.

Future research participation depends on current experience and expectations. It is the latter I fear are being currently mishandled in public and the media.

Less than a month ago, at the NHS England Board Meeting on July 3rd,  Lord Adebowale very sensibly asked, “how do we lead people from where we are, and how we take the public with us? We need to be a world leader in engaging all the public”

Engagement is not rocket science. But don’t forget the ethics.

If this project is meant to be, according to MP George Freeman [George 2], akin to Kennedy launching the Space Race, then, by Fenyman [12], why can they not get their public involvement at big launches sorted out?

Is it because there are such large gaps and unknowns that questioning will not stand up to scrutiny? Is it because suggesting a programme will end the NHS as we know it, would be fatal for any politician or party who supports that programme in the coming year? Or do the leading organisations possibly paternalistically believe the public is too dim or uninterested or simply working to make ends meet to care [perhaps part of the 42% of the population who expected to struggle as a result of universal welfare changes,  one in three main claimants (34 per cent) said in 2012 they ‘run out of money before the end of the week/month always or most of the time’] ? But why bother will the big press splash, if it should not make waves?

In the words of Richard Feynman after the Challenger launch disaster in 1986:

“Let us make recommendations to ensure that NASA officials deal in a world of reality in understanding technological weaknesses and imperfections well enough to be actively trying to eliminate them. They must live in reality in comparing the costs and utility of the Shuttle to other methods of entering space. And they must be realistic in making contracts, in estimating costs, and the difficulty of the projects.

Only realistic flight schedules should be proposed, schedules that have a reasonable chance of being met.

If in this way the government would not support them, then so be it. NASA owes it to the citizens from whom it asks support to be frank, honest, and informative, so that these citizens can make the wisest decisions for the use of their limited resources. For a successful technology, reality must take precedence over public relations… [June 6th 1986. Six months after the disaster, the Report to the Presidential Commission (Appendix F)]

Just like the Rosetta spacecraft is getting ever closer to actually landing on the comet, its goal, [15 – BBC Newsround has an excellent little summary] after over ten years, so too is genomics close to the goal of many. It is within grasp that the long-planned mainstreaming of genomic intervention, will touch down in the NHS. My hope is that in its ever closer passes, we get hard factual evidence and understand exactly where we have come from, and where we intend going. What will who do with the information once collected?

The key is not the landing, it’s understanding why we launched in the first place.

Space may not be the most significant final frontier out there in the coming months that we should be looking at up close. Both in health and science.  Our focus in England must surely be to examine these plans with a microscope, and ask what frontiers have we reached in genomics, health data sharing and ethics in the NHS?

******  image source: ESA via Nature

[1] “It’s a hugely ambitious project, it’s on a par with the space race how Kennedy launched 40 years ago.” [from 2:46.30 BBC Radio 4 Int. Sarah Montague w/ George Freeman]

[2] Downing Street Press Release 1st August – genomics https://www.gov.uk/government/news/human-genome-uk-to-become-world-numb

[3] 6th December “Transcript of a speech given by Prime Minister at the FT Global Pharmaceutical and Biotechnology Conference” [https://www.gov.uk/government/speeches/pm-speech-on-life-sciences-and-opening-up-the-nhs]

[4] 10th December 2012 DNA Database concerns Channel 4 http://www.channel4.com/news/dna-cancer-database-plan-prompts-major-concerns

[5] Wellcome Trust- comment by Jeremy Farrar http://news.sky.com/story/1311189/pm-hails-300m-project-to-unlock-power-of-dna

[6] Strategic Priorities in Rare Diseases June 2013 http://www.genomicsengland.co.uk/wp-content/uploads/2013/06/GenomicsEngland_ScienceWorkingGroup_App2rarediseases.pdf

[7] NHS England Board paper presentation July 2013 http://www.england.nhs.uk/wp-content/uploads/2013/07/180713-item16.pdf

[8] ICO and HSCIC on anonymous and pseudonymous data in Computing Magazine http://www.computing.co.uk/ctg/news/2337679/ico-says-anonymous-data-not-covered-by-data-protection-act-until-its-de-anonymised

[9] HSCIC Pseudonymisation Review August 2014 http://www.hscic.gov.uk/article/4896/Data-pseudonymisation-review

[10] November 2013 ISCG – political pressure on genomics schedule http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-001-ISCG-Meeting-Minutes-and-Actions-26-November-2013-v1.1.pdf

[11] Wellcome Trust August 1st 2014 The Genetic Building Blocks of Future Healthcare

[12] Fenyan – For successful technology reality must take precedence over PR https://jenpersson.com/successful-technology-reality-precedence-public-relations/

[13] Next Steps in the Sequence – the implications for whole genome sequencing in the UK – PHG Foundation, funded by the PHG Foundation, with additional financial support from Illumina. The second expert workshop for the project was supported by the University of Cambridge Centre for Science and Policy (CSaP) and the Wellcome Trust http://www.phgfoundation.org/file/10363

[14] Anti-elderly drugs proposals rejected by NICE: Channel 4 http://www.channel4.com/news/nice-assessment-elderly-health-drugs-rejected-contribution

[15] BBC Newsround: Rosetta spacecraft and the comet chasing

[16] Education committee, December 4th 2013 including Prof. Plomin From 11.09:30 education and social planning  http://www.parliamentlive.tv/Main/Player.aspx?meetingId=14379

*****

For avoidance of confusion [especially for foreign readership and considering one position is so new], there are two different Ministers mentioned here, both called George:

One. George Osborne [George 1] MP for Tatton, Cheshire and the Chancellor

Two. George Freeman [George 2] MP – The UK’s first-ever Minister for Life Sciences, appointed to this role July 15th 2014 [https://www.gov.uk/government/ministers/parliamentary-under-secretary-of-state–42]

 

*****

care.data should be like playing Chopin – or will it be all the right notes, but in the wrong order? [Part one]

Five months after the most recent delay to the care.data launch, I’ve come to the conclusion that we must seek long-term excellence in its performance, not content ourselves with a second-rate dress rehearsal.

“Sharing our medical records, is like playing Chopin. Done well, it has the potential to demonstrate brilliance. It separates the good, the bad and the ugly, from the world-class players.  But will we get it right, or will we look back at repeat dire performances and can say, we knew all the right notes, but got them all in the wrong order?”

Around 100 interested individuals filled a conference room at the King’s Fund, on Cavendish Square in London last Monday, July 21st, where the Health and Social Care Information Centre (HSCIC) [1] held a meeting to publicly discuss the Partridge Review [2] and HSCIC data sharing policies, practices and stakeholder expectations going forward.  Driving Positive Change.[3]

The vast majority were from organisations which are data users, some names familiar from the care.data press coverage in spring, [Beacon Consulting, Harvey Walsh] plus many university and charity driven researchers.

Sir Kingsley Manning, Sir Nick Partridge and Andy Williams [The  CEO since April 2014] all representing HSCIC, spoke about the outcomes of the PWC audit, which sampled 10% of the releases of identifiable or pseudonymous data sharing agreements for closer review, and what is termed ‘Back Office’ access (by the police, Home Office, court orders) in the eight years as the NHS IC prior to the HSCIC rebrand and changes on April 1st, 2013.

“The standard PwC methodology was adopted for sample testing data releases with the prevailing governance arrangements. Samples were selected for each of the functional areas under review. Of the total number of data releases identified (3,059); approximately a 10% sample was tested in total.” (Report, Data Release Review June 2014)

I believe it is of value to understand how we got here as well as the direction in which the HSCIC is moving. This is what the meeting sought to do, to first look back and then look forward. They are Data Controller and Processor of our health records and personal identifiable data. As care.data pathfinder pilots approach at a pace, set for ‘autumn’, the changes in the current processes and procedures for data handling will not only effect records which are already held, from our hospital care and other health settings‘, but they will have a direct effect on how our medical records extracted from GP practices will be treated, for care [dot] data in the future.

Data Management thus far has failed to meet the standards of world class delivery; in collection, governance and release

After the event, walking back to the train home, I passed the house from which Chopin left, to play his last concert. [4]

It made me think, that sharing our medical records, is like playing Chopin. Done well, it has potential for brilliance. It separates the good, the bad and the ugly, from the world-class players. Even more so, when played as part of suite, where standards are understood and interoperable . Data sharing demands technical precision, experience and discipline. Equally, gone wrong, we can look back at past performances and say, we had world class potential and knew all the right notes, but got them all in the wrong order. Where did we fail? Will we learn, or let it repeat?

The 2.5 hour event, focused more on the attendees’ main interest, how they will be affected by any changes in the release process. Some had last received data before the care.data debacle in February put a temporary halt on releases.

As a result of planned changes, will some current data customers find, that they have already received data for the last time, I wonder?

After the initial review of the critical findings in the Partridge report, the discussion centred on listening to suggestions what may be done in England to prevent future fails. But in fact, I think we should be going further. We should be looking at what we are doing in England to be the world-class player that the Prime Minister said he wants.[5]

We are focused on making the best of a bad job, when we could be looking at how to be brilliant.

To me, the meeting missed a fundamental point. Before they decide the finer points of release, they need to ensure there will be data to collect. There was not one mention of the public’s surprise that our data was collected and had been sold or shared with each of them until last spring. So now that the public in part knows about it, the recipients should also consider we are watching them closely.

Data users are being judged as one, by their group performance

What the data recipients may or may not be conscious of, is that they too each are helping to shape the orchestra and will determine the overall sound that is heard outside.

They may not realise that as data recipients, we citizens, the data providers, will see and hear their actions and respond to them all collectively, in terms of what impact it may have on our opt in/out decision.

I heard on Monday one or two shriller voices from global data intermediaries claiming that others had been receiving data whilst their own requests had been overlooked. As of last Friday, HSCIC said 627 requests were on standby, waiting for review and to know whether or not they would receive data. Currently HSCIC is getting 70 new requests a month. Bearing in mind the attendees were mostly data users, they can be forgiven that they were mostly concerned about data release and use, but they did in part also raise the importance of correct communication, governance and consent of extraction. They realise without future public trust, there is no future data store.

One consultancy however, seemed to want to blame all the other players for their own past mistakes, though there was no talk of any blame in any discussion otherwise. They asked, what about the approvals process for SUS (Secondary Uses Service data), how are those being audited and approved, is it like HES? How about HSCIC getting their act together on opt out, putting power back in the hands of patients, they asked. What about the National Cancer Registries, ONS (Office of National Statistics), all the data which is not HES, will there be one entrance point to access all these data stores for all requests? And as for insurance concerns by patients, the same said, people were foolish to be concerned. Why, “if they don’t get our health data then all the premiums will go up.”

My my, it did feel a little like a Diva having a tantrum at the rest of the performers for messing up her part. And she would darn well pull the rest of them into the pit with her if she was going to get cancelled. In true diva style, I’m sure that company didn’t even realise it.

But all those data recipients are in the same show now – if one of them screws up badly, the critics will slam them all. And with it, their providers of data, we patients, will not share our data. Consent and confidentiality are golden tickets and will not be given up lightly. If  all the data-using players perform well, abide by the expected standards, and treat both critics, audience and each other with proper etiquette, then they will get their pay, and get to stay in the show. But it won’t be a one time deal. They will need to learn continuously, do whatever the show conductor asks, and listen and learn from the critics as they perform in future, not slacking off or getting complacent.

Whilst the meeting discussed past failings in the NHS IC, I hope the organisations will consider what has truly shocked the public is some of the uses to which data has been put. How the recipients used it. They need to examine their own practices as much as HSCICs.

The majority of the attendees were playing from the same score, asking future questions which I will address in detail in part two.

The vast majority asked, how will the data lab work? And other Research users asked many similar and related questions. [This from medConfidential [6] whilst on the similar environment for accredited safe havens, goes some way to explaining the principle of a health research remote data lab (HRRDL).]

Governance questions were raised. Penalties were an oft recurring theme and local patient representative group and charity representatives, asked how the new DAAG lay person appointments process would work and be transparent.

Other questions on past data use, were concerned with the volume of Back Office data uses. The volume of police tracing for example. How person tracing by the border agency, particularly with reference to HIV and migrant health, which may reveal data to border agencies which would not normally be shared by the patients’ doctors. “If people are going to have confidence in HSCIC, this was a matter of policy which needed looking at in detail. ” The HSCIC panel noted that they also understood there were serious concerns on the quantity of intra-government departments sharing, the HMRC, Home and Cabinet Offices getting mentions.  “There was debate to be had”, he said.

And  what do you think of the show so far? [7]

They’re collectively recovering from unexpected and catastrophic criticism at the start of the year. It is still having a critical effect on many organisations because they don’t have access to the data exactly as they used to, with a backlog built up after a temporary stop on the flow which was restarted after a couple of months. HSCIC has reviewed themselves, in part, and any smart attendees on Monday will know how each of their organisations have fared. The audit has found some of their weaknesses and sought to address them. There is a huge number of changes, definitions and open considerations under discussion and not yet ready to introduce. They realise there is a great amount of work still to be done, to bring the theory into practice, test it out, edit and get to a point where they are truly ready for a new public performance.

But none of the truly dodgy sounding instruments have been kicked out yet. I would suggest there are simply organisations which are not themselves of the same standards of ethics and physical best practices which deserve to manage our data. They will bring down the whole, and need rejected – the commercial re-use licenses of commercial intermediaries. And the playing habits of the data intermediaries need some careful attention, drawing the line between their clinical support work and their purely commercial purposes. The pace may have slowed down, but data is still flowing out, and there was no recognition that this may be without data protection permission or best practice, if individuals aren’t aware of their data being used in this way. The panel conducted a well organised and orderly discussion, but there were by far more open questions, than answers ready to be given.

What we do now, sets the future stage of all data sharing, in the UK and beyond – to be brilliant, will take time to get right

How HSCIC puts into action and implements the safeguards, processes and their verbal plans to manage data in the short and medium term, will determine much for the future of data governance in England, and the wider world. Not only in terms of the storage and release of data – its technical capability and process governance, but in the approach to data extraction, fair processing, consent, communication and ongoing management.

This is all too important to rush, and I hope that the feedback and suggestions captured on the day will be incorporated into the production. To do so well, will need time and there is no point in some half-ready dress rehearsal when so much is yet to be done.

The next Big Thing – care.data

When it came to care.data, Andy Williams said it had been a serious failing to not recognise that patients view their GP records quite, totally differently, from the records held at a hospital. Sharing their HES data.

“And it is their data, at the end of the day,” he recognised.

So to conclude looking back, I believe where data sharing has reached, is leaps and bounds ahead of where it was six months ago. The Partridge Review and its recommendations recognises there are problems and makes 9 recommendations. There is lots more the workshop suggested for consideration. If HSCIC wants to achieve brilliance, it needs to practise before going out on a public stage again. The excellence of Chopin’s music does not happen by chance, or through passion alone. To achieve brilliance we cannot follow some romantic notion of ‘it will all be alright on the night’. Hard edged, technical experience knows world-class delivery demands more.

So rolling out care.data as a pathfinder model in autumn before so much good preparation can possibly be done, is in my opinion, utterly pointless. In fact, it would be damaging. It will be like pushing  a grade 5 school boy who’s not ready into the limelight, and just wishing him luck, while you wait whistling in the wings. But what will those in charge say?

Will our health data sharing be a virtuoso performance [8]? Or will we end up with a second rate show, where we will look back and say, we had all the right notes, but played them all in the wrong order [9]?

{Update August 6th, official meeting notes courtesy of HSCIC}

I look forward to the future and address this more, as we did in the second part of the meeting, in my post Part Two. [10]

*****

[1] The Health and Social Care Information Centre – HSCIC

[2] The Partridge Review – links to blog post and all report files

[3] HSCIC Driving Positive Change http://www.hscic.gov.uk/article/4824/Driving-positive-change

[4] Chopin’s Last concert in London http://www.chopin-society.org.uk/articles/chopin-last-concert.htm

[5] What are we doing in England to be the world-class player that the Prime Minister said he wants? https://www.gov.uk/government/news/record-800-million-for-groundbreaking-research-to-benefit-patients

[6] A Health Research Remote Data Lab (HRRDL) concept for the ASH consultation – https://medconfidential.org/2014/hrrdls-for-commissioning/

[7] “What do you think of the show so far?” A classic Waldorf and Statler line from the Muppet Show. https://www.youtube.com/watch?v=jJNxj1FdKuo&list=PL1BCB0B838EBE07C6&index=12

[8] Chopin Rubenstein Piano Concerto no.2 with Andre Previn https://www.youtube.com/watch?v=T_GecdMywPw&index=1&list=RDT_GecdMywPw

[9] Classic comedy Morecambe & Wise, with Andre Previn – all the right notes, but not necessarily in the right order https://www.youtube.com/watch?v=-zHBN45fbo8

[10] Blog post part two: care.data is like playing Chopin – or will it be all the right notes, but in the wrong order? [Part two – future]

**** In case care.data is news for you, here is a simple guide via Wired  and a website from GP and Caldicott Guardian Dr. Bhatia > the official NHS England page is here   ****

####

Fun facts: From The Telegraph, 2010: Prince of The Romantics by Adam Zamoyski

“That November farewell, given in aid of a Polish charity, came at the end of a difficult six-month British sojourn, which had included concerts in Manchester (one of the largest audiences he ever faced), Glasgow and Edinburgh, where the non-religious Chopin had unwillingly endured Bible readings by a pious patroness anxious to convert him to the Church of Scotland. Finally back in London, the composer-pianist spent three weeks preparing for what turned out to be his final recital by sitting wrapped in his coat in front of the fire at St James’s Place, attended by London’s leading homeopath and the Royal Physician, a specialist in tuberculosis. A week after the concert, he was on his way home to Parisian exile and death the following year.”

Born Zelazowa Wola, Poland of a French emigrant father and Polish mother, he left Poland aged 20, never to return. Well known and by some controversially for his long romantic liaison with novelist George Sand (Aurore Dudevant) after they separated his health failed and in 1848 he paid a long visit to Britain where he gave his last public performance at the Guildhall. He died in Paris.

What is Care.data? Defined scope is vital for trust.

It seems impossible to date, to get an official simple line drawn around ‘what is care.data’. And therefore scope creep is inevitable and fair processing almost impossible. There is much misunderstanding, seeing it as exclusively this one-time GP load to merge with HES. Or even confusion with the Summary Care Record and its overlap, if it will be used in read-only environments such as Proactive care and Out-of-hours, or by 111 and A&E services.  The best unofficial summary is here from a Hampshire GP, Dr. Bhatia.

Care.data is an umbrella initiative, which is planned over many years.

Care.data seems to be a vision. An ethereal concept of how all Secondary Uses (ref.p28) health and social care data will be extracted and made available to share in the cloud for all manner of customers. A global standard allowing extract, query and reporting for top down control by the men behind the curtains, with intangible benefits for England’s inhabitants whose data it is. Each data set puts another brick in the path towards a perfect, all-knowing, care.data dream. And the data sets continue to be added to and plans made for evermore future flows. (Community Services make up 10 per cent of the NHS budget and the standards that will mandate the national submission of the revised CIDS data is now not due until 2015.)

Whilst offering insight opportunity for top down cost control, planning, and ‘quality’ measures, right down to the low level basics of invoice validation, it will not offer clinicians on the ground access to use data between hospitals for direct care. HES data is too clunky, or too detailed with the wrong kinds of data, or incomplete and inaccurate to benefit patients in care of their individual consultants. Prof Jonathan Kay at the Westminster Health Forum on 1st April telling hospitals, to do their own thing and go away and make local hospital IT systems work. Totally at odds with the mantra of Beverley Bryant, NHS England of, ‘interoperability’ earlier the same day. An audience question asked, how can we ensure patients can transfer successfully between hospitals without a set of standards? It is impossible to see good value for patients here.

Without a controlled scope I do not wish to release my children’s personal data for research purposes. But at the moment we have no choice. Our data is used in pseudonymous format and we have no known publicly communicated way to restrict that use. The patient leaflet, “better data means better care” certainly gives no indication that pseudonymous data is obligatory nor states clearly that only the identifiable data would be restricted if one objected.

Data extracted now, offers no possibility to time limit its use. I hope my children will have a long and happy lifetime, and can choose themselves if they are ‘a willing research patient’ as David Cameron stated in 2010 he would change the NHS Constitution for. We just don’t know to what use those purposes will be put in their lifetime.

The scope of an opt-in assumption should surely be reasonably expected only to be used for our care and nothing else, unless there is a proven patient need & benefit for otherwise? All other secondary uses cannot be assumed without any sort of fair processing, but they already are.

The general public can now see for the first time, the scope of how the HSCIC quango and its predecessors have been giving away our hospital records at arms-length, with commercial re-use licenses.

The scope of sharing and its security is clearly dependent on whether it is fully identifiable (red),  truly anonymous and aggregated (green, Open data) or so-called amber. This  pseudonymous data is re-identifiable if you know what you’re doing, according to anyone who knows about these things, and is easy when paired with other data. It’s illegal? Well so was phone hacking, and we know that didn’t happen either of course.  Knowledge once leaked, is lost. The bigger the data, the bigger the possible loss, as Target will testify. So for those who fear it falling into the wrong hands, it’s a risk which we just have to trust is well secured. This scope of what can be legitimately shared for what purposes must be reined in.

Otherwise, how can we possibly consent to something which may be entirely different purposes down the line?

If we need different data for real uses of commissioning, various aspects of research and the commercial ‘health purposes,’ why then are they conflated in the one cauldron? The Caldicott 2 review questioned many of these uses of identifiable data, notably for invoice validation and risk stratification.

Parents should be able to support research without that meaning our kids’ health data is given freely for every kind of research, for eternity, and to commercial intermediaries or other government departments. Whilst I have no qualms about Public Health research, I do about pushing today’s boundaries of predictive medicine. Our NHS belongs to us all, free-at-the-point-of-service for all, not as some sort of patient-care trade deal.

Where is the clear definition of scope and purposes for either the existing HES data or future care.data? Data extractions demand fair processing.

Data is not just a set of statistics. It is the knowledge of our bodies, minds and lifestyle choices. Sometimes it will provide knowledge to others, we don’t even yet have ourselves.

Who am I to assume today, a choice which determines my children have none forevermore? Why does the Government make that choice on our behalf and had originally decided not to even tell us at all?  It is very uncomfortable feeling like it is Mother vs Big Brother on this, but that is how it feels. You have taken my children’s hospital health records and are using them without my permission for purposes I cannot control. That is not fair processing. It was not in the past and it continues not to be now.  You want to do the same with their GP records, and planned not to ask us. And still have not explained why many had no communications leaflet. Where is my trust now?

We need to be very careful to ensure that all the right steps are put in place to safeguard patient data for the vital places which need it, public health, ethical and approved research purposes, planning and delivery of care. NHS England must surely step up publicly soon and explain what is going on. And ideally, that they will take as long as necessary to get all the right steps in the right order. Autumn is awfully close, if nothing is yet changed.

The longer trust is eroded, the greater chance there is long term damage to data quality and its flawed use by those who need it. But it would be fatal to rush and fail again.

If we set the right framework now, we should build a method that all future changes to scope ensure communication and future fair processing.

We need to be told transparently, to what purposes our data is being used today, so we can trust those who want to use it tomorrow. Each time purposes change, the right to revoke consent should change. And not just going forward, but from all records use. Historic and future.

How have we got here? Secondary Uses (SUS) is the big data cloud from which Hospital Episode Statistics (HES) is a subset. HES was originally extracted and managed as an admin tool. From the early days of the Open Exeter system GP patient data was used for our clinical care and its management. When did that change? Scope seems not so much to have crept, but skipped along a path to being OK to share the data, linked on demand even with Personal Demographics or from QOF data too, with pharma, all manner of research institutions and third party commercial intermediaries, but no one thought to tell the public. Oops says ICO.

Without scope definition, there can be no fair processing. We don’t know who will access which data for what purposes. Future trust can only be built if we know what we have been signed up to, stays what we were signed up to, across all purposes, across all classes of data. Scope creep must be addressed for all patient data handling and will be vital if we are to trust care.data extraction.

***