Tag Archives: data sharing

The Queen’s Speech, Information Society Services and GDPR

The Queen’s Speech promised new laws to ensure that the United Kingdom retains its world-class regime protecting personal data. And the government proposes a new digital charter to make the United Kingdom the safest place to be online for children.

Improving online safety for children should mean one thing. Children should be able to use online services without being used by them and the people and organisations behind it. It should mean that their rights to be heard are prioritised in decisions about them.

As Sir Tim Berners-Lee is reported as saying, there is a need to work with companies to put “a fair level of data control back in the hands of people“. He rightly points out that today terms and conditions are “all or nothing”.

There is a gap in discussions that we fail to address when we think of consent to terms and conditions, or “handing over data”. It is that this assumes that these are always and can be always, conscious acts.

For children the question of whether accepting Ts&Cs giving them control and whether it is meaningful becomes even more moot. What are the agreeing to? Younger children cannot give free and informed consent. After all most privacy policies standardly include phrases such as, “If we sell all or a portion of our business, we may transfer all of your information, including personal information, to the successor organization,” which means in effect that “accepting” a privacy policy today, is effectively a blank cheque for anything tomorrow.

The GDPR requires terms and conditions to be laid out in policies that a child can understand.

The current approach to legislation around children and the Internet is heavily weighted towards protection from seen threats. The threats we need to give more attention to, are those unseen.

By 2024 more than 50% of home Internet traffic will be used by appliances and devices, rather than just for communication and entertainment…The IoT raises huge questions on privacy and security, that have to be addressed by government, corporations and consumers. (WEF, 2017)

Our lives as measured in our behaviours and opinions, purchases and likes, are connected by trillions of sensors. My parents may have described using the Internet as going online. Today’s online world no longer means our time is spent ‘on the computer’, but being online, all day every day. Instead of going to a desk and booting up through a long phone cable, we have wireless computers in our pockets and in our homes, with functionality built-in to enable us to do other things; make a phonecall, make toast, and play. In a smart city surrounded by sensors under pavements, in buildings, cameras and tracking everywhere we go, we are living ever more inside an overarching network of cloud computers that store our data. And from all that data decisions are made, which adverts to show us, on which network sites, what we get offered and do not, and our behaviours and our conscious decision-making may be nudged quite invisibly.

Data about us, whether uniquely identifiable or not, is all too often collected passively, IP Address, linked sign-ins that extract friends lists, and some decide if we can either use the thing or not. It’s part of the deal. We get the service, they get to trade our identity, like Top Trumps, behind the scenes. But we often don’t see it, and under GDPR, there should be no contractual requirement as part of consent. I.e. agree or don’t get the service, is not an option.

From May 25, 2018 there will be special “conditions applicable to child’s consent in relation to information society services,” in Data Protection law which are applicable to the collection of data.

As yet, we have not had debate in the UK what that means in concrete terms, and if we do not soon, we risk it becoming an afterthought that harms more than helps protect children’s privacy, and therefore their digital identity.

I think of five things needed by policy shapers to tackle it:

  • In depth understanding of what ‘online’ and the Internet mean
  • Consistent understanding of what threat models and risk are connected to personal data, which today are underestimated
  • A grasp of why data privacy training is vital to safeguarding
    Confront the idea that user regulation as a stand-alone step will create a better online experience for users, when we know that perceived problems are created by providers or other site users
  • Siloed thinking that fails to be forward thinking or join the dots of tactics across Departments into cohesive inclusive strategy

If the government’s new “major new drive on internet safety” involves the world’s largest technology companies in order to make the UK the “safest place in the world for young people to go online,” then we must also ensure that these strategies and papers join things up and above all, a technical knowledge of how the Internet works needs to join the dots of risks and benefits in order to form a strategy that will actually make children safe, skilled and see into their future.

When it comes to children, there is a further question over consent and parental spyware. Various walk-to-school apps, lauded by the former Secretary of State two years running, use spyware and can be used without a child’s consent. Guardian Gallery, which could be used to scan for nudity in photos on anyone’s phone that the ‘parent’ phone holder has access to install it on, can be made invisible on the ‘child’ phone. Imagine this in coercive relationships.

If these technologies and the online environment are not correctly assessed with regard to “online safety” threat models for all parts of our population, then they fail to address the risk for the most vulnerable who need it.

What will the GDPR really mean for online safety improvement? What will it define as online services for remuneration in the IoT? And who will be considered as children, “targeted at” or “offered to”?

An active decision is required in the UK. Will 16 remain the default age needed for consent to access Information Society Services, or will we adopt 13 which needs a legal change?

As banal as these questions sound they need close attention paid, and clarity, between now and May 25, 2018 if the UK is to be GDPR ready for providers of online services to know who and how they should treat Internet access, participation and age [parental] verification.

How will the “controller” make “reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child”, and “taking into consideration available technology”.

These are fundamental questions of what the Internet is and means to people today. And if the current government approach to security is anything to go by, safety will not mean what we think it will mean.

It will matter how these plans join up. Age verification was not being considered in UK law in relation to how we would derogate GDPR, even as late as in October 2016 despite age verification requirements already in the Digital Economy Bill. It shows a lack of joined up digital thinking across our government and needs addressed with urgency to get into the next Parliamentary round.

In recent draft legislation I am yet to see the UK government address Internet rights and safety for young people as anything other than a protection issue, treating the online space in the same way as offline, irl, focused on stranger danger, and sexting.

The UK Digital Strategy commits to the implementation of the General Data Protection Regulation by May 2018, and frames it as a business issue, labelling data as “a global commodity” and as such, its handling is framed solely as a requirements needed to ensure “that our businesses can continue to compete and communicate effectively around the world” and that adoption “will ensure a shared and higher standard of protection for consumers and their data.”

The Digital Economy Bill, despite being a perfect vehicle for this has failed to take on children’s rights, and in particular the requirements of GDPR for consent at all. It was clear if we were to do any future digital transactions we need to level up to GDPR, not drop to the lowest common denominator between that and existing laws.

It was utterly ignored. So were children’s rights to have their own views heard in the consultation to comment on the GDPR derogations for children, with little chance for involvement from young people’s organisations, and less than a monthto respond.

We must now get this right in any new Digital Strategy and bill in the coming parliament.

Gotta know it all? Pokémon GO, privacy and behavioural research

I caught my first Pokémon and I liked it. Well, OK, someone else handed me a phone and insisted I have a go. Turns out my curve ball is pretty good. Pokémon GO is enabling all sorts of new discoveries.

Discoveries reportedly including a dead man, robbery, picking up new friends, and scrapes and bruises. While players are out hunting anime in augmented reality, enjoying the novelty, and discovering interesting fun facts about their vicinity, Pokémon GO is gathering a lot of data. It’s influencing human activity in ways that other games can only envy, taking in-game interaction to a whole new level.

And it’s popular.

But what is it learning about us as we do it?

This week questions have been asked about the depth of interaction that the app gets by accessing users’ log in credentials.

What I would like to know is what access goes in the other direction?

Google, heavily invested in AI and Machine intelligence research, has “learning systems placed at the core of interactive services in a fast changing and sometimes adversarial environment, combinations of techniques including deep learning and statistical models need to be combined with ideas from control and game theory.”

The app, which is free to download, has raised concerns over suggestions the app could access a user’s entire Google account, including email and passwords. Then it seemed it couldn’t. But Niantic is reported to have made changes to permissions to limit access to basic profile information anyway.

If Niantic gets access to data owned by Google through its use of google log in credentials, does Nantic’s investor, Google’s Alphabet, get the reverse: user data from the Google log in interaction with the app, and if so, what does Google learn through the interaction?

Who gets access to what data and why?

Brian Crecente writes that Apple, Google, Niantic likely making more on Pokémon Go than Nintendo, with 30 percent of revenue from in-app purchases on their online stores.

Next stop  is to make money from marketing deals between Niantic and the offline stores used as in-game focal points, gyms and more, according to Bryan Menegus at Gizmodo who reported Redditors had discovered decompiled code in the Android and iOS versions of Pokémon Go earlier this week “that indicated a potential sponsorship deal with global burger chain McDonald’s.”

The logical progressions of this, is that the offline store partners, i.e. McDonald’s and friends, will be making money from players, the people who get led to their shops, restaurants and cafes where players will hang out longer than the Pokéstop, because the human interaction with other humans, the battles between your collected creatures and teamwork, are at the heart of the game. Since you can’t visit gyms until you are level 5 and have chosen a team, players are building up profiles over time and getting social in real life. Location data that may build up patterns about the players.

This evening the two players that I spoke to were already real-life friends on their way home from work (that now takes at least an hour longer every evening) and they’re finding the real-life location facts quite fun, including that thing they pass on the bus every day, and umm, the Scientology centre. Well, more about that later**.

Every player I spotted looking at the phone with that finger flick action gave themselves away with shared wry smiles. All 30 something men. There is possibly something of a legacy in this they said, since the initial Pokémon game released 20 years ago is drawing players who were tweens then.

Since the app is online and open to all, children can play too. What this might mean for them in the offline world, is something the NSPCC picked up on here before the UK launch. Its focus  of concern is the physical safety of young players, citing the risk of in-game lures misuse. I am not sure how much of an increased risk this is compared with existing scenarios and if children will be increasingly unsupervised or not. It’s not a totally new concept. Players of all ages must be mindful of where they are playing**. Some stories of people getting together in the small hours of the night has generated some stories which for now are mostly fun. (Go Red Team.) Others are worried about hacking. And it raises all sorts of questions if private and public space is has become a Pokestop.

While the NSPCC includes considerations on the approach to privacy in a recent more general review of apps, it hasn’t yet mentioned the less obvious considerations of privacy and ethics in Pokémon GO. Encouraging anyone, but particularly children, out of their home or protected environments and into commercial settings with the explicit aim of targeting their spending. This is big business.

Privacy in Pokémon GO

I think we are yet to see a really transparent discussion of the broader privacy implications of the game because the combination of multiple privacy policies involved is less than transparent. They are long, they seem complete, but are they meaningful?

We can’t see how they interact.

Google has crowd sourced the collection of real time traffic data via mobile phones.  Geolocation data from google maps using GPS data, as well as network provider data seem necessary to display the street data to players. Apparently you can download and use the maps offline since Pokémon GO uses the Google Maps API. Google goes to “great lengths to make sure that imagery is useful, and reflects the world our users explore.” In building a Google virtual reality copy of the real world, how data are also collected and will be used about all of us who live in it,  is a little wooly to the public.

U.S. Senator Al Franken is apparently already asking Niantic these questions. He points out that Pokémon GO has indicated it shares de-identified and aggregate data with other third parties for a multitude of purposes but does not describe the purposes for which Pokémon GO would share or sell those data [c].

It’s widely recognised that anonymisation in many cases fails so passing only anonymised data may be reassuring but fail in reality. Stripping out what are considered individual personal identifiers in terms of data protection, can leave individuals with unique characteristics or people profiled as groups.

Opt out he feels is inadequate as a consent model for the personal and geolocational data that the app is collecting and passing to others in the U.S.

While the app provider would I’m sure argue that the UK privacy model respects the European opt in requirement, I would be surprised if many have read it. Privacy policies fail.

Poor practices must be challenged if we are to preserve the integrity of controlling the use of our data and knowledge about ourselves. Being aware of who we have ceded control of marketing to us, or influencing how we might be interacting with our environment, is at least a step towards not blindly giving up control of free choice.

The Pokémon GO permissions “for the purpose of performing services on our behalf“, “third party service providers to work with us to administer and provide the Services” and  “also use location information to improve and personalize our Services for you (or your authorized child)” are so broad as they could mean almost anything. They can also be changed without any notice period. It’s therefore pretty meaningless. But it’s the third parties’ connection, data collection in passing, that is completely hidden from players.

If we are ever to use privacy policies as meaningful tools to enable consent, then they must be transparent to show how a chain of permissions between companies connect their services.

Otherwise they are no more than get out of jail free cards for the companies that trade our data behind the scenes, if we were ever to claim for its misuse.  Data collectors must improve transparency.

Behavioural tracking and trust

Covert data collection and interaction is not conducive to user trust, whether through a failure to communicate by design or not.

By combining location data and behavioural data, measuring footfall is described as “the holy grail for retailers and landlords alike” and it is valuable.  “Pavement Opportunity” data may be sent anonymously, but if its analysis and storage provides ways to pitch to people, even if not knowing who they are individually, or to groups of people, it is discriminatory and potentially invisibly predatory. The pedestrian, or the player, Jo Public, is a commercial opportunity.

Pokémon GO has potential to connect the opportunity for profit makers with our pockets like never before. But they’re not alone.

Who else is getting our location data that we don’t sign up for sharing “in 81 towns and cities across Great Britain?

Whether footfall outside the shops or packaged as a game that gets us inside them, public interest researchers and commercial companies alike both risk losing our trust if we feel used as pieces in a game that we didn’t knowingly sign up to. It’s creepy.

For children the ethical implications are even greater.

There are obligations to meet higher legal and ethical standards when processing children’s data and presenting them marketing. Parental consent requirements fail children for a range of reasons.

So far, the UK has said it will implement the EU GDPR. Clear and affirmative consent is needed. Parental consent will be required for the processing of personal data of children under age 16. EU Member States may lower the age requiring parental consent to 13, so what that will mean for children here in the UK is unknown.

The ethics of product placement and marketing rules to children of all ages go out the window however, when the whole game or programme is one long animated advert. On children’s television and YouTube, content producers have turned brand product placement into programmes: My Little Pony, Barbie, Playmobil and many more.

Alice Webb, Director of BBC Children’s and BBC North,  looked at some of the challenges in this as the BBC considers how to deliver content for children whilst adapting to technological advances in this LSE blog and the publication of a new policy brief about families and ‘screen time’, by Alicia Blum-Ross and Sonia Livingstone.

So is this augmented reality any different from other platforms?

Yes because you can’t play the game without accepting the use of the maps and by default some sacrifice of your privacy settings.

Yes because the ethics and implications of of putting kids not simply in front of a screen that pitches products to them, but puts them physically into the place where they can consume products – if the McDonalds story is correct and a taster of what will follow – is huge.

Boundaries between platforms and people

Blum-Ross says, “To young people, the boundaries and distinctions that have traditionally been established between genres, platforms and devices mean nothing; ditto the reasoning behind the watershed system with its roots in decisions about suitability of content. “

She’s right. And if those boundaries and distinctions mean nothing to providers, then we must have that honest conversation with urgency. With our contrived consent, walking and running and driving without coercion, we are being packaged up and delivered right to the door of for-profit firms, paying for the game with our privacy. Smart cities are exploiting street sensors to do the same.

Freewill is at the very heart of who we are. “The ability to choose between different possible courses of action. It is closely linked to the concepts of responsibility, praise, guilt, sin, and other judgments which apply only to actions that are freely chosen.” Free choice of where we shop, what we buy and who we interact with is open to influence. Influence that is not entirely transparent presents opportunity for hidden manipulation, while the NSPCC might be worried about the risk of rare physical threat, the potential for the influencing of all children’s behaviour, both positive and negative, reaches everyone.

Some stories of how behaviour is affected, are heartbreakingly positive. And I met and chatted with complete strangers who shared the joy of something new and a mutual curiosity of the game. Pokémon GOis clearly a lot of fun. It’s also unclear on much more.

I would like to explicitly understand if Pokémon GO is gift packaging behavioural research by piggybacking on the Google platforms that underpin it, and providing linked data to Google or third parties.

Fishing for frequent Pokémon encourages players to ‘check in’ and keep that behaviour tracking live. 4pm caught a Krabby in the closet at work. 6pm another Krabby. Yup, still at work. 6.32pm Pidgey on the street outside ThatGreenCoffeeShop. Monday to Friday.

The Google privacy policies changed in the last year require ten clicks for opt out, and in part, the download of an add-on. Google has our contacts, calendar events, web searches, health data, has invested in our genetics, and all the ‘Things that make you “you”. They have our history, and are collecting our present. Machine intelligence work on prediction, is the future. For now, perhaps that will be pinging you with a ‘buy one get one free’ voucher at 6.20, or LCD adverts shifting as you drive back home.

Pokémon GO doesn’t have to include what data Google collects in its privacy policy. It’s in Google’s privacy policy. And who really read that when it came out months ago, or knows what it means in combination with new apps and games we connect it with today? Tracking and linking data on geolocation, behavioural patterns, footfall, whose other phones are close by,  who we contact, and potentially even our spend from Google wallet.

Have Google and friends of Niantic gotta know it all?

The illusion that might cheat us: ethical data science vision and practice

This blog post is also available as an audio file on soundcloud.


Anais Nin, wrote in her 1946 diary of the dangers she saw in the growth of technology to expand our potential for connectivity through machines, but diminish our genuine connectedness as people. She could hardly have been more contemporary for today:

“This is the illusion that might cheat us of being in touch deeply with the one breathing next to us. The dangerous time when mechanical voices, radios, telephone, take the place of human intimacies, and the concept of being in touch with millions brings a greater and greater poverty in intimacy and human vision.”
[Extract from volume IV 1944-1947]

Echoes from over 70 years ago, can be heard in the more recent comments of entrepreneur Elon Musk. Both are concerned with simulation, a lack of connection between the perceived, and reality, and the jeopardy this presents for humanity. But both also have a dream. A dream based on the positive potential society has.

How will we use our potential?

Data is the connection we all have between us as humans and what machines and their masters know about us. The values that masters underpin their machine design with, will determine the effect the machines and knowledge they deliver, have on society.

In seeking ever greater personalisation, a wider dragnet of data is putting together ever more detailed pieces of information about an individual person. At the same time data science is becoming ever more impersonal in how we treat people as individuals. We risk losing sight of how we respect and treat the very people whom the work should benefit.

Nin grasped the risk that a wider reach, can mean more superficial depth. Facebook might be a model today for the large circle of friends you might gather, but how few you trust with confidences, with personal knowledge about your own personal life, and the privilege it is when someone chooses to entrust that knowledge to you. Machine data mining increasingly tries to get an understanding of depth, and may also add new layers of meaning through profiling, comparing our characteristics with others in risk stratification.
Data science, research using data, is often talked about as if it is something separate from using information from individual people. Yet it is all about exploiting those confidences.

Today as the reach has grown in what is possible for a few people in institutions to gather about most people in the public, whether in scientific research, or in surveillance of different kinds, we hear experts repeatedly talk of the risk of losing the valuable part, the knowledge, the insights that benefit us as society if we can act upon them.

We might know more, but do we know any better? To use a well known quote from her contemporary, T S Eliot, ‘Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?’

What can humans achieve? We don’t yet know our own limits. What don’t we yet know?  We have future priorities we aren’t yet aware of.

To be able to explore the best of what Nin saw as ‘human vision’ and Musk sees in technology, the benefits we have from our connectivity; our collaboration, shared learning; need to be driven with an element of humility, accepting values that shape  boundaries of what we should do, while constantly evolving with what we could do.

The essence of this applied risk is that technology could harm you, more than it helps you. How do we avoid this and develop instead the best of what human vision makes possible? Can we also exceed our own expectations of today, to advance in moral progress?

Continue reading “The illusion that might cheat us: ethical data science vision and practice” »

OkCupid and Google DeepMind: Happily ever after? Purposes and ethics in datasharing

This blog post is also available as an audio file on soundcloud.


What constitutes the public interest must be set in a universally fair and transparent ethics framework if the benefits of research are to be realised – whether in social science, health, education and more – that framework will provide a strategy to getting the pre-requisite success factors right, ensuring research in the public interest is not only fit for the future, but thrives. There has been a climate change in consent. We need to stop talking about barriers that prevent datasharing  and start talking about the boundaries within which we can.

What is the purpose for which I provide my personal data?

‘We use math to get you dates’, says OkCupid’s tagline.

That’s the purpose of the site. It’s the reason people log in and create a profile, enter their personal data and post it online for others who are looking for dates to see. The purpose, is to get a date.

When over 68K OkCupid users registered for the site to find dates, they didn’t sign up to have their identifiable data used and published in ‘a very large dataset’ and onwardly re-used by anyone with unregistered access. The users data were extracted “without the express prior consent of the user […].”

Are the registration consent purposes compatible with the purposes to which the researcher put the data should be a simple enough question.  Are the research purposes what the person signed up to, or would they be surprised to find out their data were used like this?

Questions the “OkCupid data snatcher”, now self-confessed ‘non-academic’ researcher, thought unimportant to consider.

But it appears in the last month, he has been in good company.

Google DeepMind, and the Royal Free, big players who do know how to handle data and consent well, paid too little attention to the very same question of purposes.

The boundaries of how the users of OkCupid had chosen to reveal information and to whom, have not been respected in this project.

Nor were these boundaries respected by the Royal Free London trust that gave out patient data for use by Google DeepMind with changing explanations, without clear purposes or permission.

The legal boundaries in these recent stories appear unclear or to have been ignored. The privacy boundaries deemed irrelevant. Regulatory oversight lacking.

The respectful ethical boundaries of consent to purposes, disregarding autonomy, have indisputably broken down, whether by commercial org, public body, or lone ‘researcher’.

Research purposes

The crux of data access decisions is purposes. What question is the research to address – what is the purpose for which the data will be used? The intent by Kirkegaard was to test:

“the relationship of cognitive ability to religious beliefs and political interest/participation…”

In this case the question appears intended rather a test of the data, not the data opened up to answer the test. While methodological studies matter, given the care and attention [or self-stated lack thereof] given to its extraction and any attempt to be representative and fair, it would appear this is not the point of this study either.

The data doesn’t include profiles identified as heterosexual male, because ‘the scraper was’. It is also unknown how many users hide their profiles, “so the 99.7% figure [identifying as binary male or female] should be cautiously interpreted.”

“Furthermore, due to the way we sampled the data from the site, it is not even representative of the users on the site, because users who answered more questions are overrepresented.” [sic]

The paper goes on to say photos were not gathered because they would have taken up a lot of storage space and could be done in a future scraping, and

“other data were not collected because we forgot to include them in the scraper.”

The data are knowingly of poor quality, inaccurate and incomplete. The project cannot be repeated as ‘the scraping tool no longer works’. There is an unclear ethical or peer review process, and the research purpose is at best unclear. We can certainly give someone the benefit of the doubt and say intent appears to have been entirely benevolent. It’s not clear what the intent was. I think it is clearly misplaced and foolish, but not malevolent.

The trouble is, it’s not enough to say, “don’t be evil.” These actions have consequences.

When the researcher asserts in his paper that, “the lack of data sharing probably slows down the progress of science immensely because other researchers would use the data if they could,”  in part he is right.

Google and the Royal Free have tried more eloquently to say the same thing. It’s not research, it’s direct care, in effect, ignore that people are no longer our patients and we’re using historical data without re-consent. We know what we’re doing, we’re the good guys.

However the principles are the same, whether it’s a lone project or global giant. And they’re both wildly wrong as well. More people must take this on board. It’s the reason the public interest needs the Dame Fiona Caldicott review published sooner rather than later.

Just because there is a boundary to data sharing in place, does not mean it is a barrier to be ignored or overcome. Like the registration step to the OkCupid site, consent and the right to opt out of medical research in England and Wales is there for a reason.

We’re desperate to build public trust in UK research right now. So to assert that the lack of data sharing probably slows down the progress of science is misplaced, when it is getting ‘sharing’ wrong, that caused the lack of trust in the first place and harms research.

A climate change in consent

There has been a climate change in public attitude to consent since care.data, clouded by the smoke and mirrors of state surveillance. It cannot be ignored.  The EUGDPR supports it. Researchers may not like change, but there needs to be an according adjustment in expectations and practice.

Without change, there will be no change. Public trust is low. As technology advances and if we continue to see commercial companies get this wrong, we will continue to see public trust falter unless broken things get fixed. Change is possible for the better. But it has to come from companies, institutions, and people within them.

Like climate change, you may deny it if you choose to. But some things are inevitable and unavoidably true.

There is strong support for public interest research but that is not to be taken for granted. Public bodies should defend research from being sunk by commercial misappropriation if they want to future-proof public interest research.

The purpose for which the people gave consent are the boundaries within which you have permission to use data, that gives you freedom within its limits, to use the data.  Purposes and consent are not barriers to be overcome.

If research is to win back public trust developing a future proofed, robust ethical framework for data science must be a priority today.

Commercial companies must overcome the low levels of public trust they have generated in the public to date if they ask ‘trust us because we’re not evil‘. If you can’t rule out the use of data for other purposes, it’s not helping. If you delay independent oversight it’s not helping.

This case study and indeed the Google DeepMind recent episode by contrast demonstrate the urgency with which working out what common expectations and oversight of applied ethics in research, who gets to decide what is ‘in the public interest’ and data science public engagement must be made a priority, in the UK and beyond.

Boundaries in the best interest of the subject and the user

Society needs research in the public interest. We need good decisions made on what will be funded and what will not be. What will influence public policy and where needs attention for change.

To do this ethically, we all need to agree what is fair use of personal data, when is it closed and when is it open, what is direct and what are secondary uses, and how advances in technology are used when they present both opportunities for benefit or risks to harm to individuals, to society and to research as a whole.

The potential benefits of research are potentially being compromised for the sake of arrogance, greed, or misjudgement, no matter intent. Those benefits cannot come at any cost, or disregard public concern, or the price will be trust in all research itself.

In discussing this with social science and medical researchers, I realise not everyone agrees. For some, using deidentified data in trusted third party settings poses such a low privacy risk, that they feel the public should have no say in whether their data are used in research as long it’s ‘in the public interest’.

For the DeepMind researchers and Royal Free, they were confident even using identifiable data, this is the “right” thing to do, without consent.

For the Cabinet Office datasharing consultation, the parts that will open up national registries, share identifiable data more widely and with commercial companies, they are convinced it is all the “right” thing to do, without consent.

How can researchers, society and government understand what is good ethics of data science, as technology permits ever more invasive or covert data mining and the current approach is desperately outdated?

Who decides where those boundaries lie?

“It’s research Jim, but not as we know it.” This is one aspect of data use that ethical reviewers will need to deal with, as we advance the debate on data science in the UK. Whether independents or commercial organisations. Google said their work was not research. Is‘OkCupid’ research?

If this research and data publication proves anything at all, and can offer lessons to learn from, it is perhaps these three things:

Who is accredited as a researcher or ‘prescribed person’ matters. If we are considering new datasharing legislation, and for example, who the UK government is granting access to millions of children’s personal data today. Your idea of a ‘prescribed person’ may not be the same as the rest of the public’s.

Researchers and ethics committees need to adjust to the climate change of public consent. Purposes must be respected in research particularly when sharing sensitive, identifiable data, and there should be no assumptions made that differ from the original purposes when users give consent.

Data ethics and laws are desperately behind data science technology. Governments, institutions, civil, and all society needs to reach a common vision and leadership how to manage these challenges. Who defines these boundaries that matter?

How do we move forward towards better use of data?

Our data and technology are taking on a life of their own, in space which is another frontier, and in time, as data gathered in the past might be used for quite different purposes today.

The public are being left behind in the game-changing decisions made by those who deem they know best about the world we want to live in. We need a say in what shape society wants that to take, particularly for our children as it is their future we are deciding now.

How about an ethical framework for datasharing that supports a transparent public interest, which tries to build a little kinder, less discriminating, more just world, where hope is stronger than fear?

Working with people, with consent, with public support and transparent oversight shouldn’t be too much to ask. Perhaps it is naive, but I believe that with an independent ethical driver behind good decision-making, we could get closer to datasharing like that.

That would bring Better use of data in government.

Purposes and consent are not barriers to be overcome. Within these, shaped by a strong ethical framework, good data sharing practices can tackle some of the real challenges that hinder ‘good use of data’: training, understanding data protection law, communications, accountability and intra-organisational trust. More data sharing alone won’t fix these structural weaknesses in current UK datasharing which are our really tough barriers to good practice.

How our public data will be used in the public interest will not be a destination or have a well defined happy ending, but it is a long term  process which needs to be consensual and there needs to be a clear path to setting out together and achieving collaborative solutions.

While we are all different, I believe that society shares for the most part, commonalities in what we accept as good, and fair, and what we believe is important. The family sitting next to me have just counted out their money and bought an ice cream to share, and the staff gave them two. The little girl is beaming. It seems that even when things are difficult, there is always hope things can be better. And there is always love.

Even if some might give it a bad name.

********

img credit: flickr/sofi01/ Beauty and The Beast  under creative commons

Thoughts on Digital Participation and Health Literacy: Opportunities for engaging citizens in the NHS [#NHSWDP 1]

“..smartphones […] the single most important health treatment and diagnostic tool at our disposal over the coming decade and beyond “

That’s what Simon Stevens said at a meeting on “digital participation and health literacy: opportunities for engaging citizens” in the National Health Service this week, at the King’s Fund in London.

It seemed a passing comment, but its enormity from the Chief Executive of the commissioning body for the NHS, made me catch my breath.

Other than inspiration from the brilliance of Helen Milner, Chief Executive of the Tinder Foundation – the only speaker who touched on the importance of language around digital participation – what did I take away from the meeting?

The full text of Simon Steven’s speech is below at the end of this post, but he didn’t elaborate further on this comment.

Where to start?

The first thing I took away to think about, was the impact of the statement. 

“the single most important health treatment and diagnostic tool at our disposal over the coming decade and beyond “

So I thought about that more in a separate post, part two.

The second, was on consent.

This tied into the statement by Tim Kelsey, Director of Patients and Information at NHS England. It seems that the era when consent will be king is fast approaching, and I thought about this more in part three.

The third key learning I had of the day, which almost everyone I met voiced to me was, that the “best bit of these events is the learnings outside the sessions, from each other. From other people you meet.”

That included Roger who we met via video. And GP Dr Ollie Hart. All the tweeps I’ve now met in real life, and as Roz said, didn’t disappoint. People with experience and expertise in their fields. All motivated to make things better and make things work, around digital, for people.

Really important when thinking about ‘digital’ it doesn’t necessarily mean remote or reduce the people-time involved.

Change happens through people. Not necessarily seen as ‘clients’ or ‘consumers’ or even ‘customers’. How human interaction is supported by or may be replaced by digital contact fascinates me.

My fourth learning? was about how to think about data collection and use in a personalised digital world.

Something which will be useful in my new lay role on the ADRN approvals panel (which I’m delighted to take on and pretty excited about).

Data collection is undergoing a slow but long term sea change, in content, access, expectations, security & use.

Where, for who, and from whom data is collected varies enormously. It’s going to vary even more in future if some will have free access to apps, to wifi, and others be digitally excluded.

For now, the overall effect is perhaps only ripples on the surface (like interruptions to long-term research projects due to HSCIC data stops after care.data outcry) but research direction, and currents of thought may shift fundamentally if how we collect data changes radically for even small pockets of society, or the ‘worried well’.

My fifth learning, was less a learning and more the triggering of lots of questions on wearables about which I want to learn more.

#digitalinclusion is clearly less about a narrow focus on apps than applied skills and online access.

But I came away wondering how apps will affect research and the NHS in the UK, and much more.

[Next: part two #NHSWDP 2: Smartphones: the single most important health treatment & diagnostic tool at our disposal – on wearables]

[And: part three #NHSWDP 3: Wearables & Consent: patients will ‘essentially manage their data as they wish’. What will this mean for diagnostics, treatment and research and why should we care?]

*****

Full text of the speech given by Simon Stevens, Keynote speaker:

“The reality is we all can see that we’ve got to change […] as part of that we have got to have more integrated services, between primary and specialist services, between physical and mental health services, and between health and social care services.

“And the guiding principle of that integration has got to be care that is personal, and coordinated around individuals, with leadership of communities and patient groups.

“There is no way that can happen without a strong, technological underpinning using the information revolution which is sweeping just about every other part of the economy.

“We are not unusual in this country in having a health sector which has been a little slower, in some respects, than many other parts of national life to take full advantage of that.

“We are not unusual, because that is the experience of health services in every industrialised country.

“We obviously have a huge opportunity, and have a comparative advantage in the way that the NHS is organised, to put that right.

“We know that 8 out of 10 adults are now online, we know that two thirds of people in this country have got smartphones which is going to be the single most important health treatment and diagnostic tool at our disposal over the coming decade and beyond.

“But we know we have got 6.4m people who are not.

“And so when you of course then get serious about who are those six and a half million people, many of them are our highest users of services with the greatest needs.

“So this is not an optional extra. This has got to be central about what the redesign of care looks like, with a fundamental power shift actually, in the way in which services are produced and co-produced.

“This agenda goes to the heart of what we’ve got to get right, not just on inequalities but around co-production of services and the welcome steps that have been taken by the organisations involved, I think that the point is obviously we have now got to scale this in a much more fundamental fashion, but when you look at the impact of what has already been achieved, and some of the work that has already been done by the Tinder Foundation, you take some of the examples here, with the Sikh community in  Leicester around diabetes, and parenting in other parts of the country, you can see that this is an agenda which can potentially get real quite quickly and can have quite a big impact.

“The early evaluation anyway indicates that about half of people involved say they are leading healthier lives on the back of it, 48% in healthy eating, a third do more physical activity, 72% say they have saved money or time.

“Given that we are often talking about resource poor, time poor communities, that is hugely impactful as well.

“So my role here today, I think is simply to underline the weight that we place on this, as NHS England nationally, to thank all of you for the engagement that you have been having with us, and to learn from the discussion we are about to have as what you see where you see key priorities and what you need from us.”

[March 18, 2015 at the event “Digital Participation and Health Literacy: Opportunities for engaging citizens” held at the King’s Fund, London]

 

care.data – one of our business cases is missing

“The government takes the view that transparency is vital to healthy public services. It has created a new Statistics Commission to improve the quality of information collected (and to end arguments about “fiddling” figures).” [Tim Kelsey, New Statesman, 2001] [1]

In a time of continuing cuts to budgets across the public sector the members of the public have every right and good sense to question, how is public money spent and what is its justification.[#NHS2billion]

For the flagship data extraction care.data programme, it is therefore all the more surprising, that for the short and long term there is [2]:

a) no public proof of how much the programme is costing,
b) little around measurable tangible and intangible benefits,
c) or how the risks have been evaluated.

The Woolly Mammoth in the Room

The care.data programme has been running under its ‘toxic’ [3] brand in a similar form now, for two years.

When asked directly on costs at the Health Select Committee last month, the answer was, at best, woolly.

“Q655   Rosie Cooper: While I appreciate that, can you give us any rough figures? What would a CCG be contributing to this?

Tim Kelsey: I cannot answer that question, but we will very rapidly come back to you with the CCGs’ own estimates of the costs of the programme and how much of that cost is being met by the programme.” [Hansard January 2015][4]

The department appears very unwilling to make public and transparent its plans, risks and costs. I’ve been asking for them since October 2014, in a freedom of information request. [5]

They are still not open. Very much longer will look decidedly shady.

A few limited and heavily redacted parts were released [2] in poor quality .pdf files in Jan 2015, and don’t meet my request as there’s nothing from April-October 2014, and many missing files:

Transparent?

As I followed the minutes and materials released over the last 18 months this was a monstrous gap [7], so I have asked for it before.[8]

I had imagined there was reticence in making it public.
I had imagined, the numbers may be vague.
I hadn’t imagined it just didn’t exist at all.

For the programme whose watchword is transparency, this is more than a little surprising.  A plan had to be drafted to drive transparency, after the FOI was received [which I believe fails section 22 refusal criteria, as the decision to publish was made after the FOI]

– here’s the plan [9] – where are the outcomes?nessie

Is the claim that without care.data the NHS will fail, [10] no more than a myth?

 

Why does the business case and cost/risk analysis matter? What is the future of our data ownership?

 

Because history has a habit of repeating itself and there is a terrible track record in NHS IT which the public cannot afford [22] to allow to repeat, ever again.

The mentality that these unaccountable monster programmes are allowed to grow unchecked, must die out.

Of the NPfIT, Mr Bacon MP said: “This saga is one of the worst and most expensive contracting fiascos in the history of the public sector.”

Last autumn, a new case history [23] examined its rollout, including why local IT systems fail to deliver patient joined up digital records.

Yet, even today, as we hear that IT is critical to the digital delivery of NHS care and we must all be able to access our own health records, we read that tech funds are being cut.

Where is common sense and cohesion of their business planning?

These Big Data programmes do not stand alone, but interact with all sorts of other programmes, policies, and ideas on what will be done and what is possible in future for long term data purposes.

The public is not privvy to that to be able to scrutinise , criticise and positively contribute to plans. That seems short-sighted.

And what of previous data-based ventures? Take as a case study the Dr. Foster IC Joint Venture [NAO, February 2007] [24]

“The Information Centre spent £2.5 million on legal and consultancy advice in developing the joint venture, and setting up the Information Centre. The Information Centre contends that £855,000 of the money paid to KPMG was associated with costs for setting up the Information Centre which included business planning.

However, they could not provide an explicit breakdown of these costs […] We therefore calculate that the total cost to the taxpayer of a 50 per cent share is between £15.4 million and £16.3 million.”

“The Information Centre paid £12 million in cash for a 50 per cent share of the joint venture (see Figure 2 overleaf).

The UK plc made a sizeable investment here. The UK state invested UK taxes in this firm – so what’s the current business case for using data? How transparent are our current state assets and risks?

Being a shareholder in one half, it is fair to ask who are we now sharing the investment risk with or was this part sold soon after?[25] Was that investment a long-term one, or always meant to be so short term and are there any implications for the future of HSCIC?

In 2011 this report [26] another investment group, Bamboo holdings [related to other investor companies], wanted but did not succeed in selling its Dr. Foster stock at an acceptable price, said the portfolio introduction due in their words, to ‘poor performance’.  [Annual investor review from 2013 [p.5]

So what risks does the market see as a whole which are not made available to the public which affect how data is used and shared?

What of the other parts of Dr. Foster Research and so on, we, the state, went on to buy or sell later? It appears complex.

Is the commercial benefit to be made for private companies, seen as part of the big picture benefit to the UK plc or where does state investment and expectation for economic growth fit in?

What assessment has been made of the app market in the NHS and how patient data is expected in future to be held by the individual, released by personal choice to providers through phones?

Is a state infrastructure being built which in the surprisingly short term, may see few healthy people who store their data in it or will we see bias to exclude those with the money and technology to opt out who prefer to keep their health data in a handheld device?

What is the government plan for the future of the HSCIC and our data it manages? The provider Northgate was just bought by European private equity firm Cinven, which now manages a huge swathe of UK’s data [32] and HSCIC brought others in-house. [33]

“Its software and services are used by over 400 UK local authorities, all UK police forces, social housing providers in the UK and internationally, and NHS hospitals. Its IT projects support the sharing of information for criminal intelligence and investigations across UK police forces and the management of health screening records in the UK and in Ireland.”

All the easier to manage – or to manage to sell off?

Is the business plan future-proofed to survive the new age of health data management?

One of the problems with business cases for programmes which drag on and get swamped down in delays, is they become obsolete.

The one year mark has now passed in the announced care.data pause, announced on February 18th 2014.

The letter from Mr.Kelsey on April 14th 2014, said they would use the six months to listen and act on the view of patients, public, GPs and stakeholders.

Many of the open questions remain without any reply at all, never mind public answers to solutions to open issues.

The spine proposal by medConfidential [30] is one of the best and clearest proposals I have found with practical solutions to the failed opt out 9Nu4 for example.

Will these be addressed, or will NHS England answer the Data Guardian report and 27 questions [31] from December?

Is care.data arthritic or going quietly extinct? The last public information made available, is that it is rolling on in the background towards the pathfinders.

“By when will NHS England commit to respect the 700,000 objections to secondary data sharing already logged but not enacted?” [updated ref June 6th 2015]

How is the business plan kept up to date as the market moves on?

Is Big Data in the NHS too big to survive or has the programme learned to adapt and changed?

As Peter Mills asked a year ago, “Is the Government going to take this, as a live issue, into the next general election? Or will it (like the National Programme for IT) continue piecemeal, albeit without the toxic ‘care.data’ banner? “

The care.data programme board transparency agenda in Nov 2014 : “The care.data programme has yet to routinely publish agendas, minutes, highlight reports and finalised papers which arise from the care.data Programme Board.

“This may lead to external stakeholders and members of the public having a lack of confidence in the transparency of the programme.”

We all recognise the problem, but where’s the solution?

Where’s the cost, benefit and risk analysis?

Dear NHS England. One of your business cases is missing.
Why has the public not seen it?
Why are you making it hard to hunt down?
Why has transparency been gagged?

Like Dippy, the care.data business case belongs in the public domain, not hidden in a back room.

Like the NHS, the care.data full risk & planning files belong to us all.

Or is the truth that, like Nessie, despite wild claims, they may not actually exist?

***

more detail:

[1] New Statesman article, Tim Kelsey, 2001

[2]http://www.england.nhs.uk/ourwork/tsd/care-data/prog-board/ care.data programme board webpage

[3] http://www.infosecurity-magazine.com/news/nhs-caredata-pr-fiasco-continues/

[4] http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/health-committee/handling-of-nhs-patient-data/oral/17740.html

[5] https://www.whatdotheyknow.com/request/caredata_programme_board_minutes?nocache=incoming-621173#incoming-621173

[6] http://www.england.nhs.uk/wp-content/uploads/2015/02/cd-prog-brd-highlt-rep-15-12-14.pdf

[7] http://www.telegraph.co.uk/news/science/science-news/11377168/Natural-History-Museums-star-Dippy-the-dinosaur-to-retire.html

[8] http://jenpersson.com/care-data-postings-summary/

[9] http://www.england.nhs.uk/wp-content/uploads/2015/02/propsl-transpncy-pub-cd-papers.pdf

[10] http://www.computerweekly.com/news/2240215074/NHS-England-admits-failure-to-explain-benefits-of-caredata

[11] http://nuffieldbioethics.org/blog/2014/care-data-whats-in-a-dot-and-whats/

[12] http://www.theinformationdaily.com/2014/03/26/business-scents-boom-in-personal-information-economy

[13] http://www.hscic.gov.uk/article/3887/HSCIC-publishes-strategy-for-2013-2015

[14] http://jenpersson.com/flagship-care-data-2-commercial-practice/

[15] http://www.publications.parliament.uk/pa/ld201415/ldhansrd/text/141015-0001.htm

[16] http://www.publications.parliament.uk/pa/ld201415/ldhansrd/text/141015-0001.htm

[17] http://www.legislation.gov.uk/ukpga/2014/23/pdfs/ukpga_20140023_en.pdf

[18] http://jenpersson.com/hear-evil-evil-speak-evil/

[19] https://www.whatdotheyknow.com/request/nhs_patient_data_sharing_with_us

[20] http://www.hscic.gov.uk/hesdatadictionary

[21] http://www.bbc.co.uk/news/uk-politics-24130684

[22]  http://www.nao.org.uk/wp-content/uploads/2007/02/0607151.pdf

[23] http://www.cl.cam.ac.uk/~rja14/Papers/npfit-mpp-2014-case-history.pdf

[24] http://www.nao.org.uk/wp-content/uploads/2007/02/0607151.pdf

[25] http://www.healthpolicyinsight.com/?q=node/688

[26]http://www.albion-ventures.co.uk/ourfunds/pdf%20bamboo/Bamboo%20IOM%20signed%20interims%2030611.pdf

[27] http://www.v3.co.uk/v3-uk/news/2370877/nhs-needs-patients-digital-data-to-survive-warns-health-chief

[28 ]http://uk.emc.com/campaign/global/NHS-Healthcare-Report-2014/index.htm

[29 ] http://uk.emc.com/campaign/global/NHS-Healthcare-Report-2014/index.htm

[30] https://medconfidential.org/wp-content/uploads/2015/01/2015-01-29-A-short-proposal.pdf

[31] https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/389219/IIGOP_care.data.pdf

[32] http://www.privateequitywire.co.uk/2014/12/23/215235/cinven-acquire-northgate-public-services

[33] http://www.ehi.co.uk/news/EHI/9886/hscic-starts-sus-and-care-id-transfer

 

Patient questions on care.data – an open letter

Dear NHS England Patients & Information Directorate,

We’ve been very patient patients in the care.data pause. Please can we have some answers now?

I would like to call for greater transparency and openness about the promises made to the public, project processes & policies and your care.data communication plans.

In 2013, in the Health Service Journal Mr. Kelsey wrote:

“When patients are ignored, they are most at risk; that was the central conclusion of the report by Robert Francis into Stafford hospital.

Don Berwick, in his safety review, said the NHS should be “engaging, empowering and hearing patients and their carers all the time.

“That has been my mission since I started as National Director for Patients and Information: to support health and care services transform transparency and participation.

HSJ, 10th December 2013

It is time to walk-the-talk for care.data under this banner of transparency, participation and open government.

Response to the Listening exercises

The care.data listening phase, introduced by the pause announced on February 18th, has captured a mass of questions, the majority of which still remain unaddressed.

At one of these sessions, [the 1-hr session on June 17th Open House, linking ca. 100 people at each of the locations in Basingstoke, Leicester, London, and York] participants were promised that our feedback would be shared with us later in the summer, and posted online. After the NHS AGM on Sept 18th I was told it would happen ‘soon’. It is still not in the public domain.

At every meeting all the unanswered questions, on post-it notes, in table-group minutes or scribbled flipcharts, were gathered ‘to be answered at a later date’. When will that be?

To date, there has been no published information which addresses the unanswered event questions.

Transparency of Process, Policies and Approach

The care.data Programme Board has held meetings to plan the rollout process, policies and approach. The minutes and materials from which have not been published. I find this astonishing when one considers that the minutes of the care.data advisory group, NIB (new), CAG, GPES advisory or even NHS England Board itself are in the public domain. I believe the care.data Programme Board meeting materials should be too.

It was acknowledged through the Partridge Review of past use of our hospital records that this HES data is not anonymous. The extent of its sale to commercial third-parties and use by police and the Home Office was revealed. This is our medical data we gave to hospitals and in our wider medical use for our care. Why are we the last to hear it’s being accessed by all sorts of people who are not at all involved in our clinical care?

Even for commissioning purposes it is unclear how these datasharing reasons are justified when the Caldicott Review said extracting identifiable data for risk stratification or commissioning could not be assumed under some sort of ‘consent deal’?

“The Review Panel found that commissioners do not need dispensation from confidentiality, human rights and data protection law…” [The Information Governance review, ch7]

The 251 approval just got extended *again* – until 30th April 2015. If you can’t legally extract data without repeat approvals from on high, then maybe it’s time to question why?

The DoH, NHS England Patients and Information Directorate, HSCIC, and indeed many data recipients, all appear to have normalised an approach that for many is still a shock. The state centralised and passed on our medical records to others without our knowledge or permission. For years. With financial exchange. 

Amazingly, it continues to be released in this way today, still without our consent or fair processing or publicised way to opt out.

“To earn the public’s trust in future we must be able to show that our controls are meticulous, fool-proof and solid as a rock.”  said Sir Nick Partridge in his summary review.

Now you ask us to trust in care.data that the GP data, a degree more personal, will be used properly.

Yet you ask us to do this without significant changes in legislation to safeguard tightly defined purposes who can access it and why, how we control what future changes may be made without our knowledge and without a legally guaranteed opt out.

There is no information about what social care dataset is to be included in future, so how can we know what care.data scope even is yet?

Transparency cannot be a convenient watch word which applies with caveats. Quid pro quo, you want our data under an assumed consent process, then guarantee a genuinely informed public.

You can’t tell patients one approach now, then plan to change what will be said after the pilot is complete, knowingly planning a wider scope to include musculoskeletal or social care data and more.  Or knowing you plan to broaden users of data [like research and health intelligence currently under discussion at IAG ] but only communicate a smaller version in the pilot. That is like cheating on a diet. You can’t say and do one thing in public, then have your cake and eat it later when no one is looking. It still counts.

In these processes, policies and approach, I don’t feel my trust can be won back with lack of openness and transparency. I don’t yet see a system which is, ‘meticulous, fool-proof or solid as a rock’.

‘Pathfinder’ pilots

Most recently you have announced that four areas of CCGs will pilot the ‘pathfinder’ stage in the rollout of phase one. But where and when remains a  mystery. Pathfinder communications methods may vary from place to place and trial what works and what fails. One commendable method will be a written letter.

However even given that individual notice intent, we cannot ignore that many remaining questions will be hard to address in a leaflet or letter. They certainly won’t fit into an SMS text.

Why pilot communications at all which will leave the same open questions unanswered you already know, but have not answered?

For example, let’s get a few of the missing processes clarified up front:

  • How will you communicate with Gillick competent children, whose records may contain information about which their parents are not aware?
  • How will you manage this for elderly or vulnerable patients in care homes and with diminished awareness or responsibility?
  • What of  the vulnerable at risk of domestic abuse and coercion?
  • When things change in scope or use, how will we be given the choice to change our opt out decision?

I ask you not to ignore the processes which remain open. They need addressed BEFORE the pilot, unless you want people to opt out on the basis of their uncertainty and confusion.

What you do now, will set the model expectations for future communications. Patient online. Personalised medicine. If NHS health and social care is to become all about the individual, will you address all individuals equally or is reaching some less important than others?

It seems there is time and effort in talking to other professionals about big data, but not to us, whose data it is. Dear Patients & Information Directorate, you need to be talking to us, before to others about how to use us.

In March, this twelve point plan made some sensible suggestions.

Many of them remain unaddressed. You could start there. But in addition it must be clear before getting into communications tools, what is it that the pathfinders are actually piloting?

You can’t pilot communications without clearly defined contents to talk about.

Questions of substance need answers, the ten below to start with.

What determines that patients understand the programme and are genuinely informed, and how will it be measured?

Is it assumed that pilots will proceed to extraction? Or will the fair processing efforts be evaluated first and the effort vs cost be taken into account whether it is worth proceeding at all?

Given the cost involved, and legal data protection requirements, surely the latter? But the pathfinder action plan conflates the two.

Citizen engagement

Let’s see this as an opportunity to get care.data right, for us, the patients. After all, you and the rest of the NHS England Board were keen to tell us at the NHS AGM on September 18th, how valuable citizen engagement is, and to affirm that the NHS belongs to us all.

How valued is our engagement in reality, if it is ignored? How will involvement continue to be promoted in NHS Citizen and other platforms, if it is seen to be ineffective? How might this negatively affect future programmes and our willingness to get involved in clinical research if we don’t trust this basic programme today?

This is too important to get wrong. It confuses people and causes concern. It put trust and confidence in jeopardy. Not just for now, but for other future projects. care.data risks polluting across data borders, even to beyond health:

“The care.data story is a warning for us all. It is far better if the industry can be early on writing standards and protocols to protect privacy now rather than later on down the track,” he said. [David Willets, on 5G]

So please, don’t keep the feedback and this information to internal departments.

We are told it is vital to the future of our NHS. It’s our personal information.  And both belong to us.

During one Health Select Committee hearing, Mr. Kelsey claimed: “If 90 per cent opt out [of care.data], we won’t have an NHS.”

The BMA ARM voted in June for an opt in model.

ICO has ruled that an opt in model by default at practice level with due procedures for patient notification will satisfy both legal requirements and protect GPs in their role as custodians of confidentiality and data controllers. Patient Concern has called for GPs to follow that local choice opt in model.

I want to understand why he feels what the risk is, to the NHS and examine its evidence base. It’s our NHS and if it is going to fail without care.data and the Board let it come to this, then we must ask why. And we can together do something to fix it. There was a list of pre-conditions he stated at those meetings would be needed before any launch, which the public is yet to see met. Answering this question should be part of that.

It can’t afford to fail, but how do we measure at what cost?

I was one of many, including much more importantly the GPES Advisory Group, who flagged the shortcomings of the patient leaflet in October 2013, which failed to be a worthwhile communications process in January. I flagged it with comms teams, my MP, the DoH.

[Sept 2013 GPES Advisory] “The Group also had major concerns about the process for making most patients aware of the contents of the leaflets before data extraction for care.data commenced”.

No one listened. No action was taken. It went ahead as planned. It cost public money, and more importantly, public trust.

In the words of Lord Darzi,

“With more adroit handling, this is a row that might have been avoided.”

Now there is still a chance to listen and to act. This programme can’t afford to pilot another mistake. I’m sure you know this, but it would appear that with the CCG announcement, the intent is to proceed to pilot soon.  Ready or not.

If the programme is so vital to the NHS future, then let’s stop and get it right. If it’s not going to get the participation levels needed, then is it worth the cost? What are the risks and benefits of pressing ahead or at what point do we call a halt? Would it be wise to focus first on improving the quality and correct procedures around the data you already have – before increasing the volume of data you think you need? Where is the added intelligence, in adding just more information?

Is there any due diligence, a cost benefit analysis for care.data?

Suggestions

Scrap the ‘soon’ timetable. But tell us how long you need.

The complete raw feedback from all these care.data events should be made public, to ensure all the questions and concerns are debated and answers found BEFORE any pilot.

The care.data programme board minutes papers and all the planning and due diligence should be published and open to scrutiny, as any other project spending public funds should be.

A public plan of how the pathfinders fit into the big picture and timeline of future changes and content would remove the lingering uncertainty of the public and GPs: what is going on and when will I be affected?

The NHS 5 year forward view was quite clear; our purse strings have been pulled tight. The NHS belongs to all of us. And so we should say, care.data  can’t proceed at any and at all costs. It needs to be ‘meticulous, fool-proof and solid as a rock’.

We’ve been patient patients. We should now expect the respect and response, that deserves.

Thank you for your consideration.

Yours sincerely.

 

Addendum: Sample of ten significant questions still outstanding

1. Scope: What is care.data? Scope content is shifting. and requests for scope purposes are changing already, from commissioning only to now include research and health intelligence. How will we patients know what we sign up to today, stays the purposes to which data may be used tomorrow?

2. Scope changes fair processing: We cannot sign up to one thing today, and find it has become something else entirely tomorrow without our knowledge. How will we be notified of any changes in what is to be extracted or change in how what has been extracted is to be used in future – a change notification plan?

3. Purposes clarity: Who will use which parts of our medical data for what? a: Clinical care vs secondary uses:

Given the widespread confusion – demonstrated on radio and in press after the pathfinders’ announcement – between care.data  which is for ‘secondary use’ only, i.e. purposes other than the direct care of the patient – and the Summary Care Record (SCR) for direct care in medical settings, how will uses be made very clear to patients and how it will affect our existing consent settings?

3. Purposes definition: Who will use which parts of our medical data for what?  b) Commercial use  It is claimed the Care Act will rule out “solely commercial”purposes, but how when what remains is a broad definition open to interpretation? Will “the promotion of health” still permit uses such as marketing? Will HSCIC give its own interpretation, it is after all, the fact it operates within the law which prescribes what it should promote and permit.

3. Purposes exclusion: Who will use which parts of our medical data for what?  c) Commercial re-use by third parties: When will the new contracts and agreements be in place? Drafts on the HSCIC website still appear to permit commercial re-use and make no mention of changes or revoking licenses for intermediaries.

4a. Opt out: It is said that patients who opt out will have this choice respected by the Health and Social Care Information Centre (i.e. no data will be extracted from their GP record) according to the Secretary of State for Health  [col 147] – but when will the opt out – currently no more than a spoken promise – be put on a statutory basis? There seem to be no plans whatsoever for this.

Further wider consents: how patients will know what they have opted into or out from is currently almost impossible. We have the Summary Care Record, Proactive care in some local areas, different clinical GP systems, the Electronic Prescription Service and soon to be Patient Online, all using different opt in methods of asking and maintaining data and consent, means patients are unsurprisingly confused.

4b. Opt out: At what point do you determine that levels of participation are worth the investment and of value? If parts of the population are not represented, how will it be taken into account and remain valuable to have some data? What will be statistically significant?

5. Legislation around security: The Care Act 2014 is supposed to bring in new legislation for our data protection. But there are no changes to date as far as I can see – what happened to the much discussed in Parliament, one strike and out. Is any change still planned? If so, how has this been finalised and with what wording, will it be open to Parliamentary scrutiny?  The Government claim to have added legal protection is meaningless until the new Care Act Regulations are put in front of Parliament and agreed.

6. What of the Governance changes discussed?

There was some additional governance and oversight promised, but to date no public communication of changes to the data management groups through the HRA CAG or DAAG and no sight of the patient involvement promised.

The Data Guardian role remains without the legal weight that the importance of its position should command. It has been said this will be granted ‘at the earliest opportunity.’ Many seem to have come and gone.

7. Data security: The planned secure data facility (‘safe setting’) at HSCIC to hold linked GP and hospital data is not yet built for expanded volume of data and users expected according to Ciaran Devane at the 6th September event. When will it be ready for the scale of care.data?

Systems and processes on this scale need security designed, that scales up to match in size with the data and its use.

Will you proceed with a pilot which uses a different facility and procedures from the future plan? Or worse still, with extracting data into a setting you know is less secure than it should be?

8. Future content sharing: Where will NHS patients’ individual-level data go in the longer term? The current documentation says ‘in wave 1’ or phase one, which would indicate a future change is left open, and indicated identifiable ‘red’ data is to be shared in future?  “care.data will provide the longer term visions as well as […] the replacement for SUS.

9.  Current communications:

    • How will GPs and patients in ‘pathfinder’ practices be contacted?
    • Will every patient be written to directly with a consent form?
    • What will patients who opted out earlier this year be told if things have changed since then?
    • How will NHS England contact those who have retired or moved abroad recently or temporarily, still with active GP records?
    • How will foreign pupils’ parents be informed abroad and rights respected?
    • How does opt out work for sealed envelopes?
    • All the minorities with language needs or accessibility needs – how will you cater for foreign language, dialect or disability?
    • The homeless, the nomadic,  children-in-care
    • How can we separate these uses clearly from clinical care in the public’s mind to achieve a genuinely informed opinion?
    • How will genuine mistakes in records be deleted – wrong data on wrong record, especially if we only get Patient Online access second and then spot mistakes?
    • How long will data be retained for so that it is relevant and not excessive – Data Protection principle 3?
    • How will the communications cater for both GP records and HES plus other data collection and sharing?
    • If the plan is to have opt out effective for all secondary uses, communications must cater for new babies to give parents an informed choice from Day One. How and when will this begin?

No wonder you wanted first no opt out, then an assumed consent via opt out junk mail leaflet. This is hard stuff to do well. Harder still, how will you measure effectiveness of what you may have missed?

10. Pathfinder fixes: Since NHS England doesn’t know what will be effective communications tools, what principles will be followed to correct any failures in communications for any particular trial run and how will that be measured?

How will patients be asked if they heard about it and how will any survey, or follow up ensure the right segmentation does not miss measuring the hard to reach groups – precisely those who may have been missed?  i.e. If you only inform 10% of the population, then ask that same 10% if they heard of care.data, you would expect a close to 100% yes. That’s not reflective that the whole population was well informed about the programme.

If it is shown to have been ineffective, at what point do you say Fair Processing failed and you cannot legally proceed to extraction?

> This list doesn’t yet touch on the hundreds of questions generated from public events, on post-its and minutes. But it would be a start.

*******

References for remaining questions:

17th June Open House: Q&A

17th June Open House: Unanswered public Questions

Twelve point plan [March 2014] positive suggestions by Jeremy Taylor, National Voices

6th September care.data meeting in London

image quote: Winnie The Pooh, A.A. Milne

care.data should be like playing Chopin – or will it be all the right notes, but in the wrong order? [Part two]

How our data sharing performance will be judged, matters not just today, or in this electoral term but for posterity. The current work-in-progress is not a dress rehearsal for a care.data quick talent show, but the preparations for lifetime performance and at world standard.

How have we arrived where we are now, at a Grand Pause in the care.data performance? I looked at the past, reviewed through the Partridge Review meeting in [part one here] the first half of this post from attending the HSCIC ‘Driving Positive Change’ meeting on July 21st. (official minutes are online via HSCIC >>  here.)

Looking forward, how do we want our data sharing to be? I believe we must not lose sight of classical values in the rush to be centre stage in the Brave New World of medical technology. [updated link  August 3rd]* Our medical datasharing must be above and beyond the best model standards to be acceptable technically, legally and ethically, worldwide. Exercised with discipline, training and precision, care.data should be of the musical equivalent of Chopin.

Not only does HSCIC have a pivotal role to play in the symphony that the Government wishes research to play in the ‘health & wealth’ future of our economy, but they are currently alone on the world stage. Nowhere in the world has a comparable health data set over such length of time, as we do, and none has ever brought in all it’s primary care records into a central repository to merge and link, as is planned with care.data. Sir Kingsley Manning said in the current July/August Pharma Times article, data sharing now has to manage its reputation, just like Big Pharma.

reputation
Pharma Times – July/Aug 2014 http://www.pharmatimes.com/DigitalOnlineArea/digitaleditionlogin.aspx

Countries around the world, will be watching HSCIC, the companies and organisations involved in the management and in the use of our data.  They will be assessing the involvement and reaction of England’s population, to HSCIC’s performance. This performance will help shape what is acceptable, works well and failings will be learned from, by other countries, who will want to do the same in future.

Can we rise to the Challenge to be a world leader in Data Sharing?

If the UK Government wants England to be the world leader in research, we need, not only to be exemplary in how we govern the holding, management and release of data, but also exemplary in our ethics model and expectations of each other in the data sharing process.

How can we expect China [1] with whom the British Government recently agreed £14 billion in trade deals, [2] India, the country to which our GP support services are potentially poised to be outsourced through Steria [3] or any other organi Continue reading “care.data should be like playing Chopin – or will it be all the right notes, but in the wrong order? [Part two]” »

care.data communications and core concepts [Part one]

“My concerns about care.data are heightened, not allayed by the NHS England apparently relentless roll-out and focus on communications. Whilst they say it will take as long as it needs, there is doublespeak talk of Oct-Nov. pilots. It is still all about finding the right communications, not fixing flaws in core concepts.”

Today at the Health Select Committee Mr. Tim Kelsey, on behalf of NHS England, said that care.data pilots will be in October/ November and in the meantime they are listening to the “constructive challenge to NHS England how to build trust in the [care.data] programme.”

Here’s my real experience of that listening, why it may not help and what still needs done. (And in under 4 months if in time to be of any use for the pathfinder pilots, which are only of use to the whole if done properly. )

[Part one]  care.data communications and core concepts – Ten takeaways from the Open House event.

The NHS England led Open House Day [1] on June 17th was a listening opportunity according to the draft agenda for:

“patients and the public to influence the work of NHS England at national and regional level.”

Here are some of the things I learned:

1. Public Awareness

Mr.Kelsey asked the room (he was in London, other locations took part by live link) how many have:

a) heard of care (dot) data and

b) how many think they understand what it is is?

We couldn’t see his room, but he said ‘about half’ understood it. Our room’s show of hands was similar.

My reaction: One would expect everyone attending to have heard of it, the event after all was billed as in part about care.data. The level of understanding should be higher than the average in the public, since many (in Basingstoke at least) were NHS England or more involved than the average citizen.

Feedback overall was consistent with the latest MORI Ipsos poll [2] commissioned by the Joseph Rowntree Reform Trust in which the minority know it well and over 50% say they have never heard of it. That’ s a long way to go to reach people, inform them adequately to meet legal Data Protection minimums and let them enact their patient choice.

ipsosmori_q4know

2. Communications Message & Scope

A consistent, frequent communications message is that ” there are FAQs and materials, we have the answers, we just need to communicate them better.”

My response: communication is failing because the core scope of what care.data is, is fluid. Without something concrete and limited, it cannot be explained neatly. As one NHS England communications member of staff said to me this week, ‘we haven’t got an elevator pitch.’  So it’s not about the materials or the methods, it’s the substance that is flawed. When you’re talking about extracting, storing, sharing and selling some of our most intimate information, a vague notion of pooled experience is not good enough to trust. People want to know exactly what information, is being shared for what purpose, with whom, where. And how long will they keep it for?  NHS England simply do not have the answers to that, so, that elevator pitch? It’s never going to get off the ground in a meaningful way. And anything less than the answers to those questions, doesn’t meet the Fair Processing requirement of Data Protection Law.

Today at the Health Select Committee Mr.Kelsey was asked, will patients be able to trace in future where their data went? There was a rare and stunning silence. And after a benefits statement, there was still no answer given to the question. [update: Hansard now available, Q525/526]

Scope cannot be fluid and changing – the use of our personal information that we sign up to today, must stay what we agreed to tomorrow.

Data Protection requires that the minimum data is extracted so this ever increasing scope creep, but only *one* chance at opt out are at odds with each other.  What plans are in place to meet Data Protection fair processing EVERY time new things should be added and more data could be extracted? It’s a legal necessity. An ongoing change communications process MUST be in place.

3. Timing

Mr. Kelsey said, on rollout timing that NHS England would take it  ‘as slowly as we need to.’

My response: This reiterates the ‘no artificial deadlines’ but appears to be doublethink in contrast with the statement confirming  ‘autumn 2014’ extraction for Pathfinder (pilot) 100-500 practices. How will the pathfinder (pilot) locations be ready to test a communications process which as yet does not exist? How will it pilot a consent process for young people, the vulnerable, those with complex health system needs, the at risk, those outside ‘the system’ with GP records? A process which by its nature must be applied to any opt in or opt out choice, if others make a decision on their behalf yet from the meetings’ discussion, whose informed consent appears not even begun to be considered?  Or how will solutions to past Data protection Law failings be found from thin air, when data has been breached in the past, continues to be shared in the present and there is no solution to resolving those failings for the future?

4. Language simplification

There is a tendency to oversimplify the language of the Care Act, into ‘care.data will not be used for any purpose other than ‘health benefit’ – whereas benefit is not mentioned in the wording:

Care Act 2014My response: Is to question why this is? Does benefit sound better than promotion perhaps? Again, words should be used accurately.

5. Users simplification of the Care.Act wording

The actual wording is ‘the promotion of health’.

NHS England are similarly very keen to point out explicitly that care.data  cannot possibly be used for insurance or marketing purposes, such as junk mail.

My response:  Yet again, the wording of the Care Act does not state this explicitly. In fact, it leaves pharmaceutical marketing for example, quite open, ‘for the promotion of health’. And there is no legal barrier in the Care Act per se, for firms which receive data for one purpose, such as BUPA the hospital provider in London, using it for another, such as BUPA as refining premiums. BUPA Health Dialog received individual level patient data in the past. How do those patients know what it was then used for or shared with? Perhaps Data Sharing Agreements can specify this, but the Care Act, does not.

Claims to rule out “solely commercial” can’t be backed up by the wording of the Act. Will “the promotion of health” still permit uses such as marketing by pharmacies or ‘healthy eating’ campaigns from big food chains?  There is no obvious definition – and leaves wide interpretation open.

When Sir Manning spoke at the Health Select Committee he (rightly) said HSCIC can only restrict and determine what they do ‘within the law’. The law needs to be tight if the purposes are to be tight. Loose law, loose uses.

6. Use by Data Intermediaries to continue

care.data will continue to be on offer to third party Data Intermediaries it was confirmed in the panel Q&A.

My response: some third party intermediaries in part perform outsourced data services for the NHS. But do they also use the data within their own business to inform their business intelligence markets? They sell knowledge gleaned from raw data onwards,  or have commercial re-use licenses for raw data over which we in the public have no visibility or transparency.  We cannot see within these businesses how they build their own ‘Chinese walls’, self-imposed restrictions to ensure security between different parts of the same umbrella organisation. Allowing third parties to re-sell data means control over its use, owners and management is lost forever. Not secure, transparent or trustworthy. I explore their uses with commercial brokers more here in a previous post. [3] Considering I was told that my personal confidential data will not be shared with third parties, in a letter signed by the Secretary of State for Health, I am most unhappy about this. I will find it hard to trust new statements of best intent, without legislation to govern them.

7. Data Lab – restricting user access

Mr. Kelsey indicated that going forward the default access to our health data will be on the premises of HSCIC, the so called “Fume cupboard” or “Data Lab.” However he noted, this would not be for all, but be the ‘default’.

”The default will be access it on the premises of the IC. That won’t be universal for all organisations….”

My questions: Whilst a big improvement from giving away chunks of raw data via CD or to remote users, these processes need documented and publicly communicated for us to trust they will work. When will it be built and operational? How will we know who all the end users are if the same rules do not apply to all? How will those exceptions be granted? Documented? Audited? Will raw data extraction still be permitted? It’s the exceptions which cause issues and in future, the processes and how they are seen to be governed must be whiter than white. For those with direct access, users of the HDIS or HES, will a transparent list of users be published? At least for now, they do not show up on extraction audits so the public cannot see what those users access or why. So, a good step, but can’t stand alone.

Until this secure data lab is physically built, any data extracted cannot go into it. That won’t happen by October/November I should think. So will NHS England be prepared to extract data anyway, into a setting they *know* is LESS secure and a NOT yet a safe setting?

8. Governance

We were informed, an Independent Information Governance Oversight Panel (IIGOP), chaired by Dame Fiona Caldicott, has agreed to advise the care.data Programme Board to evaluate the first phase pathfinder (pilot) stage.

My feedback: I find this interesting not least because the Information Governance Review [4] under her direction in March 2013 decided that commissioning purposes were insufficient reason to extract identifiable data. Personal confidential data should only be disclosed with consent or under statute and “while the public interest can also provide a legal basis for disclosure it should not be relied upon for routine data flows. [footnote, p.63]”

What value is Independent Governance if it has no legislative teeth and can only advise? At the Health Select Committee today, he said she would be able to offer a view, and a number of parties will be able to express views & be ‘in agreement’. But I wonder who owns the ultimate final go/ no-go decision whether the pilot should progress to full roll-out?

9. Anonymous Sounds Safer

Feedback on the handout: The care.data notes need not only to be accurate but transparently truthful.

In my opinion, words are again misused words to indicate that data is anonymous. 1706204_datauses Whilst the intention of the merged CES output (GP records combined with HES files) may be that some users will see only pseudonymous data, the extracted and stored data is identifiable unless opted out. Name is held in the Personal Demographics Service. [5] This is one of the key communications messages I have taken up with HSCIC, NHS England, raised to the DH through my MP. To reassure the public by saying name is not stored, is deliberately deceptive unless it states simultaneously that it may already be held in the PDS and/or linked on demand.[6]

1706datauses

The Partridge Review [7] has dispensed with the notion that data is anonymous once and for all. Now it must be managed accordingly as identifiable data within Data Protection law and communications must stop misusing the anonymous concept to reassure the public.

“It’s a beautiful thing, the destruction of words.”                                 ( George Orwell, 1984)

10. My own experience of engagement

The most interesting part of the day for me personally however, were the discussions which were unstructured and when we were free to talk amongst ourselves. Unfortunately, that was very little. The structure (at least in Basingstoke and appeared similar on screens elsewhere) was based around tables of about 10 which included at least two NHS England staff at each.

At the end of the morning session, before lunch, as the other participants had left the table, a Communications person and I got into conversation on the differences between care.data, the Summary care Record (SCR) and where Patient Online was to fit in our understanding of which data was used for which purpose.

We discussed that since care.data is only monthly retrospective extracts, not for real-time record access, it would not be a suitable basis for Patient Online access – care.data is for secondary uses. So, we moved onto the challenges of SCR access at local level and how it will be possible to offer everyone Patient online when so many have opted out of the Summary Care Record. We began to talk stats of SCR availability and actual use in hospitals.[8]

Sadly, the table facilitator appeared to decide at that point, that our discussion needed guidance and rushed to fetch a senior member of staff from Strategic systems. And rather than engaging me in what had been a very positive, pleasant two-way conversation, with the Comms person asking me questions and our exchange of views, the Strategic Head took over the conversation with her NHSE team member, effectively restricting further discussion, even with her body positioning and language. Being informed is OK, as long as its the ‘right’ information?

I don’t think that’s what patient engagement is about. The subject needs real, hard discussion, not just managed exchange using pre-designed template cards of topics that we are told we ‘should’ discuss. Perhaps ignorance is strength, but in my opinion, keeping Communications staff informed only ‘on message’ and not of the wider facts and concerns is shortsighted and does them, and patients, a disservice, but then again:

“If you want to keep a secret, you must also hide it from yourself.” (George Orwell, 1984)

For [Part two] care.data communications and core concepts – Questions, Communications and Actions : link here >>

*****

[1] The NHS England Open House recording June 17th http://www.nhsengland-openhouse.public-i.tv/core/portal/NHSopenhouse

[2] IPSOS Mori poll conducted for the Joseph Rowntree Foundation: http://www.ipsos-mori.com/Assets/Docs/Polls/jrrt-privacy-topline-nhs-2014.pdf

[3] My post on uses of our records with commercial Data Brokers – http://jenpersson.com/flagship-care-data-2-commercial-practice/

[4] The Information Governance Review ‘Caldicott 2‘ https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/192572/2900774_InfoGovernance_accv2.pdf

[5] The Personal demographics Service at HSCIC (including name) http://systems.hscic.gov.uk/demographics/pds/contents

[6] The Data Linkage Service at HSCIC http://www.hscic.gov.uk/dles

[7] The Partridge review: http://www.hscic.gov.uk/datareview

[8] Summary Care Record use statistics https://www.whatdotheyknow.com/request/scr_care_settings_with_viewing_c#incoming-446569

***

Fun fact: George Orwell’s Nineteen Eighty-Four is currently number 5 in the UK Classics Fiction Amazon ranking. And 86th in fiction overall. Sales up over 5,000% in the US since the Snowden revelations, a year ago.

MORE BACKGROUND ON THE EVENT:

Within the other programmes of Patient Online and Patient Participation, care.data was a one hour session. It included the blue plasticine people short animation, a speech by Mr.Kelsey, a 15 minute table discussion on one pre-given theme from a range of four, reading aloud the summary of that discussion from each table within the room, one question per venue raised outside the room to the panel via video link in London, and their answers. Our discussion topics were brief, controlled and relatively superficial. It could have been a productive day’s workshop on only that.

The Open House  took place simultaneously in four venues across England, Basingstoke, Leicester, York and London, connected through a live videolink at a number of points throughout the day. The recording in part, can be viewed here.

I attended the Basingstoke event, particularly keen to learn about national programmes such as care.data and hear about any updated plans for its rollout, to learn about patient online, and to meet the NHS England team in the South as well as other interested people like me. I hoped for some real public discussion and to hear others get their questions aired, shared and on the table for resolution.

I met one other ‘only’ patient and whilst I was kindly told by a further active PPG organiser, that I should never refer to myself as ‘only’ a patient, but you know what I mean. I’ve applied as a lay rep on our local CCG for an opening next year, until then, I’m learning as much as I can from others. Other attendees I met were those already more closely involved with NHS England in some way already. As NHS England staff, facilitators, representatives from Clinical Commissioning Groups, Patient Leaders and PPG leaders.

Flagship care.data – precious cargo [1] & commercial uses in theory

“The challenge is that if many users of data are intermediaries with re-use licences and even the HSCIC doesn’t know who all the end users are, how on earth can anyone judge how they will be for purposes of ‘improving NHS care’?”

Commercial and third party use is one of the most damaging aspects of the rollout which is wrecking the care.data programme.

I’ve cut my opinion on this care.data topic into two parts, theory and practice, to address the outcomes of the LMC conf of yesterday from a patient POV. From my lay perspective, the result of the debate and votes was partly due to the failure to shore up the policy theory around commercial uses to make any perceivable improvement to trust for the future. And partly based on proven failures in practice to protect our data in the past. Failures around commercial use of care.data in theory and practice.

The theme of making money, is a recurring topic for women in literature, and graced or should I say, grubbied  our screens in recent weeks in the adaptation of Dame Daphne Du Maurier’s Jamaica Inn.

Mary Yellan, orphaned and without means, seeks the only family she has and lands among the smugglers and muddy marsh of the Cornish moors. It’s not only set against a backdrop  of smuggling, but wrecking. The heroine struggles between moral conflict and practical necessity, whether to join in their activities, against her ethical principles.  She gets used to it but ultimately can’t live with it.

Given that the real inn is in the middle of a very bleak moor, with no outlook except the rough shorn grass, you need to really see unmet potential to want to be its new owner. For that, you need to see strong commercial opportunities or be a committed hard core Du Maurier fan. Or both.

So it can appear, from a patient point of view on care.data. Either the driving parties promoting the release of patient data see unmet potential [1] which needs commercial harnessing [1b], have direct commercial interests[1c], or they have another personal interest in its extraction and access. Or perhaps they are just hard core fans of data sharing, to the point that we should support mashing our health data up with commercial retail loyalty cards as Mr. Tim Kelsey suggested in November 2013 at Strata [from 16:00] [2].

Are the same people and organisations driving the programme and calling for ‘data for patients’ not also the same who will benefit most from having access to the data? The measurable benefits to us patients remain unclear, at best. The cost, our confidentiality and GP trust, is however clearly non-refundable. Consent, the age old pillar of medical ethics is to be waived aside. The LMC Conf obviously see value in protecting confidentiality at source if it cannot be guaranteed by others, whether the HSCIC or the data users.

Who will all the end users of our data be? They remain somewhat undefined, because the care.data addendum including Think Tanks, commercial companies and information intermediaries was not approved [3] and because future users are undefined in social care, for example. Future scope will entail additional future users. But then perhaps this should not surprise us that NHS England and the HSCIC expect us to acquiesce to this fair processing failure although we don’t yet know all the future end users, because Sir Kingsley Manning admitted that HSCIC does not know who all the current end users are either (Q272) [4a] at the  Health Select Committee hearing. So, were the GPs at LMC Conf just expected to trust ‘on spec’ to whom their approval of care.data would entitle its sharing?

Information intermediaries in particular, seem to still be on the key stakeholders list[5] in January 2014. But only a year ago, in April 2013, The ‘Health and Social Care Transparency Panel’ discussion on sharing patient data with information intermediaries clearly stated there was no legitimate or statutory basis to share at least ONS data with them. [6]

“The issues of finding a legitimate basis for sharing ONS death data with information intermediaries for commercial purposes had been a long running problem. A number of possible approaches had been considered but advice from the relevant Government legal teams was that there did not appear to be a statutory basis for doing so. The panel identified this as a significant barrier to developing a vibrant market of information intermediaries (IIs). It also limited the ability of IIs to support NHS organisations with business intelligence to evaluate and benchmark the quality of their services.

It was agreed that this issue needed to be resolved, and if necessary changes to the relevant legislation should be considered. ” 

I would love to know whether the law changed in the last year, how was the issue resolved, or has HSCIC and have we just through use, acknowledged that this sharing with intermediaries is acceptable and legal? The meeting later in July should have given clarity, but I can’t see minutes beyond April. They are no doubt somewhere, and someone cleverer than me, can help find them and clarify how the decision was reached I expect. I did find notes in the recent HSCIC audit of past data releases [4b], that ONS data was granted under existing law after all:

“The ONS data are supplied under the Statistics and Registration Service Act 2007 section 42(4) as amended by s287 of the Health and Social Care Act 2012, for the purpose of assisting the Secretary of State for Health, or the Welsh Ministers, in the performance of his, or their functions in relation to the health service.”

Since the Health and Social Care Act revoked the Secretary of State’s duty of care to provide a national health service, I wonder what functions it relates to as pertains to third party intermediaries? The ONS application form is detailed but no more enlightening for commercial intermediary use. I can’t help feeling we’re seeking justifications rather than good cause as the starting point for widening data releases. That we are starting to accept that our hospital records have been shared without our consent and sold. (Let’s give up the recouping costs word play, call a spade a spade. Data and cash change hands.). ‘What can we do about it anyway? we may well ask. As time has gone on in the care.data debacle, and in the three months since the delay, it appears from the leadership comments of NHS England from Mr. Kelsey in Pulse that, we’re not to worry, “now we are working to make care.data safe.” [free registration required] Still no one has said, we made a mistake of its handling in the past.

This acknowledgement however that work needs done to make the data safe, underlines exactly what so many saw months ago including the GPES advisory group which had concerns [17] in Sept 2013 on commercial uses and its communication, governance and patient trust. Care.data was launched regardless. Now it’s grounded.  What has improved since then? What remains to fix?

How well exactly did HES storage and sharing work so far, with breaches identified as well as the basic legal fair processing failing to inform us of its extraction? What has been done to prevent it happening again? I have seen no concrete steps which give me faith the past flaws have been fixed enough to now trust it in future.

In February, before the pause Jeremy Taylor of National Voices wrote a very sound 12 point plan of what needed to change.  Since then, what has actually  changed [7] as far as I can see, is only the introduction of a delay, and that his words were listened to, that there should be no artificial deadline:

‘”the timescale for launching Care.Data was entirely artificial, as is the six month “pause”.

Three months into the delay, nothing of substance other than agreeing there is no artificial deadline, appears to have changed.

The most significant past let downs have all been commercial or third party uses. OmegaSolver, Beacon Dodsworth, PA ConsultingEarthware.

The Care Bill amendment touted as a change in the legal protection of our care.data, does not block commercial Third party intermediaries sharing care.datauses of our data, only stating that it should be used ‘for the promotion of health’ which is open to all sorts of interpretation. Not least I imagine, those similar to ‘fight against obesity’ campaigns by marketing masters of commercialism.

So with little transparent change on policy, since we have become aware of data breaches, misuse and patient anger about commercial use, it should come therefore as no surprise that the BMA Local Medical Committees (LMCs) yesterday voted to state a preference for opt in not opt out, pseudo or anonymisation at source and insists that care.data should only be used for its stated purpose of improving health care delivery, and not sold for profit.

Simply: the public don’t trust that our identifiable data is protected and we object to all our data being traded commercially.

This is in direct conflict with HSCICs stated purpose in the HSCIC 2013-15 roadmap [8]:

“Help stimulate the market through dynamic relationships with commercial organisations, especially those who expect to use its data and outputs to design new information-based services.”

And in statements by both Sir Manning at the Health Select Committee and Dr. Geraint Lewis [9]:

…”we think it would be wrong to exclude private companies simply on ideological grounds; instead, the test should be how the company wants to use the data to improve NHS care. And, as Polly Toynbee put it, if “it aids economic growth too, that’s to the good.”

The challenge is that if many users of data are intermediaries with re-use licences and we don’t even know who all the end users are, how on earth can the HSCIC judge how they will benefit ‘improving NHS care’?

As regards economic growth, if the aim is to give away data for free, as Mr. Kelsey told the September 13th NHS England board (from 26:10)[10], how is the NHS to make profit from it? It’s not. Commercial companies are to buy at prices only to help HSCIC recoup costs [11], so that is not technically opposed in wording to ‘ not making a profit.’ Citizens, GPs and others can be aligned with that on paper. But not in spirit. For now commercial companies profit from our state funded records, paid for by NHS DoH money.  They profit intermediaries with re-use licences beyond which we have no visibility or control of where our data goes or why. And the fact that the wider profiting third parties from the whole scheme,  ATOS paid zero tax in the UK in 2012,[12] really grates. How does the cash given to ATOS benefit economic growth in the country?

Therefore, for the LMCs to have voted now any differently, would have expected them be soothsayers, knowing that the care.data work-in-progress and any future changes will make both the future scope purposes and future users clearly defined, in order to fulfil their duty as data controller, ensuring patients have a reasonable expectation of how their data will be used. It asks GPs to betray their age old fundamental principle of medicine, to betray patient confidentiality, for commissioning. They are being told to betray the good ethics of consent.  They are being asked to betray patients’ trust and even to use that trust to ‘sell’ the idea in which they may not believe.

And care.data current processes betray the best practices of data collection – seek to collect the minimum data required, for a specific purpose and delete it when that is completed.

“Personal data processed for any purpose or purposes shall not be kept for longer than is necessary for that purpose or those purposes’ consistent with the Data Protection Act principle 5. [13]

Instead HSCIC’s remit over the coming years of care.data is to fill in all the remaining gaps with any health and social care information not already collected [14], and keep it linkable from cradle to grave – or even from “germ to worm” for everyone with an NHS number in England. Purposes are non-specific and unlimited because they’ll change over time and the end users are not all defined for it plans to be opened up increasingly widely for use in social care and we don’t know what else.

caredatatimeline

 

In my lay view, the BMA LCs had no choice in the interests of their patients but to call for a rejection of assumed consent and commercial uses. The two do not go together. Opt out for uses of our data purely for NHS care and its planning would be much more palatable. But add in commercial uses, which is what has both been the main source of patient objection and data breaches, and it’s a deal breaker.

They can’t stake their support and reputation on a best guess of what might be. They can only base their judgement on what they know now. And no one supports care.data exactly as she is right now, which is why it is postponed and work in progress. Shore up trust, governance and axe these commercial uses and perhaps an assumed consent would seem more palatable. For example, Cross border governance needs documented when the application form gives non UK options. Scope and users need defined to ensure proper fair processing to meet DPA ICO requirements [16]. But so far, nothing has visibly changed.

It’s no different from when Ben Goldacre was telling us public trust cannot be easily regained and it broke his heart [15]. I know why, there are expected benefits to public research amongst others to access primary care data more than they already have in CPRD or pseudonymous data in QResearch and others, but we need to act based on today’s approved uses for care.data, not what might be remain in an undefined future. Right now, we’ve seen no changes of substance since the delay was announced.

NHS England can’t therefore genuinely expect to see a shift in trust in citizens or GPs based on nothing more than lines in the sand.

I believe GPs at the LMC Conf took the best decisions they could with the programme in its current form, with knowledge of past problems and lack of future clarity over scope and users.

They voted for how they feel best protects, respects and empowers their patients.

If our current Data Controllers and  guardians of confidentiality don’t stand up for patients to get the build of the infrastructure right before they agree to release our data to fill it, who will? The question will be whether the Secretary of State and NHS England will force their legal right of extraction through regardless, or will respect the medical profession’s representatives and the rights of citizens they care for?

There is an opportunity to fix things. The LMC Conf after all have no legal efficacy, they stated their opinion and stance which commands respect and attention. Flagship care.data is not washed up, yet. But it can’t sail without addressing governance and professional support. Commercial exploitation and assumed opt in are not going to work comfortably together. Transparency of who has access to what data for what purposes and how it is released needs sharpened up. And regardless of whether opt in ever comes onto the table or not, if care.data keeps her strongly  commercial heading many, many more will jump ship to opt out. The damage of bias will be done, either way.

She needs some new directions, helmsmanship that we trust and sound repairs.

********

If you have missed the background to this saga, I’d recommend the Julia Powles article in WIRED – what to save when the care.data ship goes down.

I’m going to look at some more of the commercial uses of care.data in practice another time. And clarify the communication of the opt out codes and why research purposes is a misnomer in the GP patient record sharing part of care.data purposes – it’s not (yet at least) an approved use.

********

[1] MOU between AstraZeneca and the HSCIC, December 2012

[1b]  ABPI Vision for harnessing Real World Data 2011

[1c] Hansard, Nov 2010 George Freeman ‘I know from my own experience that we are sitting on billions of pounds-worth of patient data. Let us think about how we can unlock the value of those data around the world.’

[2] Strata November 2013, Tim Kelsey keynote ‘mash it up with other data sources to get their local retailers to tell them about their purchasing habits so they can mash that up with their health data’

[3] care.data addendum Sept 2013

[4] Written Hansard of the Health Select Committee , 8th April

[4b] The HSCIC data release register issued on April 3rd 2013

[5] Oversight panel with input from Dame Fiona Caldicott, January 2014, with stakeholders’ list

[6] Health and Social Care Transparency Overview Panel April 2013

[7] National Voices – Jeremy Taylor, an excellent overview of 12 points which needed fixed from February 2014

[8] HSCIC 2013-15 Roadmap

[9] NHS England comments by Dr.Lewis on commercial principle

[10] September 13th 2013, care.data directions approved by the NHS England Board – care.data from 25:40 – 39:00 – note identifiable, not anonymous data is extracted and stored with the DLES at HSCIC, and GP objections to date on care.data opt-in seem not to have been respected in contrast to the claim ‘GPs make a decision’ from 31:00. There is to date, no communicated way to prevent HES data extraction and its sharing in pseudonymous form.

[11] The HSCIC Data Linkage price list

[12] The Independent, November 2013 Atos & G4 pay no corporation tax in 2012, National Audit Office stats via Adam Withnall, The Independent

[13] Data Protection Standards – retention, principle 5

[14] care.data programme overview April 2013

[15] the Guardian, 28th February 2014 – care.data is in chaos – Ben Goldacre

[16] Blog from the Information Commissioner’s Office on care.data Data Protection and Fair processing

[17]The GPES Advisory Group meeting minutes Sept 12th 2013

{updated 28th May – looks like past uses of our health data are now also under scrutiny by ICO which is investigating claims that insurers have accessed full medical records using subject access requests.}

By theamateurbookblogger@googlemail.com