Tag Archives: commercial uses

Gotta know it all? Pokémon GO, privacy and behavioural research

I caught my first Pokémon and I liked it. Well, OK, someone else handed me a phone and insisted I have a go. Turns out my curve ball is pretty good. Pokémon GO is enabling all sorts of new discoveries.

Discoveries reportedly including a dead man, robbery, picking up new friends, and scrapes and bruises. While players are out hunting anime in augmented reality, enjoying the novelty, and discovering interesting fun facts about their vicinity, Pokémon GO is gathering a lot of data. It’s influencing human activity in ways that other games can only envy, taking in-game interaction to a whole new level.

And it’s popular.

But what is it learning about us as we do it?

This week questions have been asked about the depth of interaction that the app gets by accessing users’ log in credentials.

What I would like to know is what access goes in the other direction?

Google, heavily invested in AI and Machine intelligence research, has “learning systems placed at the core of interactive services in a fast changing and sometimes adversarial environment, combinations of techniques including deep learning and statistical models need to be combined with ideas from control and game theory.”

The app, which is free to download, has raised concerns over suggestions the app could access a user’s entire Google account, including email and passwords. Then it seemed it couldn’t. But Niantic is reported to have made changes to permissions to limit access to basic profile information anyway.

If Niantic gets access to data owned by Google through its use of google log in credentials, does Nantic’s investor, Google’s Alphabet, get the reverse: user data from the Google log in interaction with the app, and if so, what does Google learn through the interaction?

Who gets access to what data and why?

Brian Crecente writes that Apple, Google, Niantic likely making more on Pokémon Go than Nintendo, with 30 percent of revenue from in-app purchases on their online stores.

Next stop  is to make money from marketing deals between Niantic and the offline stores used as in-game focal points, gyms and more, according to Bryan Menegus at Gizmodo who reported Redditors had discovered decompiled code in the Android and iOS versions of Pokémon Go earlier this week “that indicated a potential sponsorship deal with global burger chain McDonald’s.”

The logical progressions of this, is that the offline store partners, i.e. McDonald’s and friends, will be making money from players, the people who get led to their shops, restaurants and cafes where players will hang out longer than the Pokéstop, because the human interaction with other humans, the battles between your collected creatures and teamwork, are at the heart of the game. Since you can’t visit gyms until you are level 5 and have chosen a team, players are building up profiles over time and getting social in real life. Location data that may build up patterns about the players.

This evening the two players that I spoke to were already real-life friends on their way home from work (that now takes at least an hour longer every evening) and they’re finding the real-life location facts quite fun, including that thing they pass on the bus every day, and umm, the Scientology centre. Well, more about that later**.

Every player I spotted looking at the phone with that finger flick action gave themselves away with shared wry smiles. All 30 something men. There is possibly something of a legacy in this they said, since the initial Pokémon game released 20 years ago is drawing players who were tweens then.

Since the app is online and open to all, children can play too. What this might mean for them in the offline world, is something the NSPCC picked up on here before the UK launch. Its focus  of concern is the physical safety of young players, citing the risk of in-game lures misuse. I am not sure how much of an increased risk this is compared with existing scenarios and if children will be increasingly unsupervised or not. It’s not a totally new concept. Players of all ages must be mindful of where they are playing**. Some stories of people getting together in the small hours of the night has generated some stories which for now are mostly fun. (Go Red Team.) Others are worried about hacking. And it raises all sorts of questions if private and public space is has become a Pokestop.

While the NSPCC includes considerations on the approach to privacy in a recent more general review of apps, it hasn’t yet mentioned the less obvious considerations of privacy and ethics in Pokémon GO. Encouraging anyone, but particularly children, out of their home or protected environments and into commercial settings with the explicit aim of targeting their spending. This is big business.

Privacy in Pokémon GO

I think we are yet to see a really transparent discussion of the broader privacy implications of the game because the combination of multiple privacy policies involved is less than transparent. They are long, they seem complete, but are they meaningful?

We can’t see how they interact.

Google has crowd sourced the collection of real time traffic data via mobile phones.  Geolocation data from google maps using GPS data, as well as network provider data seem necessary to display the street data to players. Apparently you can download and use the maps offline since Pokémon GO uses the Google Maps API. Google goes to “great lengths to make sure that imagery is useful, and reflects the world our users explore.” In building a Google virtual reality copy of the real world, how data are also collected and will be used about all of us who live in it,  is a little wooly to the public.

U.S. Senator Al Franken is apparently already asking Niantic these questions. He points out that Pokémon GO has indicated it shares de-identified and aggregate data with other third parties for a multitude of purposes but does not describe the purposes for which Pokémon GO would share or sell those data [c].

It’s widely recognised that anonymisation in many cases fails so passing only anonymised data may be reassuring but fail in reality. Stripping out what are considered individual personal identifiers in terms of data protection, can leave individuals with unique characteristics or people profiled as groups.

Opt out he feels is inadequate as a consent model for the personal and geolocational data that the app is collecting and passing to others in the U.S.

While the app provider would I’m sure argue that the UK privacy model respects the European opt in requirement, I would be surprised if many have read it. Privacy policies fail.

Poor practices must be challenged if we are to preserve the integrity of controlling the use of our data and knowledge about ourselves. Being aware of who we have ceded control of marketing to us, or influencing how we might be interacting with our environment, is at least a step towards not blindly giving up control of free choice.

The Pokémon GO permissions “for the purpose of performing services on our behalf“, “third party service providers to work with us to administer and provide the Services” and  “also use location information to improve and personalize our Services for you (or your authorized child)” are so broad as they could mean almost anything. They can also be changed without any notice period. It’s therefore pretty meaningless. But it’s the third parties’ connection, data collection in passing, that is completely hidden from players.

If we are ever to use privacy policies as meaningful tools to enable consent, then they must be transparent to show how a chain of permissions between companies connect their services.

Otherwise they are no more than get out of jail free cards for the companies that trade our data behind the scenes, if we were ever to claim for its misuse.  Data collectors must improve transparency.

Behavioural tracking and trust

Covert data collection and interaction is not conducive to user trust, whether through a failure to communicate by design or not.

By combining location data and behavioural data, measuring footfall is described as “the holy grail for retailers and landlords alike” and it is valuable.  “Pavement Opportunity” data may be sent anonymously, but if its analysis and storage provides ways to pitch to people, even if not knowing who they are individually, or to groups of people, it is discriminatory and potentially invisibly predatory. The pedestrian, or the player, Jo Public, is a commercial opportunity.

Pokémon GO has potential to connect the opportunity for profit makers with our pockets like never before. But they’re not alone.

Who else is getting our location data that we don’t sign up for sharing “in 81 towns and cities across Great Britain?

Whether footfall outside the shops or packaged as a game that gets us inside them, public interest researchers and commercial companies alike both risk losing our trust if we feel used as pieces in a game that we didn’t knowingly sign up to. It’s creepy.

For children the ethical implications are even greater.

There are obligations to meet higher legal and ethical standards when processing children’s data and presenting them marketing. Parental consent requirements fail children for a range of reasons.

So far, the UK has said it will implement the EU GDPR. Clear and affirmative consent is needed. Parental consent will be required for the processing of personal data of children under age 16. EU Member States may lower the age requiring parental consent to 13, so what that will mean for children here in the UK is unknown.

The ethics of product placement and marketing rules to children of all ages go out the window however, when the whole game or programme is one long animated advert. On children’s television and YouTube, content producers have turned brand product placement into programmes: My Little Pony, Barbie, Playmobil and many more.

Alice Webb, Director of BBC Children’s and BBC North,  looked at some of the challenges in this as the BBC considers how to deliver content for children whilst adapting to technological advances in this LSE blog and the publication of a new policy brief about families and ‘screen time’, by Alicia Blum-Ross and Sonia Livingstone.

So is this augmented reality any different from other platforms?

Yes because you can’t play the game without accepting the use of the maps and by default some sacrifice of your privacy settings.

Yes because the ethics and implications of of putting kids not simply in front of a screen that pitches products to them, but puts them physically into the place where they can consume products – if the McDonalds story is correct and a taster of what will follow – is huge.

Boundaries between platforms and people

Blum-Ross says, “To young people, the boundaries and distinctions that have traditionally been established between genres, platforms and devices mean nothing; ditto the reasoning behind the watershed system with its roots in decisions about suitability of content. “

She’s right. And if those boundaries and distinctions mean nothing to providers, then we must have that honest conversation with urgency. With our contrived consent, walking and running and driving without coercion, we are being packaged up and delivered right to the door of for-profit firms, paying for the game with our privacy. Smart cities are exploiting street sensors to do the same.

Freewill is at the very heart of who we are. “The ability to choose between different possible courses of action. It is closely linked to the concepts of responsibility, praise, guilt, sin, and other judgments which apply only to actions that are freely chosen.” Free choice of where we shop, what we buy and who we interact with is open to influence. Influence that is not entirely transparent presents opportunity for hidden manipulation, while the NSPCC might be worried about the risk of rare physical threat, the potential for the influencing of all children’s behaviour, both positive and negative, reaches everyone.

Some stories of how behaviour is affected, are heartbreakingly positive. And I met and chatted with complete strangers who shared the joy of something new and a mutual curiosity of the game. Pokémon GOis clearly a lot of fun. It’s also unclear on much more.

I would like to explicitly understand if Pokémon GO is gift packaging behavioural research by piggybacking on the Google platforms that underpin it, and providing linked data to Google or third parties.

Fishing for frequent Pokémon encourages players to ‘check in’ and keep that behaviour tracking live. 4pm caught a Krabby in the closet at work. 6pm another Krabby. Yup, still at work. 6.32pm Pidgey on the street outside ThatGreenCoffeeShop. Monday to Friday.

The Google privacy policies changed in the last year require ten clicks for opt out, and in part, the download of an add-on. Google has our contacts, calendar events, web searches, health data, has invested in our genetics, and all the ‘Things that make you “you”. They have our history, and are collecting our present. Machine intelligence work on prediction, is the future. For now, perhaps that will be pinging you with a ‘buy one get one free’ voucher at 6.20, or LCD adverts shifting as you drive back home.

Pokémon GO doesn’t have to include what data Google collects in its privacy policy. It’s in Google’s privacy policy. And who really read that when it came out months ago, or knows what it means in combination with new apps and games we connect it with today? Tracking and linking data on geolocation, behavioural patterns, footfall, whose other phones are close by,  who we contact, and potentially even our spend from Google wallet.

Have Google and friends of Niantic gotta know it all?

The illusion that might cheat us: ethical data science vision and practice

This blog post is also available as an audio file on soundcloud.


Anais Nin, wrote in her 1946 diary of the dangers she saw in the growth of technology to expand our potential for connectivity through machines, but diminish our genuine connectedness as people. She could hardly have been more contemporary for today:

“This is the illusion that might cheat us of being in touch deeply with the one breathing next to us. The dangerous time when mechanical voices, radios, telephone, take the place of human intimacies, and the concept of being in touch with millions brings a greater and greater poverty in intimacy and human vision.”
[Extract from volume IV 1944-1947]

Echoes from over 70 years ago, can be heard in the more recent comments of entrepreneur Elon Musk. Both are concerned with simulation, a lack of connection between the perceived, and reality, and the jeopardy this presents for humanity. But both also have a dream. A dream based on the positive potential society has.

How will we use our potential?

Data is the connection we all have between us as humans and what machines and their masters know about us. The values that masters underpin their machine design with, will determine the effect the machines and knowledge they deliver, have on society.

In seeking ever greater personalisation, a wider dragnet of data is putting together ever more detailed pieces of information about an individual person. At the same time data science is becoming ever more impersonal in how we treat people as individuals. We risk losing sight of how we respect and treat the very people whom the work should benefit.

Nin grasped the risk that a wider reach, can mean more superficial depth. Facebook might be a model today for the large circle of friends you might gather, but how few you trust with confidences, with personal knowledge about your own personal life, and the privilege it is when someone chooses to entrust that knowledge to you. Machine data mining increasingly tries to get an understanding of depth, and may also add new layers of meaning through profiling, comparing our characteristics with others in risk stratification.
Data science, research using data, is often talked about as if it is something separate from using information from individual people. Yet it is all about exploiting those confidences.

Today as the reach has grown in what is possible for a few people in institutions to gather about most people in the public, whether in scientific research, or in surveillance of different kinds, we hear experts repeatedly talk of the risk of losing the valuable part, the knowledge, the insights that benefit us as society if we can act upon them.

We might know more, but do we know any better? To use a well known quote from her contemporary, T S Eliot, ‘Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?’

What can humans achieve? We don’t yet know our own limits. What don’t we yet know?  We have future priorities we aren’t yet aware of.

To be able to explore the best of what Nin saw as ‘human vision’ and Musk sees in technology, the benefits we have from our connectivity; our collaboration, shared learning; need to be driven with an element of humility, accepting values that shape  boundaries of what we should do, while constantly evolving with what we could do.

The essence of this applied risk is that technology could harm you, more than it helps you. How do we avoid this and develop instead the best of what human vision makes possible? Can we also exceed our own expectations of today, to advance in moral progress?

Continue reading The illusion that might cheat us: ethical data science vision and practice

Can new datasharing laws win social legitimacy, public trust and support without public engagement?

I’ve been struck by stories I’ve heard on the datasharing consultation, on data science, and on data infrastructures as part of ‘government as a platform’ (#GaaPFuture) in recent weeks. The audio recorded by the Royal Statistical Society on March 17th is excellent, and there were some good questions asked.

There were even questions from insurance backed panels to open up more data for commercial users, and calls for journalists to be seen as accredited researchers, as well as to include health data sharing. Three things that some stakeholders, all users of data, feel are  missing from consultation, and possibly some of those with the most widespread public concern and lowest levels of public trust. [1]

What I feel is missing in consultation discussions are:

  1. a representative range of independent public voice
  2. a compelling story of needs – why tailored public services benefits citizens from whom data is taken, not only benefits data users
  3. the impacts we expect to see in local government
  4. any cost/risk/benefit assessment of those impacts, or for citizens
  5. how the changes will be independently evaluated – as some are to be reviewed

The Royal Statistical Society and ODI have good summaries here of their thoughts, more geared towards the statistical and research aspects of data,  infrastructure and the consultation.

I focus on the other strands that use identifiable data for targeted interventions. Tailored public services, Debt, Fraud, Energy Companies’ use. I think we talk too little of people, and real needs.

Why the State wants more datasharing is not yet a compelling story and public need and benefit seem weak.

So far the creation of new data intermediaries, giving copies of our personal data to other public bodies  – and let’s be clear that this often means through commercial representatives like G4S, Atos, Management consultancies and more –  is yet to convince me of true public needs for the people, versus wants from parts of the State.

What the consultation hopes to achieve, is new powers of law, to give increased data sharing increased legal authority. However this alone will not bring about the social legitimacy of datasharing that the consultation appears to seek through ‘open policy making’.

Legitimacy is badly needed if there is to be public and professional support for change and increased use of our personal data as held by the State, which is missing today,  as care.data starkly exposed. [2]

The gap between Social Legitimacy and the Law

Almost 8 months ago now, before I knew about the datasharing consultation work-in-progress, I suggested to BIS that there was an opportunity for the UK to drive excellence in public involvement in the use of public data by getting real engagement, through pro-active consent.

The carrot for this, is achieving the goal that government wants – greater legal clarity, the use of a significant number of consented people’s personal data for complex range of secondary uses as a secondary benefit.

It was ignored.

If some feel entitled to the right to infringe on citizens’ privacy through a new legal gateway because they believe the public benefit outweighs private rights, then they must also take on the increased balance of risk of doing so, and a responsibility to  do so safely. It is in principle a slippery slope. Any new safeguards and ethics for how this will be done are however unclear in those data strands which are for targeted individual interventions. Especially if predictive.

Upcoming discussions on codes of practice [which have still to be shared] should demonstrate how this is to happen in practice, but codes are not sufficient. Laws which enable will be pushed to their borderline of legal and beyond that of ethical.

In England who would have thought that the 2013 changes that permitted individual children’s data to be given to third parties [3] for educational purposes, would mean giving highly sensitive, identifiable data to journalists without pupils or parental consent? The wording allows it. It is legal. However it fails the DPA Act legal requirement of fair processing.  Above all, it lacks social legitimacy and common sense.

In Scotland, there is current anger over the intrusive ‘named person’ laws which lack both professional and public support and intrude on privacy. Concerns raised should be lessons to learn from in England.

Common sense says laws must take into account social legitimacy.

We have been told at the open policy meetings that this change will not remove the need for informed consent. To be informed, means creating the opportunity for proper communications, and also knowing how you can use the service without coercion, i.e. not having to consent to secondary data uses in order to get the service, and knowing to withdraw consent at any later date. How will that be offered with ways of achieving the removal of data after sharing?

The stick for change, is the legal duty that the recent 2015 CJEU ruling reiterating the legal duty to fair processing [4] waved about. Not just a nice to have, but State bodies’ responsibility to inform citizens when their personal data are used for purposes other than those for which those data had initially been consented and given. New legislation will not  remove this legal duty.

How will it be achieved without public engagement?

Engagement is not PR

Failure to act on what you hear from listening to the public is costly.

Engagement is not done *to* people, don’t think explain why we need the data and its public benefit’ will work. Policy makers must engage with fears and not seek to dismiss or diminish them, but acknowledge and mitigate them by designing technically acceptable solutions. Solutions that enable data sharing in a strong framework of privacy and ethics, not that sees these concepts as barriers. Solutions that have social legitimacy because people support them.

Mr Hunt’s promised February 2014 opt out of anonymised data being used in health research, has yet to be put in place and has had immeasurable costs for delayed public research, and public trust.

How long before people consider suing the DH as data controller for misuse? From where does the arrogance stem that decides to ignore legal rights, moral rights and public opinion of more people than those who voted for the Minister responsible for its delay?

 

This attitude is what fails care.data and the harm is ongoing to public trust and to confidence for researchers’ continued access to data.

The same failure was pointed out by the public members of the tiny Genomics England public engagement meeting two years ago in March 2014, called to respond to concerns over the lack of engagement and potential harm for existing research. The comms lead made a suggestion that the new model of the commercialisation of the human genome in England, to be embedded in the NHS by 2017 as standard clinical practice, was like steam trains in Victorian England opening up the country to new commercial markets. The analogy was felt by the lay attendees to be, and I quote, ‘ridiculous.’

Exploiting confidential personal data for public good must have support and good two-way engagement if it is to get that support, and what is said and agreed must be acted on to be trustworthy.

Policy makers must take into account broad public opinion, and that is unlikely to be submitted to a Parliamentary consultation. (Personally, I first knew such  processes existed only when care.data was brought before the Select Committee in 2014.) We already know what many in the public think about sharing their confidential data from the work with care.data and objections to third party access, to lack of consent. Just because some policy makers don’t like what was said, doesn’t make that public opinion any less valid.

We must bring to the table the public voice from past but recent public engagement work on administrative datasharing [5], the voice of the non-research community, and from those who are not stakeholders who will use the data but the ‘data subjects’, the public  whose data are to be used.

Policy Making must be built on Public Trust

Open policy making is not open just because it says it is. Who has been invited, participated, and how their views actually make a difference on content and implementation is what matters.

Adding controversial ideas at the last minute is terrible engagement, its makes the process less trustworthy and diminishes its legitimacy.

This last minute change suggests some datasharing will be dictated despite critical views in the policy making and without any public engagement. If so, we should ask policy makers on what mandate?

Democracy depends on social legitimacy. Once you lose public trust, it is not easy to restore.

Can new datasharing laws win social legitimacy, public trust and support without public engagement?

In my next post I’ll post look at some of the public engagement work done on datasharing to date, and think about ethics in how data are applied.

*************

References:

[1] The Royal Statistical Society data trust deficit

[2] “The social licence for research: why care.data ran into trouble,” by Carter et al.

[3] FAQs: Campaign for safe and ethical National Pupil Data

[4] CJEU Bara 2015 Ruling – fair processing between public bodies

[5] Public Dialogues using Administrative data (ESRC / ADRN)

img credit: flickr.com/photos/internetarchivebookimages/

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex. Too little, too late.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex ed. They should be discussing prevention and personal data protection for all our personal data, not just one company, after the event.

Everyone’s been talking about TalkTalk and for all the wrong reasons. Data loss and a 15-year-old combined with a reportedly reckless response to data protection, compounded by lack of care.

As Rory Cellan-Jones wrote [1] rebuilding its reputation with customers and security analysts is going to be a lengthy job.

In Parliament Chi Onwarah, Shadow Minister for Culture & the Digital Economy, summed up in her question, asking the Minister to acknowledge “that all the innovation has come from the criminals while the Government sit on their hands, leaving it to businesses and consumers to suffer the consequences?”  [Hansard 2]

MPs were concerned for the 4 million* customers’ loss of name, date of birth, email, and other sensitive data, and called for an inquiry. [It may now be fewer*.] [3] The SciTech committee got involved too.

I hope this means Parliament will talk about TalkTalk not as the problem to be solved, but as one case study in a review of contemporary policy and practices in personal data handling.

Government spends money in data protection work in the [4] “National Cyber Security Programme”. [NCSP] What is the measurable outcome – particularly for TalkTalk customers and public confidence – from its £860M budget?  If you look at the breakdown of those sums, with little going towards data protection and security compared with the Home Office and Defence, we should ask if government is spending our money in an appropriately balanced way on the different threats it perceives. Keith Vaz suggested British companies that lose £34 billion every year to cybercrime. Perhaps this question will come into the inquiry.

This all comes after things have gone wrong.  Again [5]. An organisation we trusted has abused that trust by not looking after data with the stringency that customers should be able to expect in the 21st century, and reportedly not making preventative changes, apparent a year ago. Will there be consequences this time?

The government now saying it is talking about data protection and consequences, is like saying they’re talking sex education with teens, but only giving out condoms to the boys.

It could be too little too late. And they want above all to avoid talking about their own practices. Let’s change that.

Will this mean a review to end risky behaviour, bring in change, and be wiser in future?

If MPs explore what the NCSP does, then we the public, should learn more about what government’s expectations of commercial companies is in regards modern practices.

In addition, any MPs’ inquiry should address government’s own role in its own handling of the public’s personal data. Will members of government act in a responsible manner or simply tell others how to do so?

Public discussion around both commercial and state use of our personal data, should mean genuine public engagement. It should involve a discussion of consent where necessary for purposes  beyond those we expect or have explained when we submit our data, and there needs to be a change in risky behaviour in terms of physical storage and release practices, or all the talk, is wasted.

Some say TalkTalk’s  practices mean they have broken their contract along with consumer trust. Government departments should also be asking whether their data handling would constitute a breach of the public’s trust and reasonable expectations.

Mr Vaizey should apply his same logic to government handling data as he does to commercial handling. He said he is open to suggestions for improvement. [6]

Let’s not just talk about TalkTalk.

    • Let’s Talk Consequences: organisations taking risk seriously and meaningful consequences if not [7]
    • Let’s Talk Education: the education of the public on personal data use by others and rights and responsibilities we have [8]
    • Let’s Talk Parliament’s Policies and Practices: about its own complementary lack of data  understanding in government and understand what good practice is in physical storage, good governance and transparent oversight
    • Let’s Talk Public Trust: and the question whether government can be trusted with public data it already has and whether its current handling makes it trustworthy to take more [9]

Vaizey said of the ICO now in his own department: “The Government take the UK’s cyber-security extremely seriously and we will continue to do everything in our power to protect organisations and individuals from attacks.”

“I will certainly meet the Information Commissioner to look at what further changes may be needed in the light of this data breach. [..] It has extensive powers to take action and, indeed, to levy significant fines. “

So what about consequences when data are used in ways the public would consider a loss, and not through an attack or a breach, but government policy? [10]

Let’s Talk Parliament’s Policies and Practices

Commercial companies are not alone in screwing up the use and processing [11] management of our personal data. The civil service under current policy seems perfectly capable of doing by itself. [12]

Government data policy has not kept up with 21st century practices and to me seems to work in the dark, as Chi Onwarah said,

‘illuminated by occasional flashes of incompetence.’

This incompetence can risk harm to people’s lives, to business and to public confidence.

And once given, trust would be undermined by changing the purposes or scope of use for which it was given, for example as care.data plans to do after the pilot. A most risky idea.

Trust in these systems, whether commercial or state, is crucial. Yet reviews which highlight this, and make suggestions to support trust such as ‘data should never be (and currently is never) released with personal identifiers‘ in The Shakespeare Review have been ignored by government.

Where our personal data are not used well in government departments by the department themselves, they seem content to date to rely on public ignorance to get away with current shoddy practices.

Practices such as not knowing who all your customers are, because they pass data on to others. Practices, such as giving individual level identifiable personal data to third parties without informing the public, or asking for consent. Practices, such as never auditing or measuring any benefit of giving away others personal data.

“It is very important that all businesses, particularly those handling significant amounts of sensitive customer data, have robust procedures in place to protect those data and to inform customers when there may have been a data breach.” Ed Vaizey, Oct 26th, HOC

If government departments prove to be unfit to handle the personal data we submit in trust to the state today, would we be right to trust them with even more?

While the government is busy wagging fingers at commercial data use poor practices, the care.data debacle is evidence that not all its MPs or civil service understand how data are used in commercial business or through government departments.

MPs calling for commercial companies to sharpen up their data protection must understand how commercial use of data often piggy-backs the public use of our personal data, or others getting access to it via government for purposes that were unintended.

Let’s Talk Education

If the public is to understand how personal data are to be kept securely with commercial organisations, why should they not equally ask to understand how the state secures their personal data? Educating the public could lead to better engagement with research, better understanding of how we can use digital services and a better educated society as a whole. It seems common sense.

At a recent public event [13],  I asked civil servants talking about big upcoming data plans they announced, linking school data with more further education and employment data, I asked how they planned to involve the people whose data they would use. There was no public engagement to mention. Why not? Inexcusable in this climate.

Public engagement is a matter of trust and developing understanding in a relationship. Organisations must get this right.[14]

If government is discussing risky practices by commercial companies, they also need to look closer to home and fix what is broken in government data handling where it exposes us to risk through loss of control of our personal data.

The National Pupil Database for example, stores and onwardly shares identifiable individual sensitive data of at least 8m children’s records from age 2 -19. That’s twice as big as the TalkTalk loss was first thought to be.

Prevention not protection is what we should champion. Rather than protection after the events,  MPs and public must demand emphasis on prevention measures in our personal data use.

This week sees more debate on how and why the government will legislate to have more powers to capture more data about all the people in the country. But are government policy, process and practices fit to handle our personal data, what they do with it and who they give it to?

Population-wide gathering of data surveillance in any of its many forms is not any less real just because you don’t see it. Children’s health, schools, increases in volume of tax data collection. We don’t discuss enough how these policies can be used every day without the right oversight. MPs are like the conservative parents not comfortable talking to their teens about sleeping with someone. Just because you don’t know, it doesn’t mean they’re not doing it. [15] It just means you don’t want to know because if you find out they’re not doing it safely, you’ll have to do something about it.

And it might be awkward. (Meanwhile in schools real, meaningful PHSE has been left off the curriculum.)

Mr. Vaizey asked in the Commons for suggestions for improvement.

My suggestion is this. How government manages data has many options. But the principle should be simple. Our personal data needs not only protected, but not exposed to unnecessary risk in the first place, by commercial or state bodies. Doing nothing, is not an option.

Let’s Talk about more than TalkTalk

Teens will be teens. If commercial companies can’t manage their systems better to prevent a child successfully hacking it, then it’s not enough to point at criminal behaviour. There is fault to learn from on all sides. In commercial and state uses of personal data.

There is talk of new, and bigger, data sharing plans. [16]

Will the government wait to see  and keep its fingers crossed each month to see if our data are used safely at unsecured settings with some of these unknown partners data might be onwardly shared with, hoping we won’t find out and they won’t need to talk about it, or have a grown up public debate based on public education?

Will it put preventative measures in place appropriate to the sensitivity and volume of the data it is itself responsible for?

Will moving forward with new plans mean safer practices?

If government genuinely wants our administrative data at the heart of digital government fit for the 21st century, it must first understand how all government departments collect and use public data. And it must educate the public in this and commercial data use.

We need a fundamental shift in the way the government respects public opinion and shift towards legal and privacy compliance – both of which are lacking.

Let’s not talk about TalkTalk. Let’s have meaningful grown up debate with genuine engagement. Let’s talk about prevention measures in our data protection. Let’s talk about consent. It’s personal.

******

[1] Questions for TalkTalk: http://www.bbc.co.uk/news/technology-34636308

[2] Hansard: http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm151026/debtext/151026-0001.htm#15102612000004

[3] TalkTalk update: http://www.talktalkgroup.com/press/press-releases/2015/cyber-attack-update-tuesday-october-30-2015.aspx

[4] The Cyber Security Programme: http://www.civilserviceworld.com/articles/feature/depth-look-national-cyber-security-programme

[5] Paul reviews TalkTalk; https://paul.reviews/value-security-avoid-talktalk/

[6] https://ico.org.uk/for-organisations/guide-to-data-protection/conditions-for-processing/

[7] Let’s talk Consequences: the consequences of current failures to meet customers’ reasonable expectations of acceptable risk, are low compared with elsewhere.  As John Nicolson (East Dunbartonshire) SNP pointed out in the debate, “In the United States, AT&T was fined £17 million for failing to protect customer data. In the United Kingdom, the ICO can only place fines of up to £500,000. For a company that received an annual revenue of nearly £1.8 billion, a fine that small will clearly not be terrifying. The regulation of telecoms must be strengthened to protect consumers.”

[8] Let’s talk education: FOI request revealing a samples of some individual level data released to members of the press: http://www.theyworkforyou.com/debates/?id=2015-10-26b.32.0

The CMA brought out a report in June, on the use of consumer data, the topic should be familiar in parliament, but little engagement has come about as a result. It suggested the benefit:

“will only be realised if consumers continue to provide data and this relies on them being able to trust the firms that collect and use it”, and that “consumers should know when and how their data is being collected and used and be able to decide whether and how to participate. They should have access to information from firms about how they are collecting, storing and using data.”

[9] Let’s Talk Public Trust – are the bodies involved Trustworthy? Government lacks an effective data policy and is resistant to change. Yet it wants to collect ever more personal and individual level for unknown purposes from the majority of 60m people, with an unprecedented PR campaign.  When I heard the words ‘we want a mature debate’ it was reminiscent of HSCIC’s ‘intelligent grown up debate’ requested by Kinglsey Manning, in a speech when he admitted lack of public knowledge was akin to a measure of past success, and effectively they would rather have kept the use of population wide health data ‘below the radar’.

Change: We need change, the old way after all, didn’t work, according to Minister Matt Hancock: “The old model of government has failed, so we will build a new one.” I’d like to see what that new one will look like. Does he mean to expand only data sharing policy, or the powers of the civil service?

[10] National Pupil Database detailed data releases to third parties https://www.whatdotheyknow.com/request/pupil_data_national_pupil_databa

[11] http://adrn.ac.uk/news-events/latest-news/adrn-rssevent

[12] https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[13] https://www.liberty-human-rights.org.uk/human-rights/privacy/state-surveillance

[14] http://www.computerweekly.com/news/4500256274/Government-will-tackle-barriers-to-sharing-and-linking-data-says-Cabinet-Office-minister-Hancock

Building Public Trust [5]: Future solutions for health data sharing in care.data

This wraps up my series of thoughts on ‘Building Public Trust’ since the NIB Bristol meeting on July 24th.

It has looked at how to stop chasing public trust and instead the need to become organisations that can be trustworthy [part 1]. What behaviours make an organisation trustworthy [part 2]. Why fixing the Type 2 opt out is a vital first step [part 3], and why being blinded by ‘the benefits’ is not the answer [part 4], but giving balanced and fair explanations of programme purposes, commissioning and research, is beneficial to communicate.

So I want to wrap up by suggesting how communications can be improved in content and delivery. Some ideas will challenge the current approach.

Here in part five: Future solutions, I suggest why aiming to “Build Public Trust” through a new communications approach may work better for the public than the past. I’ll propose communications on care.data:

  • Review content:  what would ethical, accurate content look like
  • Strengthen relationships for delivery: don’t attempt to rebuild trust where there is now none, but strengthen the channels that are already viewed by the public to be trustworthy
  • Rethink why you communicate and the plan for when: All communications need delivered through a conversation with real listening and action based upon it. Equal priority must be given to both a communications plan for today and for the future. It must set out a mechanism for future change communications now,  before the pathfinders begin
  • Since writing this, the Leeds area CCGs have released their ‘data sharing’ comms leaflet. I have reviewed this in detail and give my opinions as a case study.

NIB workstream 4, underpins the NHS digital future,  and aims to build and sustain public trust, delivering plans for consent based information sharing and assurance of safeguards. It focuses on 4 areas: governance and oversight, project risks, consent and genomics:

“The work will begin in 2015 and is expected to include deliberative groups to discuss complex issues and engagement events, as well as use of existing organisations and ways to listen. There will also be a need to listen to professional audiences.”  [NIB work stream 4] [ref 1]

Today’s starting point in trust, trust that enables two-way communication, could hardly be worse, with professionals and public audiences. Communications are packaged in mistrust:

“Relations between the doctors’ union and Health Secretary Jeremy Hunt hit a new low following his announcement in July that he was prepared to impose seven-day working on hospital doctors in England.” [BBC news, Aug 15, 2015]

There appears to be divided opinion between politicians and civil servants.

Right now, the Department of Health seems to be sabotaging its own plans for success at every turn.

What reason can there be for denying debate in the public domain of the very plans it says are the life blood of the savings central to the NHS future?

Has the Department learned nothing from the loss of public and professional trust in 2014?

And as regards the public in engagement work, Hetan Shah, executive director of the Royal Statistical Society said in 2014, “Our research shows a “data trust deficit”. In this data rich world, companies and government have to earn citizens’ trust in how they manage and use data – and those that get it wrong will pay the price.’ [RSS Data Trust Deficit, lessons for policymakers, 2014] [2]

Where do the NIB work stream discussions want to reach by 2020?

“The emergence of genomics requires a conversation about what kind of consent is appropriate by 2020. The work stream will investigate a strand of work to be led by an ethicist.” [NIB work stream 4]

Why is genomics here in workstream 4, when datasharing for genomics is with active consent from volunteers? Why will a strand of work be led by an ethicist for this, and not other work strands? Is there a gap in how their consent is managed today or in how consent is to be handled for genomics for the future? It seems to me there is a gap in what is planned and what the public is being told here. It is high time for an overdue public debate on what future today’s population-wide data sharing programme is building. Good communication must ensure there are no surprises.

The words I underlined from the work stream 4 paper, highlight the importance of communication; to listen and to have a conversation. Despite all the engagement work of 2014 I feel that is still to happen. As one participant summed up later, “They seem hell bent on going ahead. I know they listened, but what did they hear?” [3]

care.data pathfinder practices are apparently ready to roll out communications materials: “Extraction is likely to take place between September and November depending on how fair processing testing communications was conducted” [Blackburn and Darwen HW]

So what will patient facing materials look like in content? How will they be rolled out?

Are pathfinder communications more robust than 2014 materials?

I hope the creatives will also think carefully, what is the intent of communications to be delivered.  Is it to fully and ethically inform patients about their choice whether to accept or opt out from changes in their data access, management, use and oversight? Or is the programme guidance to minimise the opt out numbers?

The participants are not signing up to a one time, single use marketing campaign, but to a lifetime of data use by third parties. Third parties who remain in role and purposes, loosely defined.

It is important when balancing this decision not to forget that data  that is available and not used wisely could fail to mitigate risk; for example in identifying pharmaceutical harms.

At the same time to collect all data for all purposes under that ‘patient safety and quality’ umbrella theme is simplistic, and lends itself in some ways, to lazy communications.

Patients must also feel free and able to make an informed decision without coercion, that includes not making opting out feel guilty.

The wording used in the past was weighted towards the organisation’s preference.  The very concept of “data sharing” is weighted positively towards the organisation. Even though in reality the default is for data to be taken by the organisation, not donated by the citizen. In other areas of life, this is recognised as an unwilling position for the citizen to be in.

At the moment I feel that the scope of purposes both today and future are not clearly defined enough in communications or plans for me personally to be able to trust them. Withholding information about how digital plans will fit into the broader NHS landscape and what data sharing will mean beyond 2020 appears rightly or wrongly,  suspicious. Department of Health, what are you thinking?

What the organisation says it will do, it must do and be seen to do, to be demonstrably trustworthy.

This workstream carries two important strands of governance and oversight which now need to be seen to happen. Implementing the statutory footing of the National Data Guardian, which has been talked about since October 2014 and ‘at the earliest opportunity’ seems to have been rather long in coming, and ‘a whole system’ that respects patient choice. What will this look like and how will it take into account the granular level of choices asked for at care.data listening events through 2014?

“By April 2016 NIB will publish, in partnership with civil society and patient leaders, a roadmap for moving to a whole-system, consent-based approach, which respects citizens’ preferences and objections about how their personal and confidential data is used, with the goal of implementing that approach by December 2020.”

‘By December 2020’ is still some time away, yet the pathfinders for care.data rolls on now regardless. The proof that will demonstrate what was said about data use actually is what happens to data, that what is communicated is trustworthy, is part of a system that can communicate this by recording and sharing consent decisions, “and can provide information on the use to which an individual’s data has been put. Over the longer term, digital solutions will be developed that automate as far as possible these processes.”

Until then what will underpin trust to show that what is communicated is done, in the short term?

Future proofing Communications must start now

Since 2013 the NHS England care.data approach appeared to want a quick data grab without long term future-proofed plans. Like the hook-up app approach to dating.

To enable the NIB 2020 plans and beyond, to safeguard research in the public interest, all communications must shape a trusted long term relationship.

To ensure public trust, communications content and delivery can only come after changes. Which is again why focusing only on communicate the benefits without discussing balance of risk does not work.  That’s what 2014 patient facing communications tried.

In 2014 there were challenges on communications that were asked but not answered, on reaching those who are digitally excluded, on reaching those for whom reading text was a challenge, and deciding who the target audience will be, considering people with delegated authority young and old, as well as those who go in and out of GP care throughout their lives, such as some military. Has that changed?

In February 2014 Health Select Committee member Sarah Wollaston, now Chair, said: “There are very serious underlying problems here that need to be addressed.”

If you change nothing, you can expect nothing to change in public and professional feeling about the programme. Communications cannot in 2015 simply revamp the layout and pacakging. There must be a change in content and in the support given in its delivery. Change means that you need to stop doing some things and start doing others.

In summary for future communications to support trust, I suggest:

1. STOP: delivering content that is biased towards what the organsation wants to achieve often with a focus on fair processing requirement, under a coercive veil of patient safety and research

START: communicating with an entirely ethical based approach reconsidering all patient data held at HSCIC and whether omission of  ‘commercial use’, balanced risks as identified in the privacy impact assessment and stating ‘your name is not included’ is right.  

2. STOP: Consider all the releases of health data held by HSCIC again and decide for each type if they are going to deliver public confidence that your organisations are trustworthy. 

START: communicate publicly which commercial companies, re-users and back office would no longer be legally eligible to receive data and why. Demonstrate organisations who received data in the past that will not in future.  

3. STOP: the Department of Health and NHS England must stop undermining trust in its own leadership, through public communications that voice opposition to medical professional bodies. Doctors are trusted much more than politicians.

START: strengthen the public-GP relationship that is already well trusted. Strengthen the GP position that will in turn support the organisational-trust-chain that you need to sustain public support. 

4. STOP: stop delaying the legislative changes needed on Data Guardian and penalties for data misuse 

START: implement them and clearly explain them in Parliament and press

5. STOP: don’t rush through short term short-cuts  to get ‘some’ data but ignore the listening from the public that asked for choice.

START: design a thorough granular consent model fit for the 21stC and beyond and explain to the public what it will offer, the buy in for bona fide research will be much greater (be prepared to define ‘research’!

6. STOP: saying that future practices have been changed and that security and uses are now more trustworthy than in the past. Don’t rush to extract data until you can prove you are trustworthy.

START: Demonstrate in future who receives data to individuals through a data use report. Who future users are in practice can only be shown through a demonstrable tool to see your word can be relied upon in practice. This will I am convinced, lower the opt out rate.

 Point 6 is apparently work-in-progress. [p58]
NIB2015

7. STOP: rolling out the current communications approach without any public position on what changes will mean they are notified before a new purpose and user in future of our data

START: design a thorough change communications model fit for the 21stC and beyond and tell the public in THIS round of communications what changes of user or purposes will trigger a notification to enable them to opt out in future BEFORE a future change i.e. in a fictional future – if the government decided that the population wide database should be further commercialised ‘for the purposes of health’, linked to the NHSBT blood donor registry and sold to genomic research companies, how would I as a donor be told, BEFORE the event?

There are still unknowns in content and future scope that mean communications are difficult. If you don’t know what you’re saying how to say it is hard. But what is certain is that there are future changes in the programme planned, and how to communicate these these with the public and professionals must be designed for now, so that what we are signed up for today, stays what we signed up for.

Delivering messages about data sharing and the broader NHS, the DH/NHS England should consider carefully their relationships and behaviours, all communication becomes relevant to trust.

Solutions cannot only be thought of in terms tools, not of what can be imposed on people, but of what can be achieved with people.

That’s people from the public and professionals and the programme working with the same understanding of the plans together, in a trusted long term relationship.

For more detail including my case study comments on the Leeds area CCGs comms leaflet, continue reading below.

Thanks for sharing in discussions of ideas in my five part post on Building public trust – a New Approach. Comments welcome.

Continue reading Building Public Trust [5]: Future solutions for health data sharing in care.data

Building Public Trust [2]: a detailed approach to understanding Public Trust in data sharing

Enabling public trust in data sharing is not about ‘communicating benefits’. For those interested in nitty gritty, some practical suggestions for progress in Building Public Trust in data sharing follows on from my summary after the NIB Bristol event 24/7/15.

Trust is an important if invisible currency used in the two-way transactions between an organisation and people.

So far, there have been many interactions and listening events but much of what professionals and the public called for, remains undone and public trust in the programme remains unchanged since 2014.

If you accept that it is not public trust that needs built, but the tangible trusthworthiness of an organisation, then you should also ask  what needs done by the organisation to make that demonstrable change?

What’s today’s position on Public Trust of data storage and use

Trust in the data sharing process is layered and dependent on a number of factors. Mostly [based on polls and public event feedback from 2014] “who will access my data and what will they use it for?”

I’m going to look more closely below at planned purposes: research and commissioning.

It’s also important to remember that trust is not homogeneous. Trust  is nuanced even within the single relationship between one individual and another. Trust, like consent, is stratified – you may trust the Post Office to deliver a letter or postcard, but sign up to recorded delivery for something valuable.

So for example when it comes to secondary uses data sharing, I might trust HSCIC with storing and using my health records for anonymous statistics, for analysis of immunisation and illness patterns for example. But as long as they continue to share with the Home Office, police or other loosely defined third parties [5], do I want them to have fully  identifiable data at all?

Those bodies have their own public trust issues at an all time low.

Mixing the legitimate users of health data with these Back Office punitive  uses will invite some people’s opt out who would otherwise not. Some of the very groups who need the most health and social care understanding, research and care, will be the very groups who opt out if there is a possibility of police and Home Office access by the back door. Telling traveller communities what benefits care.data will bring them is wasted effort when they see NHS health data is a police accessible register. I know. I’ve talked to some about it.

That position on data storage and use should be reconsidered if NHS England is serious that this is about health and for the benefit of individuals and communities’ well being.

What has HSCIC changed to demonstrate that  it is trustworthy?

A new physical secure setting is being built that will enable researchers to view research data but not take raw data away.

That is something they can control, and have changed, and it demonstrates they take the public seriously and we reciprocate.

That is great – demonstrable change by the organisation, inviting change in the public.

That’s practical, so what can be done on policy by NHS England/DH?

What else should be done to demonstrate policy is trustworthy?

Act on what the public and professionals asked for in 2014. [8]

Right now it feels as though in public communications that the only kind of relationship that is wanted on the part of the leadership is a one night stand.

It’s all about what the programme wants. Minimise the objections, get the data, and sneak out. Even when its leaders talk about some sort of ongoing consent model, the focus is still about ‘how to enable sharing data.’

This focus is the wrong one. If you want to encourage people to share they need to know why, what’s in it for them, and why do you want it? What collecting the data is about is still important to explain and specifically, each time the scope changes if you are doing it fairly.

Remember. Data-sharing is not vital to future-proof the NHS. Using knowledge wisely is. 

What is the policy for the future of primary care research?

The CPRD already enables primary care GP data to be linked with secondary data for research. In fact it already links more items from GP held data than current are.data plans to extract. So what benefit will care.data offer to research that is not already available today?

Simply having ever more data, stored in more places will not make us wiser. Before it’s collected repeatedly, it is right to question why.

What do we have collected already? How is it used? Where are the gaps in what we want to achieve through the knowledge we could gain. It’s NOT simply about filling in what gaps exist in what data we could gather. Understand the purposes and what will be gained to see if it’s worth the efforts. Prioritise. Collect it all, is not a solution.

I had thought that the types of data to be collected in care.data were clear, and how it differs from direct care was clear. But the Bristol NIB meeting demonstrated a wide range of understanding in NHS and CCG staff, Local Authority staff, IT staff, IG professionals, data providers and other third parties.  Data for secondary purposes are not to be conflated with direct care.

But that’s not what care.data sharing is about. So where to start with public trust, asked the NIB Bristol #health2020 meeting?

Do you ignore the starting point or tailor your approach to it?

“The NHS is at a crossroads and needs to change and improve as it moves forward. That was the message from NHS England’s Chief Executive Simon Stevens as a Five Year Forward View for the NHS was launched.”  [1] [NHS England, Oct 2014]

As the public is told over and over again that change is vital to the health of a sustainable NHS, a parallel public debate rages, whether the policy-making organisations behind the NHS – the commissioning body NHS England, the Department of Health and Cabinet Office – are serious about the survival of universal health and care provision, and about supporting its clinicians.

It is against this backdrop, and under the premise that obtaining patient data for centralised secondary uses is do or die for the NHS, that the NIB #health2020 has set out [2] work stream 4: “Build and sustain public trust: Deliver roadmap to consent based information sharing and assurance of safeguards”

“Without the care.data programme, the health service will not have a future, said Tim Kelsey, national director for patients and information, NHS England.” [3]

 

Polls say [A] nearly all institutions suffer from a ‘trust in data deficit’. Trust in them to use data appropriately, is lower than trust in the organisation generally.

Public trust in what the Prime Minister says on health is low.

Trust in the Secretary of State for Health is possibly at an all time low, with: “a bitter divide, a growing rift between the Secretary of State for Health and the medical profession.” [New Statesman, July 2015]

This matters. care.data needs the support of professionals and public.

ADRN research showed multiple contributing factors: “Participants were also worried about personal data being leaked, lost, shared or sold by government departments to third parties, particularly commercial companies. Low trust in government more generally seemed to be driving these views.” [Dialogue on data]

It was interesting to see all the same issues as reflected by the public in care.data listening events, asked from the opposite perspective from data users.

But it was frustrating to sit ay the Bristol NIB #health2020 event and discuss questions around the same issues on data sharing already discussed at care.data events through the last 18 months.

Nothing substantial has changed other then HSCIC’s physical security for data storage.

It is frustrating knowing that these change and communications issues will keep coming back again and again if not addressed.

Personally, I’m starting to lose trust there is any real intention for change, if senior leadership is unwilling to address this properly and change themselves.

To see a change in Public Trust do what the public asked to see change: On Choice

At every care.data meeting I attended in 2014, people asked for choice.

They asked for boundaries between the purposes of data uses, real choice.

Willingness for their information to be used by academic researchers in the public interest does not equate to being willing for it to be used by a pharmaceutical company for their own market research and profit.

The public understand these separations well. To say they do not, underestimates people and does not reflect public feeling. Anyone attending 2014 care.data events, has heard many people discuss this. They want a granular consent model.

This would offer a red line between how data are used for what purposes.

Of the data-sharing organisations today some are trusted and others are not. Offering a granular consent approach would offer a choice of a red line between who gets access to data.

This choice of selective use, would encourage fewer people to opt out from all purposes, allowing more data to be available for research for example.

To see a change in Public Trust do what the public asked to see: Explain your purposes more robustly

Primarily this data is to be used and kept indefinitely for commissioning purposes. Research wasn’t included as purposes for care.data gathering  in the planned specifications for well over a year. [After research outcry]

Yet specific to commissioning, the Caldicott recommendations [3] were very clear; commissioning purposes were insufficient and illegal grounds for sharing fully identifiable data which was opposed by NHS England’s Commissioning Board:

“The NHS Commissioning Board suggested that the use of personal confidential data for commissioning purposes would be legitimate because it would form part of a ‘consent deal’ between the NHS and service users. The Review Panel does not support such a proposition. There is no evidence that the public is more likely to trust commissioners to handle personal confidential data than other groups of professionals who have learned how to work within the existing law.”

NHS England seems unwilling to change this position, despite the professionals bodies and the public’s opposition to sharing fully identifiable data for commissioning purposes [care.data listening events 2014]. Is it any wonder that they keep hitting the same barrier? More people don’t want that to happen than you do. Something’s gotta give.

Ref the GPES Customer Requirements specification from March 2013 v2.1 which states on page 11: “…for commissioning purposes, it is important to understand activity undertaken (or not undertaken) in all care settings. The “delta load” approach (by which only new events are uploaded) requires such data to be retained, to enable subsequent linkage.”

The public has asked for red lines to differentiate between the purposes of data uses. NHS England and the Department of Health policy seems unwilling to do so.  Why?

To see a change in Public Trust do what the public asked to see: Red lines on policy of commercial use – and its impact on opt out

The public has asked for red lines outlawing commercial exploitation of their data. Though it was said it was changed, in practice it is hard to see. Department of Health policy seems unwilling to be clear, because the Care Act 2012 purposes remained loose.  Why?

As second best, the public has asked for choice not to have their data used at all for secondary purposes and were offered an opt out.

NHS England leaflet and the Department of Health, Secretary of State publicly promised this but has been unable to implement it and to date has made no public announcement on when it will be respected.  Why?

Trust does not exist in a vacuum.  What you say and what you actually do, matter. Policy and practice are co-dependent. Public trust depends on your organisations being trustworthy.

Creating public trust is not the government, the DH or NIB’s task ahead. They must instead focus on improving their own competency, honesty and reliability and through those, they will demonstrate that they can be trusted.

That the secondary purposes opt out has not been respected does not demonstrate those qualities.

“Trust is not about the public. Public trust is about the organisation being trustworthy.”

How will they do that?

Let the DH/NHS England and organisations in policy and practice address what they themselves will stop and start doing to bring about change in their own actions and behaviours.

Communications change request: Start by addressing the current position NOT what the change will bring. You must move people along the curve , not dump them with a fait accomplice and wonder why the reaction is so dire.

changecurve

Vital for this is the current opt out; what was promised and what was done.

The secondary uses opt out must be implemented with urgency.

To see a change in Public Trust you need to take action. the Programme needs to do what the public asked to see change: on granular consent, on commercial use and defined purposes.

And to gather suggested actions, start asking the right questions.

Not ‘how do we rebuild public trust?’ but “how can we demonstrate that we are trustworthy to the public?”

1. How can a [data-sharing] org demonstrate it is trustworthy?
2. Identify: why people feel confident their trust is well placed?
3. Why do clinical professionals feel confident in any org?
4. What would harm the organisational-trust-chain in future?
5. How will the org-trust-chain be positively maintained in future?
6. What opportunities will be missed if that does not happen?
(identify value)

Yes the concepts are close,  but how it is worded defines what is done.

These apparent small differences make all the difference in how people provide you ideas, how you harness them into real change and improvement.

Only then can you start understanding why “communicating the benefits” has not worked and how it should affect future communications  materials.

From this you will find it much easier to target actual tasks, and short and long term do-able solutions than an open discussion will deliver. Doing should  include thinking/attitudes as well as actions.

This will lead to communications messages that are concrete not wooly. More about that in the next posts.

####

To follow, for those interested in nitty gritty, some practical suggestions for progress in Building Public Trust in data sharing:

Part one: A seven step top line summary – What I’d like to see change addressing public trust in health data sharing for secondary purposes.

This is Part two: a New Approach is needed to understanding Public Trust For those interested in a detailed approach on Trust. What Practical and Policy steps influence trust. On Reserach and Commissioning. Trust is not homogeneous. Trust  is nuanced even within the single relationship between one individual and another. It doesn’t exist in a vacuum.

Part three: Know where you’re starting from. What behaviours influence trust and how can we begin to see them demonstrated. Mr.Kelsey discusses  consent and opt out. Fixing what has already been communicated is vital before new communications get rolled out. Vital to tailor the content of public communications, for public trust and credibility the programme must be clear what is missing and what needs filled in. #Health2020 Bristol NIB meeting.

Part four: “Communicate the Benefits” won’t work – How Communications influence trust. For those interested in more in-depth reasons, I outline in part two why the communications approach is not working, why the focus on ‘benefits’ is wrong, and fixes.

Part five: Future solutions – why a new approach may work better for future trust – not to attempt to rebuild trust where there is now none, but strengthen what is already trusted and fix today’s flawed behaviours; honesty and reliability, that  are vital to future proofing public trust.

 

####

References:

[1] NHS England October 2014 http://www.england.nhs.uk/2014/10/23/nhs-leaders-vision/

[2] Workstream 4: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/442829/Work_Stream_4.pdf

[3] Caldicott Review 2: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/192572/2900774_InfoGovernance_accv2.pdf

[4] Missing Programme Board documents: 2015 and June-October 2014

[5] HSCIC Data release register

[6] Telegraph article on Type 2 opt out http://www.telegraph.co.uk/news/health/news/11655777/Nearly-1million-patients-could-be-having-confidential-data-shared-against-their-wishes.html

[7] Why Wanting a Better Care.Data is not Luddite: http://davidg-flatout.blogspot.co.uk/2014/04/why-wanting-better-caredata-is-not.html

[8] Talking to the public about using their data is crucial- David Walker, StatsLife http://www.statslife.org.uk/opinion/1316-talking-to-the-public-about-using-their-data-is-crucial

[9] Dame Fiona Caldicott appointed in new role as National Data Guardian

[10] Without care.data health service has no future says director http://www.computerweekly.com/news/2240216402/Without-Caredata-we-wont-have-a-health-service-for-much-longer-says-NHS

[11] Coin Street, care.data advisory meeting, September 6th 2014: https://storify.com/ruth_beattie/care-data-advisory-group-open-meeting-6th-septembe

[12] Public questions unanswered: https://jenpersson.com/pathfinder/

care.data : the economic value of data versus the public interest?

 This is a repost of my opinion piece published in StatsLife in June 2015.

The majority of the public supports the concept of using data for public benefit.[1] But the measurable damage done in 2014 to the public’s trust in data sharing [2] and reasons for it, are an ongoing threat to its achievement.

Rebuilding trust and the public legitimacy of government data gathering could be a task for Sisyphus, given the media atmosphere clouded by the smoke and mirrors of state surveillance. As Mark Taylor, chair of the NHS’s Confidentiality Advisory Group wrote when he considered the tribulations of care.data [3] ‘…we need a much better developed understanding of ‘the public interest’ than is currently offered by law.’

So what can we do to improve this as pilot sites move forward and for other research? Can we consistently quantify the value of the public good and account for intangible concerns and risks alongside demonstrable benefits? Do we have a common understanding of how the public feels what is in its own best interests?

And how are shifting public and professional expectations to be reflected in the continued approach to accessing citizens’ data, with the social legitimacy upon which research depends?

Listening and lessons learned

Presented as an interval to engage the public and professionals, the 18 month long pause in care.data involved a number of ‘listening’ events. I attended several of these to hear what people were saying about the use of personal health data. The three biggest areas of concern raised frequently [4] were:

  • Commercial companies’ use and re-use of data
  • Lack of transparency and control over who was accessing data for what secondary purposes, and
  • Potential resulting harms: from data inaccuracy, loss of trust and confidentiality, and fear of discrimination.

It’s not the use of data per se that the majority of the public raises objection to. Indeed many people would object if health data were not used for research in the public interest. Objections were more about the approach to this in the past and in the future.

There is a common understanding of what bona fide research is, how it serves the public interest, and polls confirm a widespread acceptance of ‘reasonable’ research use of data. The HSCIC audit under Sir Nick Partridge [5] acknowledged that some past users or raw data sharing had not always met public expectations of what was ‘reasonable’. The new secure facility should provide a safe setting for managing this better, but open questions remain on governance and transparency.

As one question from a listening event succinctly put it [6]:

‘Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.’

Using the information gleaned from data was often seen as exploitation when used in segmenting the insurance markets, consumer market research or individual targeting. There is also concern, even outright hostility, to raw health data being directly sold, re-used or exchanged as a commodity – regardless whether this is packaged as ‘for profit’ or ‘covering administrative costs’.

Add to that, the inability to consent to, control or find out who uses individual level data and for what purpose, or to delete mistakes, and there is a widespread sense of disempowerment and loss of trust.

Quantifying the public perception of care.data’s value

While the pause was to explain the benefits of the care.data extraction, it actually seemed clear at meetings that people already understood the potential benefits. There is clear public benefit to be gained for example, from using data as a knowledge base, often by linking with other data to broaden scientific and social insights, generating public good.

What people were asking, was what new knowledge would be gained that isn’t gathered from non-identifiable data already? Perhaps more tangible, yet less discussed at care.data events, is the economic benefits for commissioning use by using data as business intelligence to inform decisions in financial planning and cost cutting.

There might be measurable economic public good from data, from outside interests who will make a profit by using data to create analytic tools. Some may even sell information back into the NHS as business insights.

Care.data is also to be an ‘accelerator’ for other projects [7]. But it is hard to find publicly available evidence to a) support the economic arguments for using primary care data in any future projects, and b) be able to compare them with the broader current and future needs of the NHS.

A useful analysis could find that potential personal benefits and the public good overlap, if the care.data business case were to be made available by NHS England in the public domain. In a time when the NHS budget is rarely out of the media it seems a no-brainer that this should be made open.

Feedback consistently shows that making money from data raises more concern over its uses. Who all future users might be remains open as the Care Act 2014 clause is broadly defined. Jamie Reed MP said in the debate [8]: ‘the new clause provides for entirely elastic definitions that, in practice, will have a limitless application.’

Unexpected uses and users of public data has created many of its historical problems. But has the potential future cost of ‘limitless’ applications been considered in the long term public interest? And what of the confidentiality costs [9]? The NHS’s own Privacy Impact Assessment on care.data says [10]:

‘The extraction of personal confidential data from providers without consent carries the risk that patients may lose trust in the confidential nature of the health service.

Who has quantified the cost of that loss of confidence and have public and professional opinions been accounted for in any cost/benefit calculations? All these tangible and intangible factors should be measured in calculating its value in the public interest and ask, ‘what does the public want?’ It is after all, our data and our NHS.

Understanding shifting public expectations

‘The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.’ – David Carr, policy adviser at the Wellcome Trust [11]

To rebuild trust in data sharing, individuals need the imbalance of power corrected, so they can control ‘their data’. The public was mostly unaware health records were being used for secondary purposes by third parties, before care.data. In February 2014, the secretary of state stepped in to confirm an opt-out will be offered, as promised by the prime minister in his 2010 ‘every patient a willing research patient’ speech.

So leaving aside the arguments for and against opt-in versus opt-out (and that for now it is not technically possible to apply the 700,000 opt-outs already made) the trouble is, it’s all or nothing. By not offering any differentiation between purposes, the public may feel forced to opt-out of secondary data sharing, denying all access to all their data even if they want to permit some uses and not others.

Defining and differentiating secondary uses and types of ‘research purposes’ could be key to rebuilding trust. The HSCIC can disseminate information ‘for the purposes of the provision of health care or adult social care, or the promotion of health’. This does not exclude commercial use. Cutting away commercial purposes which appear exploitative from purposes in the public interest could benefit the government, commerce and science if, as a result, more people would be willing to share their data.

This choice is what the public has asked for at care.data events, other research events [12] and in polls, but to date has yet to see any move towards. I feel strongly that the government cannot continue to ignore public opinion and assume its subjects are creators of data, willing to be exploited, without expecting further backlash. Should a citizen’s privacy become a commodity to put a price tag on if it is a basic human right?

One way to protect that right is to require an active opt-in to sharing. With ongoing renegotiation of public rights and data privacy at EU level, consent is no longer just a question best left ignored in the pandora’s box of ethics, as it has been for the last 25 years in hospital data secondary use. [13]

The public has a growing awareness, differing expectations, and different degrees of trust around data use by different users. Policy makers ignoring these expectations, risk continuing to build on a shaky foundation and jeopardise the future data sharing infrastructure. Profiting at the expense of public feeling and ethical good practice is an unsustainable status quo.

Investing in the public interest for future growth

The care.data pause has revealed differences between the thinking of government, the drivers of policy, the research community, ethics panels and the citizens of the country. This is not only about what value we place on our own data, but how we value it as a public good.

Projects that ignore the public voice, that ‘listen’ but do not act, risk their own success and by implication that of others. And with it they risk the public good they should create. A state which allows profit for private companies to harm the perception of good research practice sacrifices the long term public interest for short term gain. I go back to the words of Mark Taylor [3]:

‘The commitment must be an ongoing one to continue to consult with people, to continue to work to optimally protect both privacy and the public interest in the uses of health data. We need to use data but we need to use it in ways that people have reason to accept. Use ‘in the public interest’ must respect individual privacy. The current law of data protection, with its opposed concepts of ‘privacy’ and ‘public interest’, does not do enough to recognise the dependencies or promote the synergies between these concepts.’ 

The economic value of data, personal rights and the public interest are not opposed to one another, but have synergies and a co-dependency. The public voice from care.data listening could positively help shape a developing consensual model of data sharing if the broader lessons learned are built upon in an ongoing public dialogue. As Mark Taylor also said, ‘we need to do this better.’

*******

[1] according to various polls and opinions gathered from my own discussions and attendance at care.data events in 2014 [ refs: 2, 4. 6. 12]

[2] The data trust deficit, work by the Royal Statistical Society in 2014

[3] M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1 http://script-ed.org/?p=1377

[4] Communications and Change – blogpost https://jenpersson.com/care-data-communications-change/

[5] HSCIC audit under Sir Nick Partridge https://www.gov.uk/government/publications/review-of-data-releases-made-by-the-nhs-information-centre

[6] Listening events, NHS Open Day blogpost https://jenpersson.com/care-data-communications-core-concepts-part-two/

[7] Accelerator for projects mentioned include the 100K Genomics programme https://www.youtube.com/watch?v=s8HCbXsC4z8

[8] Hansard http://www.publications.parliament.uk/pa/cm201314/cmhansrd/cm140311/debtext/140311-0002.htm

[9] Confidentiality Costs; StatsLife http://www.statslife.org.uk/opinion/1723-confidentiality-costs

[10] care.data privacy impact assessment Jan 2014 [newer version has not been publicly released] http://www.england.nhs.uk/wp-content/uploads/2014/01/pia-care-data.pdf

[11] Wellcome Trust http://blog.wellcome.ac.uk/2015/04/08/sharing-research-data-to-improve-public-health/

[12]  Dialogue on Data – Exploring the public’s views on using linked administrative data for research purposes: https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx

[13] HSCIC Lessons Learned http://www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

The views expressed in this article originally published in the Opinion section of StatsLife are solely mine, the original author. These views and opinions do not necessarily represent those of The Royal Statistical Society.

The nhs.uk digital platform: a personalised gateway to a new NHS?

In recent weeks rebranding the poverty definitions and the living wage in the UK deservedly received more attention than the rebrand of the website NHS Choices into ‘nhs.uk.

The site that will be available only in England and Wales despite its domain name, will be the doorway to enter a personalised digital NHS offering.

As the plans proceed without public debate, I took some time to consider the proposal announced through the National Information Board (NIB) because it may be a gateway to a whole new world in our future NHS. And if not, will it be a  big splash of cash but create nothing more than a storm-in-a-teacup?

In my previous post I’d addressed some barriers to digital access. Will this be another? What will it offer that isn’t on offer already today and how will the nhs.uk platform avoid the problems of its predecessor HealthSpace?

Everyone it seems is agreed, the coming cuts are going to be ruthless. So, like Alice, I’m curious. What is down the rabbit hole ahead?

What’s the move from NHS Choices to nhs.uk about?

The new web platform nhs.uk would invite users to log on, using a system that requires identity, and if compulsory, would be another example of a barrier to access simply from a convenience point of view, even leaving digital security risks aside.

What will nhs.uk offer to incentivise users and offer benefit as a trade off against these risks, to go down the new path into the unknown and like it?

“At the heart of the domain , will be the development of nhs.uk into a new integrated health and care digital platform that will be a source of access to information, directorate, national services and locally accredited applications.”

In that there is nothing new compared with information, top down governance and signposting done by NHS Choices today.  

What else?

“Nhs.uk will also become the citizen ’s gateway to the creation of their own personal health record, drawing on information from the electronic health records in primary and secondary care.”

nhs.uk will be an access point to patient personal confidential records

Today’s patient online we are told offers 97% of patients access to their own GP created records access. So what will nhs.uk offer more than is supposed to be on offer already today? Adding wearables data into the health record is already possible for some EMIS users, so again, that won’t be new. It does state it will draw on both primary and secondary records which means getting some sort of interoperability to show both hospital systems data and GP records. How will the platform do this?

Until care.data many people didn’t know their hospital record was stored anywhere outside the hospital. In all the care.data debates the public was told that HES/SUS was not like a normal record in the sense we think of it. So what system will secondary care records come from? [Some places may have far to go. My local hospital pushes patients round with beige paper folders.] The answer appears to be an unpublished known or an unknown.

What else?

nhs.uk will be an access point to tailored ‘signposting’ of services

In addition to access to your personal medical records in the new “pull not push” process the nhs.uk platform will also offer information and services, in effect ‘advertising’ local services, to draw users to want to use it, not force its use. And through the power of web tracking tools combined with log in, it can all be ‘tailored’ or ‘targeted’ to you, the user.

“Creating an account will let you save information, receive emails on your chosen topics and health goals and comment on our content.”

Do you want to receive emails on your chosen topics or comment on content today? How does it offer more than can already be done by signing up now to NHS Choices?

NHS Choices today already offers information on local services, on care provision and symptoms’ checker.

What else?

Future nhs.uk users will be able to “Find, Book, Apply, Pay, Order, Register, Report and Access,” according to the NIB platform headers.

platform

“Convenient digital transactions will be offered like ordering and paying for prescriptions, registering with GPs, claiming funds for treatment abroad, registering as an organ and blood donor and reporting the side effects of drugs . This new transactional focus will complement nhs.uk’s existing role as the authoritative source of condition and treatment information, NHS services and health and care quality information.

“This will enable citizens to communicate with clinicians and practices via email, secure video links and fill out pre-consultation questionnaires. They will also be able to include data from their personal applications and wearable devices in their personal record. Personal health records will be able to be linked with care accounts to help people manage their personal budget.”

Let’s consider those future offerings more carefully.

Separating out the the transactions that for most people will be one off, extremely rare or never events (my blue) leaves other activities which you can already do or will do via the patient online programme (in purple).

The question is that although video and email are not yet widespread where they do work today and would in future, would they not be done via a GP practice system, not a centralised service? Or is the plan not that you could have an online consultation with ‘your’ named GP through nhs.uk but perhaps just ‘any’ GP from a centrally provided GP pool? Something like this? 

That leaves two other things, which are both payment tools (my bold).

i. digital transactions will be offered like ordering and paying for prescriptions
ii. …linked with care accounts to help people manage their personal budget.”

Is the core of the new offering about managing money at individual and central level?

Beverly Bryant, ‎Director of Strategic Systems and Technology at NHS England, said at the #kfdigi2015 June 16th event, that implementing these conveniences had costs saving benefits as well: “The driver is customer service, but when you do it it actually costs less.”

How are GP consultations to cost less, significantly less, to be really cost effective compared with the central platform to enable it to happen, when the GP time is the most valuable part and remains unchanged spent on the patient consultation and paperwork and referral for example?

That most valuable part to the patient, may be seen as what is most costly to ‘the system’.

If the emphasis is on the service saving money, it’s not clear what is in it for people to want to use it and it risks becoming another Healthspace, a high cost top down IT rollout without a clear customer driven need.

The stated aim is that it will personalise the user content and experience.

That gives the impression that the person using the system will get access to information and benefits unique and relevant to them.

If this is to be something patients want to use (pull) and are not to be forced to use (push) I wonder what’s really at its core, what’s in it for them, that is truly new and not part of the existing NHS Choices and Patient online offering?

What kind of personalised tailoring do today’s NHS Choices Ts&Cs sign users up to?

“Any information provided, or any information the NHS.uk site may infer from it, are used to provide content and information to your account pages or, if you choose to, by email.  Users may also be invited to take part in surveys if signed up for emails.

“You will have an option to submit personal information, including postcode, age, date of birth, phone number, email address, mobile phone number. In addition you may submit information about your diet and lifestyle, including drinking or exercise habits.”

“Additionally, you may submit health information, including your height and weight, or declare your interest in one or more health goals, conditions or treatments. “

“With your permission, academic institutions may occasionally use our data in relevant studies. In these instances, we shall inform you in advance and you will have the choice to opt out of the study. The information that is used will be made anonymous and will be confidential.”

Today’s NHS Choices terms and conditions say that “we shall inform you in advance and you will have the choice to opt out of the study.”

If that happens already and the NHS is honest about its intent to give patients that opt out right whether to take part in studies using data gathered from registered users of NHS Choices, why is it failing to do so for the 700,000 objections to secondary use of personal data via HSCIC?

If the future system is all about personal choice NIB should perhaps start by enforcing action over the choice the public may have already made in the past.

Past lessons learned – platforms and HealthSpace

In the past, the previous NHS personal platform, HealthSpace, came in for some fairly straightforward criticism including that it offered too little functionality.

The Devil’s in the Detail remarks are as relevant today on what users want as they were in 2010. It looked at the then available Summary Care Record (prescriptions allergies and reactions) and the web platform HealthSpace which tried to create a way for users to access it.

Past questions from Healthspace remain unanswered for today’s care.data or indeed the future nhs.uk data: What happens if there is a mistake in the record and the patient wants it deleted? How will access be given to third party carers/users on behalf of individuals without capacity to consent to their records access?

Reasons given by non-users of HealthSpace included lack of interest in managing their health in this way, a perception that health information was the realm of health professionals and lack of interest or confidence in using IT.

“In summary, these findings show that ‘self management’ is a much more complex, dynamic, and socially embedded activity than original policy documents and technical specifications appear to have assumed.”

What lessons have been learned? People today are still questioning the value of a centrally imposed system. Are they being listened to?

Digital Health reported that Maurice Smith, GP and governing body member for Liverpool CCG, speaking in a session on self-care platforms at the King’s Fund event he said that driving people towards one national hub for online services was not an option he would prefer and that he had no objection to a national portal, “but if you try drive everybody to a national portal and expect everybody to be happy with that I think you will be disappointed.”

How will the past problems that hit Healthspace be avoided for the future?

How will the powers-at-be avoid repeating the same problems for its ongoing roll out of care.data and future projects? I have asked this same question to NHS England/NIB leaders three times in the last year and it remains unanswered.

How will you tell patients in advance of any future changes who will access their data records behind the scenes, for what purpose, to future proof any programmes that plan to use the data?

One of the Healthspace 2010 concerns was: “Efforts of local teams to find creative new uses for the SCR sat in uneasy tension with implicit or explicit allegations of ‘scope creep’.”

Any programme using records can’t ethically sign users up to one thing and change it later without informing them before the change. Who will pay for that and how will it be done? care.data pilots, I’d want that answered before starting pilot communications.

As an example of changes to ‘what’ or content scope screep, future plans will see ‘social care flags added’ to the SCR record, states p.17 of the NIB 2020 timeline. What’s the ‘discovery for the use of genomic data complete’ about on p.11?  Scope creep of ‘who’ will access records, is very current. Recent changes allow pharmacists to access the SCR yet the change went by with little public discussion. Will they in future see social care flags or mental health data under their SCR access? Do I trust the chemist as I trust a GP?

Changes without adequate public consultation and communication cause surprises. Bad idea. Sir Nick Partridge said ensuring ‘no surprises’ is key to citizens’ trust after the audit of HES/SUS data uses. He is right.

The core at the heart of this nhs.uk plan is that it needs to be used by people, and enough people to make the investment vs cost worthwhile. That is what Healthspace failed to achieve.

The change you want to see doesn’t address the needs of the user as a change issue. (slide 4) This is all imposed change. Not user need-driven change.

Dear NIB, done this way seems to ignore learning from Healthspace. The evidence shown is self-referring to Dr. Foster and NHS Choices. The only other two listed are from Wisconsin and the Netherlands, hardly comparable models of UK lifestyle or healthcare systems.

What is really behind the new front door of the nhs.uk platform?

The future nhs.uk looks very much as though it seeks to provide a central front door to data access, in effect an expanded Summary Care Record (GP and secondary care records) – all medical records for direct care – together with a way for users to add their own wider user data.

Will nhs.uk also allow individuals to share their data with digital service providers of other kinds through the nhs.uk platform and apps? Will their data be mined to offer a personalised front door of tailored information and service nudges? Will patients be profiled to know their health needs, use and costs?

If yes, then who will be doing the mining and who will be using that data for what purposes?

If not, then what value will this service offer if it is not personal?

What will drive the need to log on to another new platform, compared with using the existing services of patient online today to access our health records, access GPs via video tools, and without any log-in requirement, browse similar content of information and nudges towards local services offered via NHS Choices today?

If this is core to the future of our “patient experience” of the NHS the public should be given the full and transparent facts  to understand where’s the public benefit and the business case for nhs.uk, and what lies behind the change expected via online GP consultations.

This NIB programme is building the foundation of the NHS offering for the next ten years. What kind of NHS are the NIB and NHS England planning for our children and our retirement through their current digital designs?

If the significant difference behind the new offering for nhs.uk platform is going to be the key change from what HealthSpace offered and separate from what patient online already offers it appears to be around managing cost and payments, not delivering any better user service.

Managing more of our payments with pharmacies and personalised budgets would reflect the talk of a push towards patient-responsible-self-management  direction of travel for the NHS as a whole.

More use of personal budgets is after all what Simon Stevens called a “radical new option” and we would expect to see “wider scale rollout of successful projects is envisaged from 2016-17″.

When the system will have finely drawn profiles of its users, will it have any effect for individuals in our universal risk-shared system? Will a wider roll out of personalised budgets mean more choice or could it start to mirror a private insurance system in which a detailed user profile would determine your level of risk and personal budget once reached, mean no more service?

What I’d like to see and why

To date, transparency has a poor track record on sharing central IT/change programme business plans.  While saying one thing, another happens in practice. Can that be changed? Why all the effort on NHS Citizen and ‘listening’, if the public is not to be engaged in ‘grown up debate‘ to understand the single biggest driver of planned service changes today: cost.

It’s at best patronising in the extreme, to prevent the public from seeing plans which spend public money.

We risk a wasteful, wearing repeat of the past top down failure of an imposed NPfIT-style HealthSpace, spending public money on a project which purports to be designed to save it.

To understand the practical future we can look back to avoid what didn’t work and compare with current plans. I’d suggest they should spell out very clearly what were the failures of Healthspace, and why is nhs.uk different.

If the site will offer an additional new pathway to access services than we already have, it will cost more, not less. If it has genuine expected cost reduction compared with today, where precisely will it come from?

I’d suggest you publish the detailed business plan for the nhs.uk platform and have the debate up front. Not only the headline numbers towards the end of these slides, but where and how it fits together in the big picture of Stevens’ “radical new option”.  This is public money and you *need* the public on side for it to work.

Publish the business cases for the NIB plans before the public engagement meet ups, because otherwise what facts will opinion be based on?

What discussion can be of value without them, when we are continually told by leadership those very  details are at the crux of needed change – the affordability of the future of the UK health and care system?

Now, as with past projects, The Devil’s in the Detail.

***

NIB detail on nhs.uk and other concepts: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/437067/nib-delivering.pdf

The Devil’s in the Detail: Final report of the independent evaluation of the Summary Care Record and HealthSpace programmes 2010

Digital revolution by design: infrastructures and the world we want

1. Digital revolution by design: building for change and people
2. Digital revolution by design: barriers by design
3. Digital revolution by design: infrastructures and the fruits of knowledge

This is Part 4.  Infrastructures and the world we want

At high level physical network infrastructures enable data transfer from one place to another and average users perceive little of it.

In the wider world Internet infrastructure today, this week might be looked back on as, to use a horrible cliché, a game changer. A two-tier Internet traffic system could be coming to Europe which would destroy a founding principle of equality – all traffic is created equal.

In other news, Facebook announced it will open an office in the toe of Africa, a foothold on a potential market of a billion people.

Facebook’s Internet.org initiative sees a further ‘magnificent seven’ companies working together. Two of whom Ericsson and Nokia will between them have “an effective lock down on the U.S market,” unless another viable network competitor emerges.  And massive reach worldwide.

In Africa today there is a hodge podge of operators and I’ll be interested to see how much effect the boys ganging up under the protection of everybody’s big brother ‘Facebook” will have on local markets.

And they’re not alone in wanting in on African action.

Whatever infrastructures China is building on and under the ground of the African continent, or donating ludicrous showcase gifts, how they are doing it has not gone unnoticed. The Chinese ethics of working and their environmental standards can provoke local disquiet.

Will Facebook’s decision makers shape up to offer Africa an ethical package that could include not only a social network, but physical one managing content delivery in the inner workings of tubes and pipes?

In Europe the data connections within and connecting the continent are shifting, as TTIP, CETA and TISA shape how our data and knowledge will be shared or reserved or copyrighted by multinational corporations.

I hope we will ensure transparency designed it these supra-national agreements on private ownership of public firms.

We don’t want to find commercial companies withhold information such as their cyber security planning, and infrastructure investments in the name of commercial protectionism, but at a public cost.

The public has opportunities now as these agreements are being drawn up, we may not get soon again.

Not only for the physical constructions, the often networked infrastructures, but intangible infrastructures of principles and power, co-dependencies around a physical system; the legal and ethical infrastructures of ownership, governance and accountability.

The Open Data institute has just launched a call for the promotion of understanding around our own data infrastructures:

“A strong data infrastructure will increase interoperability and collaboration, efficiency and productivity in public and private sectors, nationally and internationally.”

Sounds like something we want to get right in, and outside, the UK.

Governance of data is physically geographical through country unique legislation, as well as supra national such as European-wide data protection.

In some ways outdated legal concepts in a borderless digital age but one way at least over which there is manageable oversight and citizens should be able to call companies and State to account.

Yet that accountability is questionable when laws seem to be bypassed under the banner of surveillance.

As a result people have somewhat lost trust in national bodies to do the right thing. We want to share data for public good but not for commercial exploitation. And we’re not sure who to trust with it.

Data governance of contractual terms is part of the infrastructure needed to prevent exploitation and enable not restrict sharing. And it needs to catch up with apps whose terms and conditions can change after a user has enrolled.

That comes back down to the individual and some more  ideas on those personal infrastructures are in the previous post.

Can we build lasting foundations fit for a digital future?

Before launching into haphazard steps of a digital future towards 2020, the NIB/NHS decision makers need to consider the wider infrastructures in which it is set and understand under what ethical compass they are steering by.

Can there be oversight to make national and supra-national infrastructures legally regulated, bindingly interoperable and provider and developer Ts and Cs easily understood?

Is it possible to regulate only that which is offered or sold through UK based companies or web providers and what access should be enabled or barriers designed in?

Whose interests should data and knowledge created from data serve?

Any state paid initiative building a part of the digital future for our citizens must decide, is it to be for public good or for private profit?

NHS England’s digital health vision includes: “clinical decision support to be auto populated with existing healthcare information, to take real time feeds of biometric data, and to consider genomics data in the future.”  [NIB plans, Nov 2014]

In that 66 page document while it talks of data and trust and cyber security, ethics is not mentioned once.  The ambition is to create ‘health-as-a-platform’ and its focus is on tech, not on principles.

‘2020’ is the goal and it’s not a far away future at all if counted as 1175 working days from now.

By 2020 we may have moved on or away in a new digital direction entirely or to other new standards of network or technology. On what can we build?

Facebook’s founder sees a futuristic role for biometric data used in communication. Will he drive it? Should we want him to?

Detail will change, but ethical principles could better define the framework for development promoting the best of innovation long term and protect citizens from commercial exploitation. We need them now.

When Tim Berners-Lee called for a Magna Carta on the world wide web he asked for help to achieve the web he wants.

I think it’s about more than the web he wants. This fight is not only for net neutrality. It’s not only challenging the internet of things to have standards, ethics and quality that shape a fair future for all.

While we shape the web we want, we shape the world we want.

That’s pretty exciting, and we’d better try to get it right.

******

1. Digital revolution by design: building for change and people
2. Digital revolution by design: barriers by design
3. Digital revolution by design: infrastructures and the fruits of knowledge
4. Digital revolution by design: infrastructures and the world we want

 

The Economic Value of Data vs the Public Good? [1] care.data, Concerns and the cost of Consent

They say ‘every little helps’.  care.data needs every little it can get.

In my new lay member role on the ADRN panel, I read submissions for research requests for any ethical concerns that may be reflected in wider public opinion.

The driving force for sharing administrative data research is non-commercial, with benefits to be gained for the public good.

So how do we quantify the public good, and ‘in the public interest’?

Is there alignment between the ideology of government, the drivers of policy [for health, such as the commissioning body NHS England] and the citizens of the country on what constitutes ‘the public good’?

There is public good to be gained for example, from social and health data seen as a knowledge base,  by using it using in ‘bona fide’ research, often through linking with other data to broaden insights.

Insight that might result in improving medicines, health applications, and services. Social benefits that should help improve lives, to benefit society.

Although social benefits may be less tangible, they are no harder for the public to grasp than the economic. And often a no brainer as long as confidentiality and personal control are not disregarded.

When it comes to money making from our data the public is less happy. The economic value of data raises more questions on use.

There is economic benefit to extract from data as a knowledge base to inform decision making, being cost efficient and investing wisely. Saving money.

And there is measurable economic public good in terms of income tax from individuals and corporations who by using the data make a profit, using data as a basis from which to create tools or other knowledge. Making money for the public good through indirect sales.

Then there is economic benefit from data trading as a commodity. Direct sales.

In all of these considerations, how does what the public feels and their range of opinions, get taken into account in the public good cost and benefit accounting?

Do we have a consistent and developed understanding of ‘the public interest’ and how it is shifting to fit public expectation and use?

Public concern

“The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.”  [Wellcome blog, April 2015]

If something is jeopardising that public good it is in the public interest to say so, and for the right reasons.

The loss of public trust in data sharing measured by public feeling in 2014 is a threat to data used in the public interest, so what are we doing to fix it and are care.data lessons being learned?

The three biggest concerns voiced by the public at care.data listening events[1] were repeatedly about commercial companies’ use, and re-use of data, third parties accessing data for unknown purposes and the resultant loss of confidentiality.

 Question from Leicester: “Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.” [NHS Open Day, June 2014]

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial purposes.

Much of the debate and upset caused by the revelations of how our hospital episode statistics were managed in the past centred on the sense of loss of ownership. And with that, the inability to consent to who uses it. This despite acknowledgment that patients own their data.

Significant concern centres on use of the information gleaned from data that patients consider commercial exploitation. For use segmenting the insurance markets. For consumer market research. Using data for individual targeting. And its utter lack of governance.

There is also concern about data being directly sold or exchanged as a commodity.

These concerns were raised meeting after meeting in the 2014 care.data “listening process.”

To read in Private Eye that commercially sensitive projects were discussed in various meetings between NHS England and supermarket giant Tesco throughout 2014 [2] by the Patients and Information Director, responsible for care.data, is therefore all the more surprising.

They may of course be quite unrelated.

But when transparency is the mother of trust, it’s perhaps a surprising liason while ‘listening’ to care.data concerns.

It could appear that greater confidentiality was given to the sensitivity of commercial meetings than citizens’ sensitive data.

Consent package deals may be a costly mistake

People are much more aware since care.data a year ago, that unknown third parties may access data without our consent.

Consent around secondary NHS data sharing and in wider fora is no longer an inconvenient ethical dilemma best left on the shelf, as it has been for the last 25 years in secondary use, dusted off in the care.data crisis. [3]

Consent is front and centre in the latest EU data protection discussions [4] in which consent may become a requirement for all research purposes.

How that may affect social science and health research use, its pros and cons [5] remain to be seen.

However, in principle consent has always been required and good practice in applied medicine, despite the caveat for data used in medical research. As a general rule: “An intervention in the health field may only be carried out after the person concerned has given free and informed consent to it”. But this is consent for your care. Assuming that information is shared when looking after you, for direct care, during medical treatment itself is not causes concerns.

The idea is becoming increasingly assumed in discussions I have heard, [at CCG and other public meetings] that because patients have given implied consent to sharing their information for their care, that the same data may be shared for other purposes. It is not, and it is those secondary purposes that the public has asked at care.data events, to see split up, and differentiated.

Research uses are secondary uses, and those purposes cannot ethically be assumed. However, legal gateways, access to that data which makes it possible to uses for clearly defined secondary purposes by law, may make that data sharing legal.

That legal assumption, for the majority of people polls and dialogue show [though not for everyone 6b], comes  a degree of automatic support for bona fide research in the public interest. But it’s not a blanket for all secondary uses by any means, and it is this blanket assumption which has damaged trust.

So if data use in research assumes consent, and any panel is the proxy for personal decision making, the panel must consider the public voice and public interest in its decision making.

So what does the public want?

In those cases where there is no practicable alternative [to consent], there is still pressure to respect patient privacy and to meet reasonable expectations regarding use. The stated ambition of the CAG, for example, is to only advise disclosure in those circumstances where there is reason to think patients would agree it to be reasonable.

Whether active not implied consent does or does not become a requirement for research purposes without differentiation between kinds, the public already has different expectations and trust around different users.

The biggest challenge for championing the benefits of research in the public good, may be to avoid being lumped in with commercial marketing research for private profit.

The latter’s misuse of data is an underlying cause of the mistrust now around data sharing [6]. It’s been a high price to pay for public health research and others delayed since the Partridge audit.

Consent package deals mean that the public cannot choose how data are used in what kids of research and if not happy with one kind, may refuse permission for the other.

By denying any differentiation between direct, indirect, economic and social vale derived from data uses, the public may choose to deny all researchers access to their all personal data.

That may be costly to the public good, for public health and in broader research.

A public good which takes profit into account for private companies and the state, must not be at the expense of public feeling, reasonable expectations and ethical good practice.

A state which allows profit for private companies to harm the perception of  good practice by research in the public interest has lost its principles and priorities. And lost sight of the public interest.

Understanding if the public, the research community and government have differing views on what role economic value plays in the public good matters.

It matters when we discuss how we should best protect and approach it moving towards a changing EU legal framework.

“If the law relating to health research is to be better harmonised through the passing of a Regulation (rather than the existing Directive 95/46/EC), then we need a much better developed understanding of ‘the public interest’ than is currently offered by law.”  [M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1]

In the words of Dr Mark Taylor, “we need to do this better.”

How? I took a look at some of this in more detail:

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

Update note: A version of these three posts was combined into an opinion piece – care.data: ‘The Value of Data versus the Public Interest?’ published on StatsLife on June 3rd 2015.

****

image via Tesco media

 

[1] care.data listening event questions: https://jenpersson.com/pathfinder/

[2] Private Eye – on Tesco / NHS England commercial meetings https://twitter.com/medConfidential/status/593819474807148546

[3] HSCIC audit and programme for change www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

[4] EU data protection discussion http://www.digitalhealth.net/news/EHI/9934/eu-ministers-back-data-privacy-changes

[5] Joint statement on EU Data Protection proposals http://www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/WTP055584.pdf

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[6b] The ‘Dialogue on Data’ Ipsos MORI research 2014 https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx – commissioned by the Economic and Social Research Council (ESRC) and the Office for National Statistics (ONS) to conduct a public dialogue examining the public’s views on using linked administrative data for research purposes,

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

[10] Personalisation in health data plans http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-002-Adult-Social-Care-Informatics.pdf

[11] Tim Kelsey Keynote speech at Strata November 2013 https://www.youtube.com/watch?v=s8HCbXsC4z8

[12] Forbes: Illumina CEO on the US$20bn DNA market http://www.forbes.com/sites/luketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market/