Category Archives: NHS England

Digital revolution by design: building for change and people (1)

Andy Williams said* that he wants not evolution, but a revolution in digital health.

It strikes me that few revolutions have been led top down.

We expect revolution from grass roots dissent, after a growing consensus in the population that the status quo is no longer acceptable.

As the public discourse over the last 18 months about the NHS use of patient data has proven, we lack a consensual agreement between state, organisations and the public how the data in our digital lives should be collected, used and shared.

The 1789 Declaration of the Rights of Man and Citizen as part of the French Revolution set out a charter for citizens, an ethical and fair framework of law in which they trusted their rights would be respected by fellow men.

That is something we need in this digital revolution.

We are told hand by government that it is necessary to share all our individual level data in health from all sorts of sources.

And that bulk data collection is vital in the public interest to find surveillance knowledge that government agencies want.

At the same time other government departments plan to restrict citizens’ freedom of access to knowledge that could be used to hold the same government  and civil servants to account.

On the consumer side, there is public concern about the way we are followed around on the web by companies including global platforms like Google and Facebook, that track our digital footprint to deliver advertising.

There is growing objection to the ways in which companies scoop up data to build profiles of individuals and groups and personalising how they get treated. Recent objection was to marketing misuse by charities.

There is little broad understanding yet of the power of insight that organisations can now have to track and profile due to the power of algorithms and processing capability.

Technology progress that has left legislation behind.

But whenever you talk to people about data there are two common threads.

The first, is that although the public is not happy with the status quo of how paternalistic organisations or consumer companies ‘we can’t live without’ manage our data, there is a feeling of powerlessness that it can’t change.

The second, is frustration with organisations that show little regard for public opinion.

What happens when these feelings both reach tipping point?

If Marie Antoinette were involved in today’s debate about the digital revolution I suspect she may be the one saying: “let them eat cookies.”

And we all know how that ended.

If there is to be a digital revolution in the NHS where will it start?

There were marvelous projects going on at grassroots discussed over the two days: bringing the elderly online and connected and in housing and deprivation. Young patients with rare diseases are designing apps and materials to help consultants improve communications with patients.

The NIB meeting didn’t have real public interaction and or any discussion of those projects ‘in the room’ in the 10 minutes offered. Considering the wealth of hands-on digital health and care experience in the audience it was a missed opportunity for the NIB to hear common issues and listen to suggestions for co-designed solutions.

While white middle class men (for the most part) tell people of their grand plans from the top down, the revolutionaries of all kinds are quietly getting on with action on the ground.

If a digital revolution is core to the NHS future, then we need to ask to understand the intended change and outcome much more simply and precisely.

We should all understand why the NHS England leadership wants to drive change, and be given proper opportunity to question it, if we are to collaborate in its achievement.

It’s about the people, stoopid

Passive participation will not be enough from the general public if the revolution is to be as dramatic as it is painted.

Consensual co-design of plans and co-writing policy are proven ways to increase commitment to change.

Evidence suggests citizen involvement in planning is more likely to deliver success. Change done with, not to.

When constructive solutions have been offered what impact has engagement if no change is made to any  plans?

If that’s engagement, you’re doing it wrong.

Struggling with getting themselves together on the current design for now, it may be hard to invite public feedback on the future.

But it’s only made hard if what the public wants is ignored.  If those issues were resolved in the way the public asked for at listening events it could be quite simple to solve.

The NIB leadership clearly felt nervous to have debate, giving only 10 minutes of three hours for public involvement, yet that is what it needs. Questions and criticism are not something to be scared of, but opportunities to make things better.

The NHS top-down digital plans need public debate and dissected by the clinical professions to see if it fits the current and future model of healthcare, because if not involved in the change, the ride will be awfully bumpy to get there.

For data about us, to be used without us, is certainly an outdated model incompatible with a digital future.

The public needs to fight for citizen rights in a new social charter that demands change along lines we want, change that doesn’t just talk of co-design but that actually means it.

If unhappy about today’s data use, then the general public has to stop being content to be passive cash cows as we are data mined.

If we want data used only for public benefit research and not market segmentation, then we need to speak up. To the Information Commissioner’s Office if the organisation itself will not help.

“As Nicole Wong, who was one of President Obama’s top technology advisors, recently wrote, “[t]here is no future in which less data is collected and used.”

“The challenge lies in taking full advantage of the benefits that the Internet of Things promises while appropriately protecting consumers’ privacy, and ensuring that consumers are treated fairly.” Julie Brill, FTC, May 4 2015, Berlin

In the rush to embrace the ‘Internet of Things’ it can feel as though the reason for creating them has been forgotten. If the Internet serves things, it serves consumerism. AI must tread an enlightened path here. If the things are designed to serve people, then we would hope they offer methods of enhancing our life experience.

In the dream of turning a “tsunami of data” into a “tsunami of actionable business intelligence,” it seems all too often the person providing the data is forgotten.

While the Patient and Information Directorate, NHS England or NIB speakers may say these projects are complex and hard to communicate the benefits, I’d say if you can’t communicate the benefits, its not the fault of the audience.

People shouldn’t have to either a) spend immeasurable hours of their personal time, understanding how these projects work that want their personal data, or b) put up with being ignorant.

We should be able to fully question why it is needed and get a transparent and complete explanation. We should have fully accountable business plans and scrutiny of tangible and intangible benefits in public, before projects launch based on public buy-in which may misplaced. We should expect  plans to be accessible to everyone and make documents straightforward enough to be so.

Even after listening to a number of these meetings and board meetings, I am  not sure many would be able to put succinctly: what is the NHS digital forward view really? How is it to be funded?

On the one hand new plans are to bring salvation, while the other stops funding what works already today.

Although the volume of activity planned is vast, what it boils down to, is what is visionary and achievable, and not just a vision.

Digital revolution by design: building for change and people

We have opportunity to build well now, avoiding barriers-by-design, pro-actively designing ethics and change into projects, and to ensure it is collaborative.

Change projects must map out their planned effects on people before implementing technology. For the NHS that’s staff and public.

The digital revolution must ensure the fair and ethical use of the big data that will flow for direct care and secondary uses if it is to succeed.

It must also look beyond its own bubble of development as they shape their plans in the ever changing infrastructures in which data, digital, AI and ethics will become important to discuss together.

That includes in medicine.

Design for the ethics of the future, and enable change mechanisms in today’s projects that will cope with shifting public acceptance, because that shift has already begun.

Projects whose ethics and infrastructures of governance were designed years ago, have been overtaken in the digital revolution.

Projects with an old style understanding of engagement are not fit-for-the-future. As Simon Denegri wrote, we could have 5 years to get a new social charter and engagement revolutionised.

Tim Berners-Lee when he called for a Magna Carta on the Internet asked for help to achieve the web he wants:

“do me a favour. Fight for it for me.”

The charter as part of the French Revolution set out a clear, understandable, ethical and fair framework of law in which they trusted their rights would be respected by fellow citizens.

We need one for data in this digital age. The NHS could be a good place to start.

****

It’s exciting hearing about the great things happening at grassroots. And incredibly frustrating to then see barriers to them being built top down. More on that shortly, on the barriers of cost, culture and geography.

****

* at the NIB meeting held on the final afternoon of the Digital Conference on Health & Social Care at the King’s Fund, June 16-17.

NEXT>>
2. Driving Digital Health: revolution by design
3. Digital revolution by design: building infrastructure

Refs:
Apps for sale on the NHS website
Whose smart city? Resident involvement
Data Protection and the Internet of Things, Julie Brill FTC
A Magna Carta for the web

Reputational risk. Is NHS England playing a game of public confidence?

“By when will NHS England commit to respect the 700,000 objections  [1] to secondary data sharing already logged* but not enacted?” [gathered from objections to secondary uses in the care.data rollout, Feb 2014*]

Until then, can organisations continue to use health data held by HSCIC for secondary purposes, ethically and legally, or are they placing themselves at reputational risk?

If HSCIC continues to share, what harm may it do to public confidence in data sharing in the NHS?

I should have asked this explicitly of the National Information Board (NIB) June 17th board meeting [2], that rode in for the last 3 hours of the two day Digital Health and Care Congress at the King’s Fund.

But I chose to mention it only in passing, since I assumed it is already being worked on and a public communication will follow very soon. I had lots of other constructive things I wanted to hear in the time planned for ‘public discussion’.

Since then it’s been niggling at me that I should have asked more directly, as it dawned on me watching the meeting recording and more importantly when reading the NIB papers [3], it’s not otherwise mentioned. And there was no group discussion anyway.

Mark Davies. Director at UK Department of Health talked in fairly jargon-free language about transparency. [01:00] I could have asked him when we will see more of it in practice?

Importantly, he said on building and sustaining public trust, “if we do not secure public trust in the way that we collect store and use their personal confidential data, then pretty much everything we do today will not be a success.”

So why does the talk of securing trust seem at odds with the reality?

Evidence of Public Voice on Opt Out

Is the lack of action based on uncertainty over what to do?

Mark Davies also said “we have only a sense” and we don’t have “a really solid evidence base” of what the public want. He said, “people feel slightly uncomfortable about data being used for commercial gain.” Which he felt was “awkward” as commercial companies included pharma working for public good.

If he has not done so already, though I am sure he will have, he could read the NHS England own care.data listening feedback. People were strongly against commercial exploitation of data. Many were livid about its use. [see other care.data events] Not ‘slightly uncomfortable.’  And they were able to make a clear distinction between uses by commercial companies they felt in the public interest, such as bona fide pharma research and the differences with consumer market research, even if by the same company.  Risk stratification and commissioning does not need, and should not have according to the Caldicott Review [8], fully identifiable individual level data sharing.

Uses are actually not so hard to differentiate. In fact, it’s exactly what people want. To have the choice to have their data used only for direct care  or to choose to permit sharing between different users, permitting say, bona fide research.  Or at minimum, possible to exclude commercially exploitative uses and reuse. To enable this would enable more data sharing with confidence.

I’d also suggest there is a significant evidence base gathered in the data trust deficit work from the Royal Statistical Society, a poll on privacy for the Joseph Rowntree Foundation, and work done for the ADRN/ESRC. I’m sure he and the NIB are aware of these projects, and Mark Davies said himself more is currently being done with the Nuffield Trust.

Work with almost 3,000 young for the Royal Academy of Engineering people confirmed what those interested in privacy know, but is the opposite of what is often said about young people and privacy – they care and want control:

youngpeople_privacy

NHS England has itself further said it has held ‘over 180’ listening events in 2014 and feedback was consistent with public letters to papers, radio phone-ins and news reports in spring 2014.

Don’t give raw data out, exclude access to commercial companies not working in the public interest, exclude non-bona fide research use and re-use licenses, define the future purposes, improve legal protection including the opt out and provide transparency to trust.

How much more evidence does anyone need to have of public understanding and feeling, or is it simply that NHS England and the DH don’t like the answers given? Listening does not equal heard.

Here’s some of NHS England’s own slides – [4] points included a common demand from the public to give the opt out legal status:

legal

 

Opt out needs legal status

Paul Bate talked about missing pieces of understanding on secondary uses, for [56:00] [3] “Commissioners, researchers, all the different regulators.” He gave an update, which assumed secondary use of data as the norm.

But he missed out any mention of the perceived cost of loss of confidentiality, and loss of confidence since the failure to respect the 9nu4 objections made in the 2014 aborted care.data rollout. That’s not even mentioning that so many did not even recall getting a leaflet, so those 700,00K came from the most informed.

When the public sees their opt out is not respected they lose trust in the whole system of data sharing. Whether for direct care, for use by an NHS organisation, or by any one of the many organisations vying to manage their digital health interaction and interventions. If someone has been told data will not be shared with third parties and it is, why would they trust any other governance will be honoured?

By looking back on the leadership pre- care.data flawed thinking ‘no one who uses a public service should be allowed to opt out of sharing their records, nor can people rely on their record being anonymised’ and its resulting disastrous attempt to rollout without communication and then a second at fair processing, lessons learned should inform future projects. That includes care.data mark 2. This < is simply daft.

You can object and your data will not be extracted and you can make no contribution to society, Mr. Kelsey answered a critic on twitter in 2014 and revealed that his thinking really hasn’t changed very much, even if he has been forced to make concessions. I should have said at #kfdigital15, ignoring what the public wants is not your call to make.

What legal changes will be made that back up the verbal guarantees given since February? If none are forthcoming, then were the statements made to Parliament untrue? 

“people should be able to opt out from having their anonymised data used for the purposes of scientific research.” [Hunt, 2014]

We are yet to see this legal change and to date, the only publicly stated choice is only for identifiable data, not all data for secondary purposes including anonymous, as offered by the Minister in February 2014, and David Cameron in 2010.

If Mark Davies is being honest about how important he feels trust is to data sharing, implementing the objection should be a) prioritised and b) given legal footing.optout_ppt

 

Risks and benefits : need for a new social contract on Data

Simon Denegri recently wrote [5] he believes there are “probably five years to sort out a new social contract on data in the UK.”

I’d suggest less, if high profile data based projects or breaches irreparably damage public trust first, whether in the NHS or consumer world. The public will choose to share increasingly less.

But the public cannot afford to lose the social benefits that those projects may bring to the people who need them.

Big projects, such as care.data, cannot afford for everyone’s sake to continue to repeatedly set off and crash.

Smaller projects, those planned and in progress by each organisation and attendee at the King’s Fund event, cannot afford for those national mistakes to damage the trust the public may otherwise hold in the projects at local level.

I heard care.data mentioned five different times over the two-day event  in different projects as having harmed the project through trust or delays. We even heard examples of companies in Scotland going bust due to rollouts with slowed data access and austerity.

Individuals cannot afford for their reputation to be harmed through association, or by using data in ways the public finds unreasonable and get splashed across the front page of the Telegraph.

Clarity is needed for everyone using data well whether for direct care with implied consent, or secondary uses without it, and it is in the public interest to safeguard access to that data.

A new social contract on data would be good all round.

Reputational Risk

The June 6th story of the 700,000 unrespected opt outs has been and gone. But the issue has not.

Can organisations continue to use that data ethically and legally knowing it is explicitly without consent?

“When will those objections be implemented?” should be a question that organisations across the country are asking – if reputational risk is a factor in any datasharing decision making – in addition to the fundamental ethical principle: can we continue to use the data from an individual from whom we know consent was not freely given and was actively withheld?

What of projects that use HES or hospital secondary care sites’ submitted data and rely on the HSCIC POM mechanisms? How do those audits or other projects take HES secondary objections into account?

Sir Nick Partridge said in the April 2014 HSCIC HES/SUS audit there should be ‘no surprises’ in future.

That future is now. What has NHS England done since to improve?

“Consumer confidence appears to be fragile and there are concerns that future changes in how data may be collected and used (such as more passive collection via the Internet of Things) could test how far consumers are willing to continue to provide data.” [CMA Consumer report] [6]

The problem exists across both state and consumer data sharing. It is not a matter of if, but when, these surprises are revealed to the public with unpredictable degrees of surprise and revulsion, resulting in more objection to sharing for any purposes at all.

The solutions exist: meaningful transparency, excluding commercial purposes which appear exploitative, consensual choices, and no surprises. Shape communications processes by building-in future change to today’s programmes to future proof trust.

Future-proofing does not mean making a purpose and use of data so vague as to be all encompassing – exactly what the public has said at care.data listening events they do not want and will not find sufficient to trust nor I would argue, would it meet legally adequate fair processing – it must build and budget for mechanisms into every plan today, to inform patients of the future changes to use or users of data already gathered, and offer them a new choice to object or consent. And they should have a way to know who used what.

The GP who asked the first of the only three questions that were possible in 10 minutes Q&A from the room, had taken away the same as I had: the year 2020 is far too late as a public engagement goal. There must be much stronger emphasis on it now. And it is actually very simple. Do what the public has already asked for.

The overriding lesson must be, the person behind the data must come first. If they object to data being used, that must be respected.

It starts with fixing the opt outs. That must happen. And now.

Public confidence is not a game [7]. Reputational risk is not something organisations should be forced to gamble with to continue their use of data and potential benefits of data sharing.

If NHS England, the NIB or Department of Health know how and when it will be fixed they should say so. If they don’t, they better have a darn good reason why and tell us that too.

‘No surprises’, said Nick Partridge.

The question decision makers must address for data management is, do they continue to be part of the problem or offer part of the solution?

******

References:

[1]The Telegraph, June 6th 2015 http://www.telegraph.co.uk/news/health/news/11655777/Nearly-1million-patients-could-be-having-confidential-data-shared-against-their-wishes.html

[2]  June 17th NIB meeting http://www.dh-national-information-board.public-i.tv/core/portal/webcast_interactive/180408

[3] NIB papers / workstream documentation https://www.gov.uk/government/publications/plans-to-improve-digital-services-for-the-health-and-care-sector

[4] care.data listening feedback http://www.england.nhs.uk/wp-content/uploads/2015/01/care-data-presentation.pdf

[5] Simon Denegri’s blog http://simondenegri.com/2015/06/18/is-public-involvement-in-uk-health-research-a-danger-to-itself/

[6] CMA findings on commercial use of consumer data https://www.gov.uk/government/news/cma-publishes-findings-on-the-commercial-use-of-consumer-data

[7] Data trust deficit New research finds data trust deficit with lessons for policymakers: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[8] Caldicott review: information governance in the health and care system

Off the record – a case study in NHS patient data access

Patient online medical records’ access in England was promised by April 2015.

HSCIC_statsJust last month headlines abounded “GPs ensure 97% of patients can access summary record online“. Speeches carried the same statistics.  So what did that actually mean? The HSCIC figures released in May 2015 showed that while around 57 million patients can potentially access something of their care record only 2.5 million or 4.5% of patients had actively signed up for the service.

In that gap lies a gulf of a difference. You cannot access the patient record unless you have signed up for it, so to give the impression that 97% of patients can access a summary record online is untrue.  Only 4.5% can, and have done so. While yes, this states patients must request access, the impression is somewhat misrepresentative.

Here’s my look at what that involved and once signed up, what ‘access your medical records’ actually may mean in practice.

The process to getting access

First I wrote a note to the practice manager about a month ago, and received a phone call a few days later to pop in any time. A week later, I called to check before I would ‘pop in’ and found that the practice manager was not in every day, and it would actually have to be when she was.

I offered to call back and arrange a suitable date and time. Next call, we usefully agreed the potential date I could go in, but I’d have to wait to be sure that the paper records had first been retrieved from the external store (since another practice closed down ours had become more busy than ever and run out of space.) I was asked whether I had already received permission from the practice manager and to confirm that I knew there would be a £10 charge.

So, one letter, four phone calls and ten pounds in hard cash later, I signed a disclosure form this morning to say I was me and I had asked to see my records, and sat in a corner of the lovely practice manager’s office with a small thinly stuffed Lloyd George envelope, and a few photocopied or printed-out A4 pages  (so I didn’t get to actually look at my own on-screen record the GP uses) and a receipt.

What did my paper records look like?

My oldest notes on paper went back as far as 1998 and were for the most part handwritten. Having lived abroad since there was then a ten year gap until my new registration and notes moved onto paper prints of electronic notes.

These included referral for secondary care, correspondence between consultants and my GP and/or to and from me.

The practice manager was very supportive and tolerant of me taking up a corner of her office for half an hour. Clutching a page with my new log-in for the EMIS web for patient records access, I put the papers back, said my thank yous and set off home.

Next step: online

I logged on at home to the patient access system. Having first had it in 2009 when I registered, I hadn’t used the system since as it had very limited functionality, and I had had good health. Now I took the opportunity to try it again.

By asking the GP practice reception, I had been assigned a PIN, given the Practice ID, an Access ID and confirmation of my NHS number all needed entry in Step 1:

emis1

 

Step 2: After these on screen 2, I was asked for my name, DOB, and to create a password.

emis2

 

Step 3: the system generated a long number user ID which I noted down.

Step 4: I looked for the data sharing and privacy policy. Didn’t spot with whom data entered would be shared or for what purposes and any retention or restrictions of purposes. I’d like to see that added.

emis3
Success:

Logged on using my new long user ID and password, I could see an overview page with personal contact details, which were all accurate.  Sections for current meds, allergies, appointments, medical record, personal health record and repeats prescriptions. There was space for overview of height, BMI and basic lifestyle (alcohol and smoking) there too.

emis4c

 

A note from 2010 read: “refused consent to upload national. sharing. electronic record.” Appropriately some may perhaps think, this was recorded in the “problems” section, which was otherwise empty.

Drilling down to view the medication record,  the only data held was the single most recent top line prescription without any history.

emis4b

 

And the only other section to view was allergies, similarly and correctly empty:

emis5

The only error I noted was a line to say I was due an MMR immunization in June 2015. [I will follow up to check whether one of my children should be due for it, rather than me.]

What else was possible?

Order repeat prescription: If your practice offers this service there is a link called Make a request in the Repeat Prescriptions section of the home page after you have signed in. This was possible. Our practice already does it direct with the pharmacy.

Book an appointment: with your own GP from dates in a drop down.

Apple Health app integration: The most interesting part of the online access was this part that suggested it could upload a patient’s Apple health app data, and with active patient consent, that would be shared with the GP.

emis6

 

It claims: “You can consent to the following health data types being shared to Patient Access and added to your Personal Health Record (PHR):”

  • Height
  • Weight
  • BMI
  • Blood Glucose
  • Blood Pressure (Diastolic & Systolic)
  • Distance (walked per day)
  • Forced expired volume
  • Forced Vital capacity
  • Heart Rate
  • Oxygen Saturation
  • Peak Expiratory Flow
  • Respiratory rate
  • Steps (taken per day)

“This new feature is only available to users of IOS8 who are using the Apple Health app and the Patient Access app.”

 

With the important caveat for some: IOS 8.1 has removed the ability to manually enter Blood Glucose data via the Health app. Health will continue to support Blood Glucose measurements added via 3rd party apps such as MySugr and iHealth.

Patient Access will still be able to collect any data entered and we recommend entering Blood Glucose data via one of those free apps until Apple reinstate the capability within Health.

What was not possible:

To update contact details: The practice configures which details you are allowed to change. It may be their policy to restrict access to change some details only in person at the practice.

Viewing my primary care record: other than a current medication there was nothing of my current records in the online record. Things like test results were not in my online record at all, only on paper. Pulse noted sensible concerns about this area in 2013.

Make a correction: clearly the MMR jab note is wrong, but I’ll need to ask for help to remove it.

“Currently the Patient Access app only supports the addition of new information; however, we envisage quickly extending this functionality to delete information via the Patient Access service.” How this will ensure accuracy and avoid self editing I am unsure.

Questions: Who can access this data?

While the system stated that “the information is stored securely in our accredited data centre that deals solely with clinical data. ” there is no indication of where, who manages it and who may access it and why.

In 2014 it was announced that pharmacies would begin to have access to the summary care record.

“A total of 100 pharmacies across Somerset, Northampton, North Derbyshire, Sheffield and West Yorkshire will be able to view a patient’s summary care record (SCR), which contains information such as a patient’s current medications and allergies.”

Yet clearly in the Summary Care Record consent process in 2010 from my record, pharmacists were not mentioned.

Does the patient online access also use the Summary Care Record or not? If so, did I by asking for online access, just create a SCR without asking for one? Or is it a different subset of data? If they are different, which is the definitive record?

Overall:

From stories we read it could appear that there are broad discrepancies between what is possible in one area of the country from another, and between one practice and another.

Clearly to give the impression that 97% of patients can access summary records online is untrue to date if only 4.5% actually can get onto an electronic system, and see any part of their records, on demand today.

How much value is added to patients and practitioners in that 4.5% may vary enormously depending upon what functionality they have chosen to enable at different locations.

For me as a rare user of the practice, there is no obvious benefit right now. I can book appointments during the day by telephone and meds are ordered through the chemist. It contained no other information.

I don’t know what evidence base came from patients to decide that Patient Online should be a priority.

How many really want and need real time, online access to their records? Would patients not far rather the priority in these times of austerity, the cash and time and IT expertise be focused on IT in direct care and visible by their medics? So that when they visit hospital their records would be available to different departments within the hospital?

I know which I would rather have.

What would be good to see?

I’d like to get much clearer distinction between the data purposes we have of what data we share for direct and indirect purposes, and on what legal basis.

Not least because it needs to be understandable within the context of data protection legislation. There is often confusion in discussions of what consent can be implied for direct care and where to draw its limit.

The consultation launched in June 2014 is still to be published since it ended in August 2014, and it too blurred the lines between direct care and secondary purposes.  (https://www.gov.uk/government/consultations/protecting-personal-health-and-care-data).

Secondly, if patients start to generate potentially huge quantities of data in the Apple link and upload it to GP electronic records, we need to get this approach correct from the start. Will that data be onwardly shared by GPs through care.data for example?

But first, let’s start with tighter use of language on communications. Not only for the sake of increased accuracy, but so that as a result expectations are properly set for policy makers, practitioners and patients making future decisions.

There are many impressive visions and great ideas how data are to be used for the benefit of individuals and the public good.

We need an established,  easy to understand, legal and ethical framework about our datasharing in the NHS to build on to turn benefits into an achievable reality.

Are care.data pilots heading for a breech delivery?

Call the midwife [if you can find one free, the underpaid overworked miracle workers that they are], the care.data ‘pathfinder’ pilots are on their way! [This is under a five minute read, so there should be time to get the hot water on – and make a cup of tea.]

I’d like to be able to say I’m looking forward to a happy new arrival, but I worry care.data is set for a breech birth. Is there still time to have it turned around? I’d like to say yes, but it might need help.

The pause appears to be over as the NHS England board delegated the final approval of directions to their Chair, Sir Malcolm Grant and Chief Executive, Simon Stevens, on Thursday May 28.

Directions from NHS England which will enable the HSCIC to proceed with their pathfinder pilots’ next stage of delivery.

“this is a programme in which we have invested a great deal, of time and thought in its development.” [Sir Malcolm Grant, May 28, 2015, NHS England Board meeting]

And yet. After years of work and planning, and a 16 month pause, as long as it takes for the gestation of a walrus, it appears the directions had flaws. Technical fixes are also needed before the plan could proceed to extract GP care.data and merge it with our hospital data at HSCIC.

And there’s lots of unknowns what this will deliver.**

Groundhog Day?

The public and MPs were surprised in 2014. They may be even more surprised if 2015 sees a repeat of the same again.

We have yet to hear case studies of who received data in the past and would now be no longer eligible. Commercial data intermediaries? Can still get data, the NHS Open Day was told. And they do, as the HSCIC DARS meeting minutes in 2015 confirm.

By the time the pilots launch, the objection must actually work, communications must include an opt out form and must clearly give the programme a name.

I hope that those lessons have been learned, but I fear they have not been. There is still lack of transparency. NHS England’s communications materials and May-Oct 2014 and any 2015 programme board minutes have not been published.

We have been here before.

Back to September 2013: the GPES Advisory Committee, the ICO and Dame Fiona Caldicott, as well as campaigners and individuals could see the issues in the patient leaflet and asked for fixes.The programme went ahead anyway in February 2014 and although foreseen, failed to deliver. [For some, quite literally.]

These voices aren’t critical for fun, they call for fixes to get it right.

I would suggest that all of the issues raised since April 2014, were broadly known in February 2014 before the pause began. From the public listening exercise,  the high level summary captures some issues raised by patients, but doesn’t address their range or depth.

Some of the difficult and unwanted  issues, are still there, still the same and still being ignored, at least in the public domain. [4]

A Healthy New Arrival?

How is the approach better now and what happens next to proceed?

“It seems a shame,” the Walrus said, “To play them such a trick, After we’ve brought them out so far, And made them trot so quick!” [Lewis Carroll]

When asked by a board member: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach? it wasn’t very clear. [full detail end of post]

First they must pass the tests asked of them by Dame Fiona [her criteria and 27 questions from before Christmas.] At least that was what the verbal background given at the board meeting explained.

If the pilots should be a dip in the water of how national rollouts will proceed, then they need to test not just for today, but at least for the known future of changing content scope and expanding users – who will pay for the communication materials’ costs each time?

If policy keeps pressing forward, will it not make these complications worse under pressure? There may be external pressure ahead as potential changes to EU data protection are expected this year as well, for which the pilot must be prepared and design in advance for the expectations of best practice.

Pushing out the pathfinder directions, before knowing the answers to these practical things and patient questions open for over 16 months, is surely backwards. A breech birth, with predictable complications.

If in Sir Malcolm Grant’s words:

“we would only do this  if we believed it was absolutely critical in the interests of patients.” [Malcom Grant, May 28, 2015, NHS England Board meeting]

then I’d like to see the critical interest of patients put first. Address the full range of patient questions from the ‘listening pause’.

In the rush to just fix the best of a bad job, we’ve not even asked are we even doing the right thing? Is the system designed to best support doctor patient needs especially with the integration “blurring the lines” that Simon Stevens seems set on.

If  focus is on the success of the programme and not the patient, consider this: there’s a real risk too many opt out due to these unknowns. And lack of real choice on how their data gets used. It could be done better to reduce that risk.

What’s the percentage of opt out that the programme deems a success to make it worthwhile?

In March 2014, at a London event, a GP told me all her patients who were opting out were the newspaper reading informed, white, middle class. She was worried that the data that would be included, would be misleading and unrepresentative of her practice in CCG decision making.

medConfidential has written a current status for pathfinder areas that make great sense to focus first on fixing care.data’s big post-election question the opt out that hasn’t been put into effect. Of course in February 2014 we had to choose between two opt outs -so how will that work for pathfinders?

In the public interest we need collectively to see this done well. Another mis-delivery will be fatal. “No artificial timelines?”

Right now, my expectations are that the result won’t be as cute as a baby walrus.

******

Notes from the NHS England Board Meeting, May 28, 2015:

TK said:  “These directions [1] relate only to the pathfinder programme and specify for the HSCIC what data we want to be extracted in the event that Dame Fiona, this board and the Secretary of State have given their approval for the extraction to proceed.

“We will be testing in this process a public opt out, a citizen’s right to opt out, which means that, and to be absolutely clear if someone does exercise their right to opt out, no clinical data will be extracted from their general practice,  just to make that point absolutely clearly.

“We have limited access to the data, should it be extracted at the end of the pathfinder phase, in the pathfinder context to just four organisations: NHS England, Public Health England, the HSCIC and CQC.”

“Those four organisations will only be able to access it for analytic purposes in a safe, a secure environment developed by the Information Centre [HSCIC], so there will be no third party hosting of the data that flows from the extraction.

“In the event that Dame Fiona, this board, the Secretary of State, the board of the Information Centre, are persuaded that there is merit in the data analysis that proceeds from the extraction, and that we’ve achieved an appropriate standard of what’s called fair processing, essentially have explained to people their rights, it may well be that we proceed to a programme of national rollout, in that case this board will have to agree a separate set of directions.”

“This is not signing off anything other than a process to test communications, and for a conditional approval on extracting data subject to the conditions I’ve just described.”

CD said: “This is new territory, precedent, this is something we have to get right, not only for the pathfinders but generically as well.”

“One of the consequences of having a pathfinder approach, is as Tim was describing, is that directions will change in the future. So if we are going to have a truly fair process , one of the things we have to get right, is that for the pathfinders, people understand that the set of data that is extracted and who can use it in the pathfinders, will both be a subset of, the data that is extracted and who can use it in the future. If we are going to be true to this fair process, we have to make sure in the pathfinders that we do that.

“For example, at the advisory group last week, is that in the communication going forward we have to make sure that we flag the fact there will be further directions, and they will be changed, that we are overt in saying, subject to what Fiona Caldicott decides, that process itself will be transparent.”

Questions from Board members:
Q: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach?
What are the top three objectives we seek to achieve?

TK: So, Dame Fiona has set a series of standards she expects the pathfinders to demonstrate, in supporting GPs to be able to discharge this rather complex communication responsibility, that they have under the law  in any case.

“On another level how we can demonstrate that people have adequately understood their right to opt out [..]

“and how do we make sure that populations who are relatively hard to reach, although listed with GPs, are also made aware of their opportunity to opt out.

Perhaps it may help if I forward this to the board, It is in the public domain. But I will forward the letter to the board.”

“So that lays out quite a number of specific tangible objectives that we then have to evaluate in light of the pathfinder experience. “

Chair: “this is a programme in which we have invested a great deal, of time and thought in its development, we would only do this  if we believed it was absolutely critical in the interests of patients, it was something that would give us the information the intelligence that we need to more finely attune our commissioning practice, but also to get real time intelligence about how patients lives are lived, how treatments work and how we can better provide for their care.

“I don’t think this is any longer a matter of huge controversy, but how do we sensitively attune ourselves to patient confidentiality.”

“I propose that […] you will approve in principle the directions before you and also delegate to the Chief Executive and to myself to do final approval on behalf of the board, once we have taken into account the comments from medConfidential and any other issues, but the substance will remain unchanged.”

******

[4] request for the release of June 2014 Open House feedback still to be published in the hope that the range and depth of public questions can be addressed.

care.data comms letter

******
“The time has come,” the walrus said, “to talk of many things.”
[From ‘The Walrus* and the Carpenter’ in Through the Looking-Glass by Lewis Carroll]

*A walrus has a gestation period of about 16 months.
The same amount of time which the pause in the care.data programme has taken to give birth to the pathfinder sites.

references:
[1] NHS England Directions to HSCIC: May 28 2015 – http://www.england.nhs.uk/wp-content/uploads/2015/05/item6-board-280515.pdf
[2] Notes from care.data advisory group meeting on 27th February 2015
[3] Patient questions: https://jenpersson.com/pathfinder/
[4] Letter from NHS England in response to request from September, and November 2014 to request that public questions be released and addressed


15 Jan 2024: Image section in header replaced at the request of likely image tracing scammers who don’t own the rights and since it and this blog is non-commercial would fall under fair use anyway. However not worth the hassle. All other artwork on this site is mine.

The Economic Value of Data vs the Public Good? [1] care.data, Concerns and the cost of Consent

They say ‘every little helps’.  care.data needs every little it can get.

In my new lay member role on the ADRN panel, I read submissions for research requests for any ethical concerns that may be reflected in wider public opinion.

The driving force for sharing administrative data research is non-commercial, with benefits to be gained for the public good.

So how do we quantify the public good, and ‘in the public interest’?

Is there alignment between the ideology of government, the drivers of policy [for health, such as the commissioning body NHS England] and the citizens of the country on what constitutes ‘the public good’?

There is public good to be gained for example, from social and health data seen as a knowledge base,  by using it using in ‘bona fide’ research, often through linking with other data to broaden insights.

Insight that might result in improving medicines, health applications, and services. Social benefits that should help improve lives, to benefit society.

Although social benefits may be less tangible, they are no harder for the public to grasp than the economic. And often a no brainer as long as confidentiality and personal control are not disregarded.

When it comes to money making from our data the public is less happy. The economic value of data raises more questions on use.

There is economic benefit to extract from data as a knowledge base to inform decision making, being cost efficient and investing wisely. Saving money.

And there is measurable economic public good in terms of income tax from individuals and corporations who by using the data make a profit, using data as a basis from which to create tools or other knowledge. Making money for the public good through indirect sales.

Then there is economic benefit from data trading as a commodity. Direct sales.

In all of these considerations, how does what the public feels and their range of opinions, get taken into account in the public good cost and benefit accounting?

Do we have a consistent and developed understanding of ‘the public interest’ and how it is shifting to fit public expectation and use?

Public concern

“The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.”  [Wellcome blog, April 2015]

If something is jeopardising that public good it is in the public interest to say so, and for the right reasons.

The loss of public trust in data sharing measured by public feeling in 2014 is a threat to data used in the public interest, so what are we doing to fix it and are care.data lessons being learned?

The three biggest concerns voiced by the public at care.data listening events[1] were repeatedly about commercial companies’ use, and re-use of data, third parties accessing data for unknown purposes and the resultant loss of confidentiality.

 Question from Leicester: “Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.” [NHS Open Day, June 2014]

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial purposes.

Much of the debate and upset caused by the revelations of how our hospital episode statistics were managed in the past centred on the sense of loss of ownership. And with that, the inability to consent to who uses it. This despite acknowledgment that patients own their data.

Significant concern centres on use of the information gleaned from data that patients consider commercial exploitation. For use segmenting the insurance markets. For consumer market research. Using data for individual targeting. And its utter lack of governance.

There is also concern about data being directly sold or exchanged as a commodity.

These concerns were raised meeting after meeting in the 2014 care.data “listening process.”

To read in Private Eye that commercially sensitive projects were discussed in various meetings between NHS England and supermarket giant Tesco throughout 2014 [2] by the Patients and Information Director, responsible for care.data, is therefore all the more surprising.

They may of course be quite unrelated.

But when transparency is the mother of trust, it’s perhaps a surprising liason while ‘listening’ to care.data concerns.

It could appear that greater confidentiality was given to the sensitivity of commercial meetings than citizens’ sensitive data.

Consent package deals may be a costly mistake

People are much more aware since care.data a year ago, that unknown third parties may access data without our consent.

Consent around secondary NHS data sharing and in wider fora is no longer an inconvenient ethical dilemma best left on the shelf, as it has been for the last 25 years in secondary use, dusted off in the care.data crisis. [3]

Consent is front and centre in the latest EU data protection discussions [4] in which consent may become a requirement for all research purposes.

How that may affect social science and health research use, its pros and cons [5] remain to be seen.

However, in principle consent has always been required and good practice in applied medicine, despite the caveat for data used in medical research. As a general rule: “An intervention in the health field may only be carried out after the person concerned has given free and informed consent to it”. But this is consent for your care. Assuming that information is shared when looking after you, for direct care, during medical treatment itself is not causes concerns.

The idea is becoming increasingly assumed in discussions I have heard, [at CCG and other public meetings] that because patients have given implied consent to sharing their information for their care, that the same data may be shared for other purposes. It is not, and it is those secondary purposes that the public has asked at care.data events, to see split up, and differentiated.

Research uses are secondary uses, and those purposes cannot ethically be assumed. However, legal gateways, access to that data which makes it possible to uses for clearly defined secondary purposes by law, may make that data sharing legal.

That legal assumption, for the majority of people polls and dialogue show [though not for everyone 6b], comes  a degree of automatic support for bona fide research in the public interest. But it’s not a blanket for all secondary uses by any means, and it is this blanket assumption which has damaged trust.

So if data use in research assumes consent, and any panel is the proxy for personal decision making, the panel must consider the public voice and public interest in its decision making.

So what does the public want?

In those cases where there is no practicable alternative [to consent], there is still pressure to respect patient privacy and to meet reasonable expectations regarding use. The stated ambition of the CAG, for example, is to only advise disclosure in those circumstances where there is reason to think patients would agree it to be reasonable.

Whether active not implied consent does or does not become a requirement for research purposes without differentiation between kinds, the public already has different expectations and trust around different users.

The biggest challenge for championing the benefits of research in the public good, may be to avoid being lumped in with commercial marketing research for private profit.

The latter’s misuse of data is an underlying cause of the mistrust now around data sharing [6]. It’s been a high price to pay for public health research and others delayed since the Partridge audit.

Consent package deals mean that the public cannot choose how data are used in what kids of research and if not happy with one kind, may refuse permission for the other.

By denying any differentiation between direct, indirect, economic and social vale derived from data uses, the public may choose to deny all researchers access to their all personal data.

That may be costly to the public good, for public health and in broader research.

A public good which takes profit into account for private companies and the state, must not be at the expense of public feeling, reasonable expectations and ethical good practice.

A state which allows profit for private companies to harm the perception of  good practice by research in the public interest has lost its principles and priorities. And lost sight of the public interest.

Understanding if the public, the research community and government have differing views on what role economic value plays in the public good matters.

It matters when we discuss how we should best protect and approach it moving towards a changing EU legal framework.

“If the law relating to health research is to be better harmonised through the passing of a Regulation (rather than the existing Directive 95/46/EC), then we need a much better developed understanding of ‘the public interest’ than is currently offered by law.”  [M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1]

In the words of Dr Mark Taylor, “we need to do this better.”

How? I took a look at some of this in more detail:

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

Update note: A version of these three posts was combined into an opinion piece – care.data: ‘The Value of Data versus the Public Interest?’ published on StatsLife on June 3rd 2015.

****

image via Tesco media

 

[1] care.data listening event questions: https://jenpersson.com/pathfinder/

[2] Private Eye – on Tesco / NHS England commercial meetings https://twitter.com/medConfidential/status/593819474807148546

[3] HSCIC audit and programme for change www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

[4] EU data protection discussion http://www.digitalhealth.net/news/EHI/9934/eu-ministers-back-data-privacy-changes

[5] Joint statement on EU Data Protection proposals http://www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/WTP055584.pdf

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[6b] The ‘Dialogue on Data’ Ipsos MORI research 2014 https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx – commissioned by the Economic and Social Research Council (ESRC) and the Office for National Statistics (ONS) to conduct a public dialogue examining the public’s views on using linked administrative data for research purposes,

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

[10] Personalisation in health data plans http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-002-Adult-Social-Care-Informatics.pdf

[11] Tim Kelsey Keynote speech at Strata November 2013 https://www.youtube.com/watch?v=s8HCbXsC4z8

[12] Forbes: Illumina CEO on the US$20bn DNA market http://www.forbes.com/sites/luketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market/

 

The Economic Value of Data vs the Public Good? [2] Pay-for-privacy, defining purposes

Differentiation. Telling customers apart and grouping them by similarities is what commercial data managers want.

It enables them to target customers with advertising and sales promotion most effectively. They segment the market into chunks and treat one group differently from another.

They use market research data, our loyalty card data, to get that detailed information about customers, and decide how to target each group for what purposes.

As the EU states debate how research data should be used and how individuals should be both enabled and protected through it, they might consider separating research purposes by type.

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial consumer research purposes. [ref part 1].

Separating consumer and commercial market research from the definition of research purposes for the public good by the state, could be key to rebuilding people’s trust in government data use.

Having separate purposes would permit separate consent and control procedures to govern them.

But what role will profit make in the state’s definition of ‘in the public interest’ – is it in the public interest if the UK plc makes money from its citizens? and how far along any gauge of public feeling will a government be prepared to go to push making money for the UK plc at our own personal cost?

Pay-for-privacy?

In January this year, the Executive Vice President at Dunnhumby, Nishat Mehta, wrote in this article [7], about how he sees the future of data sharing between consumers and commercial traders:

“Imagine a world where data and services that are currently free had a price tag. You could choose to use Google or Facebook freely if you allowed them to monetize your expressed data through third-party advertisers […]. Alternatively, you could choose to pay a fair price for these services, but use of the data would be forbidden or limited to internal purposes.”

He too, talked about health data. Specifically about its value when accurate expressed and consensual:

“As consumers create and own even more data from health and fitness wearables, connected devices and offline social interactions, market dynamics would set the fair price that would compel customers to share that data. The data is more accurate, and therefore valuable, because it is expressed, rather than inferred, unable to be collected any other way and comes with clear permission from the user for its use.”

What his pay-for-privacy model appears to have forgotten, is that this future consensual sharing is based on the understanding that privacy has a monetary value. And that depends on understanding the status quo.

It is based on the individual realising that there is money made from their personal data by third parties today, and that there is a choice.

The extent of this commercial sharing and re-selling will be a surprise to most loyalty card holders.

“For years, market research firms and retailers have used loyalty cards to offer money back schemes or discounts in return for customer data.”

However despite being signed up for years, I believe most in the public are unaware of the implied deal. It may be in the small print. But everyone knows that few read it, in the rush to sign up to save money.

Most shoppers believe the supermarket is buying our loyalty. We return to spend more cash because of the points. Points mean prizes, petrol coupons, or pounds off.

We don’t realise our personal identity and habits are being invisibly analysed to the nth degree and sold by supermarkets as part of those sweet deals.

But is pay-for-privacy discriminatory? By creating the freedom to choose privacy as a pay-for option, it excludes those who cannot afford it.

Privacy should be seen as a human right, not as a pay-only privilege.

Today we use free services online but our data is used behind the scenes to target sales and ads often with no choice and without our awareness.

Today we can choose to opt in to loyalty schemes and trade our personal data for points and with it we accept marketing emails, and flyers through the door, and unwanted calls in our private time.

The free option is to never sign up at all, but by doing so customers pay a premium by not getting the vouchers and discounts.  Or trading convenience of online shopping.

There is a personal cost in all three cases, albeit in a rather opaque trade off.

 

Does the consumer really benefit in any of these scenarios or does the commercial company get a better deal?

In the sustainable future, only a consensual system based on understanding and trust will work well. That’s assuming by well, we mean organisations wish to prevent PR disasters and practical disruption as resulted for example to NHS data in the last year, through care.data.

For some people the personal cost to the infringement of privacy by commercial firms is great. Others care less. But once informed, there is a choice on offer even today to pay for privacy from commercial business, whether one pays the price by paying a premium for goods if not signed up for loyalty schemes or paying with our privacy.

In future we may see a more direct pay-for-privacy offering along  the lines of Nishat Mehta.

And if so, citizens will be asking ever more about how their data is used in all sorts of places beyond the supermarket.

So how can the state profit from the economic value of our data but not exploit citizens?

‘Every little bit of data’ may help consumer marketing companies.  Gaining it or using it in ways which are unethical and knowingly continue bad practices won’t win back consumers and citizens’ trust.

And whether it is a commercial consumer company or the state, people feel exploited when their information is used to make money without their knowledge and for purposes with which they disagree.

Consumer commercial use and use in bona fide research are separate in the average citizen’s mind and understood in theory.

Achieving differentiation in practice in the definition of research purposes could be key to rebuilding consumers’ trust.

And that would be valid for all their data, not only what data protection labels as ‘personal’. For the average citizen, all data about them is personal.

Separating in practice how consumer businesses are using data about customers to the benefit of company profits, how the benefits are shared on an individual basis in terms of a trade in our privacy, and how bona fide public research benefits us all, would be beneficial to win continued access to our data.

Citizens need and want to be offered paths to see how our data are used in ways which are transparent and easy to access.

Cutting away purposes which appear exploitative from purposes in the public interest could benefit commerce, industry and science.

By reducing the private cost to individuals of the loss of control and privacy of our data, citizens will be more willing to share.

That will create more opportunity for data to be used in the public interest, which will increase the public good; both economic and social which the government hopes to see expand.

And that could mean a happy ending for everyone.

The Economic Value of Data vs the Public Good?  They need not be mutually exclusive. But if one exploits the other, it has the potential to continue be corrosive. The UK plc cannot continue to assume its subjects are willing creators and repositories of information to be used for making money. [ref 1] To do so has lost trust in all uses, not only those in which citizens felt exploited.[6]

The economic value of data used in science and health, whether to individual app creators, big business or the commissioning state in planning and purchasing is clear. Perhaps not quantified or often discussed in the public domain perhaps, but it clearly exists.

Those uses can co-exist with good practices to help people understand what they are signed up to.

By defining ‘research purposes’, by making how data are used transparent, and by giving real choice in practice to consent to differentiated data for secondary uses, both commercial and state will secure their long term access to data.

Privacy, consent and separation of purposes will be wise investments for its growth across commercial and state sectors.

Let’s hope they are part of the coming ‘long-term economic plan’.

****

Related to this:

Part one: The Economic Value of Data vs the Public Good? [1] Concerns and the cost of Consent

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

****

image via Tesco media

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

 

The Economic Value of Data vs the Public Good? [3] The value of public voice.

Demonstrable value of public research to the public good, while abstract, is a concept quite clearly understood.

Demonstrating the economic value of data for private consumer companies like major supermarkets is even easier to understand.

What is less obvious is the harm that the commercial misuse of data can do to the public’s perception of all research for the public good.[6]

The personal cost of consumer data exploitation, whether through the loss of, or through paid-for privacy, must be limited to reduce the perceived personal cost of the public good.

By reducing the personal cost, we increase the value of the perceived public benefit of sharing and overall public good.

The public good may mean many things: benefits from public health research like understanding how disease travels, or good financial planning, derived from knowing what needs communities have and what services to provide.

By reducing the private cost to individuals of the loss of control and privacy of our data, citizens will be more willing to share.

It will create more opportunity for data to be used in the public interest, for both economic and social gain.

As I outlined in the previous linked blog posts, consent [part 1] and privacy [part 2] would be wise investments for its growth.

So how are consumer businesses and the state taking this into account?

Where is the dialogue we need to keep expectations and practices aligned in a changing environment and legal framework?

Personalisation: the economic value of data for companies

Any projects under discussion or in progress without adequate public consultation and real involvement, that ignore public voice,  risk their own success and with it the public good they should create.

The same is true for commercial projects.  For example, back to Tesco.

Whether the clubcard data management and processing [8] is directly or indirectly connected to Tesco, its customer data are important to the supermarket chain and are valuable.

Former Tesco executive, spoke about that value in a 2013 interview:

“These are slow-growing industries,” Leahy said. “The difference was in the use of data, in the way Tesco learned about its customers. And from that, everything flowed.”[9]

By knowing who, how and when citizens shop, it allows them to target the sales offering to make people buy more or differently. The so-called ‘nudge’ moving citizens in the direction the company wants.

He explained how, through the Clubcard loyalty program, the supermarket was able to transition from mass marketing to personalized marketing and that it works in other areas too:

“You can already see in some areas where customers are content to be priced as customers: risk pricing with insurance and so on.

“It makes a lot of sense in health pricing, but there will be certain social policy restriction in terms of fair access and so on.”

NHS patient data and commercial supermarket data may be coming closer in their use than we might think.

Not only closer in their similar desire to move towards personalisation [10] but for similar reasons, in the desire to use all the data to know all about people as health consumers and from that, to plan and purchase, best and cheapest…”in reducing overall cost.”

It is worth thinking about in an economy driven by ideological austerity, how reducing overall cost will be applied, by cutting services or reducing to whom services are offered.

What ‘nudge’ may be applied through NHS policies, to move citizens in the direction the drivers in government or civil service want to see?

What will push those who can afford it, into private care and out of those who the state has to spend money on, if they are prepared to spend their own, for example.

What is the data that citizens provide through schemes like care.data designed to achieve?

“Demonstrating The Actual Economic Value of Data”

Tim Kelsey, speaking at Strata in 2013 [11] talked about: “Demonstrating The Actual Economic Value of Data”. Our NHS data are valuable in both economic and social terms.

[From 12:17] “It will help put the UK on the map in terms of genomic research. The PM has already committed to the UK developing 100K gene sequences very rapidly. But those sequences on their own will have very limited value without the reference data that lies out there in the real world of the NHS, the data we’ll start making available form next June […]. The name of the programme by the way is care dot data.”

The long since delayed care.data programme plans to provide medical records for secondary use, as reference data for the 100K genomics programme. The programme has the intent to “create a lasting legacy for patients, the NHS and the UK economy.”

With consent.

When the CEO of Illumina talks about winning a US $20bn market [12] perhaps it also sounds economically appealing for the UK plc and the austerity-lean NHS. Illumina is the company which won the contract for the Genomics England project sequencing of course.

“The notion here is that it’s really a precursor to understand the health economics of why sequencing helps improve healthcare, both in quality of outcome, and in reducing overall cost. Presuming we meet the objectives of this three-year study–and it’s truly a pilot–then the program will expand substantially and sequence many more people in the U.K.” [Jay Flatley, CEO]

The idea of it being a precursor leaves me asking, to what?
“Will expand substantially” to whom?

As more and more becomes possible in science, there will be an ever greater need for understanding between how and why we should advance medicine, and how to protect human dignity. Because it becomes possible may not always mean it should be done.

Article 21 of the Convention for the Protection of Human Rights and Dignity of the Human Being with regard to the application of biology and medicine, also says:  “The human body and its parts shall not, as such, give rise to financial gain.”

How close is profit making from DNA sequencing getting to that line?

These are questions that raise ethical questions and questions of social and economic value. The social legitimacy of these programmes will depend on trust. Trust based on no surprises.

Commercial market research or real research for the public good?

Meanwhile all consenting patients can in theory now choose to access their own record [GP online].  Mr Kelsey expressed hopes in 2013 that developers would use that to help patients:

“to mash it up with other data sources to get their local retailers to tell them about their purchasing habits [16:05] so they can mash it up with their health data.”

This despite the 67% of the public concerned around health data use by commercial companies.

So what were the commercially sensitive projects discussed by NHS England and Tesco throughout 2014? It would be interesting to know whether loyalty cards and mashing up our data was part of it – or did they discuss market segmentation, personalisation and health pricing? Will we hear the ‘Transparency Tsar‘ tell NHS citizens their engagement is valued, but in reality find the public is not involved?

To do so would risk another care.data style fiasco in other fields.

Who might any plans offer most value to – the customer, the company or the country plc? Will the Goliaths focus on short term profit or fair processing and future benefits?

In the long run, ignoring public voice won’t help the UK plc or the public interest.

A balanced and sustainable research future will not centre on a consumer pay-for-privacy basis, or commercial alliances, but on a robust ethical framework for the public good.

A public good which takes profit into account for private companies and the state, but not at the expense of public feeling and ethical good practice.

A public good which we can understand in terms of social, direct and indirect economic value.

While we strive for the economic and public good in scientific and medical advances we must also champion human dignity and values.

This dialogue needs to be continued.

“The commitment must be an ongoing one to continue to consult with people, to continue to work to optimally protect both privacy and the public interest in the uses of health data. We need to use data but we need to use it in ways that people have reason to accept. Use ‘in the public interest’ must respect individual privacy. The current law of data protection, with its opposed concepts of ‘privacy’ and ‘public interest’, does not do enough to recognise the dependencies or promote the synergies between these concepts.”

[M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1]

The public voice from care.data listening and beyond, could positively help shape the developing consensual model if given genuine adequate opportunity to do so in much needed dialogue.

As they say, every little helps.

****

Part one: The Economic Value of Data vs the Public Good? [1] Concerns and the cost of Consent

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

****

[1] care.data listening event questions: https://jenpersson.com/pathfinder/

[2] Private Eye – on Tesco / NHS England commercial meetings https://twitter.com/medConfidential/status/593819474807148546

[3] HSCIC audit and programme for change www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

[4] EU data protection discussion http://www.digitalhealth.net/news/EHI/9934/eu-ministers-back-data-privacy-changes

[5] Joint statement on EU Data Protection proposals http://www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/WTP055584.pdf

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

[10] Personalisation in health data plans http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-002-Adult-Social-Care-Informatics.pdf

[11] Tim Kelsey Keynote speech at Strata November 2013 https://www.youtube.com/watch?v=s8HCbXsC4z8

[12] Forbes: Illumina CEO on the US$20bn DNA market http://www.forbes.com/sites/luketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market/

Public data in private hands – should we know who manages our data?

When Tesco reportedly planned to sell off its data arm Dunnhumby [1] in January this year, it was a big deal.

Clubcard and the data which deliver customer insights – telling the company who we are, what we buy and how and when we shop using ‘billions of lines of code’ – will clearly continue to play a vital role in the supermarket customer relations strategy, whether its further processing and analysis is in-house or outsourced.

Assuming the business is sold,  clubcard shoppers might wonder who will then own their personal data, if not the shoppers themselves? Who is the data controller and processor? Who will inform customers of any change in its management?

“Dunnhumby has functioned as a standalone outfit in the past few years, offering customer information services to other retailers around the world, and could operate in a similar way for Tesco post-acquisition.”

I haven’t seen in the same media that the Dunnhumby speculation turned into a sale. At least not yet.

In contrast to the commercial company managing customer data for those who choose to take part, the company which manages the public’s data for many state owned services, was sold in December.

For an undisclosed value, Northgate Public Services [2] part of NIS was sold in Dec 2014 to Cinven, a European private equity firm.

What value I wondered does the company have of itself, or what value is viewed intrinsic to the data it works with – health screening, the National Joint Registry and more? It formerly managed HES data. What was part of the deal? Are the data part of the package?

Does the public have transparency of who manages our data?

Northgate has, according to their website, worked with public data, national and local government administrative data since 1969, including the development and management of the NNADC, “the mission critical solution providing continuous surveillance of the UK’s road network. The NADC is integrated with other databases, including the Police National Computer, and supports more than 3 million reads a day across the country.”

Northgate manages welfare support payments for many local authorities and the Welsh Assembly Government.

Data are entrusted to these third parties by the commercial or public body, largely without informing the public.

One could argue that a ‘named owner and processor’ is irrelevant to the public, which is probably true when things are done well.

But when things go wrong or are changed, should ‘the supplier’ of the data, or rather the public whose data it is, not be told?

If so, citizens would be informed and know who now accesses or even owns our public data that Northgate had in the past. Different firms will have different levels of experience, security measures and oversight of their practices than others. To understand how this works could be an opportunity for transparency to create trust.

Trust which is badly needed to ensure consensual data sharing continues.

So what will the future hold for these systems now owned by a private equity firm?

The buyer of Northgate Public Services, Cinven, has experience making a profit in healthcare.

We hear few details of plans available in the public domain about the NHS vision for data management and its future in public research.

We generally hear even less about the current management of the public’s data unless it is in a crisis, as front page stories will testify to over the last year. care.data has been in good company generating anger, with HMRC, the electoral register and other stories of legal, but unexpected data use of citizens’ data.

As a result we don’t know what of our public data is held by whom.

The latest news reported by the DM [3] will not be popular either given that 2/3rds of people asked in research into public trust over the governance of data [4] have concerns about public data in the hands of private firms:

Controversial plans to give private companies such as Google responsibility for storing people’s private personal health data could be revived, a minister has suggested.”

Could there ever be privatisation plans afoot for HSCIC?

It’s going to be interesting to see what happens next, whoever is making these decisions on our behalf after May 7th.

Certainly the roadmap, business plan, SIAM goals, and framework agreement [5] have given me cause to consider this before. The framework agreement specifically says change to its core functions or duties would require further primary legislation.”
[HSCIC DH framework agreement]

hscic_DH_framework

 

Changes to the HSCIC core remit, such as privatising the service, would require a change in legislation which would by default inform parliament.

Should there not be the same onus to inform the public whose data they are? Especially with “protection of patients being paramount”.  One could say protections should apply to our consumer data too.

Regardless of whether data are managed in-house or by another third party, by the state or commercial enterprise, if third parties can be outsourced or even sold, should consumers not always know who owns our data and of any changes in that guardianship?

Taking into account the public mistrust of commercial companies’ data management I would like to think so.

Further privatising the workings of our state data without involving the public in the process would certainly be a roadmap to driving public confidence on data sharing into the ground.

So too, when it comes to public trust, we might find when the commercial sale of consumer Clubcard data goes ahead, every little does not help.

****

Refs:

[1] Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[2] Northgate sale to Cinven http://www.northgate-is.com/press-release-nps.html / http://www.northgatepublicservices.co.uk/

[3]  On the future of data handling http://www.dailymail.co.uk/news/article-3066758/Could-Google-look-NHS-data-Controversial-plans-revived-minister-says-technology-firms-best-placed-look-information-securely.html

[4] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[5] HSCIC DH Framework agreement http://www.hscic.gov.uk/media/13866/Framework-Agreement-between-the-Department-of-Health-and-the-HSCIC/pdf/Framework_Agreement_between_the_Department_of_Health_and_the_Health_and_Social_Care_Information_Cent.pdf

Wearables: patients will ‘essentially manage their data as they wish’. What will this mean for diagnostics, treatment and research and why should we care? [#NHSWDP 3]

 

Consent to data sharing appears to be a new choice firmly available on the NHS England patient menu if patient ownership of our own records, is clearly acknowledged as ‘the operating principle legally’.

Simon Stevens, had just said in his keynote speech:

“..smartphones; […] the single most important health treatment and diagnostic tool at our disposal over the coming decade and beyond ” Simon Stevens, March 18 2015.

Tim Kelsey, Director Patients and Information, NHS England, then talked about consent in the Q&A:

“We now acknowledge the patient’s ownership of the record […] essentially, it’s always been implied, it’s still not absolutely explicit but it is the operating principle now legally for the NHS.

“So, let’s get back to consent and what it means for clinical professionals, because we are going to move to a place where people will make those decisions as they currently do with wearable devices, and other kinds of mobile, and we need to get to a point where people can plug their wearable device into their medical record, and essentially manage their data as they wish.

“It is essentially, their data.”

How this principle has been applied in the past, is being now, and how it may change matters, as it will affect many other areas.

Our personal health data is the business intelligence of the health industry’s future.

Some parts of that industry will say we don’t share enough data. Or don’t use it in the right way.  For wearables designed as medical devices, it will be vital to do so.

But before some launch into polemics on the rights and wrongs of blanket ‘data sharing’ we should be careful what types of data we mean, and for what purposes it is extracted.It matters when discussing consent and sharing.

We should be clear to separate consent to data sharing for direct treatment from consent for secondary purposes other than care (although Mr Kelsey hinted at a conflation of the two in a later comment). The promised opt-out from sharing for secondary uses is pending legal change. At least that’s what we’ve been told.

Given that patient data from hospital and range of NHS health settings today, are used for secondary purposes without consent – despite the political acknowledgement that patients have an opt out – this sounded a bold new statement, and contrasted with his past stance.

Primary care data extraction for secondary uses, in the care.data programme, was not intended to be consensual. Will it become so?

Its plan so far has an assumed opt-in model, despite professional calls from some, such as at the the BMA ARM to move to an opt-in model, and the acknowledged risk of harm that it will do to patient trust.

The NHS England Privacy Assessment said: ‘The extraction of personal confidential data from providers without consent carries the risk that patients may lose trust in the confidential nature of the health service.’

A year into the launch, Jan 2014, a national communications plan should have solved the need for fair processing, but another year on, March 2015, there is postcode lottery, pilot approach.

If in principle, datasharing is to be decided by consensual active choice,  as it “is the operating principle now legally for the NHS” then why not now, for care.data, and for all?

When will the promised choice be enacted to withhold data from secondary uses and sharing with third parties beyond the HSCIC?

“we are going to move to a place where people will make those decisions as they currently do with wearable devices” [Widening digital participation, at the King’s Fund March 2015]

So when will we see this ‘move’ and what will it mean?

Why plan to continue to extract more data under the ‘old’ assumption principle, if ownership of data is now with the individual?

And who is to make the move first – NHS patients or NHS patriarchy – if patients use wearables before the NHS is geared up to them?

Looking back or forward thinking?

Last year’s programme has become outdated not only in principle, but digital best practice if top down dictatorship is out, and the individual is now to “manage their data as they wish.”

What might happen in the next two years, in the scope of the Five Year Forward Plan or indeed by 2020?

This shift in data creation, sharing and acknowledged ownership may mean epic change for expectations and access.

It will mean that people’s choice around data sharing; from patients and healthy controls, need considered early on in research & projects. Engagement, communication and involvement will be all about trust.

For the ‘worried well’, wearables could ‘provide digital “nudges” that will empower us to live healthier and better lives‘ or perhaps not.

What understanding have we yet, of the big picture of what this may mean and where apps fit into the wider digital NHS application and beyond?

Patients right to choose

The rights to information and decision making responsibility is shifting towards the patient in other applied areas of care.

But what data will patients truly choose to apply and what to share, manipulate or delete? Who will use wearables and who will not, and how will that affect the access to and delivery of care?

What data will citizens choose to share in future and how will it affect the decision making by their clinician, the NHS as an organisation, research, public health, the state, and the individual?

Selective deletion could change a clinical history and clinician’s view.

Selective accuracy in terms of false measurements [think diabetes], or in medication, could kill people quickly.

How are apps to be regulated? Will only NHS ‘approved’ apps be licensed for use in the NHS and made available to choose from and what happens to patients’ data who use a non-approved app?

How will any of their data be accessed and applied in primary care?

Knowledge is used to make choices and inform decisions. Individuals make choices about their own lives, clinicians make decisions for and with their patients in their service provision, organisations make choices about their business model which may include where to profit.

Our personal health data is the business intelligence of the health industry’s future.

Who holds the balance of power in that future delivery model for healthcare in England, is going to be an ongoing debate of epic proportions but it will likely change in drips rather than a flood.

It has already begun. Lobbyists and companies who want access to data are apparently asking for significant changes to be made in the access to micro data held at the ONS. EU laws are changing.

The players who hold data, will hold knowledge, will hold power.

If the NHS were a monopoly board game, data intermediaries would be some of the wealthiest sites, but the value they create from publicly funded NHS data, should belong in the community chest.

If consent is to be with the individual for all purposes other than direct care, then all data sharing bodies and users had best set their expectations accordingly. Patients will need to make wise decisions, for themselves and in the public interest.

Projects for research and sharing must design trust and security into plans from the start or risk failure through lack of participants.

It’s enormously exciting.  I suspect some apps will be rather well hyped and deflate quickly if not effective. Others might be truly useful. Others may kill us.

As twitter might say, what a time to be alive.

Digital opportunities for engaging citizens as far as apps and data sharing goes, is not only not about how the NHS will engage citizens, but how citizens will engage with what NHS offering.

Consent it seems will one day be king.
Will there or won’t there be a wearables revolution?
Will we be offered or choose digital ‘wellness tools’ or medically approved apps? Will we trust them for diagnostics and treatment? Or will few become more than a fad for the worried well?
Control for the individual over their own data and choice to make their own decisions of what to store, share or deny may rule in practice, as well as theory.
That practice will need to differentiate between purposes for direct clinical care and secondary uses as it does today, and be supported and protected in legislation, protecting patient trust.
“We are going to move to a place where people will make those decisions as they currently do with wearable devices, and other kinds of mobile, and we need to get to a point where people can plug their wearable device into their medical record, and essentially manage their data as they wish.”
However as ‘choice’ was the buzzword for NHS care in recent years – conflated with increasing the use of private providers – will consent be abused to mean a shift of responsibility from the state to the individual, with caveats for how it could affect care?
With that shift in responsibility for decision making, as with personalized budgets, will we also see a shift in responsibility for payment choices from state to citizen?
Will our lifestyle choices in one area exclude choice in another?
Could app data of unhealthy purchases from the supermarket or refusal to share our health data, one day be seen as refusal of care and a reason to decline it? Mr Kelsey hinted at this last question in the meeting.
Add a population stratified by risk groups into the mix, and we have lots of legitimate questions to ask on the future vision of the NHS.
He went on to say:
“we have got some very significant challenges to explore in our minds, and we need to do, quite urgently from a legal and ethical perspective, around the advent of machine learning, and …artificial intelligence capable of handling data at a scale which we don’t currently do […] .
“I happen to be the person responsible in the NHS for the 100K genomes programme[…]. We are on the edge of a new kind of medicine, where we can also look at the interaction of all your molecules, as they bounce around your DNA. […]
“The point is, the principle is, it’s the patient’s data and they must make decisions about who uses it and what they mash it up with.”
How well that is managed will determine who citizens will choose to engage and share data with, inside and outside our future NHS.
Simon Stevens earlier at the event, had acknowledged a fundamental power shift he sees as necessary:
“This has got to be central about what the redesign of care looks like, with a fundamental power shift actually, in the way in which services are produced and co-produced.”

That could affect everyone in the NHS, with or without a wearables revolution.

These are challenges the public is not yet discussing and we’re already late to the party.

We’re all invited. What will you be wearing?

********
[Previous: part one here #NHSWDP 1  – From the event “Digital Participation and Health Literacy: Opportunities for engaging citizens” held at the King’s Fund, London, March 18, 2015]

[Previous: part two #NHSWDP 2: Smartphones: the single most important health treatment & diagnostic tool at our disposal]

********

Apple ResearchKit: http://techcrunch.com/2015/03/09/apple-introduces-researchkit-turning-iphones-into-medical-diagnostic-devices/#lZOCiR:UwOp
Digital nudges – the Tyranny of the Should by Maneesha Juneja http://maneeshjuneja.com/blog/2015/3/2/the-tyranny-of-the-should

You may use these HTML tags and attributes: <blockquote cite="">

smartphones: the single most important health treatment & diagnostic tool at our disposal [#NHSWDP 2]

After Simon Stevens big statement on smartphones at the #NHSWDP event, I’d asked what sort of assessment had the NHS done on how wearables’ data would affect research.

#digitalinclusion is clearly less about a narrow focus on apps than applied skills and online access.

But I came away wondering how apps will work in practice, affect research and our care in the NHS in the UK, and much more.

What about their practical applications and management?

NHS England announced a raft of regulated apps for mental health this week, though it’s not the first approved.  

This one doesn’t appear to have worked too well.

The question needs an answer before many more are launched: how will these be catalogued, indexed and stored ? Will it be just a simple webpage? I’m sure we can do better to make this page user friendly and intuitive.

This British NHS military mental health app is on iTunes. Will iTunes carry a complete NHS approved library and if so, where are the others?

We don’t have a robust regulation model for digital technology, it was said at a recent WHF event, and while medical apps are sold as wellness or fitness or just for fun, patients could be at risk.

In fact, I’m convinced that while medical apps are being used by consumers as medical devices, for example as tests, or tools which make recommendations, and they are not thoroughly regulated, we *are* at risk.

If Simon Stevens sees smartphones as: “going to be the single most important health treatment and diagnostic tool at our disposal over the coming decade and beyond,” then we’d best demand the tools that work on them, work safely. [speech in full]

And if his statement on their importance is true, then when will our care providers be geared up to accepting extracts of data held on a personal device into the local health record at a provider – how will interoperability, testing and security work?

And who’s paying for them? those on the library right now, have price tags. The public should be getting lots of answers to lots of questions.

“Over the coming decade”  has already started.

What about Research?: I know the Apple ResearchKit had a big reaction, and I’m sure there’s plenty of work already done on expectations of how data sharing in wearables affect research participation. (I just haven’t read it yet, but am interested to do so,  feel free to point any my way).

I was interested in the last line in this article: “ResearchKit is a valiant effort by Apple, and if its a hit with scientists, it could make mass medical research easier than ever.”

How do we define ‘easier’? Has Apple hit on a mainstream research app? How is ‘mass medical research’ in public health for example, done today and how may it change?

Will more people be able to participate in remote trials?

Will more people choose to share their well-being data and share ‘control’ phenotype data more in depth than in the past?

Are some groups under- or not-at-all represented?

How will we separate control of datasharing for direct care and for other secondary uses like research?

Quality: Will all data be good data or do we risk research projects drowning in a data tsunami of quantity not quality? Or will apps be able to target very specific trial data better than before?

How: One size will not fit all. How will data stored in wearables affect research in the UK? Will those effects differ between the UK and the US, and will app designs need different approaches due to the NHS long history and take into account single standards and be open? How will research take historical data into account if apps are all ‘now’? How will research based on that data be peer reviewed?

Where: And as we seek to close the digital divide here at home, what gulf may be opening up in the research done in public health, the hard to reach, and even between ‘the west’ and ‘developing’ countries?

In the UK will the digital postcode lottery affect care? Even with a wish for wifi in every part of the NHS estate, the digital differences are vast. Take a look at Salford – whose digital plans are worlds apart from my own Trust which has barely got rid of Lloyd George folders on trolleys.

Who: Or will in fact the divide not be by geography, but by accessibility based on wealth?  While NHS England talks about digital exclusion, you would hope they would be doing all they can to reduce it. However, the mental health apps announced just this week each have a price tag if ‘not available’ to you on the NHS.

Why: on what basis will decisions be made on who gets them prescribed and who pays for the,  where apps are to be made available for which area of diagnosis or treatment, or at all if the instructions are “to find out if it’s available in your area email xxx or call 020 xxx. Or you could ask your GP or healthcare professional.”

The highest intensity users of the NHS provision, are unlikely to be the greatest users of growing digital trends.

Rather the “worried well” would seem the ideal group who will be encouraged to stay away from professionals, self-care with self-paid support from high street pharmacies. How much could or will this measurably benefit the NHS, the individual and make lives better? As increasingly the population is risk stratified and grouped into manageable portions, will some be denied care based on data?

Or will the app providers be encouraged to promote their own products, make profits, benefit the UK plc regardless of actual cost and measurable benefits to patients?

In 2013, IMS Health reported that more than 43,000 health-related apps were available for download from the Apple iTunes app store. Of those, the IMS Institute found that only 16,275 apps are directly related to patient health and treatment, and there was much to be done to move health apps from novelty to mainstream.

Reactionary or Realistic – and where’s the Risks Assessment before NHS England launches even more approved apps?

At the same time as being exciting,  with this tempting smörgåsbord of shiny new apps comes a set of new risks which cannot responsibly be ignored. In patient safety, cyber security, and on what and who will be left out.

Given that basic data cannot in some places be shared between GP and hospital due for direct care to local lack of tech and the goal is another five years away, how real is the hype of the enormous impact of wearables going to be for the majority or at scale?

On digital participation projects: “Some of the work that has already been done by the Tinder Foundation, you take some of the examples here, with the Sikh community in  Leicester around diabetes, and parenting in other parts of the country, you can see that this is an agenda which can potentially get real quite quickly and can have quite a big impact.”
(Simon Stevens)

These statements, while each on different aspects of digital inclusion, by Simon Stevens on smartphones, and scale, and on consent by Tim Kelsey, are fundamentally bound together.

What will wearables mean for diagnostics, treatment and research in the NHS? For those who have and those who have not?

How will sharing data be managed for direct care and for other purposes?

What control will the patriarchy of the NHS reasonably expect to have over patients choice of app by any provider? Do most patients know at all, what effect their choice may have for their NHS care?

How will funding be divided into digital and non-digital, and be fair?

How will we maintain the principles and practice of a ‘free at the point of access’ digital service available to all in the NHS?

Will there really be a wearables revolution? Or has the NHS leadership just jumped on a bandwagon as yet without any direction?

****

[Next: part three  – on consent – #NHSWDP 3: Wearables: patients will ‘essentially manage their data as they wish’. What will this mean for diagnostics, treatment and research and why should we care?] 

[Previous: part one – #NHSWDP 1: Thoughts on Digital Participation and Health Literacy: Opportunities for engaging citizens in the NHS – including Simon Stevens full keynote speech]