Tag Archives: datasharing

Act now: Stand up and speak out for your rights to finding out the facts #saveFOI

The Freedom of Information Act has enabled me to stand up for my children’s rights. It really matters to me. And we might lose it.

For every member of the public, who has ever or who has never used their rights under the Freedom of Information Act laws, the government consultation on changing them that closes today is worth caring about. If you haven’t yet had your say, go and take action here >> now.  If it is all you have time for before the end of today, you can sign 38 degrees petition or write an email to your MP.

Or by the end of today you can reply to the call for evidence. There is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also got this plain English version.

Please do. Now. It closes today, on November 20th.

If you need convinced why it matters to me and it should to you, read on.

What will happen

If the proposed changes come to pass, information about public accountability will be lost. Political engagement will not be open to all equally. It will promote an unfair society in which individuals are not only prevented from taking part in full public life, but prevented from understanding decisions made about them or that affect them. Campaign groups will be constrained from standing up for human rights by cost.  The press will be restrained in what they can ask.

MySociety has a brilliant summary.  Michael Sheen spoke up calling it “nothing short of a full frontal attack” on the principles of democratic government. And Tom Watson spoke of three serious instances where facts would have stayed hidden, were it not for access made using the law of Freedom of Information:

1. death rates in cardiac patient care
2. cases when the police use Tasers on children
3. the existence of cracks in the nuclear power station at Hinckley

Why does FOI matter to me personally? In Education.

Because it’s enabled me to start a conversation to get the Department for Education to start to improve their handling of our 8 million children’s personal and sensitive data they hold in the National Pupil Database for England and Wales. Through FOI I asked for unpublished facts how many releases of identifiable personal data of school pupils have been fast-tracked at the Department of Education without panel oversight. And to see the panel terms of reference which are still not on their website.

The request: whatdotheykknow.com
The outcome:
National Pupil Database FOI case study summary here.

I’m now coordinating calls for changes on behalf of the 8m children whose records they hold and parents across the country.

******

Why does FOI matter to me personally? In Health.

Because Freedom of Information law has enabled public transparency of decision making and accountability of the care.data programme board decision making that was kept secret for over a year. NHS England refused to publish them. Their internal review declined appeal. The Information Commissioner’s Office upheld it.

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. In fact my case study, while highlighting that NHS England refused to release information, also shows that only a handful of genuine redactions were necessary, using Section 36 to keep them hidden, when the minutes were finally released.

In October 2014 I simply wanted to see the meeting minutes form part of the public record of care.data planning. I wanted to see the cost-benefit business case and scrutinise it against the benefits case that the public were told of at every public engagement event I had been to.  When at every turn the public is told how little money the NHS can afford to spend I wanted scrutiny of what the programme would cost at national and local levels. It was in the public interest to better inform public debate about the merits of the national programme. And I strongly believe that it is in the public interest to be informed and fully understand the intention and programme that demands the use of sensitive personal data.

The request: whatdotheyknow.com
The outcome: care.data FOI case study summary here.

Others could use this information I hoped, to ask the right questions about missing meeting minutes and transparency, and for everyone to question why there was no cost-benefit business plan at all in private; while the public kept being told of the benefits.  And it shows that data collection is further set to expand, without public debate.

Since then the programme has been postoned again and work is in progress on improved public engagement to enable public and professional confidence.

What has Freedom of Information achieved?

One of the most memorable results of Freedom of Information was the MPs expenses scandal. Who knows how much this Freedom of Information Request saved the taxpayers in immeasurable amounts of future spending on duck houses since MPs have been required to publish expenses since 2010? Four MPs were jailed for false accounting. Peers were expelled. Second homes and what appeared to the public as silly spending on sundries were revealed. Mr. Cameron apologized in 2009, saying he was “appalled” by the expenses. The majority of MPs had done nothing illegal but the Freedom of Information request enabled the start of a process of increased transparency to the public which showed where activities, while permitted by law, were simply unethical or unreasonable.

Historical record

Information published under the Freedom of Information Act can help to ensure that important records of decision-making processes are retained as part of the historic background to government.

Increased trust

The right information at the right time helps make better decisions, make spending more transparent and makes policies and practices more trustworthy.

Access to official information can also improve public confidence where public sector bodies are seen as being open. In a 2011 survey carried out on behalf of the Information Commissioner’s Office, 81% of public bodies questioned agreed that the Act had increased the public’s trust in their organisation.

A key argument made by the commission is that those in public office need private space for decision making. The Information Commissioner’s Office countered this in their submission to the consultation saying,

“there is a distinction between a need for a private space, depending on the circumstances and a desire for secrecy across a broad area of public sector activity. It was the latter tendency that FOIA was intended to correct.”

So how much more “private space” do public servants need?

Holding back information

When information that are judged should not be released in the public interest, there are already exemptions that can be applied to prevent disclosure of information under the Freedom of Information Act. [1]

The exemptions include:

  • if the information can easily be accessed by other means – e.g. the internet or published documents
  • if the information is personal information
  • if the information is provided in confidence (but only if legally enforceable)
  • when there is a legal reason not to disclose
  • if the information is about national security, defence, the economy, law enforcement, formulation of Government policy, health and safety, communications with Her Majesty or other royalty, international relations, intended for future publication and commercial interests. (All exemptions in this group must be tested to see if disclosure is in the public interest.)

In addition to these exemptions, organisations can withhold information if it will take more than two-and-a-half days to provide it, or they cannot identify what information is needed (although they have to work with the requester to clarify what is being requested).

They can also withhold information if they decide the request is vexatious.

Does it cost us too much to administer?

Some people who are supportive of these changes say they are concerned about costs in answering requests but have perhaps not considered the savings in exceptional cases (like the Expenses Scandal outcome). And as mySociety has reported [2], money spent responding to Freedom of Information requests also needs to be considered fairly in the context of wider public spending. In 2012 it was reported that Staffordshire County Council had spent £38,000 in a year responding to Freedom of Information requests. The then Director of mySociety, Tom Steinberg, commented:

“From this I can see that oversight by citizens and journalists cost only £38,000 from a yearly total budget of £1.3bn. I think it is fantastic that Staffordshire County Council can provide such information for only 0.002 per cent of its operating budget.”

Why does the government want to make itself less transparent? Even the Information Commissioner’s office has replied to the consultation to say that the Commissioner does not consider that significant changes to the core principles of the legislation are needed. This is a good law, that gives the public rights in our favour and transparency into how we are governed and tax money spent.

How will the value of FOI be measured of what would be lost if the changes are made?

What can you do?

The call for evidence is here and there is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also put together this super-easy Plain English version.

To have your say in the consultation closing on November 20th go online.

Or simply call or write to your MP.  Today. This really matters.


References:

[1] Requests can be refused https://ico.org.uk/for-organisations/guide-to-freedom-of-information/refusing-a-request/

[2] MySociety opposes restrictions https://www.mysociety.org/2015/11/11/voices-from-whatdotheyknow-why-we-oppose-foi-act-restrictions/

[3] National Pupil Database FOI case study summary here

[4] My care.data programme board FOI case study summary here

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex. Too little, too late.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex ed. They should be discussing prevention and personal data protection for all our personal data, not just one company, after the event.

Everyone’s been talking about TalkTalk and for all the wrong reasons. Data loss and a 15-year-old combined with a reportedly reckless response to data protection, compounded by lack of care.

As Rory Cellan-Jones wrote [1] rebuilding its reputation with customers and security analysts is going to be a lengthy job.

In Parliament Chi Onwarah, Shadow Minister for Culture & the Digital Economy, summed up in her question, asking the Minister to acknowledge “that all the innovation has come from the criminals while the Government sit on their hands, leaving it to businesses and consumers to suffer the consequences?”  [Hansard 2]

MPs were concerned for the 4 million* customers’ loss of name, date of birth, email, and other sensitive data, and called for an inquiry. [It may now be fewer*.] [3] The SciTech committee got involved too.

I hope this means Parliament will talk about TalkTalk not as the problem to be solved, but as one case study in a review of contemporary policy and practices in personal data handling.

Government spends money in data protection work in the [4] “National Cyber Security Programme”. [NCSP] What is the measurable outcome – particularly for TalkTalk customers and public confidence – from its £860M budget?  If you look at the breakdown of those sums, with little going towards data protection and security compared with the Home Office and Defence, we should ask if government is spending our money in an appropriately balanced way on the different threats it perceives. Keith Vaz suggested British companies that lose £34 billion every year to cybercrime. Perhaps this question will come into the inquiry.

This all comes after things have gone wrong.  Again [5]. An organisation we trusted has abused that trust by not looking after data with the stringency that customers should be able to expect in the 21st century, and reportedly not making preventative changes, apparent a year ago. Will there be consequences this time?

The government now saying it is talking about data protection and consequences, is like saying they’re talking sex education with teens, but only giving out condoms to the boys.

It could be too little too late. And they want above all to avoid talking about their own practices. Let’s change that.

Will this mean a review to end risky behaviour, bring in change, and be wiser in future?

If MPs explore what the NCSP does, then we the public, should learn more about what government’s expectations of commercial companies is in regards modern practices.

In addition, any MPs’ inquiry should address government’s own role in its own handling of the public’s personal data. Will members of government act in a responsible manner or simply tell others how to do so?

Public discussion around both commercial and state use of our personal data, should mean genuine public engagement. It should involve a discussion of consent where necessary for purposes  beyond those we expect or have explained when we submit our data, and there needs to be a change in risky behaviour in terms of physical storage and release practices, or all the talk, is wasted.

Some say TalkTalk’s  practices mean they have broken their contract along with consumer trust. Government departments should also be asking whether their data handling would constitute a breach of the public’s trust and reasonable expectations.

Mr Vaizey should apply his same logic to government handling data as he does to commercial handling. He said he is open to suggestions for improvement. [6]

Let’s not just talk about TalkTalk.

    • Let’s Talk Consequences: organisations taking risk seriously and meaningful consequences if not [7]
    • Let’s Talk Education: the education of the public on personal data use by others and rights and responsibilities we have [8]
    • Let’s Talk Parliament’s Policies and Practices: about its own complementary lack of data  understanding in government and understand what good practice is in physical storage, good governance and transparent oversight
    • Let’s Talk Public Trust: and the question whether government can be trusted with public data it already has and whether its current handling makes it trustworthy to take more [9]

Vaizey said of the ICO now in his own department: “The Government take the UK’s cyber-security extremely seriously and we will continue to do everything in our power to protect organisations and individuals from attacks.”

“I will certainly meet the Information Commissioner to look at what further changes may be needed in the light of this data breach. [..] It has extensive powers to take action and, indeed, to levy significant fines. “

So what about consequences when data are used in ways the public would consider a loss, and not through an attack or a breach, but government policy? [10]

Let’s Talk Parliament’s Policies and Practices

Commercial companies are not alone in screwing up the use and processing [11] management of our personal data. The civil service under current policy seems perfectly capable of doing by itself. [12]

Government data policy has not kept up with 21st century practices and to me seems to work in the dark, as Chi Onwarah said,

‘illuminated by occasional flashes of incompetence.’

This incompetence can risk harm to people’s lives, to business and to public confidence.

And once given, trust would be undermined by changing the purposes or scope of use for which it was given, for example as care.data plans to do after the pilot. A most risky idea.

Trust in these systems, whether commercial or state, is crucial. Yet reviews which highlight this, and make suggestions to support trust such as ‘data should never be (and currently is never) released with personal identifiers‘ in The Shakespeare Review have been ignored by government.

Where our personal data are not used well in government departments by the department themselves, they seem content to date to rely on public ignorance to get away with current shoddy practices.

Practices such as not knowing who all your customers are, because they pass data on to others. Practices, such as giving individual level identifiable personal data to third parties without informing the public, or asking for consent. Practices, such as never auditing or measuring any benefit of giving away others personal data.

“It is very important that all businesses, particularly those handling significant amounts of sensitive customer data, have robust procedures in place to protect those data and to inform customers when there may have been a data breach.” Ed Vaizey, Oct 26th, HOC

If government departments prove to be unfit to handle the personal data we submit in trust to the state today, would we be right to trust them with even more?

While the government is busy wagging fingers at commercial data use poor practices, the care.data debacle is evidence that not all its MPs or civil service understand how data are used in commercial business or through government departments.

MPs calling for commercial companies to sharpen up their data protection must understand how commercial use of data often piggy-backs the public use of our personal data, or others getting access to it via government for purposes that were unintended.

Let’s Talk Education

If the public is to understand how personal data are to be kept securely with commercial organisations, why should they not equally ask to understand how the state secures their personal data? Educating the public could lead to better engagement with research, better understanding of how we can use digital services and a better educated society as a whole. It seems common sense.

At a recent public event [13],  I asked civil servants talking about big upcoming data plans they announced, linking school data with more further education and employment data, I asked how they planned to involve the people whose data they would use. There was no public engagement to mention. Why not? Inexcusable in this climate.

Public engagement is a matter of trust and developing understanding in a relationship. Organisations must get this right.[14]

If government is discussing risky practices by commercial companies, they also need to look closer to home and fix what is broken in government data handling where it exposes us to risk through loss of control of our personal data.

The National Pupil Database for example, stores and onwardly shares identifiable individual sensitive data of at least 8m children’s records from age 2 -19. That’s twice as big as the TalkTalk loss was first thought to be.

Prevention not protection is what we should champion. Rather than protection after the events,  MPs and public must demand emphasis on prevention measures in our personal data use.

This week sees more debate on how and why the government will legislate to have more powers to capture more data about all the people in the country. But are government policy, process and practices fit to handle our personal data, what they do with it and who they give it to?

Population-wide gathering of data surveillance in any of its many forms is not any less real just because you don’t see it. Children’s health, schools, increases in volume of tax data collection. We don’t discuss enough how these policies can be used every day without the right oversight. MPs are like the conservative parents not comfortable talking to their teens about sleeping with someone. Just because you don’t know, it doesn’t mean they’re not doing it. [15] It just means you don’t want to know because if you find out they’re not doing it safely, you’ll have to do something about it.

And it might be awkward. (Meanwhile in schools real, meaningful PHSE has been left off the curriculum.)

Mr. Vaizey asked in the Commons for suggestions for improvement.

My suggestion is this. How government manages data has many options. But the principle should be simple. Our personal data needs not only protected, but not exposed to unnecessary risk in the first place, by commercial or state bodies. Doing nothing, is not an option.

Let’s Talk about more than TalkTalk

Teens will be teens. If commercial companies can’t manage their systems better to prevent a child successfully hacking it, then it’s not enough to point at criminal behaviour. There is fault to learn from on all sides. In commercial and state uses of personal data.

There is talk of new, and bigger, data sharing plans. [16]

Will the government wait to see  and keep its fingers crossed each month to see if our data are used safely at unsecured settings with some of these unknown partners data might be onwardly shared with, hoping we won’t find out and they won’t need to talk about it, or have a grown up public debate based on public education?

Will it put preventative measures in place appropriate to the sensitivity and volume of the data it is itself responsible for?

Will moving forward with new plans mean safer practices?

If government genuinely wants our administrative data at the heart of digital government fit for the 21st century, it must first understand how all government departments collect and use public data. And it must educate the public in this and commercial data use.

We need a fundamental shift in the way the government respects public opinion and shift towards legal and privacy compliance – both of which are lacking.

Let’s not talk about TalkTalk. Let’s have meaningful grown up debate with genuine engagement. Let’s talk about prevention measures in our data protection. Let’s talk about consent. It’s personal.

******

[1] Questions for TalkTalk: http://www.bbc.co.uk/news/technology-34636308

[2] Hansard: http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm151026/debtext/151026-0001.htm#15102612000004

[3] TalkTalk update: http://www.talktalkgroup.com/press/press-releases/2015/cyber-attack-update-tuesday-october-30-2015.aspx

[4] The Cyber Security Programme: http://www.civilserviceworld.com/articles/feature/depth-look-national-cyber-security-programme

[5] Paul reviews TalkTalk; https://paul.reviews/value-security-avoid-talktalk/

[6] https://ico.org.uk/for-organisations/guide-to-data-protection/conditions-for-processing/

[7] Let’s talk Consequences: the consequences of current failures to meet customers’ reasonable expectations of acceptable risk, are low compared with elsewhere.  As John Nicolson (East Dunbartonshire) SNP pointed out in the debate, “In the United States, AT&T was fined £17 million for failing to protect customer data. In the United Kingdom, the ICO can only place fines of up to £500,000. For a company that received an annual revenue of nearly £1.8 billion, a fine that small will clearly not be terrifying. The regulation of telecoms must be strengthened to protect consumers.”

[8] Let’s talk education: FOI request revealing a samples of some individual level data released to members of the press: http://www.theyworkforyou.com/debates/?id=2015-10-26b.32.0

The CMA brought out a report in June, on the use of consumer data, the topic should be familiar in parliament, but little engagement has come about as a result. It suggested the benefit:

“will only be realised if consumers continue to provide data and this relies on them being able to trust the firms that collect and use it”, and that “consumers should know when and how their data is being collected and used and be able to decide whether and how to participate. They should have access to information from firms about how they are collecting, storing and using data.”

[9] Let’s Talk Public Trust – are the bodies involved Trustworthy? Government lacks an effective data policy and is resistant to change. Yet it wants to collect ever more personal and individual level for unknown purposes from the majority of 60m people, with an unprecedented PR campaign.  When I heard the words ‘we want a mature debate’ it was reminiscent of HSCIC’s ‘intelligent grown up debate’ requested by Kinglsey Manning, in a speech when he admitted lack of public knowledge was akin to a measure of past success, and effectively they would rather have kept the use of population wide health data ‘below the radar’.

Change: We need change, the old way after all, didn’t work, according to Minister Matt Hancock: “The old model of government has failed, so we will build a new one.” I’d like to see what that new one will look like. Does he mean to expand only data sharing policy, or the powers of the civil service?

[10] National Pupil Database detailed data releases to third parties https://www.whatdotheyknow.com/request/pupil_data_national_pupil_databa

[11] http://adrn.ac.uk/news-events/latest-news/adrn-rssevent

[12] https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[13] https://www.liberty-human-rights.org.uk/human-rights/privacy/state-surveillance

[14] http://www.computerweekly.com/news/4500256274/Government-will-tackle-barriers-to-sharing-and-linking-data-says-Cabinet-Office-minister-Hancock

Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

care.data communicating the benefits as its response to the failed communications in spring 2014, has failed to deliver public trust, here’s why:

To focus on the benefits is a shortcut for avoiding the real issues

Talking about benefits is about telling people what the organisation wants to tell them. This fails to address what the public and professionals want to know. The result is not communication, but a PR exercise.

Talking about benefits in response to the failed communications in spring 2014 and failing to address criticism since, ignores concerns that public and professionals raised at macro and micro level.  It appears disingenuous about real engagement despite saying ‘we’re listening’ and seems uncaring.

Talking about only the benefits does not provide any solution to demonstrably outweigh the potential risk of individual and public health harm through loss of trust in the confidential GP relationship, or data inaccuracy, or loss, and by ignoring these, seems unrealistic.

Talking about short term benefits and not long term solutions [to the broken opt out, long term security, long term scope change of uses and users and how those will be communicated] does not demonstrate competency or reliability.

Talking about only the benefits of commissioning, and research for the merged dataset CES, doesn’t mention all the secondary uses to which all HSCIC patient level health data are put, [those reflected in Type 2 opt out] including commercial re-use and National Back Office: “2073 releases made from the National Back Office between April 2013 and December 2013. This includes 313 releases to police forces, 1531 to the Home Office and 229 to the National Crime Agency.” [HSCIC, July2,  2014].

This use of hospital records and other secondary data by the back office, without openly telling the public, does not feel  ethical and transparent.

Another example, is the past patient communications that expressly said, ‘we do not collect name’, the intent of which would appear to be to assure patients of anonymity, without saying name is already stored at HSCIC on the Personal Demographics Service, or that name is not needed to be identifiable.

We hear a lot about transparency. But is transparent the same fully accurate, complete and honest? Honest about the intended outcomes of the programme. Honest about all the uses to which health data are put. Honest about potential future scope changes and those already planned.

Being completely truthful in communications is fundamental to future-proofing trust in the programme.

NHS England’s care.data programme through the focus on ‘the benefits’ lacks balance and appears disingenuous, disinterested,  unrealistic and lacking in reliability, competency and honesty. Through these actions it does not demonstrate the organisation is trustworthy.  This could be changed.

care.data fundamentally got it wrong with the intention to not communicate the programme at all.  It got it wrong in the tool and tone of communications in the patient leaflet.  There is a chance to get it right now, if the organisation  would only stop the focus on communicating the benefits.

I’m going to step through with a couple of examples why to-date, some communications on care.data and use of NHS data are not conducive to trust.

Communication designed to ‘future-proof’ an ongoing relationship and trust must be by design, not afterthought.

Communications need to start addressing the changes that are happening and how they make people feel and address the changes that create concern – in the public and professionals – not address the  goals that the organisation has.

Sound familiar? Communications to date have been flawed in the same way that the concept of ‘building trust’ has been flawed. It has aimed to achieve the wrong thing and with the wrong audience.

Communications in care.data needs to stop focussing on what the organisation wants from the public and professionals – the benefits it sees of getting data – and address instead firstly at a macro level, why the change is necessary and why the organisation should be trusted to bring it about.

When explaining benefits there are clearly positives to be had from using primary and secondary data in the public interest. But what benefits will be delivered in care.data that are not already on offer today?

Why if commissioning is done today with less identifiable data, can there be no alternative to the care.data level of identifiable data extraction? Why if the CPRD offers research in both primary and secondary care today, will care.data offer better research possibilities? And secondly at a micro level, must address questions individuals asked up and down the country in 2014.

What’s missing and possible to be done?

  1. aim to meet genuine ongoing communication needs not just legal data protection fair processing tick-boxes
  2. change organisational attitude that encourages people to ask what they each want to know at macro and micro level – why the programme at all, and what’s in it for me? What’s new and a benefit that differs from the status quo? This is only possible if you will answer what is asked.
  3. deliver robust explanations of the reason why the macro and micro benefits demonstrably outweigh the risk of individual potential harms
  4. demonstrate reliability, honesty, competency and you are trustworthy
  5. agree how scope changes will trigger communication to ‘future-proof’ an ongoing relationship and trust by design.

As the NIB work stream on Public Trust says, “This is not merely a technical exercise to counter negative media attention; substantial change and long-term work is needed to deliver the benefits of data use.”

If they’re serious about that long term work, then why continue to roll out pathfinder communications based on a model that doesn’t work, with an opt out that doesn’t work? Communications isn’t an afterthought to public trust. It’s key.

If you’re interested in details and my proposals for success in communications I’ve outlined in depth below:

  • Why Communicate Changes at all?
  • What is change in care.data about?
  • Is NHS England being honest about why this is hard?
  • Communicate the Benefits is not working
  • A mock case study in why ‘communicate the benefits’ will fail
  • Long term trust needs a long term communications solution
  • How a new model for NHS care.data Communication could deliver

Continue reading Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

Let’s assume the question of public trust is as important to those behind data sharing plans in the NHS [1] as they say it is. That the success of the care.data programme today and as a result, the very future of the NHS depends upon it.

“Without the care.data programme, the health service will not have a future, said Tim Kelsey, national director for patients and information, NHS England.” [12]

And let’s assume we accept that public trust is not about the public, but about the organisation being trustworthy.[2]

The next step is to ask, how trustworthy is the programme and organisation behind care.data? And where and how do they start to build?

The table discussion on  [3] “Building Public Trust in Data Sharing”  considered  “what is the current situation?” and “why?”

What’s the current situation? On trust public opinion is measurable. The Royal Statistical Society Data Trust Deficit shows that the starting points are low with the state and government, but higher for GPs. It is therefore important that the medical profession themselves trust the programme in principle and practice. They are after all the care.data point of contact for patients.

The current status on the rollout, according to news reports, is that pathfinder  practices are preparing to rollout [4]  communications in the next few weeks. Engagement is reportedly being undertaken ‘over the summer months’. 

Understanding both public trust and the current starting point matters as the rollout is moving forwards and as leading charity and research organisation experts said: “Above all, patients, public and healthcare professionals must understand and trust the system. Building that trust is fundamental. We believe information from patient records has huge potential to save and improve lives but privacy concerns must be taken seriously. The stakes are too high to risk any further mistakes.” [The Guardian Letters, July 27, 2015]

Here’s three steps I feel could be addressed in the short term, to start to demonstrate why the public and professionals should trust  both organisation and process.

What is missing?

1. Opt out: The type 2 opt out does not work. [5]  

2 a. Professional voices called for answers and change: As mentioned in my previous summary various bodies called for change. Including the BMA whose policy [6] remains that care.data should be on a patient opt-in basis.

2bPublic voices called for answers and change: care.data’s own listening event feedback [7] concluded there was much more than ‘communicate the benefits’ that needed done. There is much missing. Such as questions on confusing SCR and care.data, legislation and concern over controlling its future change, GP concerns of their ethical stance, the Data Guardian’s statutory footing, correction of mistakes, future funding and more.
How are open questions being addressed? If at all?

3. A single clear point of ownership on data sharing and public trust communications> Is this now NIB, NHS England Patients and Information Directorate, the DH  who owns care.data now? It’s hard to ask questions if you don’t know where to go and the boards seem to have stopped any public communications. Why? The public needs clarity of organisational oversight.

What’s the Solution? 

1. Opt out: The type 2 opt out does not work. See the post graphic, the public wanted more clarity over opt out in 2014, so this needs explained clearly >>Solution: follows below from a detailed conversation with Mr. Kelsey.

2. Answers to professional opinions: The Caldicott panel,  raised 27 questions in areas of concern in their report. [8] There has not yet been any response to address them made available in the public domain by NHS England. Ditto APPG report, BMA LMC vote, and others >> Solution: publish the responses to these concerns and demonstrate what actions are being done to address them.

2b. Fill in the lack of transparency: There is no visibility of any care.data programme board meeting minutes or materials from 2015. In eight months, nothing has been published. Their 2014 proposal for transparency, appears to have come to nothing. Why?  The minutes from June-October 2014 are also missing entirely and the October-December 2014 materials published were heavily redacted. There is a care.data advisory board, which seems to have had little public visibility recently either. >> Solution: the care.data programme business case must be detailed and open to debate in the public domain by professionals and public. Scrutiny of its associated current costs and time requirements, and ongoing future financial implications at all levels should be welcomed by national, regional (CCG) and local level providers (GPs). Proactively publishing creates demonstrable reasons why both the organisation, and the plans are both trustworthy. Refusing this without clear justifications, seems counter productive, which is why I have challenged this in the public interest. [10]

3. Address public and professional confusion of ownership: Since data sharing and public trust are two key components of the care.data programme, it seems to come under the NIB umbrella, but there is a care.data programme board [9] of its own with a care.data Senior Responsible Owner and Programme Director. >> Solution: an overview of where all the different nationally driven NHS initiatives fit together and their owners would be helpful.

[Anyone got an interactive Gantt chart for all national level driven NHS initiatives?]

This would also help public and professionals see how and why different initiatives have co-dependencies. This could also be a tool to reduce the ‘them and us’ mentality. Also useful for modelling what if scenarios and reality checks on 5YFV roadmaps for example, if care.data pushes back six months, what else is delayed?

If the public can understand how things fit together it is more likely to invite questions, and an engaged public is more likely to be a supportive public. Criticism can be quashed if it’s incorrect. If it is justified criticism, then act on it.

Yes, these are hard decisions. Yes, to delay again would be awkward. If it were the right decision, would it be worse to ignore it and carry on regardless? Yes.

The most important of the three steps in detail: a conversation with Mr. Kelsey on Type 2 opt out. What’s the Solution?

We’re told “it’s complicated.” I’d say “it’s simple.” Here’s why.

At the table of about fifteen participants at the Bristol NIB event, Mr. Kelsey spoke very candidly and in detail about consent and the opt out.

On the differences between consent in direct care and other uses he first explained the assumption in direct care. Doctors and nurses are allowed to assume that you are happy to have your data shared, without asking you specifically. But he said, “beyond that boundary, for any other purpose, that is not a medical purpose in law, they have to ask you first.”

He went on to explain that what’s changed the whole dynamic of the conversation, is the fact that the current Secretary of State, decided that when your data is being shared for purposes other than your direct care, you not only have the right to be asked, but actually if you said you didn’t want it to be shared, that decision has to be respected, by your clinician.

He said: “So one of the reasons we’re in this rather complex situation now, is because if it’s for analysis, not only should you be asked, but also when you say no, it means no.”

Therefore, I asked him where the public stands with that now. Because at the moment there are ca. 700,000 people who we know said no in spring 2014.

Simply: They opted out of data used for secondary purposes, and HSCIC continues to share their data.

“Is anything more fundamentally damaging to trust, than feeling lied to?”

Mr. Kelsey told the table there is a future solution, but asked us not to tweet when. I’m not sure why, it was mid conversation and I didn’t want to interrupt:

“we haven’t yet been able to respect that preference, because technically the Information Centre doesn’t have the digital capability to actually respect it.”

He went on to say that they have hundreds of different databases and at the moment, it takes 24 hrs for a single person’s opt out to be respected across all those hundreds of databases. He explained a person manually has to enter a field on each database, to say a person’s opted out. He asked the hoped-for timing not be tweeted but explained that all those current historic objections which have been registered will be respected at a future date.

One of the other attendees expressed surprise that GP practices hadn’t been informed of that, having gathered consent choices in 2014 and suggested the dissent code could be extracted now.

The table discussion then took a different turn with other attendee questions, so I’m going to ask here what I would have asked next in response to his statement, “if it’s for analysis, not only should you be asked, but also when you say no, it means no.”

Where is the logic to proceed with pathfinder communications?

What was said has not been done and you therefore appear untrustworthy.

If there will be a future solution it will need communicated (again)?

“Trust is not about the public. Public trust is about the organisation being trustworthy.”

There needs to be demonstrable action that what the org said it would do, the org did. Respecting patient choice is not an optional extra. It is central in all current communications. It must therefore be genuine.

Knowing that what was promised was not respected, might mean millions of people choose to opt out who would not otherwise do so if the process worked when you communicate it.

Before then any public communications in Blackburn and Darwen, and Somerset, Hampshire and Leeds surely doesn’t make sense.

Either the pathfinders will test the same communications that are to be rolled out as a test for a national rollout, or they will not. Either those communications will explain the secondary uses opt out, or they will not. Either they will explain the opt out as is [type 2 not working] or as they hope it might be in future. [will be working] Not all of these can be true.

People who opt out on the basis of a broken process simply due to a technical flaw, are unlikely to ever opt back in again. If it works to starts with, they might choose to stay in.

Or will the communications roll out in pathfinders with a forward looking promise, repeating what was promised but has not yet been done? We will respect your promise (and this time we really mean it)? Would public trust survive that level of uncertainty? In my opinion, I don’t think so.

There needs to be demonstrable action in future as well, that what the org said it would do, the org did. So the use audit report and how any future changes will be communicated both seem basic principles to clarify for the current rollout as well.

So what’s missing and what’s the solution on opt out?

We’re told “it’s complicated.” I say “it’s simple.” The promised opt out must work before moving forward with anything else. If I’m wrong, then let’s get the communications materials out for broad review to see how they accommodate this and the future re-communication of  second process.

There must be a budgeted and planned future change communication process.

So how trustworthy is the programme and organisation behind care.data?

Public opinion on trust levels is measurable. The Royal Statistical Society Data Trust Deficit shows that the starting points are clear. The current position must address the opt out issue before anything else. Don’t say one thing, and do another.

To score more highly on the ‘truthworthy scale’ there must be demonstrable action, not simply more communications.

Behaviours need change and modelled in practice, to focus on people, not  tools and tech solutions, which make patients feel as if they are less important to the organisations than their desire to ‘enable data sharing’.

Actions need to demonstrate they are ethical and robust for a 21stC solution.

Policies, practical steps and behaviours all play vital roles in demonstrating that the organisations and people behind care.data are trustworthy.

These three suggestions are short term, by that I mean six months. Beyond that further steps need to be taken to be demonstrably trustworthy in the longer term and on an ongoing basis.

Right now, do I trust that the physical security of HSCIC is robust? Yes.

Do I trust that the policies in the programme would not to pass my data in the future to third party commercial pharma companies? No.
Do I believe that for enabling commissioning my fully identifiable confidential health records should be stored indefinitely with a third party? No.
Do I trust that the programme would not potentially pass my data to non-health organisations, such as police or Home Office? No.
Do I trust that the programme to tell me if they potentially change the purposes from those which they outline now ? No.

I am open to being convinced.

*****

What is missing from any communications to date and looks unlikely to be included in the current round and why that matters I address in my next post Building Public Trust [4]: Communicate the Benefits won’t work for care.data and then why a future change management model of consent needs approached now, and not after the pilot, I wrap up in [5]: Future solutions.

Continue reading Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

Building Public Trust in care.data sharing [1]: Seven step summary to a new approach

Here’s my opinion after taking part in the NIB #health2020 Bristol event 24/7/2015 and presentation of plans at the June King’s Fund hosted event. Data sharing includes plans for extraction and uses of primary care data by third parties, charging ahead under the care.data banner.

Wearing my hat from a previous role in change management and communications, I share my thoughts in the hope the current approach can adapt and benefit from outside perspectives.

The aim of “Rebuilding and sustaining Public trust” [1] needs refocused to treat the cause, not only the symptoms of the damage done in 2014.  Here’s why:

A Seven Step Top Line Summary

1. Abstract ‘public trust’ is not vital to the future of data sharing. Being demonstrably worthy of public trust is.

2. Data-sharing is not vital to future-proof the NHS. Using knowledge wisely is.

3. A timed target to ‘get the public’s data’, is not what is needed. Having a stable, long term future-proofed and governable model is.

4. Tech solutions do not create trust. Enable the positive human response to what the org wants from people, enabling their confident ‘yes to data-sharing.’ [It might be supported by technology-based tools.]

5. Communications that tell the public ‘we know best, trust us’ fail.  While professional bodies [BMA [2], GPES advisory group, APPG report calling for a public benefits plan, ICO, and expert advice such as Caldicott] are ignored or remain to be acted upon, it remains challenging for the public to see how the programme’s needs, motives and methods are trustworthy. The [Caldicott 2] Review Panel found that commissioners do not need dispensation from confidentiality, human rights & data protection law.” [3] Something’s gotta give. What will it be?

6. care.data consistency. Relationships must be reliable and have integrity.
“Trust us – see the benefits” [But we won’t share the business cost/benefit plan.]
“Trust us – we’re transparent” [But there is nothing published in 2015 at all from the programme board minutes] [4]
“Trust us – we’ll only use your data wisely, with the patient in control” [Ignore that we didn’t before [5] and that we still share your data for secondary uses even if you opted out [6] and no, we can’t tell you when it will be fixed…]

7. Voices do not exist in a vacuum. Being trustworthy on care.data  does not stand alone but is part of the NHS ‘big picture’.
Department of Health to GPs: “Trust us about data sharing.’  [And ignore that we haven’t respected many of  your judgement or opinions.]
NHS England to GPs: “Trust us about data sharing.’  
[And ignore our lack of general GP support: MPIG withdrawal, misrepresentation in CQC reports] NHS England and Department of Health to professionals and public: “The NHS is safe in our hands.’ Everyone: “We see no evidence that plans for cost savings, 7 day working, closures and the 5YFV integration will bring the promised benefits. Let us ‘see the holes’, so that we can trust you based on evidence.”

See the differences?

Target the cause not Symptom:

The focus in the first half, the language used by NHS England/NIB/ DH, sets out their expectations of the public. “You must trust us and how you give us your data.”

The focus should instead to be on the second half, a shift to the organisation, the NHS England/NIB/ DH, and set out expectations from the public point-of-view. ” Enable the public to trust the organisation. Enable individual citizens to trust what is said by individual leaders. This will enable citizens to be consensual sharers in the activity your organisation imposes – the demand for care.data through a statutory gateway, obliging GPs to disclose patient data.

The fact that trust is broken, and specifically to data-sharing that there is the deficit [A] between how much the public trusts the organisation and how the organisation handles data, is not the fault of the public, or “1.4 M NHS staff”, or the media, or patient groups’ pressure. It’s based on proven experience.

It’s based on how organisations have handled data in the past. [5] Specifically on the decisions made by DH, and the Information Centre and leaders in between. Those who chose to sell patient data without asking the public.

The fact that trust is broken is based on how leadership individuals in those organisations have responded to that. Often taking no responsibility for loss.

No matter how often we hear “commissioners will get a better joined up picture of care needs and benefit you”, it does not compensate for past failings.

Only demonstrable actions to show why it will not happen in future can start that healing process.

Target the timing to the solution, not a shipping deadline

“Building trust to enable data sharing” aims at quick fixes, when what is needed is a healing process and ongoing relationship maintenance.

Timing has to be tailored to what needs done, not an ‘artificial deadline’. Despite that being said it doesn’t seem to match reality.

Addressing the Symptoms and not the Cause, will not find a Cure

What needs done?

Lack of public trust, the data trust deficit [A] are symptoms in the public to be understood. But it is the causes in the organisations that must be treated.

So far many NHS England staff I have met in relation to care.data, appear to have a “them and us” mentality. It’s almost tangibly wrapped up in the language used at these meetings or in defensive derision of public concerns: “Tin foil hat wearers”, “Luddites” [7] and my personal favourite, ‘Consent fetishists.’ [8] It’s counter productive and seems borne from either a lack of understanding, or frustration.

The NIB/DH/NHS England/ P&I Directorate must accept they cannot force any consensual change in an emotion-based belief based on past experiences, held by the public.

Those people each have different starting points of knowledge and beliefs.  As one attendee said, “There is no single patient replicated 60 million times.”

The NIB/DH/NHS England/ P&I Directorate can only change what they themselves can control. They have to model and be seen to model change that is trustworthy.

How can an organisation demonstrate it is trustworthy?

This means shifting the focus of the responsibility for change from public and professionals, to leadership organisation.

There is a start in this work stream, but there is little new that is concrete.

The National Data Guardian (NDG) role has been going to be put on a legal footing “at the earliest opportunity” since November 2014. [9] Nine months.

Updated information governance guidance is on the way.

Then there’s two really strong new items that would underpin public trust, to be planned in a ‘roadmap’: the first a system that can record and share consent decisions and the second, to provide information on the use to which an individual’s data has been put.

How and when those two keystones to public trust will be actually offered appear unknown. They will  encourage public trust by enabling choice and control of our data. So I would ask, if we’re not there yet on the roadmap, how can consent options be explained to the public in care.data communications, if there is as yet no mechanism to record and effect them? More on that later.

Secondly, when will a usage report be available? That will be the proof to demonstrate that what was offered, was honoured. It is one of the few tools the organisation(s) can offer to demonstrate they are trustworthy: you said, we did. So again, why jeopardise public trust by rolling out data extractions into the existing, less trustworthy environment?

How well this is done will determine whether it can realise its hoped for benefits. How the driving leadership influences that outcome, will be about the organisational approach to opt out, communicating care.data content decisions, the way and the channels in which they are communicated, accepting what has not worked to date and planning long-term approaches to communicating change before you start the pathfinders. [Detailed steps on this follows.]

Considering the programme’s importance we have been told, it’s vital to get right. [10]

i believe changing the approach from explaining benefits and focus on public trust, to demonstrating why the public should trust demonstrable changes made, will make all the difference.

So before rolling out next data sharing steps think hard what the possible benefits and risks will be, versus waiting for a better environment to do it in.

Conclusion: Trust is not about the public. Public trust is about the organisation being trustworthy. Over to you, orgs.

####

To follow, for those interested in nitty gritty, some practical suggestions for progress in Building Public Trust in data sharing:

This is Part one: A seven step top line summary – What I’d like to see change addressing public trust in health data sharing for secondary purposes.

Part two: a New Approach is needed to understanding Public Trust For those interested in a detailed approach on Trust. What Practical and Policy steps influence trust. On Research and Commissioning. Trust is not homogeneous. Trust  is nuanced even within the single relationship between one individual and another. It doesn’t exist in a vacuum.

Part three: Know where you’re starting from. What behaviours influence trust. Fixing what has already been communicated is vital before new communications get rolled out. Vital to content of your communications and vital for public trust and credibility.

Part four: Communicate the Benefits won’t work – How Communications influence trust. For those interested in more in-depth reasons, I outline in part two why the communications approach is not working, why the focus on ‘benefits’ is wrong, and fixes.

Part five: Future solutions – why a new approach may work better for future trust – not to attempt to rebuild trust where there is now none, but strengthen what is already trusted and fix today’s flawed behaviours; honesty and reliability, that  are vital to future proofing trust

####

Background References:

I’m passionate about people using technology to make their jobs and lives better, simpler, and about living well. So much so, that this became over 5000 words. To solve that, I’ve assumed a baseline knowledge and I will follow up with separate posts on why a new approach is needed to understanding “Public Trust”, to “Communicating the benefits” and “Being trustworthy and other future solutions”.

If this is all new, welcome, and I suggest you look over some of the past 18 months posts that include  public voice captured from eight care.data  events in 2014. care.data is about data sharing for secondary purposes not direct care.

[1] NHS England October 2014 http://www.england.nhs.uk/2014/10/23/nhs-leaders-vision/

[2] BMA LMC Vote 2014 http://bma.org.uk/news-views-analysis/news/2014/june/patients-medical-data-sacrosanct-declares–bma

[3] Caldicott Review 2: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/192572/2900774_InfoGovernance_accv2.pdf

[4] Missing Programme Board documents: 2015 and June-October 2014

[5] HSCIC Data release register

[6] Telegraph article on Type 2 opt out http://www.telegraph.co.uk/news/health/news/11655777/Nearly-1million-patients-could-be-having-confidential-data-shared-against-their-wishes.html

[7] Why Wanting a Better Care.Data is not Luddite: http://davidg-flatout.blogspot.co.uk/2014/04/why-wanting-better-caredata-is-not.html

[8] Talking to the public about using their data is crucial- David Walker, StatsLife http://www.statslife.org.uk/opinion/1316-talking-to-the-public-about-using-their-data-is-crucial

[9] Dame Fiona Caldicott appointed in new role as National Data Guardian

[10] Without care.data health service has no future says director http://www.computerweekly.com/news/2240216402/Without-Caredata-we-wont-have-a-health-service-for-much-longer-says-NHS

Polls of public feeling:

[A] Royal Statistical Society Data Trust Deficit http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

(B] Dialogue on data – work carried out through the ADRN

 

 

The National Pupil Database end of year report: D for transparency, C minus in security.

Transparency and oversight of how things are administered are simple ways that the public can both understand and trust that things run as we expect.

For the National Pupil Database, parents might be surprised, as I was about some of the current practices.

The scope of use and who could access the National Pupil Database was changed in 2012 and although I had three children at school at that time and heard nothing about it, nor did I read it in the papers. (Hah – time to read the papers?)  So I absolutely agree with Owen Boswara’s post when he wrote:

“There appears to have been no concerted effort to bring the consultation or the NPD initiative to the attention of parents or pupils (i.e. the data subjects themselves). This is a quote from one of the parents who did respond:

“I am shocked and appalled that I wasn’t notified about this consultation through my child’s school – I read about it on Twitter of all things. A letter should have gone to every single parent explaining the proposals and how to respond to this consultation.”

(Now imagine that sentiment amplified via Mumsnet …)”
[July 2013, blog by O. Boswara]

As Owen wrote,  imagine that sentiment amplified via Mumsnet indeed.

Here’s where third parties can apply and here’s a list of who has been given data from the National Pupil Database . (It’s only been updated twice in 18 months. The most recent of which has been since I’ve asked about it, in .) The tier groups 1-4 are explained here on p.18, where 1 is the most sensitive identifiable classification.

The consultation suggested in 2012 that the changes could be an “effective engine of economic growth, social wellbeing, political accountability and public service improvement.”.  

Has this been measured at all if the justification given has begun to be achieved? Often research can take a long time and implementing any changes as a result, more time. But perhaps there has been some measure of public benefit already begun to be accrued?

The release panel would one hope, have begun to track this. [update: DfE confirmed August 20th they do not track benefits, nor have ever done any audit of recipients]

And in parallel what oversight governs checks and balances to make sure that the drive for the ‘engine of economic growth’ remembers to treat these data as knowledge about our children?

Is there that level of oversight from application to benefits measurement?

Is there adequate assessment of privacy impact and ethics in applications?

Why the National Pupil Database troubles me, is not the data it contains per se, but the lack of child/guardian involvement, lack of accountable oversight how it is managed and full transparency around who it is used by and its processes.

Some practical steps forward

Taken now, steps could resolve some of these issues and avoid the risk of them becoming future issues of concern.

The first being thorough fair processing, as I covered in my previous post.

The submission of the school census returns, including a set of named pupil records, has been a statutory requirement on schools since the Education Act 1996. That’s almost twenty years ago in the pre-mainstream internet age.

The Department must now shape up its current governance practices in its capacity as the data processor and controller of the National Pupil Database, to be fit for the 21st century.

Ignoring current weaknesses, actively accepts an ever-increasing reputational risk for the Department, schools, other data sharing bodies or those who link to the data and its bona fide research users. If people lose trust in data uses, they won’t share at all and the quality of data will suffer, bad for functional admin of the state and individual, but also for the public good.

That concerns me also wearing my hat as a lay member on the ADRN panel because it’s important that the public trusts our data is looked after wisely so that research can continue to use it for advances in health and social science and all sorts of areas of knowledge to improve our understanding of society and make it better.

Who decides who gets my kids data, even if I can’t?

A Data Management Advisory Panel (DMAP) considers applications for only some of the applications, tier 1 data requests. Those are the most, but not the only applications for access to sensitive data.

“When you make a request for NPD data it will be considered for approval by the Education Data Division (EDD) with the exception of tier 1 data requests, which will be assessed by the department’s Data Management Advisory Panel. The EDD will inform you of the outcome of the decision.”

Where is governance transparency?

What is the make up of both the Data Management Advisory Panel and and the Education Data Division (EDD)? Who sits on them and how are they selected? Do they document their conflicts of interest for each application? For how long are they appointed and under what selection criteria?

Where is decision outcome transparency?

The outcome of the decision should be documented and published. However, the list has been updated only twice since its inception in 2012. Once was December 2013, and the most recently was, ahem, May 18 2015. After considerable prodding. There should be a regular timetable, with responsible owner and a depth of insight into its decision making.

Where is transparency over decision making to approve or reject requests?

Do privacy impact assessments and ethics reviews play any role in their application and if so, how are they assessed and by whom?

How are those sensitive and confidential data stored and governed?

The weakest link in any system is often said to be human error. Users of the NPD data vary from other government departments to “Mom and Pop” small home businesses, selling schools’ business intelligence and benchmarking.

So how secure are our children’s data really, and once the data have left the Department database, how are they treated? Does lots of form filling and emailed data with a personal password ensure good practice, or simply provide barriers to slow down the legitimate applications process?

What happens to data that are no longer required for the given project? Are they properly deleted and what audits have ever been carried out to ensure that?

The National Pupil Database end of year report: a C- in security

The volume of data that can be processed now at speed is incomparable with 1996, and even 2012 when the current processes were set up. The opportunities and risks in cyber security have also moved on.

Surely the Department for Education should take responsibility seriously to treat our children’s personal data and sensitive records equally as well as the HSCIC now intends to manage health data?

Processing administrative or linked data in an environment with layered physical security (e.g. a secure perimeter, CCTV, security guarding or a locked room without remote connection such as internet access) is good practice. And reduces the risk of silly, human error. Or  simple theft.

Is giving out chunks of raw data by email, with reams of paperwork as its approval ‘safeguards’ really fit for the 21st century and beyond?

tiers

Twenty years on from the conception of the National Pupil Database, it is time to treat the personal data of our future adult citizens with the respect it deserves and we expect of best-in-class data management.

It should be as safe and secure as we treat other sensitive government data, and lessons could be learned from the FARR, ADRN and HSCIC safe settings.

Back to school – more securely, with public understanding and transparency

Understanding how that all works, how technology and people, data sharing and privacy, data security and trust all tie together is fundamental to understanding the internet. When administrations take our data, they take on responsibilities for some of our participation in dot.everyone that the state is so keen for us all to take part in. Many of our kids will live in the world which is the internet of things.  Not getting that, is to not understand the Internet.

And to reiterate some of why that matters, I go back to my previous post in which I quoted Martha Lane Fox recently and the late Aaron Swartz when he said: “It’s not OK not understand the internet, anymore”.

While the Department of Education has turned down my subject access request to find out what the National Pupil Database stores on my own children, it matters too much to brush the issues aside, as only important for me. About 700,000 children are born each year and will added to this database every academic year. None ever get deleted.

Parents can, and must ask that it is delivered to the highest standards of fair processing, transparency, oversight and security. I’m certainly going to.

It’s going to be Back to School in September, and those annual privacy notices, all too soon.

*****

1. The National Pupil Database end of year report card

2. The National Pupil Database end of year report: an F in fair processing

3. The National Pupil Database end of year report: a D in transparency

References:

[1] The National Pupil Database user guide: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/261189/NPD_User_Guide.pdf

[2] Data tables to see the individual level data items stored and shared (by tabs on the bottom of the file) https://www.gov.uk/government/publications/national-pupil-database-user-guide-andsupporting-information

[3] The table to show who has bought or received data and for what purpose https://www.gov.uk/government/publications/national-pupil-database-requests-received

[4] Data Trust Deficit – from the RSS: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[5] Talk by Phil Booth and Terri Dowty: http://www.infiniteideasmachine.com/2013/04/terris-and-my-talk-on-the-national-pupil-database-at-the-open-data-institute/

[6] Presentation given by Paul Sinclair of the Department for Education at the Workshop on Evaluating the Impact of Youth Programmes, 3rd June 2013

What is in the database?

The Schools Census dataset contains approximately eight million records incrementally every year (starting in 1996) and includes variables on the pupil’s home postcode, gender, age, ethnicity, special educational needs (SEN), free school meals eligibility, and schooling history. It covers pupils in state-funded primary, secondary, nursery, special schools and pupil referral units. Schools that are entirely privately funded are not included.

Pupils can be tracked across schools. Pupils can now be followed throughout their school careers. And it provides a very rich set of data on school characteristics. There is further use by linking the data from other related datasets such as those on higher education, neighbourhoods and teachers in schools.

Data stored include the full range of personal and sensitive data from name, date of birth and address, through SEN and disability needs. (Detail of content is here.)  To see what is in it download the excel sheet : NPD Requests.

 

The Department for Education has specific legal powers to collect pupil, child and workforce data held by schools, local authorities and awarding bodies under section 114 of the Education Act 2005section 537A of the Education Act 1996, and section 83 of the Children Act 1989. The submission of the school census returns, including a set of named pupil records, is a statutory requirement on schools under Section 537A of the Education Act 1996.

The nhs.uk digital platform: a personalised gateway to a new NHS?

In recent weeks rebranding the poverty definitions and the living wage in the UK deservedly received more attention than the rebrand of the website NHS Choices into ‘nhs.uk.

The site that will be available only in England and Wales despite its domain name, will be the doorway to enter a personalised digital NHS offering.

As the plans proceed without public debate, I took some time to consider the proposal announced through the National Information Board (NIB) because it may be a gateway to a whole new world in our future NHS. And if not, will it be a  big splash of cash but create nothing more than a storm-in-a-teacup?

In my previous post I’d addressed some barriers to digital access. Will this be another? What will it offer that isn’t on offer already today and how will the nhs.uk platform avoid the problems of its predecessor HealthSpace?

Everyone it seems is agreed, the coming cuts are going to be ruthless. So, like Alice, I’m curious. What is down the rabbit hole ahead?

What’s the move from NHS Choices to nhs.uk about?

The new web platform nhs.uk would invite users to log on, using a system that requires identity, and if compulsory, would be another example of a barrier to access simply from a convenience point of view, even leaving digital security risks aside.

What will nhs.uk offer to incentivise users and offer benefit as a trade off against these risks, to go down the new path into the unknown and like it?

“At the heart of the domain , will be the development of nhs.uk into a new integrated health and care digital platform that will be a source of access to information, directorate, national services and locally accredited applications.”

In that there is nothing new compared with information, top down governance and signposting done by NHS Choices today.  

What else?

“Nhs.uk will also become the citizen ’s gateway to the creation of their own personal health record, drawing on information from the electronic health records in primary and secondary care.”

nhs.uk will be an access point to patient personal confidential records

Today’s patient online we are told offers 97% of patients access to their own GP created records access. So what will nhs.uk offer more than is supposed to be on offer already today? Adding wearables data into the health record is already possible for some EMIS users, so again, that won’t be new. It does state it will draw on both primary and secondary records which means getting some sort of interoperability to show both hospital systems data and GP records. How will the platform do this?

Until care.data many people didn’t know their hospital record was stored anywhere outside the hospital. In all the care.data debates the public was told that HES/SUS was not like a normal record in the sense we think of it. So what system will secondary care records come from? [Some places may have far to go. My local hospital pushes patients round with beige paper folders.] The answer appears to be an unpublished known or an unknown.

What else?

nhs.uk will be an access point to tailored ‘signposting’ of services

In addition to access to your personal medical records in the new “pull not push” process the nhs.uk platform will also offer information and services, in effect ‘advertising’ local services, to draw users to want to use it, not force its use. And through the power of web tracking tools combined with log in, it can all be ‘tailored’ or ‘targeted’ to you, the user.

“Creating an account will let you save information, receive emails on your chosen topics and health goals and comment on our content.”

Do you want to receive emails on your chosen topics or comment on content today? How does it offer more than can already be done by signing up now to NHS Choices?

NHS Choices today already offers information on local services, on care provision and symptoms’ checker.

What else?

Future nhs.uk users will be able to “Find, Book, Apply, Pay, Order, Register, Report and Access,” according to the NIB platform headers.

platform

“Convenient digital transactions will be offered like ordering and paying for prescriptions, registering with GPs, claiming funds for treatment abroad, registering as an organ and blood donor and reporting the side effects of drugs . This new transactional focus will complement nhs.uk’s existing role as the authoritative source of condition and treatment information, NHS services and health and care quality information.

“This will enable citizens to communicate with clinicians and practices via email, secure video links and fill out pre-consultation questionnaires. They will also be able to include data from their personal applications and wearable devices in their personal record. Personal health records will be able to be linked with care accounts to help people manage their personal budget.”

Let’s consider those future offerings more carefully.

Separating out the the transactions that for most people will be one off, extremely rare or never events (my blue) leaves other activities which you can already do or will do via the patient online programme (in purple).

The question is that although video and email are not yet widespread where they do work today and would in future, would they not be done via a GP practice system, not a centralised service? Or is the plan not that you could have an online consultation with ‘your’ named GP through nhs.uk but perhaps just ‘any’ GP from a centrally provided GP pool? Something like this? 

That leaves two other things, which are both payment tools (my bold).

i. digital transactions will be offered like ordering and paying for prescriptions
ii. …linked with care accounts to help people manage their personal budget.”

Is the core of the new offering about managing money at individual and central level?

Beverly Bryant, ‎Director of Strategic Systems and Technology at NHS England, said at the #kfdigi2015 June 16th event, that implementing these conveniences had costs saving benefits as well: “The driver is customer service, but when you do it it actually costs less.”

How are GP consultations to cost less, significantly less, to be really cost effective compared with the central platform to enable it to happen, when the GP time is the most valuable part and remains unchanged spent on the patient consultation and paperwork and referral for example?

That most valuable part to the patient, may be seen as what is most costly to ‘the system’.

If the emphasis is on the service saving money, it’s not clear what is in it for people to want to use it and it risks becoming another Healthspace, a high cost top down IT rollout without a clear customer driven need.

The stated aim is that it will personalise the user content and experience.

That gives the impression that the person using the system will get access to information and benefits unique and relevant to them.

If this is to be something patients want to use (pull) and are not to be forced to use (push) I wonder what’s really at its core, what’s in it for them, that is truly new and not part of the existing NHS Choices and Patient online offering?

What kind of personalised tailoring do today’s NHS Choices Ts&Cs sign users up to?

“Any information provided, or any information the NHS.uk site may infer from it, are used to provide content and information to your account pages or, if you choose to, by email.  Users may also be invited to take part in surveys if signed up for emails.

“You will have an option to submit personal information, including postcode, age, date of birth, phone number, email address, mobile phone number. In addition you may submit information about your diet and lifestyle, including drinking or exercise habits.”

“Additionally, you may submit health information, including your height and weight, or declare your interest in one or more health goals, conditions or treatments. “

“With your permission, academic institutions may occasionally use our data in relevant studies. In these instances, we shall inform you in advance and you will have the choice to opt out of the study. The information that is used will be made anonymous and will be confidential.”

Today’s NHS Choices terms and conditions say that “we shall inform you in advance and you will have the choice to opt out of the study.”

If that happens already and the NHS is honest about its intent to give patients that opt out right whether to take part in studies using data gathered from registered users of NHS Choices, why is it failing to do so for the 700,000 objections to secondary use of personal data via HSCIC?

If the future system is all about personal choice NIB should perhaps start by enforcing action over the choice the public may have already made in the past.

Past lessons learned – platforms and HealthSpace

In the past, the previous NHS personal platform, HealthSpace, came in for some fairly straightforward criticism including that it offered too little functionality.

The Devil’s in the Detail remarks are as relevant today on what users want as they were in 2010. It looked at the then available Summary Care Record (prescriptions allergies and reactions) and the web platform HealthSpace which tried to create a way for users to access it.

Past questions from Healthspace remain unanswered for today’s care.data or indeed the future nhs.uk data: What happens if there is a mistake in the record and the patient wants it deleted? How will access be given to third party carers/users on behalf of individuals without capacity to consent to their records access?

Reasons given by non-users of HealthSpace included lack of interest in managing their health in this way, a perception that health information was the realm of health professionals and lack of interest or confidence in using IT.

“In summary, these findings show that ‘self management’ is a much more complex, dynamic, and socially embedded activity than original policy documents and technical specifications appear to have assumed.”

What lessons have been learned? People today are still questioning the value of a centrally imposed system. Are they being listened to?

Digital Health reported that Maurice Smith, GP and governing body member for Liverpool CCG, speaking in a session on self-care platforms at the King’s Fund event he said that driving people towards one national hub for online services was not an option he would prefer and that he had no objection to a national portal, “but if you try drive everybody to a national portal and expect everybody to be happy with that I think you will be disappointed.”

How will the past problems that hit Healthspace be avoided for the future?

How will the powers-at-be avoid repeating the same problems for its ongoing roll out of care.data and future projects? I have asked this same question to NHS England/NIB leaders three times in the last year and it remains unanswered.

How will you tell patients in advance of any future changes who will access their data records behind the scenes, for what purpose, to future proof any programmes that plan to use the data?

One of the Healthspace 2010 concerns was: “Efforts of local teams to find creative new uses for the SCR sat in uneasy tension with implicit or explicit allegations of ‘scope creep’.”

Any programme using records can’t ethically sign users up to one thing and change it later without informing them before the change. Who will pay for that and how will it be done? care.data pilots, I’d want that answered before starting pilot communications.

As an example of changes to ‘what’ or content scope screep, future plans will see ‘social care flags added’ to the SCR record, states p.17 of the NIB 2020 timeline. What’s the ‘discovery for the use of genomic data complete’ about on p.11?  Scope creep of ‘who’ will access records, is very current. Recent changes allow pharmacists to access the SCR yet the change went by with little public discussion. Will they in future see social care flags or mental health data under their SCR access? Do I trust the chemist as I trust a GP?

Changes without adequate public consultation and communication cause surprises. Bad idea. Sir Nick Partridge said ensuring ‘no surprises’ is key to citizens’ trust after the audit of HES/SUS data uses. He is right.

The core at the heart of this nhs.uk plan is that it needs to be used by people, and enough people to make the investment vs cost worthwhile. That is what Healthspace failed to achieve.

The change you want to see doesn’t address the needs of the user as a change issue. (slide 4) This is all imposed change. Not user need-driven change.

Dear NIB, done this way seems to ignore learning from Healthspace. The evidence shown is self-referring to Dr. Foster and NHS Choices. The only other two listed are from Wisconsin and the Netherlands, hardly comparable models of UK lifestyle or healthcare systems.

What is really behind the new front door of the nhs.uk platform?

The future nhs.uk looks very much as though it seeks to provide a central front door to data access, in effect an expanded Summary Care Record (GP and secondary care records) – all medical records for direct care – together with a way for users to add their own wider user data.

Will nhs.uk also allow individuals to share their data with digital service providers of other kinds through the nhs.uk platform and apps? Will their data be mined to offer a personalised front door of tailored information and service nudges? Will patients be profiled to know their health needs, use and costs?

If yes, then who will be doing the mining and who will be using that data for what purposes?

If not, then what value will this service offer if it is not personal?

What will drive the need to log on to another new platform, compared with using the existing services of patient online today to access our health records, access GPs via video tools, and without any log-in requirement, browse similar content of information and nudges towards local services offered via NHS Choices today?

If this is core to the future of our “patient experience” of the NHS the public should be given the full and transparent facts  to understand where’s the public benefit and the business case for nhs.uk, and what lies behind the change expected via online GP consultations.

This NIB programme is building the foundation of the NHS offering for the next ten years. What kind of NHS are the NIB and NHS England planning for our children and our retirement through their current digital designs?

If the significant difference behind the new offering for nhs.uk platform is going to be the key change from what HealthSpace offered and separate from what patient online already offers it appears to be around managing cost and payments, not delivering any better user service.

Managing more of our payments with pharmacies and personalised budgets would reflect the talk of a push towards patient-responsible-self-management  direction of travel for the NHS as a whole.

More use of personal budgets is after all what Simon Stevens called a “radical new option” and we would expect to see “wider scale rollout of successful projects is envisaged from 2016-17″.

When the system will have finely drawn profiles of its users, will it have any effect for individuals in our universal risk-shared system? Will a wider roll out of personalised budgets mean more choice or could it start to mirror a private insurance system in which a detailed user profile would determine your level of risk and personal budget once reached, mean no more service?

What I’d like to see and why

To date, transparency has a poor track record on sharing central IT/change programme business plans.  While saying one thing, another happens in practice. Can that be changed? Why all the effort on NHS Citizen and ‘listening’, if the public is not to be engaged in ‘grown up debate‘ to understand the single biggest driver of planned service changes today: cost.

It’s at best patronising in the extreme, to prevent the public from seeing plans which spend public money.

We risk a wasteful, wearing repeat of the past top down failure of an imposed NPfIT-style HealthSpace, spending public money on a project which purports to be designed to save it.

To understand the practical future we can look back to avoid what didn’t work and compare with current plans. I’d suggest they should spell out very clearly what were the failures of Healthspace, and why is nhs.uk different.

If the site will offer an additional new pathway to access services than we already have, it will cost more, not less. If it has genuine expected cost reduction compared with today, where precisely will it come from?

I’d suggest you publish the detailed business plan for the nhs.uk platform and have the debate up front. Not only the headline numbers towards the end of these slides, but where and how it fits together in the big picture of Stevens’ “radical new option”.  This is public money and you *need* the public on side for it to work.

Publish the business cases for the NIB plans before the public engagement meet ups, because otherwise what facts will opinion be based on?

What discussion can be of value without them, when we are continually told by leadership those very  details are at the crux of needed change – the affordability of the future of the UK health and care system?

Now, as with past projects, The Devil’s in the Detail.

***

NIB detail on nhs.uk and other concepts: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/437067/nib-delivering.pdf

The Devil’s in the Detail: Final report of the independent evaluation of the Summary Care Record and HealthSpace programmes 2010

Digital revolution by design: infrastructures and the world we want

1. Digital revolution by design: building for change and people
2. Digital revolution by design: barriers by design
3. Digital revolution by design: infrastructures and the fruits of knowledge

This is Part 4.  Infrastructures and the world we want

At high level physical network infrastructures enable data transfer from one place to another and average users perceive little of it.

In the wider world Internet infrastructure today, this week might be looked back on as, to use a horrible cliché, a game changer. A two-tier Internet traffic system could be coming to Europe which would destroy a founding principle of equality – all traffic is created equal.

In other news, Facebook announced it will open an office in the toe of Africa, a foothold on a potential market of a billion people.

Facebook’s Internet.org initiative sees a further ‘magnificent seven’ companies working together. Two of whom Ericsson and Nokia will between them have “an effective lock down on the U.S market,” unless another viable network competitor emerges.  And massive reach worldwide.

In Africa today there is a hodge podge of operators and I’ll be interested to see how much effect the boys ganging up under the protection of everybody’s big brother ‘Facebook” will have on local markets.

And they’re not alone in wanting in on African action.

Whatever infrastructures China is building on and under the ground of the African continent, or donating ludicrous showcase gifts, how they are doing it has not gone unnoticed. The Chinese ethics of working and their environmental standards can provoke local disquiet.

Will Facebook’s decision makers shape up to offer Africa an ethical package that could include not only a social network, but physical one managing content delivery in the inner workings of tubes and pipes?

In Europe the data connections within and connecting the continent are shifting, as TTIP, CETA and TISA shape how our data and knowledge will be shared or reserved or copyrighted by multinational corporations.

I hope we will ensure transparency designed it these supra-national agreements on private ownership of public firms.

We don’t want to find commercial companies withhold information such as their cyber security planning, and infrastructure investments in the name of commercial protectionism, but at a public cost.

The public has opportunities now as these agreements are being drawn up, we may not get soon again.

Not only for the physical constructions, the often networked infrastructures, but intangible infrastructures of principles and power, co-dependencies around a physical system; the legal and ethical infrastructures of ownership, governance and accountability.

The Open Data institute has just launched a call for the promotion of understanding around our own data infrastructures:

“A strong data infrastructure will increase interoperability and collaboration, efficiency and productivity in public and private sectors, nationally and internationally.”

Sounds like something we want to get right in, and outside, the UK.

Governance of data is physically geographical through country unique legislation, as well as supra national such as European-wide data protection.

In some ways outdated legal concepts in a borderless digital age but one way at least over which there is manageable oversight and citizens should be able to call companies and State to account.

Yet that accountability is questionable when laws seem to be bypassed under the banner of surveillance.

As a result people have somewhat lost trust in national bodies to do the right thing. We want to share data for public good but not for commercial exploitation. And we’re not sure who to trust with it.

Data governance of contractual terms is part of the infrastructure needed to prevent exploitation and enable not restrict sharing. And it needs to catch up with apps whose terms and conditions can change after a user has enrolled.

That comes back down to the individual and some more  ideas on those personal infrastructures are in the previous post.

Can we build lasting foundations fit for a digital future?

Before launching into haphazard steps of a digital future towards 2020, the NIB/NHS decision makers need to consider the wider infrastructures in which it is set and understand under what ethical compass they are steering by.

Can there be oversight to make national and supra-national infrastructures legally regulated, bindingly interoperable and provider and developer Ts and Cs easily understood?

Is it possible to regulate only that which is offered or sold through UK based companies or web providers and what access should be enabled or barriers designed in?

Whose interests should data and knowledge created from data serve?

Any state paid initiative building a part of the digital future for our citizens must decide, is it to be for public good or for private profit?

NHS England’s digital health vision includes: “clinical decision support to be auto populated with existing healthcare information, to take real time feeds of biometric data, and to consider genomics data in the future.”  [NIB plans, Nov 2014]

In that 66 page document while it talks of data and trust and cyber security, ethics is not mentioned once.  The ambition is to create ‘health-as-a-platform’ and its focus is on tech, not on principles.

‘2020’ is the goal and it’s not a far away future at all if counted as 1175 working days from now.

By 2020 we may have moved on or away in a new digital direction entirely or to other new standards of network or technology. On what can we build?

Facebook’s founder sees a futuristic role for biometric data used in communication. Will he drive it? Should we want him to?

Detail will change, but ethical principles could better define the framework for development promoting the best of innovation long term and protect citizens from commercial exploitation. We need them now.

When Tim Berners-Lee called for a Magna Carta on the world wide web he asked for help to achieve the web he wants.

I think it’s about more than the web he wants. This fight is not only for net neutrality. It’s not only challenging the internet of things to have standards, ethics and quality that shape a fair future for all.

While we shape the web we want, we shape the world we want.

That’s pretty exciting, and we’d better try to get it right.

******

1. Digital revolution by design: building for change and people
2. Digital revolution by design: barriers by design
3. Digital revolution by design: infrastructures and the fruits of knowledge
4. Digital revolution by design: infrastructures and the world we want

 

Driving digital health, revolution by design

This follows on from: 1. Digital revolution by design: building for change and people.

***

Talking about the future of digital health in the NHS, Andy Williams went on to ask, what makes the Internet work?

In my head I answered him, freedom.

Freedom from geographical boundaries. Freedom of speech to share ideas and knowledge in real time with people around the world.  The freedom to fair and equal use. Cooperation, creativity, generosity…

Where these freedoms do not exist or are regulated the Internet may not work well for its citizens and its potential is restricted, as well as its risks.

But the answer he gave, was standards.

And of course he was right.  Agreed standards are needed when sharing a global system so that users, their content and how it works behind the screen cooperate and function as intended.

I came away wondering what the digital future embodied in the NHS NIB plans will look like, who has their say in its content and design and who will control  it?

What freedoms and what standards will be agreed upon for the NHS ‘digital future’ to function and to what purpose?

Citizens help shape the digital future as we help define the framework of how our data are to be collected and used, through what public feeling suggests is acceptable and people actually use.

What are some of the expectations the public have and what potential barriers exist to block achieving its benefits?

It’s all too easy when discussing the digital future of the NHS to see it as a destination. Perhaps we could shift the conversation focus to people, and consider what tools digital will offer the public on their life journey, and how those tools will be driven and guided.

Expectations

One key public expectation will be of trust, if something digital is offered under the NHS brand, it must be of the rigorous standard we expect.

Is every app a safe, useful tool or fun experiment and how will users [especially for mental health apps where the outcomes may be less tangibly measured than say, blood glucose] know the difference?

A second expectation must be around universal equality of access.

A third expectation must be that people know once the app is downloaded or enrolment done, what they have signed up to.

Will the NHS England / NIB digital plans underway create or enable these barriers and expectations?

What barriers exist to the NHS digital vision and why?

Is safety regulation a barrier to innovation?

The ability to broadly share innovation at speed is one of the greatest strengths of digital development, but can also risk spreading harm quickly. Risk management needs to be upfront.

We  assume that digital designs will put at their heart the core principles in the spirit of the NHS.  But if apps are not available on prescription and are essentially a commercial product with no proven benefit, does that exploit the NHS brand trust?

Regulation of quality and safety must be paramount, or they risk doing harm as any other treatment could to the person and regulation must further consider reputational risk to the NHS and the app providers.

Regulation shouldn’t be seen as a barrier, but as an enabler to protect and benefit both user and producer, and indirectly the NHS and state.

Once safety regulation is achieved, I hope that spreading benefits will not be undermined by creating artificial boundaries that restrict access to the tools by affordability, in a postcode lottery,  or in language.

But are barriers being built by design in the NHS digital future?

Cost: commercial digital exploitation or digital exclusion?

There appear to be barriers being built by design into the current NHS apps digital framework. The first being cost.

For the poorest even in the UK today in maternity care, exclusion is already measurable in those who can and cannot afford the data allowance it costs on a smart phone for e-red book access, attendees were told by its founder at #kfdigital15.

Is digital participation and its resultant knowledge or benefit to become a privilege reserved for those who can afford it? No longer free at the point of service?

I find it disappointing that for all the talk of digital equality, apps are for sale on the NHS England website and many state they may not be available in your area – a two-tier NHS by design. If it’s an NHS app, surely it should be available on prescription and/or be free at the point of use and for all like any other treatment? Or is yet another example of  NHS postcode lottery care?

There are tonnes of health apps on the market which may not have much proven health benefit, but they may sell well anyway.

I hope that decision makers shaping these frameworks and social contracts in health today are also looking beyond the worried well, who may be the wealthiest and can afford apps leaving the needs of those who can’t afford to pay for them behind.

At home, it is some of the least wealthy who need the most intervention and from whom there may be little profit to be made There is little in 2020 plans I can see that focuses on the most vulnerable, those in prison and IRCs, and those with disabilities.

Regulation in addition to striving for quality and safety by design, can ensure there is no commercial exploitation of purchasers.  However it is a  question of principle that will decide for or against exclusion for users based on affordability.

Geography: crossing language, culture and country barriers

And what about our place in the wider community, the world wide web, as Andy Williams talked about: what makes the Internet work?

I’d like to think that governance and any “kite marking” of digital tools such as apps, will consider this and look beyond our bubble.

What we create and post online will be on the world wide web.  That has great potential benefits and has risks.

I feel that in the navel gazing focus on our Treasury deficit, the ‘European question’ and refusing refugees, the UK government’s own insularity is a barrier to our wider economic and social growth.

At the King’s Fund event and at the NIB meeting the UK NHS leadership did not discuss one of the greatest strengths of online.

Online can cross geographical boundaries.

How are NHS England approved apps going to account for geography and language and cross country regulation?

What geographical and cultural barriers to access are being built by design just through lack of thought into the new digital framework?

Barriers that will restrict access and benefits both in certain communities within the UK, and to the UK.

One of the three questions asked at the end of the NIB session, was how the UK Sikh community can be better digitally catered for.

In other parts of the world both traditional and digital access to knowledge are denied to those who cannot afford it.

school

This photo reportedly from Indonesia, is great [via Banksy on Twitter, and apologies I cannot credit the photographer] two boys on the way to school, pass their peers on their way to work.

I wonder if one of these boys has the capability to find the cure for cancer?
What if he is one of the five, not one of the two?

Will we enable the digital infrastructure we build today to help global citizens access knowledge and benefits, or restrict access?

Will we enable broad digital inclusion by design?

And what of  data sharing restrictions: Barrier or Enabler?

Organisations that talk only of legal, ethical or consent ‘barriers’ to datasharing don’t understand human behaviour well enough.

One of the greatest risks to achieving the potential benefits from data is the damage done to it by organisations that are paternalistic and controlling. They exploit a relationship rather than nurturing it.

The data trust deficit from the Royal Statistical Society has lessons for policymakers. Including finding that: “Health records being sold to private healthcare companies to make money for government prompted the greatest opposition (84%).”

Data are not an abstract to be exploited, but personal information. Unless otherwise informed, people expect that information offered for one purpose, will not be used for another. Commercial misuse is the greatest threat to public trust.

Organisations that believe behavioural barriers to data sharing are an obstacle,  have forgotten that trust is not something to be overcome, but to be won and continuously reviewed and protected.

The known barrier without a solution is the lack of engagement that is fostered where there is a lack of respect for the citizen behind the data. A consensual data charter could help to enable a way forward.

Where is the wisdom we have lost in knowledge?

Once an app is [prescribed[, used, data exchanged with the NHS health provider and/or app designer, how will users know that what they agreed to in an in-store app, does not change over time?

How will ethical guidance be built into the purposes of any digital offerings we see approved and promoted in the NHS digital future?

When the recent social media experiment by Facebook only mentioned the use of data for research after the experiment, it caused outcry.

It crossed the line between what people felt acceptable and intrusive, analysing the change in behaviour that Facebook’s intervention caused.

That this manipulation is not only possible but could go unseen, are both a risk and cause for concern in a digital world.

Large digital platforms, even small apps have the power to drive not only consumer, but potentially social and political decision making.

“Where is the knowledge we have lost in information?” asks the words of T S Elliott in Choruses, from the Rock. “However you disguise it, this thing does not change: The perpetual struggle of Good and Evil.”

Knowledge can be applied to make a change to current behaviour, and offer or restrict choices through algorithmic selection. It can be used for good or for evil.

‘Don’t be evil’ Google’s adoptive mantra is not just some silly slogan.

Knowledge is power. How that power is shared or withheld from citizens matters not only today’s projects, but for the whole future digital is helping create. Online and offline. At home and abroad.

What freedoms and what standards will be agreed upon for it to function and to what purpose? What barriers can we avoid?

When designing for the future I’d like to see discussion consider not only the patient need, and potential benefits, but also the potential risk for exploitation and behavioural change the digital solution may offer. Plus, ethical solutions to be found for equality of access.

Regulation and principles can be designed to enable success and benefits, not viewed as barriers to be overcome

There must be an ethical compass built into the steering of the digital roadmap that the NHS is so set on, towards its digital future.

An ethical compass guiding app consumer regulation,  to enable fairness of access and to know when apps are downloaded or digital programmes begun, that users know to what they are signed up.

Fundamental to this the NIB speakers all recognised at #kfdigital15 is the ethical and trustworthy extraction, storage and use of data.

There is opportunity to consider when designing the NHS digital future [as the NIB develops its roadmaps for NHS England]:

i making principled decisions on barriers
ii. pro-actively designing ethics and change into ongoing projects, and,
iii. ensuring engagement is genuine collaboration and co-production.

The barriers do not need got around, but solutions built by design.

***

Part 1. Digital revolution by design: building for change and people
Part 3. Digital revolution by design: building infrastructures

NIB roadmaps: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/384650/NIB_Report.pdf

Digital revolution by design: building for change and people (1)

Andy Williams said* that he wants not evolution, but a revolution in digital health.

It strikes me that few revolutions have been led top down.

We expect revolution from grass roots dissent, after a growing consensus in the population that the status quo is no longer acceptable.

As the public discourse over the last 18 months about the NHS use of patient data has proven, we lack a consensual agreement between state, organisations and the public how the data in our digital lives should be collected, used and shared.

The 1789 Declaration of the Rights of Man and Citizen as part of the French Revolution set out a charter for citizens, an ethical and fair framework of law in which they trusted their rights would be respected by fellow men.

That is something we need in this digital revolution.

We are told hand by government that it is necessary to share all our individual level data in health from all sorts of sources.

And that bulk data collection is vital in the public interest to find surveillance knowledge that government agencies want.

At the same time other government departments plan to restrict citizens’ freedom of access to knowledge that could be used to hold the same government  and civil servants to account.

On the consumer side, there is public concern about the way we are followed around on the web by companies including global platforms like Google and Facebook, that track our digital footprint to deliver advertising.

There is growing objection to the ways in which companies scoop up data to build profiles of individuals and groups and personalising how they get treated. Recent objection was to marketing misuse by charities.

There is little broad understanding yet of the power of insight that organisations can now have to track and profile due to the power of algorithms and processing capability.

Technology progress that has left legislation behind.

But whenever you talk to people about data there are two common threads.

The first, is that although the public is not happy with the status quo of how paternalistic organisations or consumer companies ‘we can’t live without’ manage our data, there is a feeling of powerlessness that it can’t change.

The second, is frustration with organisations that show little regard for public opinion.

What happens when these feelings both reach tipping point?

If Marie Antoinette were involved in today’s debate about the digital revolution I suspect she may be the one saying: “let them eat cookies.”

And we all know how that ended.

If there is to be a digital revolution in the NHS where will it start?

There were marvelous projects going on at grassroots discussed over the two days: bringing the elderly online and connected and in housing and deprivation. Young patients with rare diseases are designing apps and materials to help consultants improve communications with patients.

The NIB meeting didn’t have real public interaction and or any discussion of those projects ‘in the room’ in the 10 minutes offered. Considering the wealth of hands-on digital health and care experience in the audience it was a missed opportunity for the NIB to hear common issues and listen to suggestions for co-designed solutions.

While white middle class men (for the most part) tell people of their grand plans from the top down, the revolutionaries of all kinds are quietly getting on with action on the ground.

If a digital revolution is core to the NHS future, then we need to ask to understand the intended change and outcome much more simply and precisely.

We should all understand why the NHS England leadership wants to drive change, and be given proper opportunity to question it, if we are to collaborate in its achievement.

It’s about the people, stoopid

Passive participation will not be enough from the general public if the revolution is to be as dramatic as it is painted.

Consensual co-design of plans and co-writing policy are proven ways to increase commitment to change.

Evidence suggests citizen involvement in planning is more likely to deliver success. Change done with, not to.

When constructive solutions have been offered what impact has engagement if no change is made to any  plans?

If that’s engagement, you’re doing it wrong.

Struggling with getting themselves together on the current design for now, it may be hard to invite public feedback on the future.

But it’s only made hard if what the public wants is ignored.  If those issues were resolved in the way the public asked for at listening events it could be quite simple to solve.

The NIB leadership clearly felt nervous to have debate, giving only 10 minutes of three hours for public involvement, yet that is what it needs. Questions and criticism are not something to be scared of, but opportunities to make things better.

The NHS top-down digital plans need public debate and dissected by the clinical professions to see if it fits the current and future model of healthcare, because if not involved in the change, the ride will be awfully bumpy to get there.

For data about us, to be used without us, is certainly an outdated model incompatible with a digital future.

The public needs to fight for citizen rights in a new social charter that demands change along lines we want, change that doesn’t just talk of co-design but that actually means it.

If unhappy about today’s data use, then the general public has to stop being content to be passive cash cows as we are data mined.

If we want data used only for public benefit research and not market segmentation, then we need to speak up. To the Information Commissioner’s Office if the organisation itself will not help.

“As Nicole Wong, who was one of President Obama’s top technology advisors, recently wrote, “[t]here is no future in which less data is collected and used.”

“The challenge lies in taking full advantage of the benefits that the Internet of Things promises while appropriately protecting consumers’ privacy, and ensuring that consumers are treated fairly.” Julie Brill, FTC, May 4 2015, Berlin

In the rush to embrace the ‘Internet of Things’ it can feel as though the reason for creating them has been forgotten. If the Internet serves things, it serves consumerism. AI must tread an enlightened path here. If the things are designed to serve people, then we would hope they offer methods of enhancing our life experience.

In the dream of turning a “tsunami of data” into a “tsunami of actionable business intelligence,” it seems all too often the person providing the data is forgotten.

While the Patient and Information Directorate, NHS England or NIB speakers may say these projects are complex and hard to communicate the benefits, I’d say if you can’t communicate the benefits, its not the fault of the audience.

People shouldn’t have to either a) spend immeasurable hours of their personal time, understanding how these projects work that want their personal data, or b) put up with being ignorant.

We should be able to fully question why it is needed and get a transparent and complete explanation. We should have fully accountable business plans and scrutiny of tangible and intangible benefits in public, before projects launch based on public buy-in which may misplaced. We should expect  plans to be accessible to everyone and make documents straightforward enough to be so.

Even after listening to a number of these meetings and board meetings, I am  not sure many would be able to put succinctly: what is the NHS digital forward view really? How is it to be funded?

On the one hand new plans are to bring salvation, while the other stops funding what works already today.

Although the volume of activity planned is vast, what it boils down to, is what is visionary and achievable, and not just a vision.

Digital revolution by design: building for change and people

We have opportunity to build well now, avoiding barriers-by-design, pro-actively designing ethics and change into projects, and to ensure it is collaborative.

Change projects must map out their planned effects on people before implementing technology. For the NHS that’s staff and public.

The digital revolution must ensure the fair and ethical use of the big data that will flow for direct care and secondary uses if it is to succeed.

It must also look beyond its own bubble of development as they shape their plans in the ever changing infrastructures in which data, digital, AI and ethics will become important to discuss together.

That includes in medicine.

Design for the ethics of the future, and enable change mechanisms in today’s projects that will cope with shifting public acceptance, because that shift has already begun.

Projects whose ethics and infrastructures of governance were designed years ago, have been overtaken in the digital revolution.

Projects with an old style understanding of engagement are not fit-for-the-future. As Simon Denegri wrote, we could have 5 years to get a new social charter and engagement revolutionised.

Tim Berners-Lee when he called for a Magna Carta on the Internet asked for help to achieve the web he wants:

“do me a favour. Fight for it for me.”

The charter as part of the French Revolution set out a clear, understandable, ethical and fair framework of law in which they trusted their rights would be respected by fellow citizens.

We need one for data in this digital age. The NHS could be a good place to start.

****

It’s exciting hearing about the great things happening at grassroots. And incredibly frustrating to then see barriers to them being built top down. More on that shortly, on the barriers of cost, culture and geography.

****

* at the NIB meeting held on the final afternoon of the Digital Conference on Health & Social Care at the King’s Fund, June 16-17.

NEXT>>
2. Driving Digital Health: revolution by design
3. Digital revolution by design: building infrastructure

Refs:
Apps for sale on the NHS website
Whose smart city? Resident involvement
Data Protection and the Internet of Things, Julie Brill FTC
A Magna Carta for the web