Category Archives: change management

Destination smart-cities: design, desire and democracy (Part three)

Smart Technology we have now: A UK Case Study

In places today, where climate surveillance sensors are used to predict and decide which smog-days cars should be banned from cities, automatic number-plate recognition (ANPR) can identify cars driving on the wrong days and send automatic penalties.

Similarly ANPR technology is used in our UK tunnels and congestion charging systems. One British company encouraging installation of ANPR in India is the same provider of a most significant part of our British public administrative data and surveillance softwares in a range of sectors.

About themselves that company says:

“Northgate Public Services has a unique experience of delivering ANPR software to all Home Office police forces. We developed and managed the NADC, the mission critical solution providing continuous surveillance of the UK’s road network.  The NADC is integrated with other databases, including the Police National Computer, and supports more than 30 million reads a day across the country.”

30 million snapshots from ‘continuous surveillance of the UK’s road network‘. That’s surprised me. That’s half the population in England, not all of whom drive. 30 million every day. It’s massive, unreasonable, and risks backlash.

Northgate Public Services’ clients also include 80% of UK water companies, as well as many other energy and utility suppliers.

And in the social housing market they stretch to debt collection, or ‘income management’.

So who I wondered, who is this company that owns all this data-driven access to our homes, our roads, our utilities, life insurance, hospital records and registeries, half of all UK calls to emergency services?

Northgate Information Solutions announced the sale of its Public Services division in December 2014 to venture capital firm Cinven. Cinven that also owns a 62% shareholding in the UK private healthcare provider Spire with all sorts of influence given their active share of services and markets. 

Not only does this private equity firm hold these vast range of data systems across a wide range of sectors, but it’s making decisions about how our public policies and money are being driven.

Using health screening data they’re even making decisions that affect our future and our behaviour and affect our private lives: software provides the information and tools that housing officers need to proactively support residents, such as sending emails, letters or rent reminders by SMS and freeing up time for face-to-face support.”

Of their ANPR systems, Northgate says the data should be even more widely used “to turn CONNECT: ANPR into a critical source of intelligence for proactive policing.”

If the company were to start to ‘proactively’ use all the data it owns across the sectors we should be asking, is ‘smart’ sensible and safe?

Where is the boundary between proactive and predictive? Or public and private?

Where do companies draw the line between public and personal space?

The public services provided by the company seem to encroach into our private lives in many ways, In Northgate’s own words, “It’s also deeply personal.”

Who’s driving decision making is clear. The source of their decision making is data. And it’s data about us.

Today already whether collected by companies proactively like ANPR or through managing data we give them with consent for direct administrative purpose, private companies are the guardians of massive amounts of our personal and public data.

What is shocking to me, is how collected data in one area of public services are also used for entirely different secondary purposes without informed consent or an FYI, for example in schools.

If we don’t know which companies manage our data, how can we trust that it is looked after well and that we are told if things go wrong?

Steps must be taken in administrative personal data security, transparency and public engagement to shore up public trust as the foundation for future datasharing as part of the critical infrastructure for any future strategy, for public or commercial application. Strategy must include more transparency of the processing of our data and public involvement, not the minimum, if ‘digital citizenship’ is to be meaningful.

How would our understanding of data improve if anyone using personal data were required to put in place clear public statements about their collection, use and analysis of data?  If the principles of data protection were actually upheld, in particular that individuals should be informed? How would our understanding of data improve especially regards automated decision making and monitoring technology? Not ninety page privacy policies. Plain English. If you need ninety pages, you’re doing too much with my data.

Independent privacy impact assessments should be mandatory and published before data are collected and shared with any party other than that to which it was given for a specific purpose. Extensions broadening that purpose should require consultation and consent. If that’s a street, then make it public in plain sight.

Above all, planning committees in local government, in policy making and practical application, need to think of data in every public decision they make and its ethical implications. We need some more robust decision-making in the face of corporate data grabs, to defend data collected in public space safe, and to keep some private.

How much less fun is a summer’s picnic spent smooching, if you feel watched? How much more anxious will we make our children if they’re not allowed to ever have their own time to themselves, and every word they type in a school computer is monitored?

How much individual creativity and innovation does that stifle? We are effectively censoring children before they have written a word.

Large corporations have played historically significant and often shadowy roles in surveillance that retrospectively were seen as unethical.

We should consider sooner rather than later, if corporations such as BAE systems, Siemens and the IMSs of the world act in ways worthy of our trust in such massive reach into our lives, with little transparency and oversight.

“Big data is big opportunity but Government should tackle misuse”

The Select Committee warned in its recent report on Big Data that distrust arising from concerns about privacy and security is often well-founded and must be resolved by industry and Government.

If ‘digital’ means smart technology in the future is used in “every part of government” as announced at #Sprint16, what will its effects be on the involvement and influence these massive corporations on democracy itself?

******

I thought about this more in depth on Part one here,  “Smart systems and Public Services” here (part two), and continue after this by looking at “The Best Use of Data” used in predictions and the Future (part four).

Destination smart-cities: design, desire and democracy (Part one)

When I drop my children at school in the morning I usually tell them three things: “Be kind. Have fun. Make good choices.”

I’ve been thinking recently about what a positive and sustainable future for them might look like. What will England be in 10 years?

The #Sprint16 snippets I read talk about how: ”Digital is changing how we deliver every part of government,” and “harnessing the best of digital and technology, and the best use of data to improve public services right across the board.”

From that three things jumped out at me:

  • The first is that the “best use of data” in government’s opinion may conflict with that of the citizen.
  • The second, is how to define “public services” right across the board in a world in which boundaries between private and public in the provision of services have become increasingly blurred.
  • And the third is the power of tech to offer both opportunity and risk if used in “every part of government” and effects on access to, involvement in, and the long-term future of, democracy.

What’s the story so far?

In my experience so far of trying to be a digital citizen “across the board” I’ve seen a few systems come and go. I still have my little floppy paper Government Gateway card, navy blue with yellow and white stripes. I suspect it is obsolete. I was a registered Healthspace user, and used it twice. It too, obsolete. I tested my GP online service. It was a mixed experience.

These user experiences are shaping how I interact with new platforms and my expectations of organisations, and I will be interested to see what the next iteration, nhs alpha, offers.

How platforms and organisations interact with me, and my data, is however increasingly assumed without consent. This involves new data collection, not only using data from administrative or commercial settings to which I have agreed, but new scooping of personal data all around us in “smart city” applications.

Just having these digital applications will be of no benefit and all the disadvantages of surveillance for its own sake will be realised.

So how do we know that all these data collected are used – and by whom? How do we ensure that all the tracking actually gets turned into knowledge about pedestrian and traffic workflow to make streets and roads safer and smoother in their operation, to make street lighting more efficient, or the environment better to breathe in and enjoy? And that we don’t just gift private providers tonnes of valuable data which they simply pass on to others for profit?

Because without making things better, in this Internet-of-Things will be a one-way ticket to power in the hands of providers and loss of control, and quality of life. We’ll work around it, but buying a separate SIM card for trips into London, avoiding certain parks or bridges, managing our FitBits to the nth degree under a pseudonym. But being left no choice but to opt out of places or the latest technology to enjoy, is also tedious. If we want to buy a smart TV to access films on demand, but don’t want it to pass surveillance or tracking information back to the company how can we find out with ease which products offer that choice?

Companies have taken private information that is none of their business, and quite literally, made it their business.

The consumer technology hijack of “smart” to always mean marketing surveillance creates a divide between those who will comply for convenience and pay the price in their privacy, and those who prize privacy highly enough to take steps that are less convenient, but less compromised.

But even wanting the latter, it can be so hard to find out how to do, that people feel powerless and give-in to the easy option on offer.

Today’s system of governance and oversight that manages how our personal data are processed by providers of public and private services we have today, in both public and private space, is insufficient to meet the values most people reasonably expect, to be able to live their life without interference.

We’re busy playing catch up with managing processing and use, when many people would like to be able to control collection.

The Best use of Data: Today

My experience of how the government wants to ‘best use data’ is that until 2013 I assumed the State was responsible with it.

I feel bitterly let down.

care.data taught me that the State thinks my personal data and privacy are something to exploit, and “the best use of my data” for them, may be quite at odds with what individuals expect. My trust in the use of my health data by government has been low ever since. Saying one thing and doing another, isn’t making it more trustworthy.

I found out in 2014 how my children’s personal data are commercially exploited and given to third parties including press outside safe settings, by the Department for Education. Now my trust is at rock bottom. I tried to take a look at what the National Pupil Database stores on my own children and was refused a subject access request, meanwhile the commercial sector and Fleet Street press are given out not only identifiable data, but ‘highly sensitive’ data. This just seems plain wrong in terms of security, transparency and respect for the person.

The attitude that there is an entitlement of the State to individuals’ personal data has to go.

The State has pinched 20 m children’s privacy without asking. Tut Tut indeed. [see Very British Problems for a translation].

And while I support the use of public administrative data in deidentified form in safe settings, it’s not to be expected that anything goes. But the feeling of entitlement to access our personal data for purposes other than that for which we consented, is growing, as it stretches to commercial sector data. However suggesting that public feeling measured based on work with 0.0001% of the population, is “wide public support for the use and re-use of private sector data for social research” seems tenuous.

Even so, comments even in that tiny population suggested, “many participants were taken by surprise at the extent and size of data collection by the private sector” and some “felt that such data capture was frequently unwarranted.” “The principal concerns about the private sector stem from the sheer volume of data collected with and without consent from individuals and the profits being made from linking data and selling data sets.”

The Best use of Data: The Future

Young people, despite seniors often saying “they don’t care about privacy” are leaving social media in search of greater privacy.

These things cannot be ignored if the call for digital transformation between the State and the citizen is genuine because try and do it to us and it will fail. Change must be done with us. And ethically.

And not “ethics” as in ‘how to’, but ethics of “should we.” Qualified transparent evaluation as done in other research areas, not an add on, but integral to every project, to look at issues such as:

  • whether participation is voluntary, opt-out or covert
  • how participants can get and give informed consent
  • accessibility to information about the collection and its use
  • small numbers, particularly of vulnerable people included
  • identifiable data collection or disclosure
  • arrangements for dealing with disclosures of harm and recourse
  • and how the population that will bear the risks of participating in the research is likely to benefit from the knowledge derived from the research or not.

Ethics is not about getting away with using personal data in ways that won’t get caught or hauled over the coals by civil society.

It’s balancing risk and benefit in the public interest, and not always favouring the majority, but doing what is right and fair.

We hear a lot at the moment on how the government may see lives, shaped by digital skills, but too little of heir vison for what living will look and feel like, in smart cities of the future.

My starting question is, how does government hope society will live there and is it up to them to design it? If not, who is because these smart-city systems are not designing themselves. You’ve heard of Stepford wives. I wonder what do we do if we do not want to live like Milton Keynes man?

I hope that the world my children will inherit will be more just, more inclusive, and with a more sustainable climate to support food, livelihoods and kinder than it is today. Will ‘smart’ help or hinder?

What is rarely discussed in technology discussions is how the service should look regardless of technology. The technology assumed as inevitable, becomes the centre of service delivery.

I’d like to first understand what is the central and local government vision for “public services”  provision for people of the future? What does it mean for everyday services like schools and health, and how does it balance security and our freedoms?

Because without thinking about how and who provides those services for people, there is a hole in the discussion of “the best use of data” and their improvement “right across the board”.

The UK government has big plans for big data sharing, sharing across all public bodies, some tailored for individual interventions.

While there are interesting opportunities for public benefit from at-scale systems, the public benefit is at risk not only from lack of trust in how systems gather data and use them, but that interoperability in service, and the freedom for citizens to transfer provider, gets lost in market competition.

Openness and transparency can be absent in public-private partnerships until things go wrong. Given the scale of smart-cities, we must have more than hope that data management and security will not be one of those things.

How will we know if new plans are designed well, or not?

When I look at my children’s future and how our current government digital decision making may affect it, I wonder if their future will be more or less kind. More or less fun.

Will they be left with the autonomy to make good choices of their own?

The hassle we feel when we feel watched all the time, by every thing that we own, in every place we go, having to check every check box has a reasonable privacy setting, has a cumulative cost in our time and anxieties.

Smart technology has invaded not only our public space and our private space, but has nudged into our head space.

I for one have had enough already. For my kids I want better. Technology should mean progress for people, not tyranny.

Living in smart cities, connected in the Internet-of-Things, run on their collective Big Data and paid for by commercial corporate providers, threatens not only their private lives and well-being, their individual and independent lives, but ultimately independent and democratic government as we know it.

*****

This is the start of a four part set of thoughts: Beginnings with smart technology and data triggered by the Sprint16 session (part one). I think about this more in depth in “Smart systems and Public Services” (Part two) here, and the design and development of smart technology making “The Best Use of Data” looking at today in a UK company case study (Part three) before thoughts on “The Best Use of Data” used in predictions and the Future (Part four).

Breaking up is hard to do. Restructuring education in England.

This Valentine’s I was thinking about the restructuring of education in England and its wide ranging effects. It’s all about the break up.

The US EdTech market is very keen to break into the UK, and our front door is open.

We have adopted the model of Teach First partnered with Teach America, while some worry we do not ask “What is education for?

Now we hear the next chair of Oftsed is to be sought from the US, someone who is renowned as “the scourge of the unions.”

Should we wonder how long until the management of schools themselves is US-sourced?

The education system in England has been broken up in recent years into manageable parcels  – for private organisations, schools within schools, charity arms of commercial companies, and multi-school chains to take over – in effect, recent governments have made reforms that have dismantled state education as I knew it.

Just as the future vision of education outlined in the 2005 Direct Democracy co-authored by Michael Gove said, “The first thing to do is to make existing state schools genuinely independent of the state.”

Free schools touted as giving parents the ultimate in choice, are in effect another way to nod approval to the outsourcing of the state, into private hands, and into big chains. Despite seeing the model fail spectacularly abroad, the government seems set on the same here.

Academies, the route that finagles private corporations into running public-education is the preferred model, says Mr Cameron. While there are no plans to force schools to become academies, the legislation currently in ping-pong under the theme of coasting schools enables just that. The Secretary of State can impose academisation. Albeit only on Ofsted labeled ‘failing’ schools.

What fails appears sometimes to be a school that staff and parents cannot understand as anything less than good, but small. While small can be what parents want, small pupil-teacher ratios, mean higher pupil-per teacher costs. But the direction of growth is towards ‘big’ is better’.

“There are now 87 primary schools with more than 800 pupils, up from 77 in 2014 and 58 in 2013. The number of infants in classes above the limit of 30 pupils has increased again – with 100,800 pupils in these over-sized classes, an increase of 8% compared with 2014.” [BBC]

All this restructuring creates costs about which the Department wants to be less than transparent.  And has lost track of.

If only we could see that these new structures raised standards?  But,” while some chains have clearly raised attainment, others achieve worse outcomes creating huge disparities within the academy sector.”

If not delivering better results for children, then what is the goal?

A Valentine’s view of Public Service Delivery: the Big Break up

Breaking up the State system, once perhaps unthinkable is possible through the creation of ‘acceptable’ public-private partnerships (as opposed to outright privatisation per se). Schools become academies through a range of providers and different pathways, at least to start with, and as they fail, the most successful become the market leaders in an oligopoly. Ultimately perhaps, this could become a near monopoly. Delivering ‘better’. Perhaps a new model, a new beginning, a new provider offering salvation from the flood of ‘failing’ schools coming to the State’s rescue.

In order to achieve this entry to the market by outsiders, you must first remove conditions seen as restrictive, giving more ‘freedom’ to providers; to cut corners make efficiency savings on things like food standards, required curriculum, and numbers of staff, or their pay.

And what if, as a result, staff leave, or are hard to recruit?

Convincing people that “tech” and “digital” will deliver cash savings and teach required skills through educational machine learning is key if staff costs are to be reduced, which in times of austerity and if all else has been cut, is the only budget left to slash.

Self-taught systems’ providers are convincing in their arguments that tech is the solution.

Sadly I remember when a similar thing was tried on paper. My first year of GCSE maths aged 13-14  was ‘taught’ at our secondary comp by working through booklets in a series that we self-selected from the workbench in the classroom. Then we picked up the master marking-copy once done. Many of the boys didn’t need long to work out the first step was an unnecessary waste of time. The teacher had no role in the classroom. We were bored to bits. By the final week at end of the year they sellotaped the teacher to his chair.

I kid you not.

Teachers are so much more than knowledge transfer tools, and yet by some today seem to be considered replaceable by technology.

The US is ahead of us in this model, which has grown hand-in-hand with commercialism in schools. Many parents are unhappy.

So is the DfE setting us up for future heartbreak if it wants us to go down the US route of more MOOCs, more tech, and less funding and fewer staff? Where’s the cost benefit risk analysis and transparency?

We risk losing the best of what is human from the classroom, if we will remove the values they model and inspire. Unions and teachers and educationalists are I am sure, more than aware of all these cumulative changes. However the wider public seems little engaged.

For anyone ‘in education’ these changes will all be self-evident and their balance of risks and benefits a matter of experience, and political persuasion. As a parent I’ve only come to understand these changes, through researching how our pupils’ personal and school data have been commercialised,  given away from the National Pupil Database without our consent, since legislation changed in 2013; and the Higher Education student and staff data sold.

Will more legislative change be needed to keep our private data accessible in public services operating in an increasingly privately-run delivery model? And who will oversee that?

The Education Market is sometimes referred to as ‘The Wild West’. Is it getting a sheriff?

The news that the next chair of Oftsed is to be sought from the US did set alarm bells ringing for some in the press, who fear US standards and US-led organisations in British schools.

“The scourge of unions” means not supportive of staff-based power and in health our junior doctors have clocked exactly what breaking their ‘union’ bargaining power is all about.  So who is driving all this change in education today?

Some ed providers might be seen as profiting individuals from the State break up. Some were accused of ‘questionable practices‘. Oversight has been lacking others said. Margaret Hodge in 2014 was reported to have said: “It is just wrong to hand money to a company in which you have a financial interest if you are a trustee.”

I wonder if she has an opinion on a lead non-executive board member at the Department for Education also being the director of one of the biggest school chains? Or the ex Minister now employed by the same chain? Or that his campaign was funded by the same Director?  Why this register of interests is not transparent is a wonder.

It could appear to an outsider that the private-public revolving door is well oiled with sweetheart deals.

Are the reforms begun by Mr Gove simply to be executed until their end goal, whatever that may be, through Nikky Morgan or she driving her own new policies?

If Ofsted were  to become US-experience led, will the Wild West be tamed or US providers invited to join the action, reshaping a new frontier? What is the end game?

Breaking up is not hard to do, but in whose best interest is it?

We need only look to health to see the similar pattern.

The structures are freed up, and boundaries opened up (if you make the other criteria) in the name of ‘choice’. The organisational barriers to break up are removed in the name of ‘direct accountability’. And enabling plans through more ‘business intelligence’ gathered from data sharing, well, those plans abound.

Done well, new efficient systems and structures might bring public benefits, the right technology can certainly bring great things, but have we first understood what made the old less efficient if indeed it was and where are those baselines to look back on?

Where is the transparency of the end goal and what’s the price the Department is prepared to pay in order to reach it?

Is reform in education, transparent in its ideology and how its success is being measured if not by improved attainment?

The results of change can also be damaging. In health we see failing systems and staff shortages and their knock-on effects into patient care. In schools, these failures damage children’s start in life, it’s not just a ‘system’.

Can we assess if and how these reforms are changing the right things for the right reasons? Where is the transparency of what problems we are trying to solve, to assess what solutions work?

How is change impact for good and bad being measured, with what values embedded, with what oversight, and with whose best interests at its heart?

2005’s Direct Democracy could be read as a blueprint for co-author Mr Gove’s education reforms less than a decade later.

Debate over the restructuring of education and its marketisation seems to have bypassed most of us in the public, in a way health has not.

Underperformance as measured by new and often hard to discern criteria, means takeover at unprecedented pace.

And what does this mean for our most vulnerable children? SEN children are not required to be offered places by academies. The 2005 plans co-authored by Mr Gove also included: “killing the government’s inclusion policy stone dead,” without an alternative.

Is this the direction of travel our teachers and society supports?

What happens when breakups happen and relationship goals fail?

Who picks up the pieces? I fear the state is paying heavily for the break up deals, investing heavily in new relationships, and yet will pay again for failure. And so will our teaching staff, and children.

While Mr Hunt is taking all the heat right now, for his part in writing Direct Democracy and its proposals to privatise health – set against the current health reforms and restructuring of junior doctors contracts – we should perhaps also look to Mr Gove co-author, and ask to better understand the current impact of his recent education reforms, compare them with what he proposed in 2005, and prepare for the expected outcomes of change before it happens (see p74).

One outcome was that failure was to be encouraged in this new system, and Sweden held up as an exemplary model:

“Liberating state schools would also allow the all-important freedom to fail.”

As Anita Kettunen, principal of JB Akersberga in Sweden reportedly said when the free schools chain funded by a private equity firm failed:

“if you’re going to have a system where you have a market, you have to be ready for this.”

Breaking up can be hard to do. Failure hurts. Are we ready for this?
******

 

Abbreviated on Feb 18th.

 

Thoughts since #UKHC15. UK health datasharing.

The world you will release your technology into, is the world you are familiar with, which is already of the past. Based on old data.

How can you design tools and systems fit for the future? And for all?

For my 100th post and the first of 2016, here is a summary of some of my thoughts prompted by . Several grains of thought related to UK heath data that have been growing for some time.

1000 words on “Hard things: identity, data sharing and consent.” The fun run version.

Do we confuse hard with complex? Hard does not have to mean difficult. Some things seem to be harder than necessary, because of politics. I’ve found this hard to write. Where to start?

The search to capture solutions has been elusive.

The starting line: Identity

Then my first thoughts on identity got taken care of by Vinay Gupta in this post, better than I could. (If you want a long read about identity, you might want to get a hot drink like I did and read and re-read. It says it’ll take an hour. It took me several, in absorption and thinking time. And worth it.)

That leaves data sharing and consent. Both of which I have written many of my other 99 posts about in the last year. So what’s new?

Why are we doing this: why aren’t we there yet?

It still feels very much that many parts of the health service and broader government thinking on ‘digital’ is we need to do something. Why is missing, and therefore achieving and measuring success is hard.

Often we start with a good idea and set about finding a solution how to achieve it. But if the ‘why’ behind the idea is shaky to start with, the solution may falter, as soon as it gets difficult. No one seems to know what #paperless actually means in practice.

So why try and change things? Fixing problems, rather than coming up with good ideas is another way to think of it as they suggested at  #ukhc15, it was a meet-up for people who want to make things better, usually for others, and sometimes that involves improving the systems they worked with directly, or supported others in.

I no longer work in systems’ introductions, or enhancement processes, although I have a lay role in research and admin data, but regular readers know, most of the last two years has been all about the data.  care.data.

More often than not, in #ukhc2015 discussions that focused on “the data” I would try and bring people back to thinking about what the change is trying to solve, what it wants to “make better” and why.

There’s a broad tendency to simply think more data = better. Not true, and I’ll show later a case study why. We must question why.

Why doesn’t everyone volunteer or not want to join in?

Very many people who have spoken with me over the last two years have shared their concrete concerns over the plans to share GP data and they do not get heard. They did not see a need to share their identifiable personal confidential data, or see why truly anonymous data would not be sufficient for health planning, for example.

Homeless men, and women at risk, people from the travelling community, those with disabilities, questions on patients with stigmatising conditions, minorities, children, sexual orientation – not to mention from lawyers or agencies representing them. Or the 11 million of our adult population not online. Few of whom we spoke about. Few of whom we heard from at #ukhc15. Yet put together, these individuals make up not only a significant number of people, but make up a disproportionately high proportion of the highest demands on our health and social care services.

The inverse care law appears magnified in its potential when applied to digital, and should magnify the importance of thinking about access. How will care.data make things better for them, and how will the risks be mitigated? And are those costs being properly assessed if there is no assessment of the current care.data business case and seemingly, since 2012 at least, no serious effort to look at alternatives?

The finish line? We can’t see what it looks like yet.

The #ukhc2015 event was well run, and I liked the spontaneity of people braver than me who were keen to lead sessions and did it well.  As someone who is white, living in a ‘nice’ area, I am privileged. It was a privilege to spend a day with #UKHC15 and packed with people who clearly think about hard things all the time. People who want to make things better.  People who were welcoming to nervous first-timers at an ‘un’conference over a shared lunch.

I hope the voices of those who can’t attend these events, and outside London, are equally accounted for in all government 2016 datasharing plans.

This may be the last chance after years of similar consultations have failed to deliver workable, consensual public data sharing policies.

We have vast streams of population-wide data stored in the UK, about which, the population is largely ignorant. But while the data may be from 25 years ago, whatever is designed today is going to need to think long term, not how do we solve what we know, but how do we design solutions that will work for what we don’t.

Transparency here will be paramount to trust if future decisions are made for us, or those we make for ourselves are ‘influenced’ by machine learning, by algorithms, machine learning and ‘mindspace’ work.

As Thurgood Marshall said,

“Our whole constitutional heritage rebels at the thought of giving government the power to control men’s minds.”

Control over who we are and who the system thinks we are becomes a whole new level of discussion, if we are being told how to make a decision, especially where the decision is toward a direction of public policy based on political choice. If pensions are not being properly funded, to not allocate taxes differently and fund them, is a choice the current government has made, while the DWP seeks to influence our decison, to make us save more in private pensions.

And how about in data discussions make an effort to start talking a little more clearly in the same terms – and stop packaging ‘sharing’ as if it is something voluntary in population-wide compulsory policy.

It’s done to us, not with us, in far too many areas of government we do not see. Perhaps this consultation might change that, but it’s the ‘nth’ number of consulations and I want to be convinvced this one is intentional of real change. It’s only open for a few weeks, and this meet up for discussion appeared to be something only organised in London.

I hope we’ll hear committment to real change in support of people and the uses of our personal data by the state in the new #UkDigiStrategy, not simply more blue skythinking and drinking the ‘datasharing’ kool-aid.  We’ve been talking in the UK for far too long about getting this right.

Let’s see the government serious about making it happen. Not for government, but in the public interest, in a respectful and ethical partnership with people, and not find changes forced upon us.

No other foundation will be fit for a future in which care.data, the phenotype data, is to be the basis for an NHS so totally personalised.

If you want a longer read, read on below for my ten things in detail.

Comment welcome.

########

Hard things: The marathon version, below.
Continue reading Thoughts since #UKHC15. UK health datasharing.

Building Public Trust in care.data sharing [1]: Seven step summary to a new approach

Here’s my opinion after taking part in the NIB #health2020 Bristol event 24/7/2015 and presentation of plans at the June King’s Fund hosted event. Data sharing includes plans for extraction and uses of primary care data by third parties, charging ahead under the care.data banner.

Wearing my hat from a previous role in change management and communications, I share my thoughts in the hope the current approach can adapt and benefit from outside perspectives.

The aim of “Rebuilding and sustaining Public trust” [1] needs refocused to treat the cause, not only the symptoms of the damage done in 2014.  Here’s why:

A Seven Step Top Line Summary

1. Abstract ‘public trust’ is not vital to the future of data sharing. Being demonstrably worthy of public trust is.

2. Data-sharing is not vital to future-proof the NHS. Using knowledge wisely is.

3. A timed target to ‘get the public’s data’, is not what is needed. Having a stable, long term future-proofed and governable model is.

4. Tech solutions do not create trust. Enable the positive human response to what the org wants from people, enabling their confident ‘yes to data-sharing.’ [It might be supported by technology-based tools.]

5. Communications that tell the public ‘we know best, trust us’ fail.  While professional bodies [BMA [2], GPES advisory group, APPG report calling for a public benefits plan, ICO, and expert advice such as Caldicott] are ignored or remain to be acted upon, it remains challenging for the public to see how the programme’s needs, motives and methods are trustworthy. The [Caldicott 2] Review Panel found that commissioners do not need dispensation from confidentiality, human rights & data protection law.” [3] Something’s gotta give. What will it be?

6. care.data consistency. Relationships must be reliable and have integrity.
“Trust us – see the benefits” [But we won’t share the business cost/benefit plan.]
“Trust us – we’re transparent” [But there is nothing published in 2015 at all from the programme board minutes] [4]
“Trust us – we’ll only use your data wisely, with the patient in control” [Ignore that we didn’t before [5] and that we still share your data for secondary uses even if you opted out [6] and no, we can’t tell you when it will be fixed…]

7. Voices do not exist in a vacuum. Being trustworthy on care.data  does not stand alone but is part of the NHS ‘big picture’.
Department of Health to GPs: “Trust us about data sharing.’  [And ignore that we haven’t respected many of  your judgement or opinions.]
NHS England to GPs: “Trust us about data sharing.’  
[And ignore our lack of general GP support: MPIG withdrawal, misrepresentation in CQC reports] NHS England and Department of Health to professionals and public: “The NHS is safe in our hands.’ Everyone: “We see no evidence that plans for cost savings, 7 day working, closures and the 5YFV integration will bring the promised benefits. Let us ‘see the holes’, so that we can trust you based on evidence.”

See the differences?

Target the cause not Symptom:

The focus in the first half, the language used by NHS England/NIB/ DH, sets out their expectations of the public. “You must trust us and how you give us your data.”

The focus should instead to be on the second half, a shift to the organisation, the NHS England/NIB/ DH, and set out expectations from the public point-of-view. ” Enable the public to trust the organisation. Enable individual citizens to trust what is said by individual leaders. This will enable citizens to be consensual sharers in the activity your organisation imposes – the demand for care.data through a statutory gateway, obliging GPs to disclose patient data.

The fact that trust is broken, and specifically to data-sharing that there is the deficit [A] between how much the public trusts the organisation and how the organisation handles data, is not the fault of the public, or “1.4 M NHS staff”, or the media, or patient groups’ pressure. It’s based on proven experience.

It’s based on how organisations have handled data in the past. [5] Specifically on the decisions made by DH, and the Information Centre and leaders in between. Those who chose to sell patient data without asking the public.

The fact that trust is broken is based on how leadership individuals in those organisations have responded to that. Often taking no responsibility for loss.

No matter how often we hear “commissioners will get a better joined up picture of care needs and benefit you”, it does not compensate for past failings.

Only demonstrable actions to show why it will not happen in future can start that healing process.

Target the timing to the solution, not a shipping deadline

“Building trust to enable data sharing” aims at quick fixes, when what is needed is a healing process and ongoing relationship maintenance.

Timing has to be tailored to what needs done, not an ‘artificial deadline’. Despite that being said it doesn’t seem to match reality.

Addressing the Symptoms and not the Cause, will not find a Cure

What needs done?

Lack of public trust, the data trust deficit [A] are symptoms in the public to be understood. But it is the causes in the organisations that must be treated.

So far many NHS England staff I have met in relation to care.data, appear to have a “them and us” mentality. It’s almost tangibly wrapped up in the language used at these meetings or in defensive derision of public concerns: “Tin foil hat wearers”, “Luddites” [7] and my personal favourite, ‘Consent fetishists.’ [8] It’s counter productive and seems borne from either a lack of understanding, or frustration.

The NIB/DH/NHS England/ P&I Directorate must accept they cannot force any consensual change in an emotion-based belief based on past experiences, held by the public.

Those people each have different starting points of knowledge and beliefs.  As one attendee said, “There is no single patient replicated 60 million times.”

The NIB/DH/NHS England/ P&I Directorate can only change what they themselves can control. They have to model and be seen to model change that is trustworthy.

How can an organisation demonstrate it is trustworthy?

This means shifting the focus of the responsibility for change from public and professionals, to leadership organisation.

There is a start in this work stream, but there is little new that is concrete.

The National Data Guardian (NDG) role has been going to be put on a legal footing “at the earliest opportunity” since November 2014. [9] Nine months.

Updated information governance guidance is on the way.

Then there’s two really strong new items that would underpin public trust, to be planned in a ‘roadmap’: the first a system that can record and share consent decisions and the second, to provide information on the use to which an individual’s data has been put.

How and when those two keystones to public trust will be actually offered appear unknown. They will  encourage public trust by enabling choice and control of our data. So I would ask, if we’re not there yet on the roadmap, how can consent options be explained to the public in care.data communications, if there is as yet no mechanism to record and effect them? More on that later.

Secondly, when will a usage report be available? That will be the proof to demonstrate that what was offered, was honoured. It is one of the few tools the organisation(s) can offer to demonstrate they are trustworthy: you said, we did. So again, why jeopardise public trust by rolling out data extractions into the existing, less trustworthy environment?

How well this is done will determine whether it can realise its hoped for benefits. How the driving leadership influences that outcome, will be about the organisational approach to opt out, communicating care.data content decisions, the way and the channels in which they are communicated, accepting what has not worked to date and planning long-term approaches to communicating change before you start the pathfinders. [Detailed steps on this follows.]

Considering the programme’s importance we have been told, it’s vital to get right. [10]

i believe changing the approach from explaining benefits and focus on public trust, to demonstrating why the public should trust demonstrable changes made, will make all the difference.

So before rolling out next data sharing steps think hard what the possible benefits and risks will be, versus waiting for a better environment to do it in.

Conclusion: Trust is not about the public. Public trust is about the organisation being trustworthy. Over to you, orgs.

####

To follow, for those interested in nitty gritty, some practical suggestions for progress in Building Public Trust in data sharing:

This is Part one: A seven step top line summary – What I’d like to see change addressing public trust in health data sharing for secondary purposes.

Part two: a New Approach is needed to understanding Public Trust For those interested in a detailed approach on Trust. What Practical and Policy steps influence trust. On Research and Commissioning. Trust is not homogeneous. Trust  is nuanced even within the single relationship between one individual and another. It doesn’t exist in a vacuum.

Part three: Know where you’re starting from. What behaviours influence trust. Fixing what has already been communicated is vital before new communications get rolled out. Vital to content of your communications and vital for public trust and credibility.

Part four: Communicate the Benefits won’t work – How Communications influence trust. For those interested in more in-depth reasons, I outline in part two why the communications approach is not working, why the focus on ‘benefits’ is wrong, and fixes.

Part five: Future solutions – why a new approach may work better for future trust – not to attempt to rebuild trust where there is now none, but strengthen what is already trusted and fix today’s flawed behaviours; honesty and reliability, that  are vital to future proofing trust

####

Background References:

I’m passionate about people using technology to make their jobs and lives better, simpler, and about living well. So much so, that this became over 5000 words. To solve that, I’ve assumed a baseline knowledge and I will follow up with separate posts on why a new approach is needed to understanding “Public Trust”, to “Communicating the benefits” and “Being trustworthy and other future solutions”.

If this is all new, welcome, and I suggest you look over some of the past 18 months posts that include  public voice captured from eight care.data  events in 2014. care.data is about data sharing for secondary purposes not direct care.

[1] NHS England October 2014 http://www.england.nhs.uk/2014/10/23/nhs-leaders-vision/

[2] BMA LMC Vote 2014 http://bma.org.uk/news-views-analysis/news/2014/june/patients-medical-data-sacrosanct-declares–bma

[3] Caldicott Review 2: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/192572/2900774_InfoGovernance_accv2.pdf

[4] Missing Programme Board documents: 2015 and June-October 2014

[5] HSCIC Data release register

[6] Telegraph article on Type 2 opt out http://www.telegraph.co.uk/news/health/news/11655777/Nearly-1million-patients-could-be-having-confidential-data-shared-against-their-wishes.html

[7] Why Wanting a Better Care.Data is not Luddite: http://davidg-flatout.blogspot.co.uk/2014/04/why-wanting-better-caredata-is-not.html

[8] Talking to the public about using their data is crucial- David Walker, StatsLife http://www.statslife.org.uk/opinion/1316-talking-to-the-public-about-using-their-data-is-crucial

[9] Dame Fiona Caldicott appointed in new role as National Data Guardian

[10] Without care.data health service has no future says director http://www.computerweekly.com/news/2240216402/Without-Caredata-we-wont-have-a-health-service-for-much-longer-says-NHS

Polls of public feeling:

[A] Royal Statistical Society Data Trust Deficit http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

(B] Dialogue on data – work carried out through the ADRN

 

 

Digital revolution by design: infrastructures and the fruits of knowledge

Since the beginning of time and the story of the Garden of Eden, man has found a way to share knowledge and its power.

Modern digital tools have become the everyday way to access knowledge for many across the world, giving quick access to information and sharing power more fairly.

In this third part of my thoughts on digital revolution by design, triggered by the #kfdigi15 event on June 16-17, I’ve been considering some of the constructs we have built; those we accept and those that could be changed, given the chance, to build a better digital future.

Not only the physical constructions, the often networked infrastructures, but intangible infrastructures of principles and power, co-dependencies around a physical system; the legal and ethical infrastructures of ownership, governance and accountability.

Our personal data flow in systems behind the screens, at the end of our fingertips. Controlled in frameworks designed by providers and manufacturers, government and commercial agencies.

Increasingly in digital discussions we hear that the data subject, the citizen, will control their own data.

But if it is on the terms and conditions set by others, how much control is real and how much is the talk of a consenting citizen only a fig leaf behind which any real control is still held by the developer or organisation providing the service?

When data are used, turned into knowledge as business intelligence that adds value to aid informed decision making. By human or machine.

How much knowledge is too much knowledge for the Internet of Things to build about its users? As Chris Matyszczyk wrote:

“We have all agreed to this. We click on ‘I agree’ with no thought of consequences, only of our convenience.”

Is not knowing what we have agreed to our fault, or the responsibility of the provider who’d rather we didn’t know?

Citizens’ rights are undermined in unethical interactions if we are exploited by easy one-click access and exchange our wealth of data at unseen cost. Can it be regulated to promote, not stifle innovation?

How can we get those rights back and how will ‘doing the right thing’ help shape and control the digital future we all want?

The infrastructures we live inside

As Andrew Chitty says in this HSJ article: “People live more mobile and transient lives and, as a result, expect more flexible, integrated, informed health services.

To manage that, do we need to know how systems work, how sharing works, and trust the functionality of what we are not being told and don’t see behind the screens?

At the personal level, whether we sign up for the social network, use a platform for free email, or connect our home and ourselves in the Internet of things, we each exchange our personal data with varying degrees of willingness. There there is often no alternative if we want to use the tool.

As more social and consensual ‘let the user decide’ models are being introduced, we hear it’s all about the user in control, but reality is that users still have to know what they sign up for.

In new models of platform identity sign on, and tools that track and mine our personal data to the nth degree that we share with the system, both the paternalistic models of the past and the new models of personal control and social sharing are merging.

Take a Fitbit as an example. It requires a named account and data sharing with the central app provider. You can choose whether or not to enable ‘social sharing’ with nominated friends whom you want to share your boasts or failures with. You can opt out of only that part.

I fear we are seeing the creation of a Leviathan sized monster that will be impossible to control and just as scary as today’s paternalistic data mis-management. Some data held by the provider and invisibly shared with third parties beyond our control, some we share with friends, and some stored only on our device.

While data are shared with third parties without our active knowledge, the same issue threatens to derail consumer products, as well as commercial ventures at national scale, and with them the public interest. Loss of trust in what is done behind the settings.

Society has somehow seen privacy lost as the default setting. It has become something to have to demand and defend.

“If there is one persistent concern about personal technology that nearly everybody expresses, it is privacy. In eleven of the twelve countries surveyed, with India the only exception, respondents say that technology’s effect on privacy was mostly negative.” [Microsoft survey 2015, of  12,002 internet users]

There’s one part of that I disagree with. It’s not the effect of technology itself, but the designer or developers’ decision making that affects privacy. People have a choice how to design and regulate how privacy is affected, not technology.

Citizens have vastly differing knowledge bases of how data are used and how to best interact with technology. But if they are told they own it, then all the decision making framework should be theirs too.

By giving consumers the impression of control, the shock is going to be all the greater if a breach should ever reveal where fitness wearable users slept and with whom, at what address, and were active for how long. Could a divorce case demand it?

Fitbit users have already found their data used by police and in the courtroom – probably not what they expected when they signed up to a better health tool.  Others may see benefits that could harm others by default who are excluded from accessing the tool.

Some at org level still seem to find this hard to understand but it is simple:
No trust = no data = no knowledge for commercial, public or personal use and it will restrict the very innovation you want to drive.

Google gmail users have to make 10+ clicks to restrict all ads and information sharing based on their privacy and ad account settings. The default is ad tailoring and data mining. Many don’t even know it is possible to change the settings and it’s not intuitive how to.

Firms need to consider their own reputational risk if users feel these policies are not explicit and are exploitation. Those caught ‘cheating’ users can get a very public slap on the wrist.

Let the data subjects rule, but on whose terms and conditions?

The question every citizen signing up to digital agreements should ask, is what are the small print  and how will I know if they change? Fair processing should offer data protection, but isn’t effective.

If you don’t have access to information, you make decisions based on a lack of information or misinformation. Decisions which may not be in your own best interest or that of others. Others can exploit that.

And realistically and fairly, organisations can’t expect citizens to read pages and pages of Ts&Cs. In addition, we don’t know what we don’t know. Information that is missing can be as vital to understand as that provided. ‘Third parties’ sharing – who exactly does that mean?

The concept of an informed citizenry is crucial to informed decision making but it must be within a framework of reasonable expectation.

How do we grow the fruits of knowledge in a digital future?

Real cash investment is needed now for a well-designed digital future, robust for cybersecurity, supporting enforceable governance and oversight. Collaboration on standards and thorough change plans. I’m sure there is much more, but this is a start.

Figurative investment is needed in educating citizens about the technology that should serve us, not imprison us in constructs we do not understand but cannot live without.

We must avoid the chaos and harm and wasted opportunity of designing massive state-run programmes in which people do not want to participate or cannot participate due to barriers of access to tools. Avoid a Babel of digital blasphemy in which the only wise solution might be to knock it down and start again.

Our legislators and regulators must take up their roles to get data use, and digital contract terms and conditions right for citizens, with simplicity and oversight. In doing so they will enable better protection against risks for commercial and non-profit orgs, while putting data subjects first.

To achieve greatness in a digital future we need: ‘people speaking the same language, then nothing they plan to do will be impossible for them’.

Ethics. It’s more than just a county east of London.

Let’s challenge decision makers to plant the best of what is human at the heart of the technology revolution: doing the right thing.

And from data, we will see the fruits of knowledge flourish.

******

1. Digital revolution by design: building for change and people
2. Digital revolution by design: barriers by design
3
. Digital revolution by design: infrastructures and the fruits of knowledge
4. Digital revolution by design: infrastructures and the world we want

Reputational risk. Is NHS England playing a game of public confidence?

“By when will NHS England commit to respect the 700,000 objections  [1] to secondary data sharing already logged* but not enacted?” [gathered from objections to secondary uses in the care.data rollout, Feb 2014*]

Until then, can organisations continue to use health data held by HSCIC for secondary purposes, ethically and legally, or are they placing themselves at reputational risk?

If HSCIC continues to share, what harm may it do to public confidence in data sharing in the NHS?

I should have asked this explicitly of the National Information Board (NIB) June 17th board meeting [2], that rode in for the last 3 hours of the two day Digital Health and Care Congress at the King’s Fund.

But I chose to mention it only in passing, since I assumed it is already being worked on and a public communication will follow very soon. I had lots of other constructive things I wanted to hear in the time planned for ‘public discussion’.

Since then it’s been niggling at me that I should have asked more directly, as it dawned on me watching the meeting recording and more importantly when reading the NIB papers [3], it’s not otherwise mentioned. And there was no group discussion anyway.

Mark Davies. Director at UK Department of Health talked in fairly jargon-free language about transparency. [01:00] I could have asked him when we will see more of it in practice?

Importantly, he said on building and sustaining public trust, “if we do not secure public trust in the way that we collect store and use their personal confidential data, then pretty much everything we do today will not be a success.”

So why does the talk of securing trust seem at odds with the reality?

Evidence of Public Voice on Opt Out

Is the lack of action based on uncertainty over what to do?

Mark Davies also said “we have only a sense” and we don’t have “a really solid evidence base” of what the public want. He said, “people feel slightly uncomfortable about data being used for commercial gain.” Which he felt was “awkward” as commercial companies included pharma working for public good.

If he has not done so already, though I am sure he will have, he could read the NHS England own care.data listening feedback. People were strongly against commercial exploitation of data. Many were livid about its use. [see other care.data events] Not ‘slightly uncomfortable.’  And they were able to make a clear distinction between uses by commercial companies they felt in the public interest, such as bona fide pharma research and the differences with consumer market research, even if by the same company.  Risk stratification and commissioning does not need, and should not have according to the Caldicott Review [8], fully identifiable individual level data sharing.

Uses are actually not so hard to differentiate. In fact, it’s exactly what people want. To have the choice to have their data used only for direct care  or to choose to permit sharing between different users, permitting say, bona fide research.  Or at minimum, possible to exclude commercially exploitative uses and reuse. To enable this would enable more data sharing with confidence.

I’d also suggest there is a significant evidence base gathered in the data trust deficit work from the Royal Statistical Society, a poll on privacy for the Joseph Rowntree Foundation, and work done for the ADRN/ESRC. I’m sure he and the NIB are aware of these projects, and Mark Davies said himself more is currently being done with the Nuffield Trust.

Work with almost 3,000 young for the Royal Academy of Engineering people confirmed what those interested in privacy know, but is the opposite of what is often said about young people and privacy – they care and want control:

youngpeople_privacy

NHS England has itself further said it has held ‘over 180’ listening events in 2014 and feedback was consistent with public letters to papers, radio phone-ins and news reports in spring 2014.

Don’t give raw data out, exclude access to commercial companies not working in the public interest, exclude non-bona fide research use and re-use licenses, define the future purposes, improve legal protection including the opt out and provide transparency to trust.

How much more evidence does anyone need to have of public understanding and feeling, or is it simply that NHS England and the DH don’t like the answers given? Listening does not equal heard.

Here’s some of NHS England’s own slides – [4] points included a common demand from the public to give the opt out legal status:

legal

 

Opt out needs legal status

Paul Bate talked about missing pieces of understanding on secondary uses, for [56:00] [3] “Commissioners, researchers, all the different regulators.” He gave an update, which assumed secondary use of data as the norm.

But he missed out any mention of the perceived cost of loss of confidentiality, and loss of confidence since the failure to respect the 9nu4 objections made in the 2014 aborted care.data rollout. That’s not even mentioning that so many did not even recall getting a leaflet, so those 700,00K came from the most informed.

When the public sees their opt out is not respected they lose trust in the whole system of data sharing. Whether for direct care, for use by an NHS organisation, or by any one of the many organisations vying to manage their digital health interaction and interventions. If someone has been told data will not be shared with third parties and it is, why would they trust any other governance will be honoured?

By looking back on the leadership pre- care.data flawed thinking ‘no one who uses a public service should be allowed to opt out of sharing their records, nor can people rely on their record being anonymised’ and its resulting disastrous attempt to rollout without communication and then a second at fair processing, lessons learned should inform future projects. That includes care.data mark 2. This < is simply daft.

You can object and your data will not be extracted and you can make no contribution to society, Mr. Kelsey answered a critic on twitter in 2014 and revealed that his thinking really hasn’t changed very much, even if he has been forced to make concessions. I should have said at #kfdigital15, ignoring what the public wants is not your call to make.

What legal changes will be made that back up the verbal guarantees given since February? If none are forthcoming, then were the statements made to Parliament untrue? 

“people should be able to opt out from having their anonymised data used for the purposes of scientific research.” [Hunt, 2014]

We are yet to see this legal change and to date, the only publicly stated choice is only for identifiable data, not all data for secondary purposes including anonymous, as offered by the Minister in February 2014, and David Cameron in 2010.

If Mark Davies is being honest about how important he feels trust is to data sharing, implementing the objection should be a) prioritised and b) given legal footing.optout_ppt

 

Risks and benefits : need for a new social contract on Data

Simon Denegri recently wrote [5] he believes there are “probably five years to sort out a new social contract on data in the UK.”

I’d suggest less, if high profile data based projects or breaches irreparably damage public trust first, whether in the NHS or consumer world. The public will choose to share increasingly less.

But the public cannot afford to lose the social benefits that those projects may bring to the people who need them.

Big projects, such as care.data, cannot afford for everyone’s sake to continue to repeatedly set off and crash.

Smaller projects, those planned and in progress by each organisation and attendee at the King’s Fund event, cannot afford for those national mistakes to damage the trust the public may otherwise hold in the projects at local level.

I heard care.data mentioned five different times over the two-day event  in different projects as having harmed the project through trust or delays. We even heard examples of companies in Scotland going bust due to rollouts with slowed data access and austerity.

Individuals cannot afford for their reputation to be harmed through association, or by using data in ways the public finds unreasonable and get splashed across the front page of the Telegraph.

Clarity is needed for everyone using data well whether for direct care with implied consent, or secondary uses without it, and it is in the public interest to safeguard access to that data.

A new social contract on data would be good all round.

Reputational Risk

The June 6th story of the 700,000 unrespected opt outs has been and gone. But the issue has not.

Can organisations continue to use that data ethically and legally knowing it is explicitly without consent?

“When will those objections be implemented?” should be a question that organisations across the country are asking – if reputational risk is a factor in any datasharing decision making – in addition to the fundamental ethical principle: can we continue to use the data from an individual from whom we know consent was not freely given and was actively withheld?

What of projects that use HES or hospital secondary care sites’ submitted data and rely on the HSCIC POM mechanisms? How do those audits or other projects take HES secondary objections into account?

Sir Nick Partridge said in the April 2014 HSCIC HES/SUS audit there should be ‘no surprises’ in future.

That future is now. What has NHS England done since to improve?

“Consumer confidence appears to be fragile and there are concerns that future changes in how data may be collected and used (such as more passive collection via the Internet of Things) could test how far consumers are willing to continue to provide data.” [CMA Consumer report] [6]

The problem exists across both state and consumer data sharing. It is not a matter of if, but when, these surprises are revealed to the public with unpredictable degrees of surprise and revulsion, resulting in more objection to sharing for any purposes at all.

The solutions exist: meaningful transparency, excluding commercial purposes which appear exploitative, consensual choices, and no surprises. Shape communications processes by building-in future change to today’s programmes to future proof trust.

Future-proofing does not mean making a purpose and use of data so vague as to be all encompassing – exactly what the public has said at care.data listening events they do not want and will not find sufficient to trust nor I would argue, would it meet legally adequate fair processing – it must build and budget for mechanisms into every plan today, to inform patients of the future changes to use or users of data already gathered, and offer them a new choice to object or consent. And they should have a way to know who used what.

The GP who asked the first of the only three questions that were possible in 10 minutes Q&A from the room, had taken away the same as I had: the year 2020 is far too late as a public engagement goal. There must be much stronger emphasis on it now. And it is actually very simple. Do what the public has already asked for.

The overriding lesson must be, the person behind the data must come first. If they object to data being used, that must be respected.

It starts with fixing the opt outs. That must happen. And now.

Public confidence is not a game [7]. Reputational risk is not something organisations should be forced to gamble with to continue their use of data and potential benefits of data sharing.

If NHS England, the NIB or Department of Health know how and when it will be fixed they should say so. If they don’t, they better have a darn good reason why and tell us that too.

‘No surprises’, said Nick Partridge.

The question decision makers must address for data management is, do they continue to be part of the problem or offer part of the solution?

******

References:

[1]The Telegraph, June 6th 2015 http://www.telegraph.co.uk/news/health/news/11655777/Nearly-1million-patients-could-be-having-confidential-data-shared-against-their-wishes.html

[2]  June 17th NIB meeting http://www.dh-national-information-board.public-i.tv/core/portal/webcast_interactive/180408

[3] NIB papers / workstream documentation https://www.gov.uk/government/publications/plans-to-improve-digital-services-for-the-health-and-care-sector

[4] care.data listening feedback http://www.england.nhs.uk/wp-content/uploads/2015/01/care-data-presentation.pdf

[5] Simon Denegri’s blog http://simondenegri.com/2015/06/18/is-public-involvement-in-uk-health-research-a-danger-to-itself/

[6] CMA findings on commercial use of consumer data https://www.gov.uk/government/news/cma-publishes-findings-on-the-commercial-use-of-consumer-data

[7] Data trust deficit New research finds data trust deficit with lessons for policymakers: http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

[8] Caldicott review: information governance in the health and care system

Off the record – a case study in NHS patient data access

Patient online medical records’ access in England was promised by April 2015.

HSCIC_statsJust last month headlines abounded “GPs ensure 97% of patients can access summary record online“. Speeches carried the same statistics.  So what did that actually mean? The HSCIC figures released in May 2015 showed that while around 57 million patients can potentially access something of their care record only 2.5 million or 4.5% of patients had actively signed up for the service.

In that gap lies a gulf of a difference. You cannot access the patient record unless you have signed up for it, so to give the impression that 97% of patients can access a summary record online is untrue.  Only 4.5% can, and have done so. While yes, this states patients must request access, the impression is somewhat misrepresentative.

Here’s my look at what that involved and once signed up, what ‘access your medical records’ actually may mean in practice.

The process to getting access

First I wrote a note to the practice manager about a month ago, and received a phone call a few days later to pop in any time. A week later, I called to check before I would ‘pop in’ and found that the practice manager was not in every day, and it would actually have to be when she was.

I offered to call back and arrange a suitable date and time. Next call, we usefully agreed the potential date I could go in, but I’d have to wait to be sure that the paper records had first been retrieved from the external store (since another practice closed down ours had become more busy than ever and run out of space.) I was asked whether I had already received permission from the practice manager and to confirm that I knew there would be a £10 charge.

So, one letter, four phone calls and ten pounds in hard cash later, I signed a disclosure form this morning to say I was me and I had asked to see my records, and sat in a corner of the lovely practice manager’s office with a small thinly stuffed Lloyd George envelope, and a few photocopied or printed-out A4 pages  (so I didn’t get to actually look at my own on-screen record the GP uses) and a receipt.

What did my paper records look like?

My oldest notes on paper went back as far as 1998 and were for the most part handwritten. Having lived abroad since there was then a ten year gap until my new registration and notes moved onto paper prints of electronic notes.

These included referral for secondary care, correspondence between consultants and my GP and/or to and from me.

The practice manager was very supportive and tolerant of me taking up a corner of her office for half an hour. Clutching a page with my new log-in for the EMIS web for patient records access, I put the papers back, said my thank yous and set off home.

Next step: online

I logged on at home to the patient access system. Having first had it in 2009 when I registered, I hadn’t used the system since as it had very limited functionality, and I had had good health. Now I took the opportunity to try it again.

By asking the GP practice reception, I had been assigned a PIN, given the Practice ID, an Access ID and confirmation of my NHS number all needed entry in Step 1:

emis1

 

Step 2: After these on screen 2, I was asked for my name, DOB, and to create a password.

emis2

 

Step 3: the system generated a long number user ID which I noted down.

Step 4: I looked for the data sharing and privacy policy. Didn’t spot with whom data entered would be shared or for what purposes and any retention or restrictions of purposes. I’d like to see that added.

emis3
Success:

Logged on using my new long user ID and password, I could see an overview page with personal contact details, which were all accurate.  Sections for current meds, allergies, appointments, medical record, personal health record and repeats prescriptions. There was space for overview of height, BMI and basic lifestyle (alcohol and smoking) there too.

emis4c

 

A note from 2010 read: “refused consent to upload national. sharing. electronic record.” Appropriately some may perhaps think, this was recorded in the “problems” section, which was otherwise empty.

Drilling down to view the medication record,  the only data held was the single most recent top line prescription without any history.

emis4b

 

And the only other section to view was allergies, similarly and correctly empty:

emis5

The only error I noted was a line to say I was due an MMR immunization in June 2015. [I will follow up to check whether one of my children should be due for it, rather than me.]

What else was possible?

Order repeat prescription: If your practice offers this service there is a link called Make a request in the Repeat Prescriptions section of the home page after you have signed in. This was possible. Our practice already does it direct with the pharmacy.

Book an appointment: with your own GP from dates in a drop down.

Apple Health app integration: The most interesting part of the online access was this part that suggested it could upload a patient’s Apple health app data, and with active patient consent, that would be shared with the GP.

emis6

 

It claims: “You can consent to the following health data types being shared to Patient Access and added to your Personal Health Record (PHR):”

  • Height
  • Weight
  • BMI
  • Blood Glucose
  • Blood Pressure (Diastolic & Systolic)
  • Distance (walked per day)
  • Forced expired volume
  • Forced Vital capacity
  • Heart Rate
  • Oxygen Saturation
  • Peak Expiratory Flow
  • Respiratory rate
  • Steps (taken per day)

“This new feature is only available to users of IOS8 who are using the Apple Health app and the Patient Access app.”

 

With the important caveat for some: IOS 8.1 has removed the ability to manually enter Blood Glucose data via the Health app. Health will continue to support Blood Glucose measurements added via 3rd party apps such as MySugr and iHealth.

Patient Access will still be able to collect any data entered and we recommend entering Blood Glucose data via one of those free apps until Apple reinstate the capability within Health.

What was not possible:

To update contact details: The practice configures which details you are allowed to change. It may be their policy to restrict access to change some details only in person at the practice.

Viewing my primary care record: other than a current medication there was nothing of my current records in the online record. Things like test results were not in my online record at all, only on paper. Pulse noted sensible concerns about this area in 2013.

Make a correction: clearly the MMR jab note is wrong, but I’ll need to ask for help to remove it.

“Currently the Patient Access app only supports the addition of new information; however, we envisage quickly extending this functionality to delete information via the Patient Access service.” How this will ensure accuracy and avoid self editing I am unsure.

Questions: Who can access this data?

While the system stated that “the information is stored securely in our accredited data centre that deals solely with clinical data. ” there is no indication of where, who manages it and who may access it and why.

In 2014 it was announced that pharmacies would begin to have access to the summary care record.

“A total of 100 pharmacies across Somerset, Northampton, North Derbyshire, Sheffield and West Yorkshire will be able to view a patient’s summary care record (SCR), which contains information such as a patient’s current medications and allergies.”

Yet clearly in the Summary Care Record consent process in 2010 from my record, pharmacists were not mentioned.

Does the patient online access also use the Summary Care Record or not? If so, did I by asking for online access, just create a SCR without asking for one? Or is it a different subset of data? If they are different, which is the definitive record?

Overall:

From stories we read it could appear that there are broad discrepancies between what is possible in one area of the country from another, and between one practice and another.

Clearly to give the impression that 97% of patients can access summary records online is untrue to date if only 4.5% actually can get onto an electronic system, and see any part of their records, on demand today.

How much value is added to patients and practitioners in that 4.5% may vary enormously depending upon what functionality they have chosen to enable at different locations.

For me as a rare user of the practice, there is no obvious benefit right now. I can book appointments during the day by telephone and meds are ordered through the chemist. It contained no other information.

I don’t know what evidence base came from patients to decide that Patient Online should be a priority.

How many really want and need real time, online access to their records? Would patients not far rather the priority in these times of austerity, the cash and time and IT expertise be focused on IT in direct care and visible by their medics? So that when they visit hospital their records would be available to different departments within the hospital?

I know which I would rather have.

What would be good to see?

I’d like to get much clearer distinction between the data purposes we have of what data we share for direct and indirect purposes, and on what legal basis.

Not least because it needs to be understandable within the context of data protection legislation. There is often confusion in discussions of what consent can be implied for direct care and where to draw its limit.

The consultation launched in June 2014 is still to be published since it ended in August 2014, and it too blurred the lines between direct care and secondary purposes.  (https://www.gov.uk/government/consultations/protecting-personal-health-and-care-data).

Secondly, if patients start to generate potentially huge quantities of data in the Apple link and upload it to GP electronic records, we need to get this approach correct from the start. Will that data be onwardly shared by GPs through care.data for example?

But first, let’s start with tighter use of language on communications. Not only for the sake of increased accuracy, but so that as a result expectations are properly set for policy makers, practitioners and patients making future decisions.

There are many impressive visions and great ideas how data are to be used for the benefit of individuals and the public good.

We need an established,  easy to understand, legal and ethical framework about our datasharing in the NHS to build on to turn benefits into an achievable reality.

Are care.data pilots heading for a breech delivery?

Call the midwife [if you can find one free, the underpaid overworked miracle workers that they are], the care.data ‘pathfinder’ pilots are on their way! [This is under a five minute read, so there should be time to get the hot water on – and make a cup of tea.]

I’d like to be able to say I’m looking forward to a happy new arrival, but I worry care.data is set for a breech birth. Is there still time to have it turned around? I’d like to say yes, but it might need help.

The pause appears to be over as the NHS England board delegated the final approval of directions to their Chair, Sir Malcolm Grant and Chief Executive, Simon Stevens, on Thursday May 28.

Directions from NHS England which will enable the HSCIC to proceed with their pathfinder pilots’ next stage of delivery.

“this is a programme in which we have invested a great deal, of time and thought in its development.” [Sir Malcolm Grant, May 28, 2015, NHS England Board meeting]

And yet. After years of work and planning, and a 16 month pause, as long as it takes for the gestation of a walrus, it appears the directions had flaws. Technical fixes are also needed before the plan could proceed to extract GP care.data and merge it with our hospital data at HSCIC.

And there’s lots of unknowns what this will deliver.**

Groundhog Day?

The public and MPs were surprised in 2014. They may be even more surprised if 2015 sees a repeat of the same again.

We have yet to hear case studies of who received data in the past and would now be no longer eligible. Commercial data intermediaries? Can still get data, the NHS Open Day was told. And they do, as the HSCIC DARS meeting minutes in 2015 confirm.

By the time the pilots launch, the objection must actually work, communications must include an opt out form and must clearly give the programme a name.

I hope that those lessons have been learned, but I fear they have not been. There is still lack of transparency. NHS England’s communications materials and May-Oct 2014 and any 2015 programme board minutes have not been published.

We have been here before.

Back to September 2013: the GPES Advisory Committee, the ICO and Dame Fiona Caldicott, as well as campaigners and individuals could see the issues in the patient leaflet and asked for fixes.The programme went ahead anyway in February 2014 and although foreseen, failed to deliver. [For some, quite literally.]

These voices aren’t critical for fun, they call for fixes to get it right.

I would suggest that all of the issues raised since April 2014, were broadly known in February 2014 before the pause began. From the public listening exercise,  the high level summary captures some issues raised by patients, but doesn’t address their range or depth.

Some of the difficult and unwanted  issues, are still there, still the same and still being ignored, at least in the public domain. [4]

A Healthy New Arrival?

How is the approach better now and what happens next to proceed?

“It seems a shame,” the Walrus said, “To play them such a trick, After we’ve brought them out so far, And made them trot so quick!” [Lewis Carroll]

When asked by a board member: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach? it wasn’t very clear. [full detail end of post]

First they must pass the tests asked of them by Dame Fiona [her criteria and 27 questions from before Christmas.] At least that was what the verbal background given at the board meeting explained.

If the pilots should be a dip in the water of how national rollouts will proceed, then they need to test not just for today, but at least for the known future of changing content scope and expanding users – who will pay for the communication materials’ costs each time?

If policy keeps pressing forward, will it not make these complications worse under pressure? There may be external pressure ahead as potential changes to EU data protection are expected this year as well, for which the pilot must be prepared and design in advance for the expectations of best practice.

Pushing out the pathfinder directions, before knowing the answers to these practical things and patient questions open for over 16 months, is surely backwards. A breech birth, with predictable complications.

If in Sir Malcolm Grant’s words:

“we would only do this  if we believed it was absolutely critical in the interests of patients.” [Malcom Grant, May 28, 2015, NHS England Board meeting]

then I’d like to see the critical interest of patients put first. Address the full range of patient questions from the ‘listening pause’.

In the rush to just fix the best of a bad job, we’ve not even asked are we even doing the right thing? Is the system designed to best support doctor patient needs especially with the integration “blurring the lines” that Simon Stevens seems set on.

If  focus is on the success of the programme and not the patient, consider this: there’s a real risk too many opt out due to these unknowns. And lack of real choice on how their data gets used. It could be done better to reduce that risk.

What’s the percentage of opt out that the programme deems a success to make it worthwhile?

In March 2014, at a London event, a GP told me all her patients who were opting out were the newspaper reading informed, white, middle class. She was worried that the data that would be included, would be misleading and unrepresentative of her practice in CCG decision making.

medConfidential has written a current status for pathfinder areas that make great sense to focus first on fixing care.data’s big post-election question the opt out that hasn’t been put into effect. Of course in February 2014 we had to choose between two opt outs -so how will that work for pathfinders?

In the public interest we need collectively to see this done well. Another mis-delivery will be fatal. “No artificial timelines?”

Right now, my expectations are that the result won’t be as cute as a baby walrus.

******

Notes from the NHS England Board Meeting, May 28, 2015:

TK said:  “These directions [1] relate only to the pathfinder programme and specify for the HSCIC what data we want to be extracted in the event that Dame Fiona, this board and the Secretary of State have given their approval for the extraction to proceed.

“We will be testing in this process a public opt out, a citizen’s right to opt out, which means that, and to be absolutely clear if someone does exercise their right to opt out, no clinical data will be extracted from their general practice,  just to make that point absolutely clearly.

“We have limited access to the data, should it be extracted at the end of the pathfinder phase, in the pathfinder context to just four organisations: NHS England, Public Health England, the HSCIC and CQC.”

“Those four organisations will only be able to access it for analytic purposes in a safe, a secure environment developed by the Information Centre [HSCIC], so there will be no third party hosting of the data that flows from the extraction.

“In the event that Dame Fiona, this board, the Secretary of State, the board of the Information Centre, are persuaded that there is merit in the data analysis that proceeds from the extraction, and that we’ve achieved an appropriate standard of what’s called fair processing, essentially have explained to people their rights, it may well be that we proceed to a programme of national rollout, in that case this board will have to agree a separate set of directions.”

“This is not signing off anything other than a process to test communications, and for a conditional approval on extracting data subject to the conditions I’ve just described.”

CD said: “This is new territory, precedent, this is something we have to get right, not only for the pathfinders but generically as well.”

“One of the consequences of having a pathfinder approach, is as Tim was describing, is that directions will change in the future. So if we are going to have a truly fair process , one of the things we have to get right, is that for the pathfinders, people understand that the set of data that is extracted and who can use it in the pathfinders, will both be a subset of, the data that is extracted and who can use it in the future. If we are going to be true to this fair process, we have to make sure in the pathfinders that we do that.

“For example, at the advisory group last week, is that in the communication going forward we have to make sure that we flag the fact there will be further directions, and they will be changed, that we are overt in saying, subject to what Fiona Caldicott decides, that process itself will be transparent.”

Questions from Board members:
Q: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach?
What are the top three objectives we seek to achieve?

TK: So, Dame Fiona has set a series of standards she expects the pathfinders to demonstrate, in supporting GPs to be able to discharge this rather complex communication responsibility, that they have under the law  in any case.

“On another level how we can demonstrate that people have adequately understood their right to opt out [..]

“and how do we make sure that populations who are relatively hard to reach, although listed with GPs, are also made aware of their opportunity to opt out.

Perhaps it may help if I forward this to the board, It is in the public domain. But I will forward the letter to the board.”

“So that lays out quite a number of specific tangible objectives that we then have to evaluate in light of the pathfinder experience. “

Chair: “this is a programme in which we have invested a great deal, of time and thought in its development, we would only do this  if we believed it was absolutely critical in the interests of patients, it was something that would give us the information the intelligence that we need to more finely attune our commissioning practice, but also to get real time intelligence about how patients lives are lived, how treatments work and how we can better provide for their care.

“I don’t think this is any longer a matter of huge controversy, but how do we sensitively attune ourselves to patient confidentiality.”

“I propose that […] you will approve in principle the directions before you and also delegate to the Chief Executive and to myself to do final approval on behalf of the board, once we have taken into account the comments from medConfidential and any other issues, but the substance will remain unchanged.”

******

[4] request for the release of June 2014 Open House feedback still to be published in the hope that the range and depth of public questions can be addressed.

care.data comms letter

******
“The time has come,” the walrus said, “to talk of many things.”
[From ‘The Walrus* and the Carpenter’ in Through the Looking-Glass by Lewis Carroll]

*A walrus has a gestation period of about 16 months.
The same amount of time which the pause in the care.data programme has taken to give birth to the pathfinder sites.

references:
[1] NHS England Directions to HSCIC: May 28 2015 – http://www.england.nhs.uk/wp-content/uploads/2015/05/item6-board-280515.pdf
[2] Notes from care.data advisory group meeting on 27th February 2015
[3] Patient questions: https://jenpersson.com/pathfinder/
[4] Letter from NHS England in response to request from September, and November 2014 to request that public questions be released and addressed


15 Jan 2024: Image section in header replaced at the request of likely image tracing scammers who don’t own the rights and since it and this blog is non-commercial would fall under fair use anyway. However not worth the hassle. All other artwork on this site is mine.