Category Archives: transparency

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex. Too little, too late.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex ed. They should be discussing prevention and personal data protection for all our personal data, not just one company, after the event.

Everyone’s been talking about TalkTalk and for all the wrong reasons. Data loss and a 15-year-old combined with a reportedly reckless response to data protection, compounded by lack of care.

As Rory Cellan-Jones wrote [1] rebuilding its reputation with customers and security analysts is going to be a lengthy job.

In Parliament Chi Onwarah, Shadow Minister for Culture & the Digital Economy, summed up in her question, asking the Minister to acknowledge “that all the innovation has come from the criminals while the Government sit on their hands, leaving it to businesses and consumers to suffer the consequences?”  [Hansard 2]

MPs were concerned for the 4 million* customers’ loss of name, date of birth, email, and other sensitive data, and called for an inquiry. [It may now be fewer*.] [3] The SciTech committee got involved too.

I hope this means Parliament will talk about TalkTalk not as the problem to be solved, but as one case study in a review of contemporary policy and practices in personal data handling.

Government spends money in data protection work in the [4] “National Cyber Security Programme”. [NCSP] What is the measurable outcome – particularly for TalkTalk customers and public confidence – from its £860M budget?  If you look at the breakdown of those sums, with little going towards data protection and security compared with the Home Office and Defence, we should ask if government is spending our money in an appropriately balanced way on the different threats it perceives. Keith Vaz suggested British companies that lose £34 billion every year to cybercrime. Perhaps this question will come into the inquiry.

This all comes after things have gone wrong.  Again [5]. An organisation we trusted has abused that trust by not looking after data with the stringency that customers should be able to expect in the 21st century, and reportedly not making preventative changes, apparent a year ago. Will there be consequences this time?

The government now saying it is talking about data protection and consequences, is like saying they’re talking sex education with teens, but only giving out condoms to the boys.

It could be too little too late. And they want above all to avoid talking about their own practices. Let’s change that.

Will this mean a review to end risky behaviour, bring in change, and be wiser in future?

If MPs explore what the NCSP does, then we the public, should learn more about what government’s expectations of commercial companies is in regards modern practices.

In addition, any MPs’ inquiry should address government’s own role in its own handling of the public’s personal data. Will members of government act in a responsible manner or simply tell others how to do so?

Public discussion around both commercial and state use of our personal data, should mean genuine public engagement. It should involve a discussion of consent where necessary for purposes  beyond those we expect or have explained when we submit our data, and there needs to be a change in risky behaviour in terms of physical storage and release practices, or all the talk, is wasted.

Some say TalkTalk’s  practices mean they have broken their contract along with consumer trust. Government departments should also be asking whether their data handling would constitute a breach of the public’s trust and reasonable expectations.

Mr Vaizey should apply his same logic to government handling data as he does to commercial handling. He said he is open to suggestions for improvement. [6]

Let’s not just talk about TalkTalk.

    • Let’s Talk Consequences: organisations taking risk seriously and meaningful consequences if not [7]
    • Let’s Talk Education: the education of the public on personal data use by others and rights and responsibilities we have [8]
    • Let’s Talk Parliament’s Policies and Practices: about its own complementary lack of data  understanding in government and understand what good practice is in physical storage, good governance and transparent oversight
    • Let’s Talk Public Trust: and the question whether government can be trusted with public data it already has and whether its current handling makes it trustworthy to take more [9]

Vaizey said of the ICO now in his own department: “The Government take the UK’s cyber-security extremely seriously and we will continue to do everything in our power to protect organisations and individuals from attacks.”

“I will certainly meet the Information Commissioner to look at what further changes may be needed in the light of this data breach. [..] It has extensive powers to take action and, indeed, to levy significant fines. “

So what about consequences when data are used in ways the public would consider a loss, and not through an attack or a breach, but government policy? [10]

Let’s Talk Parliament’s Policies and Practices

Commercial companies are not alone in screwing up the use and processing [11] management of our personal data. The civil service under current policy seems perfectly capable of doing by itself. [12]

Government data policy has not kept up with 21st century practices and to me seems to work in the dark, as Chi Onwarah said,

‘illuminated by occasional flashes of incompetence.’

This incompetence can risk harm to people’s lives, to business and to public confidence.

And once given, trust would be undermined by changing the purposes or scope of use for which it was given, for example as care.data plans to do after the pilot. A most risky idea.

Trust in these systems, whether commercial or state, is crucial. Yet reviews which highlight this, and make suggestions to support trust such as ‘data should never be (and currently is never) released with personal identifiers‘ in The Shakespeare Review have been ignored by government.

Where our personal data are not used well in government departments by the department themselves, they seem content to date to rely on public ignorance to get away with current shoddy practices.

Practices such as not knowing who all your customers are, because they pass data on to others. Practices, such as giving individual level identifiable personal data to third parties without informing the public, or asking for consent. Practices, such as never auditing or measuring any benefit of giving away others personal data.

“It is very important that all businesses, particularly those handling significant amounts of sensitive customer data, have robust procedures in place to protect those data and to inform customers when there may have been a data breach.” Ed Vaizey, Oct 26th, HOC

If government departments prove to be unfit to handle the personal data we submit in trust to the state today, would we be right to trust them with even more?

While the government is busy wagging fingers at commercial data use poor practices, the care.data debacle is evidence that not all its MPs or civil service understand how data are used in commercial business or through government departments.

MPs calling for commercial companies to sharpen up their data protection must understand how commercial use of data often piggy-backs the public use of our personal data, or others getting access to it via government for purposes that were unintended.

Let’s Talk Education

If the public is to understand how personal data are to be kept securely with commercial organisations, why should they not equally ask to understand how the state secures their personal data? Educating the public could lead to better engagement with research, better understanding of how we can use digital services and a better educated society as a whole. It seems common sense.

At a recent public event [13],  I asked civil servants talking about big upcoming data plans they announced, linking school data with more further education and employment data, I asked how they planned to involve the people whose data they would use. There was no public engagement to mention. Why not? Inexcusable in this climate.

Public engagement is a matter of trust and developing understanding in a relationship. Organisations must get this right.[14]

If government is discussing risky practices by commercial companies, they also need to look closer to home and fix what is broken in government data handling where it exposes us to risk through loss of control of our personal data.

The National Pupil Database for example, stores and onwardly shares identifiable individual sensitive data of at least 8m children’s records from age 2 -19. That’s twice as big as the TalkTalk loss was first thought to be.

Prevention not protection is what we should champion. Rather than protection after the events,  MPs and public must demand emphasis on prevention measures in our personal data use.

This week sees more debate on how and why the government will legislate to have more powers to capture more data about all the people in the country. But are government policy, process and practices fit to handle our personal data, what they do with it and who they give it to?

Population-wide gathering of data surveillance in any of its many forms is not any less real just because you don’t see it. Children’s health, schools, increases in volume of tax data collection. We don’t discuss enough how these policies can be used every day without the right oversight. MPs are like the conservative parents not comfortable talking to their teens about sleeping with someone. Just because you don’t know, it doesn’t mean they’re not doing it. [15] It just means you don’t want to know because if you find out they’re not doing it safely, you’ll have to do something about it.

And it might be awkward. (Meanwhile in schools real, meaningful PHSE has been left off the curriculum.)

Mr. Vaizey asked in the Commons for suggestions for improvement.

My suggestion is this. How government manages data has many options. But the principle should be simple. Our personal data needs not only protected, but not exposed to unnecessary risk in the first place, by commercial or state bodies. Doing nothing, is not an option.

Let’s Talk about more than TalkTalk

Teens will be teens. If commercial companies can’t manage their systems better to prevent a child successfully hacking it, then it’s not enough to point at criminal behaviour. There is fault to learn from on all sides. In commercial and state uses of personal data.

There is talk of new, and bigger, data sharing plans. [16]

Will the government wait to see  and keep its fingers crossed each month to see if our data are used safely at unsecured settings with some of these unknown partners data might be onwardly shared with, hoping we won’t find out and they won’t need to talk about it, or have a grown up public debate based on public education?

Will it put preventative measures in place appropriate to the sensitivity and volume of the data it is itself responsible for?

Will moving forward with new plans mean safer practices?

If government genuinely wants our administrative data at the heart of digital government fit for the 21st century, it must first understand how all government departments collect and use public data. And it must educate the public in this and commercial data use.

We need a fundamental shift in the way the government respects public opinion and shift towards legal and privacy compliance – both of which are lacking.

Let’s not talk about TalkTalk. Let’s have meaningful grown up debate with genuine engagement. Let’s talk about prevention measures in our data protection. Let’s talk about consent. It’s personal.

******

[1] Questions for TalkTalk: http://www.bbc.co.uk/news/technology-34636308

[2] Hansard: http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm151026/debtext/151026-0001.htm#15102612000004

[3] TalkTalk update: http://www.talktalkgroup.com/press/press-releases/2015/cyber-attack-update-tuesday-october-30-2015.aspx

[4] The Cyber Security Programme: http://www.civilserviceworld.com/articles/feature/depth-look-national-cyber-security-programme

[5] Paul reviews TalkTalk; https://paul.reviews/value-security-avoid-talktalk/

[6] https://ico.org.uk/for-organisations/guide-to-data-protection/conditions-for-processing/

[7] Let’s talk Consequences: the consequences of current failures to meet customers’ reasonable expectations of acceptable risk, are low compared with elsewhere.  As John Nicolson (East Dunbartonshire) SNP pointed out in the debate, “In the United States, AT&T was fined £17 million for failing to protect customer data. In the United Kingdom, the ICO can only place fines of up to £500,000. For a company that received an annual revenue of nearly £1.8 billion, a fine that small will clearly not be terrifying. The regulation of telecoms must be strengthened to protect consumers.”

[8] Let’s talk education: FOI request revealing a samples of some individual level data released to members of the press: http://www.theyworkforyou.com/debates/?id=2015-10-26b.32.0

The CMA brought out a report in June, on the use of consumer data, the topic should be familiar in parliament, but little engagement has come about as a result. It suggested the benefit:

“will only be realised if consumers continue to provide data and this relies on them being able to trust the firms that collect and use it”, and that “consumers should know when and how their data is being collected and used and be able to decide whether and how to participate. They should have access to information from firms about how they are collecting, storing and using data.”

[9] Let’s Talk Public Trust – are the bodies involved Trustworthy? Government lacks an effective data policy and is resistant to change. Yet it wants to collect ever more personal and individual level for unknown purposes from the majority of 60m people, with an unprecedented PR campaign.  When I heard the words ‘we want a mature debate’ it was reminiscent of HSCIC’s ‘intelligent grown up debate’ requested by Kinglsey Manning, in a speech when he admitted lack of public knowledge was akin to a measure of past success, and effectively they would rather have kept the use of population wide health data ‘below the radar’.

Change: We need change, the old way after all, didn’t work, according to Minister Matt Hancock: “The old model of government has failed, so we will build a new one.” I’d like to see what that new one will look like. Does he mean to expand only data sharing policy, or the powers of the civil service?

[10] National Pupil Database detailed data releases to third parties https://www.whatdotheyknow.com/request/pupil_data_national_pupil_databa

[11] http://adrn.ac.uk/news-events/latest-news/adrn-rssevent

[12] https://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[13] https://www.liberty-human-rights.org.uk/human-rights/privacy/state-surveillance

[14] http://www.computerweekly.com/news/4500256274/Government-will-tackle-barriers-to-sharing-and-linking-data-says-Cabinet-Office-minister-Hancock

George and the Chinese Dragon. Public spending and the cost of dignity.

In 2005 I sat early one morning in an enormous international hotel chain’s breakfast room, in Guangzhou.

Most of the men fetching two adult breakfasts from the vast buffet wore cream coloured chinos, and button down shirts. They sported standardised haircuts with hints of silver. Stylish women sat at  impeccable tables, cradling  babies in pink hats or spoon feeding small children.

On a busy downtown street, close to the Chinese embassy, the hotel was popular with American parents-to-be.

My local colleague explained to me later, that her sadness over thousands of Chinese daughters exported from a one-child policy nation in 2005 was countered by the hope that loving foreign families were found for them.

She repeated with dignity, party mantras and explanations drilled at school. She has good job, (but still she could not afford children). Too little land, too few schools, and healthcare too expensive. She sighed. Her eyes lit up as she looked at my bump and asked if I knew “girl or boy?” If it were a girl, she added, how beautiful she would be with large open eyes. We laughed about the contradictory artificial stereotypes of beauty, from East and West, each nation wanting what the other did not have.

Ten tears later in 2015, British Ministers have been drawing on China often recently, as a model for us to follow; in health, education and for the economy. Seeking something they think we do not have. Seeking to instill ‘discipline, hard-working, economy-first’ spin.

At the recent ResearchEd conference, Nick Gibb, [1] Minister of State at the Department for Education, talked about the BBC documentary “Are Our Kids Tough Enough” for several minutes and the positive values of the Chinese and its education system. It supposedly triggered ‘a global debate’ when British pupils experienced “the harsh discipline of a Chinese classroom”.

The Global Times praised the  First Minister Mr. Osborne as “the first Western official in recent years who focused on business potential rather than raising a magnifying glass to the ‘human rights issue” during his recent visit [2] when he put economic growth first.

Jeremy Hunt, Secretary of State for Health, was quoted at the political party conference  saying that he saw tax cut changes necessary as a cultural shift.  He suggested we should adopt the ‘hardworking’ character of the Chinese.

An attribute that is as artificial as it is inane.

Collective efforts over the last year or more, to project ‘hard-working’ as a measure of contribution to UK society into politics has become more concentrated, especially around the election. People who are not working, are undermined by statements inferring the less productive for the nation, the less value someone has as a person. Comments are repeated in a sustained drip feed, from Lord Freud’s remarks a year ago that disabled workers were not worth the full wage, to Hancock’s recent revelation that the decision to not apply the new minimum wage to the under 25s from 2016 “was an active policy choice.”  Mr. Hunt spoke about dignity being self-earned, not dependent on richness per se, but being self-made.

“If that £16,500 is either a high proportion or entirely through the benefit system you are trapped. It matters if you are earning that yourself, because if you are earning it yourself you are independent and that is the first step towards self-respect.”

This choice to value some people’s work less than others and acceptance of spin, is concerning.

What values are Ministers suggesting we adopt in the relentless drive for economic growth? [3] When our Ministers ignore human rights and laud Chinese values in a bid to be seen as an accepting trading partner, I wonder at what cost to our international integrity?

Simple things we take for granted such as unimpeded internet  access are not available in China. In Chinese society, hard working is not seen as such a positive value. It is a tolerated norm, and sometimes an imposed one at that, where parents leave their child with grandparents in the countryside and visit twice a year on leave from their city-based jobs. Our Ministers’ version of hardworking Chinese is idyllic spin compared with reality.

China is about to launch a scheme to measure sincerity and how each citizen compares with others in terms of compliance and dissent. Using people’s social media data to determine their ‘worth’ is an ominous prospect.

Mark Kitto from 2012 on why you’ll never be Chinese is a great read. I agree, “there are hundreds of well-rounded, wise Chinese people with a modern world view, people who could, and would willingly, help their motherland face the issues that are growing into state-shaking problems .”

Despite such institutional issues, Mr. Osborne appears to have an open door for deals with the Chinese state. Few people missed the announcements he made in China that HS2 will likely be built by Chinese investors, despite home grown opposition. Ministers and EDF have reportedly agreed on a controversial £25bn development of Hinkley Point C, nuclear plant, with most of upfront costs provided by Chinese companies, although “we are the builders.” [4]

Large parts of UK utilities’ infrastructure is founded on Chinese sourced spending in the UK it’s hard to see who ‘we’ are meant to be. [5] And that infrastructure is a two-way trade. Just as Chinese money has bought many of our previously publicly owned utilities, we have sold a staggeringly long list of security related items to the Chinese state. [6]

In July 2014 the four House of Commons Select Committees: “repeated their previous Recommendation that the Government should apply significantly more cautious judgements when considering arms export licence applications for goods to authoritarian regimes which might be used for internal repression.” 

UK to China exports
Chris Patten, former Hong Kong Governor,  criticised Osborne’s lax attitude to human rights but individual and collective  criticism appear to go unheard.

This perhaps is one measure of British economic growth at all costs. Not only is Britain supplying equipment that might be used for internal repression but the Minister appears to have adopted a singularly authoritarian attitude and democratic legitimacy of the Committees has been ignored. That is concerning.

The packaging of how upcoming cuts will be presented is clear.  We will find out what “hard working families” means to the Treasury. We need to work harder, like the Chinese, and through this approach, we will earn our dignity. No doubt rebuilding Britain, on great British values. Welfare will continue to be labelled as benefits, and with it, a value judgement on economic productivity equated with human worth. Cutting welfare, will be packaged as helping those people to help themselves out of self inflicted ‘bad’ situations, in which they have lost their self worth or found an easy ‘lifestyle choice’.

As welfare spending is reduced, its percentage spend with big service providers has risen after reforms, and private companies profit where money was once recycled in the state system. There is a glaring gap in evidence for some of these decisions taken.

What is next? If for example, universal benefits such as Universal Infant Free School Meals are cut, it will take food literally from the mouths of babes, in families who cannot afford to lose hot school dinners, living in poverty but not qualifying for welfare. The policy may be flawed because Free School Meals based on pupil premium entitlement does not cater for all who need it, but catering for none of them is not an improvement.

Ministers focus the arguments of worth and value around the individual. Doctors have been told to work harder. Schools have been told to offer more childcare to enable parents to work harder. How much harder can we really expect people to work? Is the Treasury’s vision is for us all to work more to pay more taxes? It is flawed if by adopting the political aim, the vast majority of people take home little more pay and sacrifice spare time with our friends and loved ones, running our health into the ground as a result.

The Chinese have a proverb that shows a wisdom missing from Ministers’ recent comments: “Time is money, and it is difficult for one to use money to get time.”

I often remember the hotel breakfast room, and wonder how many mothers, in how many in cities in China miss their daughters, whom they could not afford to keep, through fear of the potential effect. How many young men live without women in their lives who would want to, but find the gender imbalance a barrier to meeting someone. How many are struggling to care for elderly parents.

Not all costs can be measured in money.

The grandmother I met on the station platform last Wednesday had looked after her grandchild for half the day and has him overnight weekdays, so that Mum can first sleep and then work a night shift stacking shelves. That’s her daughter’s second shift of the day. She hardly sees her son.  The husband works the shelf-stacking third shift to supplement his income as a mechanic.

That is a real British family.

Those parents can’t work any harder. Their family is already at breaking point. They take no state welfare.  They don’t qualify for any support.

Must we be so driven to become ‘hard working families’ that our children will barely know their parents? Are hungry pupils to make do as best they can at lunchtime? Are these side effects children must be prepared to pay if their parents work ever harder to earn enough to live and earn their ‘dignity’ as defined by the Secretary of State for health?

Dignity is surely inherent in being human. Not something you earn by what you do. At the heart of human rights is the belief that everybody should be treated equally and with dignity – no matter what their circumstances.

If we adopt the Ministers’ be-like-the-Chinese mantra, and accept human dignity is something that must be earned, we should ask now what price have they put on it?

MPs must slay the dragon of spin and demand transparency of the total welfare budget and government spend with its delivery providers. There is a high public cost of further public spending cuts. In order to justify them, it is not the public who must work harder, but the Treasury, in order to deliver a transparent business case what the further sacrifices of ‘hard working families’ will achieve.

 

###

[1] ResearchEd conference, Nick Gibb, Minister of State at the Department for Education

[2] New Statesman

[3] https://www.opendemocracy.net/ournhs/jen-persson/why-is-government-putting-health-watchdogs-on-leash-of-%E2%80%98promoting-economic-growth

[4] The Sun: George Osborne party conference speech with 25 mentions of builders: “We are the builders”, said Mr. Osborne.

[5] The Drum: Li Ka Shing and British investment https://www.thedrum.com/opinion/2015/01/28/meet-li-ka-shing-man-o2-his-sights-has-quietly-become-one-britains-biggest

[6] Arms exports to authoritarian regimes and countries of concern worldwide The Committees http://www.publications.parliament.uk/pa/cm201415/cmselect/cmquad/608/60805.htm#a104

 

[image: Wassily Kandinsky ca 1911, George and the Dragon]

Digital revolution by design: infrastructures and the fruits of knowledge

Since the beginning of time and the story of the Garden of Eden, man has found a way to share knowledge and its power.

Modern digital tools have become the everyday way to access knowledge for many across the world, giving quick access to information and sharing power more fairly.

In this third part of my thoughts on digital revolution by design, triggered by the #kfdigi15 event on June 16-17, I’ve been considering some of the constructs we have built; those we accept and those that could be changed, given the chance, to build a better digital future.

Not only the physical constructions, the often networked infrastructures, but intangible infrastructures of principles and power, co-dependencies around a physical system; the legal and ethical infrastructures of ownership, governance and accountability.

Our personal data flow in systems behind the screens, at the end of our fingertips. Controlled in frameworks designed by providers and manufacturers, government and commercial agencies.

Increasingly in digital discussions we hear that the data subject, the citizen, will control their own data.

But if it is on the terms and conditions set by others, how much control is real and how much is the talk of a consenting citizen only a fig leaf behind which any real control is still held by the developer or organisation providing the service?

When data are used, turned into knowledge as business intelligence that adds value to aid informed decision making. By human or machine.

How much knowledge is too much knowledge for the Internet of Things to build about its users? As Chris Matyszczyk wrote:

“We have all agreed to this. We click on ‘I agree’ with no thought of consequences, only of our convenience.”

Is not knowing what we have agreed to our fault, or the responsibility of the provider who’d rather we didn’t know?

Citizens’ rights are undermined in unethical interactions if we are exploited by easy one-click access and exchange our wealth of data at unseen cost. Can it be regulated to promote, not stifle innovation?

How can we get those rights back and how will ‘doing the right thing’ help shape and control the digital future we all want?

The infrastructures we live inside

As Andrew Chitty says in this HSJ article: “People live more mobile and transient lives and, as a result, expect more flexible, integrated, informed health services.

To manage that, do we need to know how systems work, how sharing works, and trust the functionality of what we are not being told and don’t see behind the screens?

At the personal level, whether we sign up for the social network, use a platform for free email, or connect our home and ourselves in the Internet of things, we each exchange our personal data with varying degrees of willingness. There there is often no alternative if we want to use the tool.

As more social and consensual ‘let the user decide’ models are being introduced, we hear it’s all about the user in control, but reality is that users still have to know what they sign up for.

In new models of platform identity sign on, and tools that track and mine our personal data to the nth degree that we share with the system, both the paternalistic models of the past and the new models of personal control and social sharing are merging.

Take a Fitbit as an example. It requires a named account and data sharing with the central app provider. You can choose whether or not to enable ‘social sharing’ with nominated friends whom you want to share your boasts or failures with. You can opt out of only that part.

I fear we are seeing the creation of a Leviathan sized monster that will be impossible to control and just as scary as today’s paternalistic data mis-management. Some data held by the provider and invisibly shared with third parties beyond our control, some we share with friends, and some stored only on our device.

While data are shared with third parties without our active knowledge, the same issue threatens to derail consumer products, as well as commercial ventures at national scale, and with them the public interest. Loss of trust in what is done behind the settings.

Society has somehow seen privacy lost as the default setting. It has become something to have to demand and defend.

“If there is one persistent concern about personal technology that nearly everybody expresses, it is privacy. In eleven of the twelve countries surveyed, with India the only exception, respondents say that technology’s effect on privacy was mostly negative.” [Microsoft survey 2015, of  12,002 internet users]

There’s one part of that I disagree with. It’s not the effect of technology itself, but the designer or developers’ decision making that affects privacy. People have a choice how to design and regulate how privacy is affected, not technology.

Citizens have vastly differing knowledge bases of how data are used and how to best interact with technology. But if they are told they own it, then all the decision making framework should be theirs too.

By giving consumers the impression of control, the shock is going to be all the greater if a breach should ever reveal where fitness wearable users slept and with whom, at what address, and were active for how long. Could a divorce case demand it?

Fitbit users have already found their data used by police and in the courtroom – probably not what they expected when they signed up to a better health tool.  Others may see benefits that could harm others by default who are excluded from accessing the tool.

Some at org level still seem to find this hard to understand but it is simple:
No trust = no data = no knowledge for commercial, public or personal use and it will restrict the very innovation you want to drive.

Google gmail users have to make 10+ clicks to restrict all ads and information sharing based on their privacy and ad account settings. The default is ad tailoring and data mining. Many don’t even know it is possible to change the settings and it’s not intuitive how to.

Firms need to consider their own reputational risk if users feel these policies are not explicit and are exploitation. Those caught ‘cheating’ users can get a very public slap on the wrist.

Let the data subjects rule, but on whose terms and conditions?

The question every citizen signing up to digital agreements should ask, is what are the small print  and how will I know if they change? Fair processing should offer data protection, but isn’t effective.

If you don’t have access to information, you make decisions based on a lack of information or misinformation. Decisions which may not be in your own best interest or that of others. Others can exploit that.

And realistically and fairly, organisations can’t expect citizens to read pages and pages of Ts&Cs. In addition, we don’t know what we don’t know. Information that is missing can be as vital to understand as that provided. ‘Third parties’ sharing – who exactly does that mean?

The concept of an informed citizenry is crucial to informed decision making but it must be within a framework of reasonable expectation.

How do we grow the fruits of knowledge in a digital future?

Real cash investment is needed now for a well-designed digital future, robust for cybersecurity, supporting enforceable governance and oversight. Collaboration on standards and thorough change plans. I’m sure there is much more, but this is a start.

Figurative investment is needed in educating citizens about the technology that should serve us, not imprison us in constructs we do not understand but cannot live without.

We must avoid the chaos and harm and wasted opportunity of designing massive state-run programmes in which people do not want to participate or cannot participate due to barriers of access to tools. Avoid a Babel of digital blasphemy in which the only wise solution might be to knock it down and start again.

Our legislators and regulators must take up their roles to get data use, and digital contract terms and conditions right for citizens, with simplicity and oversight. In doing so they will enable better protection against risks for commercial and non-profit orgs, while putting data subjects first.

To achieve greatness in a digital future we need: ‘people speaking the same language, then nothing they plan to do will be impossible for them’.

Ethics. It’s more than just a county east of London.

Let’s challenge decision makers to plant the best of what is human at the heart of the technology revolution: doing the right thing.

And from data, we will see the fruits of knowledge flourish.

******

1. Digital revolution by design: building for change and people
2. Digital revolution by design: barriers by design
3
. Digital revolution by design: infrastructures and the fruits of knowledge
4. Digital revolution by design: infrastructures and the world we want

Digital revolution by design: building for change and people (1)

Andy Williams said* that he wants not evolution, but a revolution in digital health.

It strikes me that few revolutions have been led top down.

We expect revolution from grass roots dissent, after a growing consensus in the population that the status quo is no longer acceptable.

As the public discourse over the last 18 months about the NHS use of patient data has proven, we lack a consensual agreement between state, organisations and the public how the data in our digital lives should be collected, used and shared.

The 1789 Declaration of the Rights of Man and Citizen as part of the French Revolution set out a charter for citizens, an ethical and fair framework of law in which they trusted their rights would be respected by fellow men.

That is something we need in this digital revolution.

We are told hand by government that it is necessary to share all our individual level data in health from all sorts of sources.

And that bulk data collection is vital in the public interest to find surveillance knowledge that government agencies want.

At the same time other government departments plan to restrict citizens’ freedom of access to knowledge that could be used to hold the same government  and civil servants to account.

On the consumer side, there is public concern about the way we are followed around on the web by companies including global platforms like Google and Facebook, that track our digital footprint to deliver advertising.

There is growing objection to the ways in which companies scoop up data to build profiles of individuals and groups and personalising how they get treated. Recent objection was to marketing misuse by charities.

There is little broad understanding yet of the power of insight that organisations can now have to track and profile due to the power of algorithms and processing capability.

Technology progress that has left legislation behind.

But whenever you talk to people about data there are two common threads.

The first, is that although the public is not happy with the status quo of how paternalistic organisations or consumer companies ‘we can’t live without’ manage our data, there is a feeling of powerlessness that it can’t change.

The second, is frustration with organisations that show little regard for public opinion.

What happens when these feelings both reach tipping point?

If Marie Antoinette were involved in today’s debate about the digital revolution I suspect she may be the one saying: “let them eat cookies.”

And we all know how that ended.

If there is to be a digital revolution in the NHS where will it start?

There were marvelous projects going on at grassroots discussed over the two days: bringing the elderly online and connected and in housing and deprivation. Young patients with rare diseases are designing apps and materials to help consultants improve communications with patients.

The NIB meeting didn’t have real public interaction and or any discussion of those projects ‘in the room’ in the 10 minutes offered. Considering the wealth of hands-on digital health and care experience in the audience it was a missed opportunity for the NIB to hear common issues and listen to suggestions for co-designed solutions.

While white middle class men (for the most part) tell people of their grand plans from the top down, the revolutionaries of all kinds are quietly getting on with action on the ground.

If a digital revolution is core to the NHS future, then we need to ask to understand the intended change and outcome much more simply and precisely.

We should all understand why the NHS England leadership wants to drive change, and be given proper opportunity to question it, if we are to collaborate in its achievement.

It’s about the people, stoopid

Passive participation will not be enough from the general public if the revolution is to be as dramatic as it is painted.

Consensual co-design of plans and co-writing policy are proven ways to increase commitment to change.

Evidence suggests citizen involvement in planning is more likely to deliver success. Change done with, not to.

When constructive solutions have been offered what impact has engagement if no change is made to any  plans?

If that’s engagement, you’re doing it wrong.

Struggling with getting themselves together on the current design for now, it may be hard to invite public feedback on the future.

But it’s only made hard if what the public wants is ignored.  If those issues were resolved in the way the public asked for at listening events it could be quite simple to solve.

The NIB leadership clearly felt nervous to have debate, giving only 10 minutes of three hours for public involvement, yet that is what it needs. Questions and criticism are not something to be scared of, but opportunities to make things better.

The NHS top-down digital plans need public debate and dissected by the clinical professions to see if it fits the current and future model of healthcare, because if not involved in the change, the ride will be awfully bumpy to get there.

For data about us, to be used without us, is certainly an outdated model incompatible with a digital future.

The public needs to fight for citizen rights in a new social charter that demands change along lines we want, change that doesn’t just talk of co-design but that actually means it.

If unhappy about today’s data use, then the general public has to stop being content to be passive cash cows as we are data mined.

If we want data used only for public benefit research and not market segmentation, then we need to speak up. To the Information Commissioner’s Office if the organisation itself will not help.

“As Nicole Wong, who was one of President Obama’s top technology advisors, recently wrote, “[t]here is no future in which less data is collected and used.”

“The challenge lies in taking full advantage of the benefits that the Internet of Things promises while appropriately protecting consumers’ privacy, and ensuring that consumers are treated fairly.” Julie Brill, FTC, May 4 2015, Berlin

In the rush to embrace the ‘Internet of Things’ it can feel as though the reason for creating them has been forgotten. If the Internet serves things, it serves consumerism. AI must tread an enlightened path here. If the things are designed to serve people, then we would hope they offer methods of enhancing our life experience.

In the dream of turning a “tsunami of data” into a “tsunami of actionable business intelligence,” it seems all too often the person providing the data is forgotten.

While the Patient and Information Directorate, NHS England or NIB speakers may say these projects are complex and hard to communicate the benefits, I’d say if you can’t communicate the benefits, its not the fault of the audience.

People shouldn’t have to either a) spend immeasurable hours of their personal time, understanding how these projects work that want their personal data, or b) put up with being ignorant.

We should be able to fully question why it is needed and get a transparent and complete explanation. We should have fully accountable business plans and scrutiny of tangible and intangible benefits in public, before projects launch based on public buy-in which may misplaced. We should expect  plans to be accessible to everyone and make documents straightforward enough to be so.

Even after listening to a number of these meetings and board meetings, I am  not sure many would be able to put succinctly: what is the NHS digital forward view really? How is it to be funded?

On the one hand new plans are to bring salvation, while the other stops funding what works already today.

Although the volume of activity planned is vast, what it boils down to, is what is visionary and achievable, and not just a vision.

Digital revolution by design: building for change and people

We have opportunity to build well now, avoiding barriers-by-design, pro-actively designing ethics and change into projects, and to ensure it is collaborative.

Change projects must map out their planned effects on people before implementing technology. For the NHS that’s staff and public.

The digital revolution must ensure the fair and ethical use of the big data that will flow for direct care and secondary uses if it is to succeed.

It must also look beyond its own bubble of development as they shape their plans in the ever changing infrastructures in which data, digital, AI and ethics will become important to discuss together.

That includes in medicine.

Design for the ethics of the future, and enable change mechanisms in today’s projects that will cope with shifting public acceptance, because that shift has already begun.

Projects whose ethics and infrastructures of governance were designed years ago, have been overtaken in the digital revolution.

Projects with an old style understanding of engagement are not fit-for-the-future. As Simon Denegri wrote, we could have 5 years to get a new social charter and engagement revolutionised.

Tim Berners-Lee when he called for a Magna Carta on the Internet asked for help to achieve the web he wants:

“do me a favour. Fight for it for me.”

The charter as part of the French Revolution set out a clear, understandable, ethical and fair framework of law in which they trusted their rights would be respected by fellow citizens.

We need one for data in this digital age. The NHS could be a good place to start.

****

It’s exciting hearing about the great things happening at grassroots. And incredibly frustrating to then see barriers to them being built top down. More on that shortly, on the barriers of cost, culture and geography.

****

* at the NIB meeting held on the final afternoon of the Digital Conference on Health & Social Care at the King’s Fund, June 16-17.

NEXT>>
2. Driving Digital Health: revolution by design
3. Digital revolution by design: building infrastructure

Refs:
Apps for sale on the NHS website
Whose smart city? Resident involvement
Data Protection and the Internet of Things, Julie Brill FTC
A Magna Carta for the web

Are care.data pilots heading for a breech delivery?

Call the midwife [if you can find one free, the underpaid overworked miracle workers that they are], the care.data ‘pathfinder’ pilots are on their way! [This is under a five minute read, so there should be time to get the hot water on – and make a cup of tea.]

I’d like to be able to say I’m looking forward to a happy new arrival, but I worry care.data is set for a breech birth. Is there still time to have it turned around? I’d like to say yes, but it might need help.

The pause appears to be over as the NHS England board delegated the final approval of directions to their Chair, Sir Malcolm Grant and Chief Executive, Simon Stevens, on Thursday May 28.

Directions from NHS England which will enable the HSCIC to proceed with their pathfinder pilots’ next stage of delivery.

“this is a programme in which we have invested a great deal, of time and thought in its development.” [Sir Malcolm Grant, May 28, 2015, NHS England Board meeting]

And yet. After years of work and planning, and a 16 month pause, as long as it takes for the gestation of a walrus, it appears the directions had flaws. Technical fixes are also needed before the plan could proceed to extract GP care.data and merge it with our hospital data at HSCIC.

And there’s lots of unknowns what this will deliver.**

Groundhog Day?

The public and MPs were surprised in 2014. They may be even more surprised if 2015 sees a repeat of the same again.

We have yet to hear case studies of who received data in the past and would now be no longer eligible. Commercial data intermediaries? Can still get data, the NHS Open Day was told. And they do, as the HSCIC DARS meeting minutes in 2015 confirm.

By the time the pilots launch, the objection must actually work, communications must include an opt out form and must clearly give the programme a name.

I hope that those lessons have been learned, but I fear they have not been. There is still lack of transparency. NHS England’s communications materials and May-Oct 2014 and any 2015 programme board minutes have not been published.

We have been here before.

Back to September 2013: the GPES Advisory Committee, the ICO and Dame Fiona Caldicott, as well as campaigners and individuals could see the issues in the patient leaflet and asked for fixes.The programme went ahead anyway in February 2014 and although foreseen, failed to deliver. [For some, quite literally.]

These voices aren’t critical for fun, they call for fixes to get it right.

I would suggest that all of the issues raised since April 2014, were broadly known in February 2014 before the pause began. From the public listening exercise,  the high level summary captures some issues raised by patients, but doesn’t address their range or depth.

Some of the difficult and unwanted  issues, are still there, still the same and still being ignored, at least in the public domain. [4]

A Healthy New Arrival?

How is the approach better now and what happens next to proceed?

“It seems a shame,” the Walrus said, “To play them such a trick, After we’ve brought them out so far, And made them trot so quick!” [Lewis Carroll]

When asked by a board member: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach? it wasn’t very clear. [full detail end of post]

First they must pass the tests asked of them by Dame Fiona [her criteria and 27 questions from before Christmas.] At least that was what the verbal background given at the board meeting explained.

If the pilots should be a dip in the water of how national rollouts will proceed, then they need to test not just for today, but at least for the known future of changing content scope and expanding users – who will pay for the communication materials’ costs each time?

If policy keeps pressing forward, will it not make these complications worse under pressure? There may be external pressure ahead as potential changes to EU data protection are expected this year as well, for which the pilot must be prepared and design in advance for the expectations of best practice.

Pushing out the pathfinder directions, before knowing the answers to these practical things and patient questions open for over 16 months, is surely backwards. A breech birth, with predictable complications.

If in Sir Malcolm Grant’s words:

“we would only do this  if we believed it was absolutely critical in the interests of patients.” [Malcom Grant, May 28, 2015, NHS England Board meeting]

then I’d like to see the critical interest of patients put first. Address the full range of patient questions from the ‘listening pause’.

In the rush to just fix the best of a bad job, we’ve not even asked are we even doing the right thing? Is the system designed to best support doctor patient needs especially with the integration “blurring the lines” that Simon Stevens seems set on.

If  focus is on the success of the programme and not the patient, consider this: there’s a real risk too many opt out due to these unknowns. And lack of real choice on how their data gets used. It could be done better to reduce that risk.

What’s the percentage of opt out that the programme deems a success to make it worthwhile?

In March 2014, at a London event, a GP told me all her patients who were opting out were the newspaper reading informed, white, middle class. She was worried that the data that would be included, would be misleading and unrepresentative of her practice in CCG decision making.

medConfidential has written a current status for pathfinder areas that make great sense to focus first on fixing care.data’s big post-election question the opt out that hasn’t been put into effect. Of course in February 2014 we had to choose between two opt outs -so how will that work for pathfinders?

In the public interest we need collectively to see this done well. Another mis-delivery will be fatal. “No artificial timelines?”

Right now, my expectations are that the result won’t be as cute as a baby walrus.

******

Notes from the NHS England Board Meeting, May 28, 2015:

TK said:  “These directions [1] relate only to the pathfinder programme and specify for the HSCIC what data we want to be extracted in the event that Dame Fiona, this board and the Secretary of State have given their approval for the extraction to proceed.

“We will be testing in this process a public opt out, a citizen’s right to opt out, which means that, and to be absolutely clear if someone does exercise their right to opt out, no clinical data will be extracted from their general practice,  just to make that point absolutely clearly.

“We have limited access to the data, should it be extracted at the end of the pathfinder phase, in the pathfinder context to just four organisations: NHS England, Public Health England, the HSCIC and CQC.”

“Those four organisations will only be able to access it for analytic purposes in a safe, a secure environment developed by the Information Centre [HSCIC], so there will be no third party hosting of the data that flows from the extraction.

“In the event that Dame Fiona, this board, the Secretary of State, the board of the Information Centre, are persuaded that there is merit in the data analysis that proceeds from the extraction, and that we’ve achieved an appropriate standard of what’s called fair processing, essentially have explained to people their rights, it may well be that we proceed to a programme of national rollout, in that case this board will have to agree a separate set of directions.”

“This is not signing off anything other than a process to test communications, and for a conditional approval on extracting data subject to the conditions I’ve just described.”

CD said: “This is new territory, precedent, this is something we have to get right, not only for the pathfinders but generically as well.”

“One of the consequences of having a pathfinder approach, is as Tim was describing, is that directions will change in the future. So if we are going to have a truly fair process , one of the things we have to get right, is that for the pathfinders, people understand that the set of data that is extracted and who can use it in the pathfinders, will both be a subset of, the data that is extracted and who can use it in the future. If we are going to be true to this fair process, we have to make sure in the pathfinders that we do that.

“For example, at the advisory group last week, is that in the communication going forward we have to make sure that we flag the fact there will be further directions, and they will be changed, that we are overt in saying, subject to what Fiona Caldicott decides, that process itself will be transparent.”

Questions from Board members:
Q: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach?
What are the top three objectives we seek to achieve?

TK: So, Dame Fiona has set a series of standards she expects the pathfinders to demonstrate, in supporting GPs to be able to discharge this rather complex communication responsibility, that they have under the law  in any case.

“On another level how we can demonstrate that people have adequately understood their right to opt out [..]

“and how do we make sure that populations who are relatively hard to reach, although listed with GPs, are also made aware of their opportunity to opt out.

Perhaps it may help if I forward this to the board, It is in the public domain. But I will forward the letter to the board.”

“So that lays out quite a number of specific tangible objectives that we then have to evaluate in light of the pathfinder experience. “

Chair: “this is a programme in which we have invested a great deal, of time and thought in its development, we would only do this  if we believed it was absolutely critical in the interests of patients, it was something that would give us the information the intelligence that we need to more finely attune our commissioning practice, but also to get real time intelligence about how patients lives are lived, how treatments work and how we can better provide for their care.

“I don’t think this is any longer a matter of huge controversy, but how do we sensitively attune ourselves to patient confidentiality.”

“I propose that […] you will approve in principle the directions before you and also delegate to the Chief Executive and to myself to do final approval on behalf of the board, once we have taken into account the comments from medConfidential and any other issues, but the substance will remain unchanged.”

******

[4] request for the release of June 2014 Open House feedback still to be published in the hope that the range and depth of public questions can be addressed.

care.data comms letter

******
“The time has come,” the walrus said, “to talk of many things.”
[From ‘The Walrus* and the Carpenter’ in Through the Looking-Glass by Lewis Carroll]

*A walrus has a gestation period of about 16 months.
The same amount of time which the pause in the care.data programme has taken to give birth to the pathfinder sites.

references:
[1] NHS England Directions to HSCIC: May 28 2015 – http://www.england.nhs.uk/wp-content/uploads/2015/05/item6-board-280515.pdf
[2] Notes from care.data advisory group meeting on 27th February 2015
[3] Patient questions: https://jenpersson.com/pathfinder/
[4] Letter from NHS England in response to request from September, and November 2014 to request that public questions be released and addressed


15 Jan 2024: Image section in header replaced at the request of likely image tracing scammers who don’t own the rights and since it and this blog is non-commercial would fall under fair use anyway. However not worth the hassle. All other artwork on this site is mine.

The Economic Value of Data vs the Public Good? [1] care.data, Concerns and the cost of Consent

They say ‘every little helps’.  care.data needs every little it can get.

In my new lay member role on the ADRN panel, I read submissions for research requests for any ethical concerns that may be reflected in wider public opinion.

The driving force for sharing administrative data research is non-commercial, with benefits to be gained for the public good.

So how do we quantify the public good, and ‘in the public interest’?

Is there alignment between the ideology of government, the drivers of policy [for health, such as the commissioning body NHS England] and the citizens of the country on what constitutes ‘the public good’?

There is public good to be gained for example, from social and health data seen as a knowledge base,  by using it using in ‘bona fide’ research, often through linking with other data to broaden insights.

Insight that might result in improving medicines, health applications, and services. Social benefits that should help improve lives, to benefit society.

Although social benefits may be less tangible, they are no harder for the public to grasp than the economic. And often a no brainer as long as confidentiality and personal control are not disregarded.

When it comes to money making from our data the public is less happy. The economic value of data raises more questions on use.

There is economic benefit to extract from data as a knowledge base to inform decision making, being cost efficient and investing wisely. Saving money.

And there is measurable economic public good in terms of income tax from individuals and corporations who by using the data make a profit, using data as a basis from which to create tools or other knowledge. Making money for the public good through indirect sales.

Then there is economic benefit from data trading as a commodity. Direct sales.

In all of these considerations, how does what the public feels and their range of opinions, get taken into account in the public good cost and benefit accounting?

Do we have a consistent and developed understanding of ‘the public interest’ and how it is shifting to fit public expectation and use?

Public concern

“The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.”  [Wellcome blog, April 2015]

If something is jeopardising that public good it is in the public interest to say so, and for the right reasons.

The loss of public trust in data sharing measured by public feeling in 2014 is a threat to data used in the public interest, so what are we doing to fix it and are care.data lessons being learned?

The three biggest concerns voiced by the public at care.data listening events[1] were repeatedly about commercial companies’ use, and re-use of data, third parties accessing data for unknown purposes and the resultant loss of confidentiality.

 Question from Leicester: “Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.” [NHS Open Day, June 2014]

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial purposes.

Much of the debate and upset caused by the revelations of how our hospital episode statistics were managed in the past centred on the sense of loss of ownership. And with that, the inability to consent to who uses it. This despite acknowledgment that patients own their data.

Significant concern centres on use of the information gleaned from data that patients consider commercial exploitation. For use segmenting the insurance markets. For consumer market research. Using data for individual targeting. And its utter lack of governance.

There is also concern about data being directly sold or exchanged as a commodity.

These concerns were raised meeting after meeting in the 2014 care.data “listening process.”

To read in Private Eye that commercially sensitive projects were discussed in various meetings between NHS England and supermarket giant Tesco throughout 2014 [2] by the Patients and Information Director, responsible for care.data, is therefore all the more surprising.

They may of course be quite unrelated.

But when transparency is the mother of trust, it’s perhaps a surprising liason while ‘listening’ to care.data concerns.

It could appear that greater confidentiality was given to the sensitivity of commercial meetings than citizens’ sensitive data.

Consent package deals may be a costly mistake

People are much more aware since care.data a year ago, that unknown third parties may access data without our consent.

Consent around secondary NHS data sharing and in wider fora is no longer an inconvenient ethical dilemma best left on the shelf, as it has been for the last 25 years in secondary use, dusted off in the care.data crisis. [3]

Consent is front and centre in the latest EU data protection discussions [4] in which consent may become a requirement for all research purposes.

How that may affect social science and health research use, its pros and cons [5] remain to be seen.

However, in principle consent has always been required and good practice in applied medicine, despite the caveat for data used in medical research. As a general rule: “An intervention in the health field may only be carried out after the person concerned has given free and informed consent to it”. But this is consent for your care. Assuming that information is shared when looking after you, for direct care, during medical treatment itself is not causes concerns.

The idea is becoming increasingly assumed in discussions I have heard, [at CCG and other public meetings] that because patients have given implied consent to sharing their information for their care, that the same data may be shared for other purposes. It is not, and it is those secondary purposes that the public has asked at care.data events, to see split up, and differentiated.

Research uses are secondary uses, and those purposes cannot ethically be assumed. However, legal gateways, access to that data which makes it possible to uses for clearly defined secondary purposes by law, may make that data sharing legal.

That legal assumption, for the majority of people polls and dialogue show [though not for everyone 6b], comes  a degree of automatic support for bona fide research in the public interest. But it’s not a blanket for all secondary uses by any means, and it is this blanket assumption which has damaged trust.

So if data use in research assumes consent, and any panel is the proxy for personal decision making, the panel must consider the public voice and public interest in its decision making.

So what does the public want?

In those cases where there is no practicable alternative [to consent], there is still pressure to respect patient privacy and to meet reasonable expectations regarding use. The stated ambition of the CAG, for example, is to only advise disclosure in those circumstances where there is reason to think patients would agree it to be reasonable.

Whether active not implied consent does or does not become a requirement for research purposes without differentiation between kinds, the public already has different expectations and trust around different users.

The biggest challenge for championing the benefits of research in the public good, may be to avoid being lumped in with commercial marketing research for private profit.

The latter’s misuse of data is an underlying cause of the mistrust now around data sharing [6]. It’s been a high price to pay for public health research and others delayed since the Partridge audit.

Consent package deals mean that the public cannot choose how data are used in what kids of research and if not happy with one kind, may refuse permission for the other.

By denying any differentiation between direct, indirect, economic and social vale derived from data uses, the public may choose to deny all researchers access to their all personal data.

That may be costly to the public good, for public health and in broader research.

A public good which takes profit into account for private companies and the state, must not be at the expense of public feeling, reasonable expectations and ethical good practice.

A state which allows profit for private companies to harm the perception of  good practice by research in the public interest has lost its principles and priorities. And lost sight of the public interest.

Understanding if the public, the research community and government have differing views on what role economic value plays in the public good matters.

It matters when we discuss how we should best protect and approach it moving towards a changing EU legal framework.

“If the law relating to health research is to be better harmonised through the passing of a Regulation (rather than the existing Directive 95/46/EC), then we need a much better developed understanding of ‘the public interest’ than is currently offered by law.”  [M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1]

In the words of Dr Mark Taylor, “we need to do this better.”

How? I took a look at some of this in more detail:

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

Update note: A version of these three posts was combined into an opinion piece – care.data: ‘The Value of Data versus the Public Interest?’ published on StatsLife on June 3rd 2015.

****

image via Tesco media

 

[1] care.data listening event questions: https://jenpersson.com/pathfinder/

[2] Private Eye – on Tesco / NHS England commercial meetings https://twitter.com/medConfidential/status/593819474807148546

[3] HSCIC audit and programme for change www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

[4] EU data protection discussion http://www.digitalhealth.net/news/EHI/9934/eu-ministers-back-data-privacy-changes

[5] Joint statement on EU Data Protection proposals http://www.wellcome.ac.uk/stellent/groups/corporatesite/@policy_communications/documents/web_document/WTP055584.pdf

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[6b] The ‘Dialogue on Data’ Ipsos MORI research 2014 https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx – commissioned by the Economic and Social Research Council (ESRC) and the Office for National Statistics (ONS) to conduct a public dialogue examining the public’s views on using linked administrative data for research purposes,

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

[10] Personalisation in health data plans http://www.england.nhs.uk/iscg/wp-content/uploads/sites/4/2014/01/ISCG-Paper-Ref-ISCG-009-002-Adult-Social-Care-Informatics.pdf

[11] Tim Kelsey Keynote speech at Strata November 2013 https://www.youtube.com/watch?v=s8HCbXsC4z8

[12] Forbes: Illumina CEO on the US$20bn DNA market http://www.forbes.com/sites/luketimmerman/2015/04/29/qa-with-jay-flatley-ceo-of-illumina-the-genomics-company-pursuing-a-20b-market/

 

The Economic Value of Data vs the Public Good? [2] Pay-for-privacy, defining purposes

Differentiation. Telling customers apart and grouping them by similarities is what commercial data managers want.

It enables them to target customers with advertising and sales promotion most effectively. They segment the market into chunks and treat one group differently from another.

They use market research data, our loyalty card data, to get that detailed information about customers, and decide how to target each group for what purposes.

As the EU states debate how research data should be used and how individuals should be both enabled and protected through it, they might consider separating research purposes by type.

While people are happy for the state to use their data without active consent for bona fide research, they are not for commercial consumer research purposes. [ref part 1].

Separating consumer and commercial market research from the definition of research purposes for the public good by the state, could be key to rebuilding people’s trust in government data use.

Having separate purposes would permit separate consent and control procedures to govern them.

But what role will profit make in the state’s definition of ‘in the public interest’ – is it in the public interest if the UK plc makes money from its citizens? and how far along any gauge of public feeling will a government be prepared to go to push making money for the UK plc at our own personal cost?

Pay-for-privacy?

In January this year, the Executive Vice President at Dunnhumby, Nishat Mehta, wrote in this article [7], about how he sees the future of data sharing between consumers and commercial traders:

“Imagine a world where data and services that are currently free had a price tag. You could choose to use Google or Facebook freely if you allowed them to monetize your expressed data through third-party advertisers […]. Alternatively, you could choose to pay a fair price for these services, but use of the data would be forbidden or limited to internal purposes.”

He too, talked about health data. Specifically about its value when accurate expressed and consensual:

“As consumers create and own even more data from health and fitness wearables, connected devices and offline social interactions, market dynamics would set the fair price that would compel customers to share that data. The data is more accurate, and therefore valuable, because it is expressed, rather than inferred, unable to be collected any other way and comes with clear permission from the user for its use.”

What his pay-for-privacy model appears to have forgotten, is that this future consensual sharing is based on the understanding that privacy has a monetary value. And that depends on understanding the status quo.

It is based on the individual realising that there is money made from their personal data by third parties today, and that there is a choice.

The extent of this commercial sharing and re-selling will be a surprise to most loyalty card holders.

“For years, market research firms and retailers have used loyalty cards to offer money back schemes or discounts in return for customer data.”

However despite being signed up for years, I believe most in the public are unaware of the implied deal. It may be in the small print. But everyone knows that few read it, in the rush to sign up to save money.

Most shoppers believe the supermarket is buying our loyalty. We return to spend more cash because of the points. Points mean prizes, petrol coupons, or pounds off.

We don’t realise our personal identity and habits are being invisibly analysed to the nth degree and sold by supermarkets as part of those sweet deals.

But is pay-for-privacy discriminatory? By creating the freedom to choose privacy as a pay-for option, it excludes those who cannot afford it.

Privacy should be seen as a human right, not as a pay-only privilege.

Today we use free services online but our data is used behind the scenes to target sales and ads often with no choice and without our awareness.

Today we can choose to opt in to loyalty schemes and trade our personal data for points and with it we accept marketing emails, and flyers through the door, and unwanted calls in our private time.

The free option is to never sign up at all, but by doing so customers pay a premium by not getting the vouchers and discounts.  Or trading convenience of online shopping.

There is a personal cost in all three cases, albeit in a rather opaque trade off.

 

Does the consumer really benefit in any of these scenarios or does the commercial company get a better deal?

In the sustainable future, only a consensual system based on understanding and trust will work well. That’s assuming by well, we mean organisations wish to prevent PR disasters and practical disruption as resulted for example to NHS data in the last year, through care.data.

For some people the personal cost to the infringement of privacy by commercial firms is great. Others care less. But once informed, there is a choice on offer even today to pay for privacy from commercial business, whether one pays the price by paying a premium for goods if not signed up for loyalty schemes or paying with our privacy.

In future we may see a more direct pay-for-privacy offering along  the lines of Nishat Mehta.

And if so, citizens will be asking ever more about how their data is used in all sorts of places beyond the supermarket.

So how can the state profit from the economic value of our data but not exploit citizens?

‘Every little bit of data’ may help consumer marketing companies.  Gaining it or using it in ways which are unethical and knowingly continue bad practices won’t win back consumers and citizens’ trust.

And whether it is a commercial consumer company or the state, people feel exploited when their information is used to make money without their knowledge and for purposes with which they disagree.

Consumer commercial use and use in bona fide research are separate in the average citizen’s mind and understood in theory.

Achieving differentiation in practice in the definition of research purposes could be key to rebuilding consumers’ trust.

And that would be valid for all their data, not only what data protection labels as ‘personal’. For the average citizen, all data about them is personal.

Separating in practice how consumer businesses are using data about customers to the benefit of company profits, how the benefits are shared on an individual basis in terms of a trade in our privacy, and how bona fide public research benefits us all, would be beneficial to win continued access to our data.

Citizens need and want to be offered paths to see how our data are used in ways which are transparent and easy to access.

Cutting away purposes which appear exploitative from purposes in the public interest could benefit commerce, industry and science.

By reducing the private cost to individuals of the loss of control and privacy of our data, citizens will be more willing to share.

That will create more opportunity for data to be used in the public interest, which will increase the public good; both economic and social which the government hopes to see expand.

And that could mean a happy ending for everyone.

The Economic Value of Data vs the Public Good?  They need not be mutually exclusive. But if one exploits the other, it has the potential to continue be corrosive. The UK plc cannot continue to assume its subjects are willing creators and repositories of information to be used for making money. [ref 1] To do so has lost trust in all uses, not only those in which citizens felt exploited.[6]

The economic value of data used in science and health, whether to individual app creators, big business or the commissioning state in planning and purchasing is clear. Perhaps not quantified or often discussed in the public domain perhaps, but it clearly exists.

Those uses can co-exist with good practices to help people understand what they are signed up to.

By defining ‘research purposes’, by making how data are used transparent, and by giving real choice in practice to consent to differentiated data for secondary uses, both commercial and state will secure their long term access to data.

Privacy, consent and separation of purposes will be wise investments for its growth across commercial and state sectors.

Let’s hope they are part of the coming ‘long-term economic plan’.

****

Related to this:

Part one: The Economic Value of Data vs the Public Good? [1] Concerns and the cost of Consent

Part two: The Economic Value of Data vs the Public Good? [2] Pay-for-privacy and Defining Purposes.

Part three: The Economic Value of Data vs the Public Good? [3] The value of public voice.

****

image via Tesco media

[6] Ipsos MORI research with the Royal Statistical Society into the Trust deficit with lessons for policy makers https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

[7] AdExchanger Janaury 2015 http://adexchanger.com/data-driven-thinking/the-newest-asset-class-data/

[8] Tesco clubcard data sale https://jenpersson.com/public_data_in_private_hands/  / Computing 14.01.2015 – article by Sooraj Shah: http://www.computing.co.uk/ctg/feature/2390197/what-does-tescos-sale-of-dunnhumby-mean-for-its-data-strategy

[9] Direct Marketing 2013 http://www.dmnews.com/tesco-every-little-bit-of-customer-data-helps/article/317823/

 

The future of care.data in recent discussions

Questions were raised at two health events this week, on the status of the care.data programme.

The most recent NHS England announcement about the care.data rollout progress, was made in October 2014.

What’s the current status of Public Information?

The IIGOP review in December 2014 [1], set 27 criteria for the programme to address.

The public has not yet seen a response, but according to the GPES minutes one was made at the end of January.

Will it be released in the public domain?

An updated privacy impact assessment “was approved by the care.data programme board and will be published in February 2015.” It has not yet been made public.

Limited and redacted programme board materials were released and the public awaits to see if a business case or more will be released in the public interest.

Risks and issues have been redacted or not released at all, such as the risk register.

There is no business case in place, confirmed page 6 of the October 2014 board minutes – I find that astonishing.

It is hard to know if more material will be made public as recommended in their own transparency agenda.

What is the current state of open questions?

Professionals and public are still interested in the current plan, and discussions this week at the Roy Lilley chat with Dr. Sarah Wollaston MP, again raised some open questions.

1. What happened to penalties for misuse and ‘one strike and out’ ?

Promised  in Parliament by Dr. Dan Poulter,  Parliamentary Under Secretary of State at the Department of Health, a year ago – questions on penalties are still being asked and  without a clear public answer of all that has changed since then and what remains to be done:

care.data penalties are unclear

Poulter on care.data penalties

[Hansard, March 25 2014 ] [2]

Some changes are being worked on [written evidence to HSC]*[7] planned for autumn 2015 – but does it clarify what has happened concretely to date and how it will protect patients in the pathfinder?

“The department is working to table these regulations in Parliament in 2015, to come into force in the autumn.”

Did this happen? Are the penalties proportionate for big multi-nationals, or will other safeguards be introduced, such as making misuse a criminal offence, as suggested?

2. What about promises made on opt out?

One year on the public still has no fair processing of personal data released by existing health providers. It was extracted in the past twenty-five years, the use of which by third parties was not public knowledge. (Data from hospital visits (HES), mental health, maternity data etc).

The opt out of all data sharing from secondary care such as A&E, stored at the HSCIC, was promised by Jeremy Hunt, Secretary of State for Health, a year ago, on February 25th 2014.

It has still not come into effect and been communicated:

Jeremy Hunt on care.data opt out

[Hansard February 25 2014, col 148] [3]

Jeremy Hunt MP

 

In fact the latest news reported in the media was that opt out ‘type 2’ was not working, as expected. [4]

Many in the public have not been informed at all that they can request opt out, as the last public communication attempt failed to reach all households, yet their data continues to be released.

3. What about clarifying the purposes of the Programme?

The public remains unclear about the purpose of the whole programme and data sharing, noted at the Roy Lilley event:

A business case, and a risk benefit analysis would improve this.

Flimsy assurances based on how data may be used in the initial extraction will not be enough to assure the public how their data will be used in future and by whom, not just the next six months or so.

Once released, data is not deleted, so a digital health footprint is not just released for care.data, it is given up for life. How much patients trust the anonymous, pseudonymous, and what is ‘de-identified’ data depends on the individual, but in a world where state-held data matching form multiple sources is becoming the norm, many in the public are skeptical.[5]

The controls over future use and assurances that are ‘rock solid’, will only be trustworthy if what was promised, happens.

To date, that is not the case or has not been communicated.

What actions have been taken recently?

Instead of protecting the body, which in my opinion has over the last two years achieved external scrutiny of care.data and ensuring promises made were kept, the independent assurance committee, the IAG, is to be scrapped.

The data extraction and data release functions are to be separated.

This could give the impression that data is no longer to be extracted only when needed for a specific purpose, but lends weight to the impression that all data is to be “sucked up” and purposes defined later. If care.data is purposed to replace SUS, it would not be a surprise.

It would however contravene fair processing data protection which requires the purposes of use to be generally clear before extraction.  Should use change, it must be fair. [For example, to have had consent for data sharing for direct care, but then use the data for secondary uses by third parties,  is such a significant change, one can question whether that falls under ‘fair’ looking at ICOs examples.]

So, what now, I asked Dr. Poulter after the Guardian healthcare debate on Tuesday evening this week on giving opt out legal weight?
(I would have asked during the main session, but there was not enough time for all questions).

care.data opt out open question

 

He was not able to give any concrete commitment to the opt out for HES data, or care.data, and simply did not give any answer at all.

What will happen next? Will the pathfinders be going live before the election in May? I asked.

Without any precise commitment, he said that everything was now dependent on Dame Fiona’s IIGOP response to the proposals [made by NHS England].

cd_metw2 Dan Poulter MP

 

What has happened to Transparency?

The public has not been given access to see what the NHS England response to the IIGOP/ Caldicott December review was.

The public has no visibility of what the risks are, as seen by the programme board.

The public is still unclear on what the expected benefits are, to measure those risks against.

And without a business case, the public does not know how much it is costing.

Without these, the public cannot see how the care.data board and DH is effectively planning, measuring progress, and spending public money, or how they will be accountable for its outcomes.

The sad thing about this, is that transparency and “intelligent grown up debate” as Sir Manning called for last year, would move this programme positively ahead.

Instead it seems secretive, which is not building trust.  The deficit of that trust is widely recognised and still needs solidly rebuilt.

Little seems to have been done since last year to make it so.

“Hetan Shah, executive director of the Royal Statistical Society said, ‘Our research shows a “data trust deficit”. In this data-rich world, companies and government have to earn citizens’ trust in how they manage and use data – and those that get it wrong will pay the price.” [Royal Statistical Society, 22 July 2014][6]

Shame.

Care.data is after all, meant to be for the public good.

care.data purposes are unclear
It would be in the public interest to get answers to these questions from recent events.

 

refs:

1. IIGOP care.data report December 2014 https://www.gov.uk/government/publications/iigop-report-on-caredata

2. Hansard March 25th 2014: http://www.publications.parliament.uk/pa/cm201314/cmhansrd/cm140325/halltext/140325h0002.htm

3. Hansard February 25th 2014: http://www.publications.parliament.uk/pa/cm201314/cmhansrd/cm140225/debtext/140225-0001.htm

4. NHS England statement on Type 2 opt out http://www.england.nhs.uk/2015/01/23/data-opt-out/

5. Ipsos MORI June 2014 survey: https://www.ipsos-mori.com/researchpublications/researcharchive/3407/Privacy-and-personal-data.aspx

6. Royal Statistical Society on the ‘trust deficit’ http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

7. *additional note made, Sun 15th incl. reference HSC Letter from HSCIC

On Being Human – moral and material values

The long running rumours of change afoot on human rights political policy were confirmed recently, and have been in the media and on my mind since.

Has human value become not just politically acceptable, but politically valuable?

Paul Bernal in his blog addressed the subject which has been on my mind, ‘Valuing the Human’ and explored the idea, ‘Many people seem to think that there isn’t any value in the human, just in certain kinds of human.’

Indeed, in recent months there appears to be the creation of a virtual commodity, making this concept of human value “not just politically acceptable, but politically valuable.” The concept of the commodity of human value, was starkly highlighted by Lord Freud’s recent comments, on human worth. How much a disabled person should earn was the focus of the remarks, but conflated the price of labour and human value.

European Rights undermined

Given the party policy announcements and the response by others in government or lack of it, it is therefore unsurprising that those familiar with human rights feel they will be undermined in the event that the policy proposals should ever take effect. As the nation gears up into full electioneering mode for May 2015, we have heard much after party speeches, about rights and responsibilities in our dealings with European partners, on what Europe contributes to, or takes away from our sovereignty in terms of UK law. There has been some inevitable back-slapping and generalisation in some quarters that everything ‘Europe’ is bad.

Whether or not our state remains politically within the EU may be up for debate, but our tectonic plates are not for turning. So I find it frustrating when politicians speak of or we hear of in the media, pulling out of Europe’ or similar.

This conflation of language is careless,  but I fear it is also dangerous in a time when the right wing fringe is taking mainstream votes and politicians in by-elections. Both here in the UK and in other European countries this year, far right groups have taken significant votes.

Poor language on what is ‘Europe’ colours our common understanding of what ‘Europe’ means, the nuances of the roles organisational bodies have, for example the differences between the European Court of Human Rights and the European Court of Justice, and their purposes are lost entirely.

The values imposed in the debate are therefore misaligned with the organisations’ duties, and all things ‘European’ and organisations  are tarred with the same ‘interfering’ brush and devalued.

Human Rights were not at their heart created by ‘Europe’ nor are they only some sort of treaty to be opted out from, [whilst many are enshrined in treaties and Acts which were, and are] but their values risk being conflated with the structures which support them.

“A withdrawal from the convention could jeopardise Britain’s membership of the EU, which is separate to the Council of Europe whose members are drawn from across the continent and include Russia and Ukraine. Membership of the Council of Europe is a requirement for EU member states.” [Guardian, October 3rd – in a clearly defined article]

The participation in the infrastructure of ‘Brussels’ however, is convenient to conflate with values; a loss of sovereignty, loss of autonomy, frivoulous legislation. Opting out of a convention should not mean changing our values. However it does seem the party attitude now on show, is seeking to withdraw from the convention. This would mean withdrawing the protections the structure offers. Would it mean withdrawing rights offered to all citizens equally as well?

Ethical values undermined

Although it varies culturally and with few exceptions, I think we do have in England a collective sense of what is fair, and how we wish to treat each others as human beings. Increasingly however, it feels as though through loose or abuse of language in political debate we may be giving ground on our ethics. We are being forced to bring the commodity of human value to the podium, and declare on which side we stand in party politics. In a time of austerity, there is a broad range of ideas how.

Welfare has become branded ‘benefits’. Migrant workers, ‘foreigners’ over here for ‘benefit tourism’. The disabled labeled ‘fit for work’ regardless of medical fact. It appears, increasingly in the UK, some citizens are being measured by their economic material value to contribute or take away from ‘the system’.

I’ve been struck by the contrast coming from 12 years abroad, to find England a place where the emphasis is on living to work, not working to live. If we’re not careful, we see our personal output in work as a measure of our value. Are humans to be measured only in terms of our output, by our productivity, by our ‘doing’ or by our intrinsic value as an individual life? Or simply by our ‘being’? If indeed we go along with the concept, that we are here to serve some sort of productive goal in society on an economic basis, our measurement of value of our ‘doing’, is measured on a material basis.

“We hear political speeches talking about ‘decent, hardworking people’ – which implies that there are some people who are not as valuable.”

I strongly agree with this in Paul’s blog. And as he does, disagree with its value statement.

Minority Rights undermined

There are minorities and segments of society whose voice is being either ignored, or actively quietened. Those on the outer edge of the umbrella ‘society’ offers us, in our collective living, are perhaps least easily afforded its protections. Travelers, those deemed to lack capacity, whether ill, old or young, single parents, or ‘foreign’ workers, to take just some examples.

I was told this week that the UK has achieved a  first. It was said, we are the first ‘first-world’ country under review by the CPRD for human rights abuse of the disabled. Which cannot be confirmed nor denied by the UN but a recent video indicated.

This is appalling in 21st century Britain.

Recently on Radio 4 news I heard of thousands of ESA claimants assigned to work, although their medical records clearly state they are long term unfit.

The group at risk highlighted on October 15th in the Lords, in debate on electoral records’ changes [col 206]  is women in refuges, women who feel at risk. As yet I still see nothing to assure me that measures have been taken to look after this group, here or for care.data.{*}

These are just simplified sample groups others have flagged at risk. I feel these groups’ basic rights are being ignored, because they can be for these minorities. Are they viewed as of less value than the majority of ‘decent, hardworking people’ perhaps, as having less economic worth to the state?

Politicians may say that any change will continue to offer assurances:
“We promote the values of individual human dignity, equal treatment and fairness as the foundations of a democratic society.”

But I simply don’t see it done fairly for all.

I see society being quite deliberately segmented into different population groups, weak and strong. Some groups need more looking after than others, and I am attentive when I hear of groups portrayed as burdens to society, the rest who are economically ‘productive’.

Indeed we seem to have reached a position in which the default position undermines the rights of the vulnerable, far from offering additional responsibilities to those who should protect them.

This stance features often in the media discussion and in political debate, on health and social care. DWP workfare, JSA, ‘bedroom tax’ to name but a few.


How undermining Rights undermines access

So, as the NHS England five year forward plan was announced recently, I wonder how the plan for the NHS and the visions for the coming 5 year parliamentary terms will soon align?

There is a lot of talking about plans, but more important is what happens as a result not of what we say, but of what we do, or don’t do. Not only for future, but what is already, today.

Politically, socially and economically we do not exist in silos. So too, our human rights which overlap in these areas, should be considered together.

Recent years has seen a steady reduction of rights to access for the most vulnerable in society. Access to a lawyer or judicial review has been made more difficult through charging for it.  The Ministry of Justice is currently pushing for, but losing it seems their quest in the Lords, for changes to the judicial review law.

If you are a working-age council or housing association tenant, the council limits your housing benefit claim if it decides you have ‘spare’ bedrooms. Changes have hit the disabled and their families hardest. These segments of the population are being denied or given reduced access to health, social and legal support.

Ethical Values need Championed

Whilst it appears the state increasingly measures everything in economic value, I believe the public must not lose sight of our ethical values, and continue to challenge and champion their importance.

How we manage our ethics today is shaping our children. What do we want their future to be like? It will also be our old age. Will we by then be measured by our success in achievement, by what we ‘do’, by what we financially achieved in life, by our health, or by who we each are? Or more intrinsically, values judged even, based on our DNA?

Will it ever be decided by dint of our genes, what level of education we can access?

Old age brings its own challenges of care and health, and we are an aging population. Changes today are sometimes packaged as shaping our healthcare fit for the 21st century.

I’d suggest that current changes in medical research and the drivers behind parts of the NHS 5YP vision will shape society well beyond that.

What restrictions do we place on value and how are moral and material values to play out together? Are they compatible or in competition?

Because there is another human right we should remember in healthcare, that of striving to benefit from scientific improvement.

This is an area in which the rights of the vulnerable and the responsibilities to uphold them must be clearer than clear.

In research if Rights are undermined, it may impact Responsibilities for research

I would like to understand how the boundary is set of science and technology and who sets them on what value basis in ethics committees and more. How does it control or support the decision making processes which runs in the background of NHS England which has shaped this coming 5 year policy?

It appears there are many decisions on rare disease, on commissioning,  for example, which despite their terms of reference, see limited or no public minutes, which hinders a transparency of their decision making.

The PSSAG has nothing at all. Yet they advise on strategy and hugely significant parts of the NHS budget.

Already we see fundamental changes of approach which appear to have economic rather than ethical reasons behind them. This in stem-cell banking, is a significant shift for the state away from the absolute belief in the non-commercialisation of human tissue, and yet little public debate has been encouraged.

There is a concerted effort from research bodies, and from those responsible for our phenotype data {*}, to undermine the coming-in-2015, stronger, European data protection and regulation, with attempt to amend EU legislation in line with [less stringent] UK policy. Policy which is questioned by data experts on the use of pseudonymisation for example.

How will striving to benefit from scientific improvement overlap with material values of ‘economic function’ is clear when we hear often that UK Life Sciences are the jewel in the crown of the UK economy? Less spoken of, is how this function overlaps with our moral values.

“We’ve got to change the way we innovate, the way that we collaborate, and the way that we open up the NHS.” [David Cameron, 2011]

care.data – “anticipating things to come” means confidence by design

“By creating these coloured paper cut-outs, it seems to me that I am happily anticipating things to come…I know that it will only be much later that people will realise to what extent the work I am doing today is in step with the future.” Henri Matisse (1869-1954) [1]
My thoughts on the care.data advisory event Saturday September 6th.  “Minority voices, the need for confidentiality and anticipating the future.”

Part one here>> Minority voices

This is Part two >> the need for confidentiality and anticipating the future.”

[Video in full > here. Well worth a viewing.]

Matisse – The cut outs

Matisse when he could no longer paint, took to cutting shapes from coloured paper and pinning them to the walls of his home. To start with, he found the process deeply unsatisfying. He felt it wasn’t right. Initially, he was often unsure what he would make from a sheet. He pinned cutouts to his walls. But tacking things on as an afterthought, rearranging them superficially was never as successful as getting it right from the start. As he became more proficient, he would cut a form out in one piece, from start to finish. He could visualise the finished piece before he started. His later work is very impressive, much more so in real life than on  screen or poster. His cut outs took on life and movement, fronds would hang in the air, multiple pieces which matched up were grouped into large scale collections of pieces on his walls. They became no longer just 2D shapes but 3D and complete pictures. They would tell a joined-up story, just as our flat 2D pieces of individual data will tell others the story of our colourful 3D lives once they are matched and grouped together in longitudinal patient tracking from cradle to grave.

Data Confidentiality is not a luxury

From the care.data advisory meeting on September 6th, I picked out the minority voices I think we need to address better.

In addition to the minority groups, there are also cases in which privacy, for both children and adults, is more important to an individual than many of us consider in the usual discussion. For those at risk in domestic violence the ability to keep private information confidential is vital. In the cases when this fails the consequences can be terrible. My local news  told this week of just such a woman and child whose privacy were compromised.

“It is understood that the girl’s mother had moved away to escape domestic violence and that her ex-partner had discovered her new address.” (Guardian, Sept 12th)

This story has saddened me greatly.  This could have been one of my children or their classmates.

These are known issues when considering data protection, and for example are addressed in the RCGP Online Roadmap (see Box 9, p20).

“Mitigation against coercion may not have a clear solution. Domestic violence and cyberstalking by the abuser are particularly prevalent issues.”

Systems and processes can design in good privacy, or poor privacy, but the human role is a key part of the process, as human error can be the weakest link in the security chain.

Yet as regards care.data, I’ve yet to hear much mention of preventative steps in place, except an opt out. We don’t know how many people at local commissioning levels will access how much of our data and how often. This may go to show why I still have so many questions how the opt out will work in practice, [5] and why it matters. It’s not a luxury, it can be vital to an individual. How much of a difference in safety, is achieved using identifiable vs pseudonymised data, compared with real individual risk or fear?


“The British Crime Survey (BCS) findings of stalking prevalence (highest estimate: 22% lifetime, 7% in the past year) give a 5.5% lifetime risk of interference with online medical records by a partner, and a 1.75% annual risk.”
This Online Access is for direct care use. There is a greater visible benefit for the individual to access their own data than in care.data, for secondary uses. But I’m starting to wonder, if in fact care.data is just one great big pot of data and the uses will be finalised later?Is this why scope is so hard to pin down?


The slides of who will use care.data included ‘the patient’ at this 6th September meeting. How, and why? I want to have the following  explained to me, because I think it’s fundamental to opt out. This is detailed, I warn you now, but I think really important:

How does the system use the Opt out?

If you imagine different users looking at the same item of data in any one record, let’s say prescribing history, then it’s the security role and how the opt out codes work which will determine who gets to see what.



I assume here, there are not multiple copies of “my medications” in my record.  The whole point of giant databases is real-time, synched data, so “my medications” will not be stored in one place in the Summary Care Record (SCR) and copied again in ‘care.data’ and a third time in my ‘Electronic Prescription Service (EPS). There will be one place in which “my medications” is recorded.


The label under which a user can see that data for me, is their security role, but to me largely irrelevant. Except for opt out.


I have questions: If I opt out of the SCR programme at my GP, but opt in at my pharmacy to the EPS, what have I opted in to? Who has permission to view “my medications”  in my core record now? Have I created in effect an SCR, without realising it?


[I realise these are detailed questions, but ones we need to ask if we are to understand and inform our decision, especially if we have responsibility for the care of others.]


If I want to permit the use of my record for direct care (SCR) but not secondary uses (care.data) how do the two opt outs work together,  and what about my other hospital information?


Do we understand what we have and have not given permission for and to whom?
If there’s only one record, but multiple layers of user access who get to see it,  how will those be built, and where is the overlap?
We should ask these questions on behalf of others, because these under represented groups and minorities cannot if they are not in the room.

Sometimes we all need privacy. What is it worth?

Individuals and minorities in our community may feel strongly about maintaining privacy, for reasons of discrimination, or of being ‘found out’ through a system which can trace them. For reasons of fear. Others can’t always see the reasons for it, but that doesn’t take away from the value it has for the person who wants it or their need for that human right to be respected. How much is it worth?

It seems the more we value keeping data private, the more the cash value it has for others. In 2013, the FT created a nifty calculator and in an interview with Dave Morgan, reckoned our individual data is worth less than $1. General details such as age, gender and location are in the many decimal place range of fractions of a cent. The more interesting your life events, the more you can add to your data’s total value. Take pregnancy as an example.  Or if you add genomic data it  goes up in market value again.

Whilst this data may on a spreadsheet be no more than a dollar amount, in real life it may have immeasurably greater value to us on which you cannot put a price tag. It may be part of our life we do not wish others to see into. We may have personal or medical data, or recorded experiences we simply do not want to share with anyone but our GP. We might want a layered option like this suggestion by medConfidential to allow some uses but not others. [6]

In this debate it is rare that we mention the PDS (Personal Demographic Service), which holds the name and core contact details of every person with and NHS number past and present, almost 80 million. This is what can compromise privacy, when the patient can be looked up by any A&E, everyone with Summary Care Record access on N3 with technical ability to do so. It is a weak link. The security system relies on human validations, effectively in audit ‘does this seem OK to have looked up?’  These things happen and can go unchecked for a long period without being traced.

Systems and processes on this scale need security designed, that scales up to match in size.

Can data be included but not cut out privacy?

Will the richness of GP record / care.data datasharing afford these individuals the level of privacy they want? If properly anonymised, it would go some way to permitting groups to feel they could stay opted in, and the data quality and completeness would be better. But the way it is now, they may feel the risks created by removing their privacy are too great. The care.data breadth and data quality will suffer as a consequence.

The requirement of care.data to share identifiable information we may not want to, and that it is an assumed right of others to do so, with an assumed exploitation for the benefit of UK plc, especially if an opt-out system proceeds, feels to many, an invasion of the individual’s privacy and right to confidentiality. It can have real personal consequences for the individual.

The right to be open, honest and trusting without fear of repercussion matters. It matters to a traveller or to someone fleeing domestic violence with fears of being traced. It matters to someone of transgender, and others who want to live without stigma. It matters to our young people.

The BMA recognised this with their vote for an opt-in system earlier this year. 

Quality & Confidence by Design

My favourite exhibition piece at Tate Britain is still Barbara Hepworth’s [3] Pelagos from 1946. It is artistically well reviewed but even if you know little of art, it is simply a beautiful thing to see. (You’re not allowed to touch, even though it really should be, and it makes you want to.) Carved from a single piece of wood, designed with movement, shape, colour and shadow. It contains a section of strings, a symbol of interconnectivity. (Barbara Hepworth: Pelagos[4]). Seen as a precious and valuable collection, the Hepworth room has its own guard and solid walls. As much as I would have liked to take pictures, photography was not permitted and natural light was too low. Visitors must respect that.

So too, I see the system design needs of good tech. Set in and produced in a changing landscape. Designed with the view in mind of how it will look completed, and fully designed before the build began, but with flexibility built in. Planned interconnectivity. Precise and professional. Accurate. And the ability to see the whole from the start. Once finished, it is kept securely, with physical as well as system-designed security features.

All these are attributes which care.data failed to present from its conception but appear to be in progress of development through the Health and Social Care Information Centre. Plans are in progress [6] following the Partridge Review, and were released on September 3rd, with forward looking dates. For example, a first wave of audits is scheduled for completion 1/09 for four organisations. HSCIC will ‘pursue a technical solution to allow data access, w/out need to release data out to external orgs. Due 30/11.’ These steps are playing catch up, with what should have been good governance practices and procedures in the past. It need not be this way for GP care.data if we know that design is right, from the start.

As I raised on Saturday, at the Sept 6th workshop advisory committee, and others will no doubt have done before me, this designing from the start matters.  Design for change of scope, and incorporating that into the communications process for the future is vital for the pathfinders. One thing will be certain for pathfinder practices, there will be future changes.

This wave of care.data is only one step along a broad and long data sharing path

To be the best of its kind, care.data must create confidence by design, build-in the solutions to all these questions which have been and continue to be asked. We should be able to see today the plans for what care.data is intended to be when finished, and design the best practices into the structure from the start. Scope is still a large part of that open question. Scope content, future plans, and how the future project will manage its change processes.

As with Matisse, we must ask the designers, planners and comms/intelligence and PR teams, please think ahead  ”anticipating things to come”. Then we can be confident that we’ve  something fit for the time we’re in, and all of our kids’ futures. Whether they’ll be travellers, trans, have disabilities, be in care or not.  For our majority and all our minorities. We need to build a system that serves all of the society we want to see. Not only the ‘easy-to-reach’ parts.

”Anticipating things to come” can mean anticipating problems early, so that costly mistakes can be avoided.

Anticipating the future

One must keep looking to design not for the ‘now’ but for tomorrow. Management of future change, scope and communication is vital to get right.

This is as much a change process as a technical implementation project. In fact, it is perhaps more about the transformation, as it is called at NHS England, than the technology.The NHS landscape is changing – who will deliver our healthcare. And the how is changing too, as telecare and ever more apps are rolled out. Nothing is constant, but change. How do we ensure everyone involved in top-down IT projects understands how the system supports, but does not drive change? Change is about process and people. The system is a tool to enable people. The system is not the goal.

We need to work today to be ahead of the next step for the future. We must ensure that processes and technology, the way we do things and the tools that enable what we do, are designing the very best practices into the whole, from the very beginning. From the ground up. Taking into account fair processing of Data Protection Law, EU law – the upcoming changes in EU data protection law –  and best practice. Don’t rush to bend a future law in current design or take a short cut in security for the sake of speed. Those best practices need not cut out the good ethics of consent and confidentiality. They can co-exist with world class research and data management. They just need included by design, not tacked on, and superficially rearranged afterwards.

So here’s my set of challenge scenarios for NHS England to answer.

1. The integration of health and social care marches on at a pace, and the systems and its users are to follow suit. How is NHS England ensuring the building of a system and processes  which are ‘anticipating by design’ these new models of data management for this type of care delivery, not staying stuck on the model of top-down mass surveillance database, planned for the last decade?

2. How will NHS England audit that a system check does not replace qualified staff decisions, with algorithms and flags for example, on a social care record? Risk averse, I fear that the system will encourage staff to be less likely to make a decision that goes against the system recommendation, ‘for child removal’, for example. Even though their judgement based on human experience, may suggest a different outcome. What are the system-built-in assumed outcomes – if you view the new social care promotional videos at least it’s pretty consistent. The most depressing stereo typed scenarios I’ve seen anywhere I think. How will this increase in data and sharing, work?

“What makes more data by volume, equal more intelligence by default?”

Just like GP call centre OOH today, sends too many people calling the 111 service to A&E now, I wonder if a highly systemised social care system risks sending too many children from A&E into social care? Children who should not be there but who meet the criteria set by insensitive algorithms or the converse risk that don’t, and get missed by over reliance on a system, missing what an experienced professional can spot.

3. How will the users of the system use their system data, and how has it been tested and likely outcomes measured against current data? i.e. will more or fewer children taken into care be seen as a measure of success? How will any system sharing be audited in governance and with what oversight in future?

Children’s social care is not a system that is doing well as it is today, by many accounts, you only need glance at the news most days, but integration will change how is it delivers service for the needs of our young people. It is an example we can apply in many other cases.

What plan is in place to manage these changes of process and system use? Where is public transparency?

care.data has to build in consent, security and transparency from the start, because it’s a long journey ahead, as data is to be added incrementally over time. As our NHS and social care organisational models are changing, how are we ensuring confidentiality and quality built-in-by-design to our new health and social care data sharing processes?

What is set up now, must be set up fit for the future.

Tacking things on afterwards, means lowering your chance of success.

Matisse knew, “”Anticipating things to come” can mean being positively in step with the future by the time it was needed. By anticipating problems early, costly mistakes can be avoided.”

*****

Immediate information and support for women experiencing domestic violence: National Domestic Violence, Freephone Helpline 0808 2000 247

*****

[1] Interested in a glimpse into the Matisse exhibition which has now closed? Check out this film.

[2] Previous post: My six month pause round up [part one] https://jenpersson.com/care-data-pause-six-months-on/

[3] Privacy and Prejudice: http://www.raeng.org.uk/publications/reports/privacy-and-prejudice-views This study was conducted by The Royal Academy of Engineering (the Academy) and Laura Grant Associates and was made possible by a partnership with the YTouring Theatre Company, support from Central YMCA, and funding from the Wellcome Trust and three of the Research Councils (Engineering and Physical and Sciences Research Council; Economic and Social Research Council and Medical Research Council).

[4]  Barbara Hepworth – Pelagos – in Prospect Magazine

[5] Questions remain open on how opt out works with identifiable vs pseudonymous data sharing requirement and what the objection really offers. [ref: Article by Tim Kelsey in Prospect Magazine 2009 “Long Live the Database State.”]
[6] HSCIC current actions published with Board minutes
[8] NIB https://app.box.com/s/aq33ejw29tp34i99moam/1/2236557895/19347602687/1
*****

More information about the Advisory Group is here: http://www.england.nhs.uk/ourwork/tsd/ad-grp/

More about the care.data programme here at HSCIC – there is an NHS England site too, but I think the HSCIC is cleaner and more useful: http://www.hscic.gov.uk/article/3525/Caredata