Category Archives: communications

Data-Driven Responses to COVID-19: Lessons Learned OMDDAC event

A slightly longer version of a talk I gave at the launch event of the OMDDAC Data-Driven Responses to COVID-19: Lessons Learned report on October 13, 2021. I was asked to respond to the findings presented on Young People, Covid-19 and Data-Driven Decision-Making by Dr Claire Bessant at Northumbria Law School.

[ ] indicates text I omitted for reasons of time, on the day.

Their final report is now available to download from the website.

You can also watch the full event here via YouTube. The part on young people presented by Claire and that I follow, is at the start.

—————————————————–

I’m really pleased to congratulate Claire and her colleagues today at OMDDAC and hope that policy makers will recognise the value of this work and it will influence change.

I will reiterate three things they found or included in their work.

  1. Young people want to be heard.
  2. Young people’s views on data and trust, include concerns about conflated data purposes

and

3. The concept of being, “data driven under COVID conditions”.

This OMDDAC work together with Investing in Children,  is very timely as a rapid response, but I think it is also important to set it in context, and recognize that some of its significance is that it reflects a continuum of similar findings over time, largely unaffected by the pandemic.

Claire’s work comprehensively backs up the consistent findings of over ten years of public engagement, including with young people.

The 2010 study with young people conducted by The Royal Academy of Engineering supported by three Research Councils and Wellcome, discussed attitudes towards the use of medical records and concluded: These questions and concerns must be addressed by policy makers, regulators, developers and engineers before progressing with the design, and implementation of record keeping systems and the linking of any databases.

In 2014, the House of Commons Science and Technology Committee in their report, Responsible Use of Data, said the Government has a clear responsibility to explain to the public how personal data is being used

The same Committee’s Big Data Dilemma 2015-16 report, (p9) concluded “data (some collected many years before and no longer with a clear consent trail) […] is unsatisfactory left unaddressed by Government and without a clear public-policy position.

Or see

2014, The Royal Statistical Society and Ipsos Mori work on the data trust deficit with lessons for policymakers, 2019  DotEveryone’s work on Public Attitudes or the 2020 The ICO Annual Track survey results.

There is also a growing body of literature to demonstrate what the implications are being a ‘data driven’ society, for the datafied child, as described by Deborah Lupton and Ben Williamson in their own research in 2017.

[This year our own work with young people, published in our report on data metaphors “the words we use in data policy”, found that young people want institutions to stop treating data about them as a commodity and start respecting data as extracts from the stories of their lives.]

The UK government and policy makers, are simply ignoring the inconvenient truth that legislation and governance frameworks such as the UN General Comment no 25 on Children in the Digital Environment, that exist today, demand people know what is done with data about them, and it must be applied to address children’s right to be heard and to enable them to exercise their data rights.

The public perceptions study within this new OMDDAC work, shows that it’s not only the views of children and young people that are being ignored, but adults too.

And perhaps it is worth reflecting here, that often people don’t tend to think about all this in terms of data rights and data protection, but rather human rights and protections for the human being from the use of data that gives other people power over our lives.

This project, found young people’s trust in use of their confidential personal data was affected by understanding who would use the data and why, and how people will be protected from prejudice and discrimination.

We could build easy-reporting mechanisms at public points of contact with state institutions; in education, in social care, in welfare and policing, to produce reports on demand of the information you hold about me and enable corrections. It would benefit institutions by having more accurate data, and make them more trustworthy if people can see here’s what you hold on me and here’s what you did with it.

Instead, we’re going in the opposite direction. New government proposals suggest making that process harder, by charging for Subject Access Requests.

This research shows that current policy is not what young people want. People want the ability to choose between granular levels of control in the data that is being shared. They value having autonomy and control, knowing who will have access, maintaining records accuracy, how people will be kept informed of changes, who will maintain and regulate the database, data security, anonymisation, and to have their views listened to.

Young people also fear the power of data to speak for them, that the data about them are taken at face value, listened to by those in authority more than the child in their own voice.

What do these findings mean for public policy? Without respect for what people want; for the fundamental human rights and freedoms for all, there is no social license for data policies.

Whether it’s confidential GP records or the school census expansion in 2016, when public trust collapses so does your data collection.

Yet the government stubbornly refuses to learn and seems to believe it’s all a communications issue, a bit like the ‘Yes Minister’ English approach to foreigners when they don’t understand: just shout louder.

No, this research shows data policy failures are not fixed by, “communicate the benefits”.

Nor is it fixed by changing Data Protection law. As a comment in the report says, UK data protection law offers a “how-to” not a “don’t-do”.

Data protection law is designed to be enabling of data flows. But that can mean that when state data processing rightly often avoids using the lawful basis of consent in data protection terms, the data use is not consensual.

[For the sake of time, I didn’t include this thought in the next two paragraphs in the talk, but I think it is important to mention that in our own work we find that this contradiction is not lost on young people. — Against the backdrop of the efforts after the MeToo movement and lots said by Ministers in Education and at the DCMS about the Everyone’s Invited work earlier this year to champion consent in relationships, sex and health education (RSHE) curriculum; adults in authority keep saying consent matters, but don’t demonstrate it, and when it comes to data, use people’s data in ways they do not want.

The report picks up that young people, and disproportionately those communities that experience harm from authorities, mistrust data sharing with the police. This is now set against the backdrop of not only the recent, Wayne Couzens case, but a series of very public misuses of police power, including COVID powers.]

The data powers used, “Under COVID conditions” are now being used as a cover for the attack on data protections in the future. The DCMS consultation on changing UK Data Protection law, open until November 19th, suggests that similarly reduced protections on data distribution in the emergency, should become the norm. While DP law is written expressly to permit things that are out of the ordinary in extraordinary circumstances, they are limited in time. The government is proposing that some things that were found convenient to do under COVID, now become commonplace.

But it includes things such as removing Article 22 from the UK GDPR with its protections for people in processes involving automated decision making.

Young people were those who felt first hand the risks and harms of those processes in the summer of 2020, and the “mutant algorithm” is something this Observatory Report work also addressed in their research. Again, it found young people felt left out of those decisions about them despite being the group that would feel its negative effects.

[Data protection law may be enabling increased lawful data distribution across the public sector, but it is not offering people, including young people, the protections they expect of their human right to privacy. We are on a dangerous trajectory for public interest research and for society, if the “new direction” this government goes in, for data and digital policy and practice, goes against prevailing public attitudes and undermines fundamental human rights and freedoms.]

The risks and benefits of the power obtained from the use of admin data are felt disproportionately across different communities including children, who are not a one size fits all, homogenous group.

[While views across groups will differ — and we must be careful to understand any popular context at any point in time on a single issue and unconscious bias in and between groups — policy must recognise where there are consistent findings across this research with that which has gone before it. There are red lines about data re-uses, especially on conflated purposes using the same data once collected by different people, like commercial re-use or sharing (health) data with police.]

The golden thread that runs through time and across different sectors’ data use, are the legal frameworks underpinned by democratic mandates, that uphold our human rights.

I hope the powers-at-be in the DCMS consultation, and wider policy makers in data and digital policy, take this work seriously and not only listen, but act on its recommendations.


2024 updates: opening paragraph edited to add current links.
A chapter written by Rachel Allsopp and Claire bessant discussing OMDDAC’s research with children will also be published on 21st May 2024 in Governance, democracy and ethics in crisis-decision-making: The pandemic and beyond (Manchester University Press) as part of its Pandemic and Beyond series https://manchesteruniversitypress.co.uk/9781526180049/ and an article discussing the research in the open access European Journal of Law and Technology is available here https://www.ejlt.org/index.php/ejlt/article/view/872.

Damage that may last a generation.

Hosted by the Mental Health Foundation, it’s Mental Health Awareness Week until 24th May, 2020. The theme for 2020 is ‘kindness’.

So let’s not comment on the former Education Ministers and MPs, the great-and-the-good and the-recently-resigned, involved in the Mail’s continued hatchet job on teachers. They probably believe that they are standing up for vulnerable children when they talk about the “damage that may last a generation“. Yet the evidence of much of their voting, and policy design to-date, suggests it’s much more about getting people back to work.

Of course there are massive implications for children in families unable to work or living with the stress of financial insecurity on top of limited home schooling. But policy makers should be honest about the return to school as an economic lever, not use children’s vulnerability to pressure professionals to return to full-school early, or make up statistics to up the stakes.

The rush to get back to full-school for the youngest of primary age pupils has been met with understandable resistance, and too few practical facts. Going back to a school in COVID-19 measures for very young children, will take tonnes of adjustment, to the virus, to seeing friends they cannot properly play with, to grief and stress.

When it comes to COVID-19 risk, many countries with similar population density to the UK, locked down earlier and tighter and now have lower rates of community transmission than we do. Or compare where didn’t, Sweden, but that has a population density of 24 people per Km2. The population density for the United Kingdom is 274 people per square kilometre. In Italy, with 201 inhabitants per square kilometre,  you needed a permission slip to leave home.

And that’s leaving aside the unknowns on COVID-19 immunity, or identifying it, or the lack of testing offer to over a million children under-5,  the very group expected to be those who return first to full-school.

Children have rights to education, and to life, survival and development. But the blanket target groups and target date, don’t appear to take the Best Interests of The Child, for each child, into account at all. ‘Won’t someone think of the children?’ may never have been more apt.

Parenting while poor is highly political

What’s the messaging in the debate, even leaving media extremes aside?

The sweeping assumption by many commentators that ‘the poorest children will have learned nothing‘ (BBC Newsnight, May 19) is unfair, but this blind acceptance as fact, a politicisation of parenting while poor, conflated with poor parenting, enables the claimed concern for their vulnerability to pass without question.

Many of these most vulnerable children were not receiving full time education *before* the pandemic but look at how it is told.

It would be more honest in discussion or publishing ‘statistics’ around the growing gap expected if children are out of school, to consider what the ‘excess’ gap will be and why. (Just like measuring excess deaths, not only those people who died and had been tested for COVID-19.) Thousands of vulnerable children were out of school already, due tobudget decisions that had left local authorities unable to fulfil their legal obligation to provide education.’

Pupil Referral Units were labeled “a scandal” in 2012 and only last year the constant “gangs at the gates” narrative was highly political.

“The St Giles Trust research provided more soundbites. Pupils involved in “county lines” are in pupil referral units (PRUs), often doing only an hour each day, and rarely returning into mainstream education.’ (Steve Howell, Schools Week)

Nearly ten years on, there is still lack of adequate support for children in Alternative Provision and a destructive narrative of “us versus them”.

Source: @sarahkendzior

The value of being in school

Schools have remained open for children of key workers and more than half a million pupils labeled as ‘vulnerable’, which includes those classified as “children in need” as well as 270,000 children with an education, health and care (EHC) plan for special educational needs.  Not all of those are ‘at risk’ of domestic violence or abuse or neglect. The reasons why there is low turnout, tend to be conflated.

Assumptions abound about the importance of formal education and the best place for those very young children in Early Years (age 2-5) to be in school at all, despite conflicting UK evidence, that is thin on the ground. Research for the NFER [the same organisation running the upcoming Baseline Test of four year olds still due to begin this year] (Sharp, 2002), found:

“there would appear to be no compelling educational rationale for a statutory school age of five or for the practice of admitting four-year-olds to school reception classes.” And “a late start appears to have no adverse effect on children’s progress.”

Later research from 2008, from the IoE, Research Report No. DCSF-RR061 (Sylva et al, 2008) commissioned before the then ‘new’ UK Government took office in 2010, suggested better outcomes for children who are in excellent Early Years provision, but also pointed out that more often the most vulnerable are not those in the best of provision.

“quality appears to be especially important for disadvantaged groups.”

What will provision quality be like, under Coronavirus measures? How much stress-free space and time for learning will be left at all?

The questions we should be asking are a) What has been learned for the second wave and b) Assume by May 2021 nothing changes. What would ideal schooling look like, and how do we get there?

Attainment is not the only gap

While it is not compulsory to be in any form of education, including home education, till your fifth birthday in England, most children start school at age 4 and turn five in the course of the year. It is one of the youngest starts in Europe.  Many hundreds of thousands of children start formal education in the UK even younger from age 2 or three. Yet is it truly better for children? We are way down the Pisa attainment scores, or comparable regional measures.  There has been little change in those outcomes in 13 years, except to find that our children are measured as being progressively less happy.

“As Education Datalab points out, the PISA 2018 cohort started school around 2008, so their period at school not only lines up with the age of austerity and government cuts, but with the “significant reforms” to GCSEs introduced by Michael Gove while he was Education Secretary.”  [source: Schools Week, 2019]

There’s no doubt that some of the harmful economic effects of Brexit will be attributed to the effects of the pandemic. Similarly, many of the outcomes of ten years of policy that have increased  children’s vulnerability and attainment gap, pre-COVID-19, will no doubt be conflated with harms from this crisis in the next few years.

The risk of the acceptance of misattributing this gap in outcomes, is a willingness to adopt misguided solutions, and deny accountability.

Children’s vulnerability

Many experts in children’s needs, have been in their jobs much longer than most MPs and have told them for years the harm their policies are doing to the very children, those voices now claim to want to protect. Will the MPs look at that evidence and act on it?

More than a third of babies are living below the poverty line in the UK. The common thread in many [UK] families’ lives, as Helen Barnard, deputy director for policy and partnerships for the Joseph Rowntree Foundation described in 2019, is a rising tide of work poverty sweeping across the country.” Now the Coronavirus is hitting those families harder too. The ONS found that in England the death rate  in the most deprived areas is 118% higher than in the least deprived.

Charities speaking out this week, said that in the decade since 2010, local authority spending on early intervention services dropped by 46% but has risen on late intervention, from 58% to 78% of spending on children and young people’s services over the same period.

If those advocating for a return to school, for a month before the summer, really want to reduce children’s vulnerability, they might sort out CAMHs for simultaneous support of the return to school, and address those areas in which government must first do no harm. Fix these things that increase the “damage that may last a generation“.


Case studies in damage that may last

Adoption and Children (Coronavirus) (Amendment) Regulations 2020’

Source: Children’s Commissoner (April 2020)

“These regulations make significant temporary changes to the protections given in law to some of the most vulnerable children in the country – those living in care.” ” I would like to see all the regulations revoked, as I do not believe that there is sufficient justification to introduce them. This crisis must not remove protections from extremely vulnerable children, particularly as they are even more vulnerable at this time. As an urgent priority it is essential that the most concerning changes detailed above are reversed.”

CAMHS: Mental health support

Source: Local Government Association CAMHS Facts and Figures

“Specialist services are turning away one in four of the children referred to them by their GPs or teachers for treatment. More than 338,000 children were referred to CAMHS in 2017, but less than a third received treatment within the year. Around 75 per cent of young people experiencing a mental health problem are forced to wait so long their condition gets worse or are unable to access any treatment at all.”

“Only 6.7 per cent of mental health spending goes to children and adolescent mental health services (CAMHS). Government funding for the Early Intervention Grant has been cut by almost £500 million since 2013. It is projected to drop by a further £183 million by 2020.

“Public health funding, which funds school nurses and public mental health services, has been reduced by £600 million from 2015/16 to 2019/20.”

Child benefit two-child limit

Source: May 5, Child Poverty Action Group
“You could not design a policy better to increase child poverty than this one.” source: HC51 House of Commons Work and Pensions Committee
The two-child limit Third Report of Session 2019 (PDF, 1 MB)

“Around sixty thousand families forced to claim universal credit since mid-March because of COVID-19 will discover that they will not get the support their family needs because of the controversial ‘two-child policy”.

Housing benefit

Source: the Poverty and Social Exclusion in the United Kingdom research project funded by the Economic and Social Research Council.

“The cuts [introduced from 2010 to the 2012 budget] in housing benefit will adversely affect some of the most disadvantaged groups in society and are likely to lead to an increase in homelessness, warns the homeless charity Crisis.”

Legal Aid for all children

Source: The Children’s Society, Cut Off From Justice, 2017

“The enactment of the Legal Aid, Punishment and Sentencing of Offenders Act 2012 (LASPO) has had widespread consequences for the provision of legal aid in the UK. One key feature of the new scheme, of particular importance to The Children’s Society, were the changes made to the eligibility criteria around legal aid for immigration cases. These changes saw unaccompanied and separated children removed from scope for legal aid unless their claim is for asylum, or if they have been identified as victims of child trafficking.”

“To fulfill its obligations under the UNCRC, the Government should reinstate legal aid for all unaccompanied and separated migrant children in matters of immigration by bringing it back within ‘scope’ under the Legal Aid, Sentencing and Punishment of Offenders Act 2012. Separated and unaccompanied children are super-vulnerable.”

Library services

Source: CIPFA’s annual library survey 2018

“the number of public libraries and paid staff fall every year since 2010, with spending reduced by 12% in Britain in the last four years.” “We can view libraries as a bit of a canary in the coal mine for what is happening across the local government sector…” “There really needs to be some honest conversations about the direction of travel of our councils and what their role is, as the funding gap will continue to exacerbate these issues.”

No recourse to public funds: FSM and more

source: NRPF Network
“No recourse to public funds (NRPF) is a condition imposed on someone due to their immigration status. Section 115 Immigration and Asylum Act 1999 states that a person will have ‘no recourse to public funds’ if they are ‘subject to immigration control’.”

“children only get the opportunity to apply for free school meals if their parents already receive certain benefits. This means that families who cannot access these benefits– because they have what is known as “no recourse to public funds” as a part of their immigration status– are left out from free school meal provision in England.”

Sure Start

Source: Institute for Fiscal Studies (2019)

“the reduction in hospitalisations at ages 5–11 saves the NHS approximately £5 million, about 0.4% of average annual spending on Sure Start. But the types of hospitalisations avoided – especially those for injuries – also have big lifetime costs both for the individual and the public purse”.

Youth Services

Source: Barnardo’s (2019) New research draws link between youth service cuts and rising knife crime.

“Figures obtained by the All-Party Parliamentary Group (APPG) on Knife Crime show the average council has cut real-terms spending on youth services by 40% over the past three years. Some local authorities have reduced their spending – which funds services such as youth clubs and youth workers – by 91%.”

Barnardo’s Chief Executive Javed Khan said:

“These figures are alarming but sadly unsurprising. Taking away youth workers and safe spaces in the community contributes to a ‘poverty of hope’ among young people who see little or no chance of a positive future.”

The illusion that might cheat us: ethical data science vision and practice

This blog post is also available as an audio file on soundcloud.


Anais Nin, wrote in her 1946 diary of the dangers she saw in the growth of technology to expand our potential for connectivity through machines, but diminish our genuine connectedness as people. She could hardly have been more contemporary for today:

“This is the illusion that might cheat us of being in touch deeply with the one breathing next to us. The dangerous time when mechanical voices, radios, telephone, take the place of human intimacies, and the concept of being in touch with millions brings a greater and greater poverty in intimacy and human vision.”
[Extract from volume IV 1944-1947]

Echoes from over 70 years ago, can be heard in the more recent comments of entrepreneur Elon Musk. Both are concerned with simulation, a lack of connection between the perceived, and reality, and the jeopardy this presents for humanity. But both also have a dream. A dream based on the positive potential society has.

How will we use our potential?

Data is the connection we all have between us as humans and what machines and their masters know about us. The values that masters underpin their machine design with, will determine the effect the machines and knowledge they deliver, have on society.

In seeking ever greater personalisation, a wider dragnet of data is putting together ever more detailed pieces of information about an individual person. At the same time data science is becoming ever more impersonal in how we treat people as individuals. We risk losing sight of how we respect and treat the very people whom the work should benefit.

Nin grasped the risk that a wider reach, can mean more superficial depth. Facebook might be a model today for the large circle of friends you might gather, but how few you trust with confidences, with personal knowledge about your own personal life, and the privilege it is when someone chooses to entrust that knowledge to you. Machine data mining increasingly tries to get an understanding of depth, and may also add new layers of meaning through profiling, comparing our characteristics with others in risk stratification.
Data science, research using data, is often talked about as if it is something separate from using information from individual people. Yet it is all about exploiting those confidences.

Today as the reach has grown in what is possible for a few people in institutions to gather about most people in the public, whether in scientific research, or in surveillance of different kinds, we hear experts repeatedly talk of the risk of losing the valuable part, the knowledge, the insights that benefit us as society if we can act upon them.

We might know more, but do we know any better? To use a well known quote from her contemporary, T S Eliot, ‘Where is the wisdom we have lost in knowledge? Where is the knowledge we have lost in information?’

What can humans achieve? We don’t yet know our own limits. What don’t we yet know?  We have future priorities we aren’t yet aware of.

To be able to explore the best of what Nin saw as ‘human vision’ and Musk sees in technology, the benefits we have from our connectivity; our collaboration, shared learning; need to be driven with an element of humility, accepting values that shape  boundaries of what we should do, while constantly evolving with what we could do.

The essence of this applied risk is that technology could harm you, more than it helps you. How do we avoid this and develop instead the best of what human vision makes possible? Can we also exceed our own expectations of today, to advance in moral progress?

Continue reading The illusion that might cheat us: ethical data science vision and practice

OkCupid and Google DeepMind: Happily ever after? Purposes and ethics in datasharing

This blog post is also available as an audio file on soundcloud.


What constitutes the public interest must be set in a universally fair and transparent ethics framework if the benefits of research are to be realised – whether in social science, health, education and more – that framework will provide a strategy to getting the pre-requisite success factors right, ensuring research in the public interest is not only fit for the future, but thrives. There has been a climate change in consent. We need to stop talking about barriers that prevent datasharing  and start talking about the boundaries within which we can.

What is the purpose for which I provide my personal data?

‘We use math to get you dates’, says OkCupid’s tagline.

That’s the purpose of the site. It’s the reason people log in and create a profile, enter their personal data and post it online for others who are looking for dates to see. The purpose, is to get a date.

When over 68K OkCupid users registered for the site to find dates, they didn’t sign up to have their identifiable data used and published in ‘a very large dataset’ and onwardly re-used by anyone with unregistered access. The users data were extracted “without the express prior consent of the user […].”

Are the registration consent purposes compatible with the purposes to which the researcher put the data should be a simple enough question.  Are the research purposes what the person signed up to, or would they be surprised to find out their data were used like this?

Questions the “OkCupid data snatcher”, now self-confessed ‘non-academic’ researcher, thought unimportant to consider.

But it appears in the last month, he has been in good company.

Google DeepMind, and the Royal Free, big players who do know how to handle data and consent well, paid too little attention to the very same question of purposes.

The boundaries of how the users of OkCupid had chosen to reveal information and to whom, have not been respected in this project.

Nor were these boundaries respected by the Royal Free London trust that gave out patient data for use by Google DeepMind with changing explanations, without clear purposes or permission.

The legal boundaries in these recent stories appear unclear or to have been ignored. The privacy boundaries deemed irrelevant. Regulatory oversight lacking.

The respectful ethical boundaries of consent to purposes, disregarding autonomy, have indisputably broken down, whether by commercial org, public body, or lone ‘researcher’.

Research purposes

The crux of data access decisions is purposes. What question is the research to address – what is the purpose for which the data will be used? The intent by Kirkegaard was to test:

“the relationship of cognitive ability to religious beliefs and political interest/participation…”

In this case the question appears intended rather a test of the data, not the data opened up to answer the test. While methodological studies matter, given the care and attention [or self-stated lack thereof] given to its extraction and any attempt to be representative and fair, it would appear this is not the point of this study either.

The data doesn’t include profiles identified as heterosexual male, because ‘the scraper was’. It is also unknown how many users hide their profiles, “so the 99.7% figure [identifying as binary male or female] should be cautiously interpreted.”

“Furthermore, due to the way we sampled the data from the site, it is not even representative of the users on the site, because users who answered more questions are overrepresented.” [sic]

The paper goes on to say photos were not gathered because they would have taken up a lot of storage space and could be done in a future scraping, and

“other data were not collected because we forgot to include them in the scraper.”

The data are knowingly of poor quality, inaccurate and incomplete. The project cannot be repeated as ‘the scraping tool no longer works’. There is an unclear ethical or peer review process, and the research purpose is at best unclear. We can certainly give someone the benefit of the doubt and say intent appears to have been entirely benevolent. It’s not clear what the intent was. I think it is clearly misplaced and foolish, but not malevolent.

The trouble is, it’s not enough to say, “don’t be evil.” These actions have consequences.

When the researcher asserts in his paper that, “the lack of data sharing probably slows down the progress of science immensely because other researchers would use the data if they could,”  in part he is right.

Google and the Royal Free have tried more eloquently to say the same thing. It’s not research, it’s direct care, in effect, ignore that people are no longer our patients and we’re using historical data without re-consent. We know what we’re doing, we’re the good guys.

However the principles are the same, whether it’s a lone project or global giant. And they’re both wildly wrong as well. More people must take this on board. It’s the reason the public interest needs the Dame Fiona Caldicott review published sooner rather than later.

Just because there is a boundary to data sharing in place, does not mean it is a barrier to be ignored or overcome. Like the registration step to the OkCupid site, consent and the right to opt out of medical research in England and Wales is there for a reason.

We’re desperate to build public trust in UK research right now. So to assert that the lack of data sharing probably slows down the progress of science is misplaced, when it is getting ‘sharing’ wrong, that caused the lack of trust in the first place and harms research.

A climate change in consent

There has been a climate change in public attitude to consent since care.data, clouded by the smoke and mirrors of state surveillance. It cannot be ignored.  The EUGDPR supports it. Researchers may not like change, but there needs to be an according adjustment in expectations and practice.

Without change, there will be no change. Public trust is low. As technology advances and if we continue to see commercial companies get this wrong, we will continue to see public trust falter unless broken things get fixed. Change is possible for the better. But it has to come from companies, institutions, and people within them.

Like climate change, you may deny it if you choose to. But some things are inevitable and unavoidably true.

There is strong support for public interest research but that is not to be taken for granted. Public bodies should defend research from being sunk by commercial misappropriation if they want to future-proof public interest research.

The purpose for which the people gave consent are the boundaries within which you have permission to use data, that gives you freedom within its limits, to use the data.  Purposes and consent are not barriers to be overcome.

If research is to win back public trust developing a future proofed, robust ethical framework for data science must be a priority today.

Commercial companies must overcome the low levels of public trust they have generated in the public to date if they ask ‘trust us because we’re not evil‘. If you can’t rule out the use of data for other purposes, it’s not helping. If you delay independent oversight it’s not helping.

This case study and indeed the Google DeepMind recent episode by contrast demonstrate the urgency with which working out what common expectations and oversight of applied ethics in research, who gets to decide what is ‘in the public interest’ and data science public engagement must be made a priority, in the UK and beyond.

Boundaries in the best interest of the subject and the user

Society needs research in the public interest. We need good decisions made on what will be funded and what will not be. What will influence public policy and where needs attention for change.

To do this ethically, we all need to agree what is fair use of personal data, when is it closed and when is it open, what is direct and what are secondary uses, and how advances in technology are used when they present both opportunities for benefit or risks to harm to individuals, to society and to research as a whole.

The potential benefits of research are potentially being compromised for the sake of arrogance, greed, or misjudgement, no matter intent. Those benefits cannot come at any cost, or disregard public concern, or the price will be trust in all research itself.

In discussing this with social science and medical researchers, I realise not everyone agrees. For some, using deidentified data in trusted third party settings poses such a low privacy risk, that they feel the public should have no say in whether their data are used in research as long it’s ‘in the public interest’.

For the DeepMind researchers and Royal Free, they were confident even using identifiable data, this is the “right” thing to do, without consent.

For the Cabinet Office datasharing consultation, the parts that will open up national registries, share identifiable data more widely and with commercial companies, they are convinced it is all the “right” thing to do, without consent.

How can researchers, society and government understand what is good ethics of data science, as technology permits ever more invasive or covert data mining and the current approach is desperately outdated?

Who decides where those boundaries lie?

“It’s research Jim, but not as we know it.” This is one aspect of data use that ethical reviewers will need to deal with, as we advance the debate on data science in the UK. Whether independents or commercial organisations. Google said their work was not research. Is‘OkCupid’ research?

If this research and data publication proves anything at all, and can offer lessons to learn from, it is perhaps these three things:

Who is accredited as a researcher or ‘prescribed person’ matters. If we are considering new datasharing legislation, and for example, who the UK government is granting access to millions of children’s personal data today. Your idea of a ‘prescribed person’ may not be the same as the rest of the public’s.

Researchers and ethics committees need to adjust to the climate change of public consent. Purposes must be respected in research particularly when sharing sensitive, identifiable data, and there should be no assumptions made that differ from the original purposes when users give consent.

Data ethics and laws are desperately behind data science technology. Governments, institutions, civil, and all society needs to reach a common vision and leadership how to manage these challenges. Who defines these boundaries that matter?

How do we move forward towards better use of data?

Our data and technology are taking on a life of their own, in space which is another frontier, and in time, as data gathered in the past might be used for quite different purposes today.

The public are being left behind in the game-changing decisions made by those who deem they know best about the world we want to live in. We need a say in what shape society wants that to take, particularly for our children as it is their future we are deciding now.

How about an ethical framework for datasharing that supports a transparent public interest, which tries to build a little kinder, less discriminating, more just world, where hope is stronger than fear?

Working with people, with consent, with public support and transparent oversight shouldn’t be too much to ask. Perhaps it is naive, but I believe that with an independent ethical driver behind good decision-making, we could get closer to datasharing like that.

That would bring Better use of data in government.

Purposes and consent are not barriers to be overcome. Within these, shaped by a strong ethical framework, good data sharing practices can tackle some of the real challenges that hinder ‘good use of data’: training, understanding data protection law, communications, accountability and intra-organisational trust. More data sharing alone won’t fix these structural weaknesses in current UK datasharing which are our really tough barriers to good practice.

How our public data will be used in the public interest will not be a destination or have a well defined happy ending, but it is a long term  process which needs to be consensual and there needs to be a clear path to setting out together and achieving collaborative solutions.

While we are all different, I believe that society shares for the most part, commonalities in what we accept as good, and fair, and what we believe is important. The family sitting next to me have just counted out their money and bought an ice cream to share, and the staff gave them two. The little girl is beaming. It seems that even when things are difficult, there is always hope things can be better. And there is always love.

Even if some might give it a bad name.

********

img credit: flickr/sofi01/ Beauty and The Beast  under creative commons

Can new datasharing laws win social legitimacy, public trust and support without public engagement?

I’ve been struck by stories I’ve heard on the datasharing consultation, on data science, and on data infrastructures as part of ‘government as a platform’ (#GaaPFuture) in recent weeks. The audio recorded by the Royal Statistical Society on March 17th is excellent, and there were some good questions asked.

There were even questions from insurance backed panels to open up more data for commercial users, and calls for journalists to be seen as accredited researchers, as well as to include health data sharing. Three things that some stakeholders, all users of data, feel are  missing from consultation, and possibly some of those with the most widespread public concern and lowest levels of public trust. [1]

What I feel is missing in consultation discussions are:

  1. a representative range of independent public voice
  2. a compelling story of needs – why tailored public services benefits citizens from whom data is taken, not only benefits data users
  3. the impacts we expect to see in local government
  4. any cost/risk/benefit assessment of those impacts, or for citizens
  5. how the changes will be independently evaluated – as some are to be reviewed

The Royal Statistical Society and ODI have good summaries here of their thoughts, more geared towards the statistical and research aspects of data,  infrastructure and the consultation.

I focus on the other strands that use identifiable data for targeted interventions. Tailored public services, Debt, Fraud, Energy Companies’ use. I think we talk too little of people, and real needs.

Why the State wants more datasharing is not yet a compelling story and public need and benefit seem weak.

So far the creation of new data intermediaries, giving copies of our personal data to other public bodies  – and let’s be clear that this often means through commercial representatives like G4S, Atos, Management consultancies and more –  is yet to convince me of true public needs for the people, versus wants from parts of the State.

What the consultation hopes to achieve, is new powers of law, to give increased data sharing increased legal authority. However this alone will not bring about the social legitimacy of datasharing that the consultation appears to seek through ‘open policy making’.

Legitimacy is badly needed if there is to be public and professional support for change and increased use of our personal data as held by the State, which is missing today,  as care.data starkly exposed. [2]

The gap between Social Legitimacy and the Law

Almost 8 months ago now, before I knew about the datasharing consultation work-in-progress, I suggested to BIS that there was an opportunity for the UK to drive excellence in public involvement in the use of public data by getting real engagement, through pro-active consent.

The carrot for this, is achieving the goal that government wants – greater legal clarity, the use of a significant number of consented people’s personal data for complex range of secondary uses as a secondary benefit.

It was ignored.

If some feel entitled to the right to infringe on citizens’ privacy through a new legal gateway because they believe the public benefit outweighs private rights, then they must also take on the increased balance of risk of doing so, and a responsibility to  do so safely. It is in principle a slippery slope. Any new safeguards and ethics for how this will be done are however unclear in those data strands which are for targeted individual interventions. Especially if predictive.

Upcoming discussions on codes of practice [which have still to be shared] should demonstrate how this is to happen in practice, but codes are not sufficient. Laws which enable will be pushed to their borderline of legal and beyond that of ethical.

In England who would have thought that the 2013 changes that permitted individual children’s data to be given to third parties [3] for educational purposes, would mean giving highly sensitive, identifiable data to journalists without pupils or parental consent? The wording allows it. It is legal. However it fails the DPA Act legal requirement of fair processing.  Above all, it lacks social legitimacy and common sense.

In Scotland, there is current anger over the intrusive ‘named person’ laws which lack both professional and public support and intrude on privacy. Concerns raised should be lessons to learn from in England.

Common sense says laws must take into account social legitimacy.

We have been told at the open policy meetings that this change will not remove the need for informed consent. To be informed, means creating the opportunity for proper communications, and also knowing how you can use the service without coercion, i.e. not having to consent to secondary data uses in order to get the service, and knowing to withdraw consent at any later date. How will that be offered with ways of achieving the removal of data after sharing?

The stick for change, is the legal duty that the recent 2015 CJEU ruling reiterating the legal duty to fair processing [4] waved about. Not just a nice to have, but State bodies’ responsibility to inform citizens when their personal data are used for purposes other than those for which those data had initially been consented and given. New legislation will not  remove this legal duty.

How will it be achieved without public engagement?

Engagement is not PR

Failure to act on what you hear from listening to the public is costly.

Engagement is not done *to* people, don’t think explain why we need the data and its public benefit’ will work. Policy makers must engage with fears and not seek to dismiss or diminish them, but acknowledge and mitigate them by designing technically acceptable solutions. Solutions that enable data sharing in a strong framework of privacy and ethics, not that sees these concepts as barriers. Solutions that have social legitimacy because people support them.

Mr Hunt’s promised February 2014 opt out of anonymised data being used in health research, has yet to be put in place and has had immeasurable costs for delayed public research, and public trust.

How long before people consider suing the DH as data controller for misuse? From where does the arrogance stem that decides to ignore legal rights, moral rights and public opinion of more people than those who voted for the Minister responsible for its delay?

 

This attitude is what fails care.data and the harm is ongoing to public trust and to confidence for researchers’ continued access to data.

The same failure was pointed out by the public members of the tiny Genomics England public engagement meeting two years ago in March 2014, called to respond to concerns over the lack of engagement and potential harm for existing research. The comms lead made a suggestion that the new model of the commercialisation of the human genome in England, to be embedded in the NHS by 2017 as standard clinical practice, was like steam trains in Victorian England opening up the country to new commercial markets. The analogy was felt by the lay attendees to be, and I quote, ‘ridiculous.’

Exploiting confidential personal data for public good must have support and good two-way engagement if it is to get that support, and what is said and agreed must be acted on to be trustworthy.

Policy makers must take into account broad public opinion, and that is unlikely to be submitted to a Parliamentary consultation. (Personally, I first knew such  processes existed only when care.data was brought before the Select Committee in 2014.) We already know what many in the public think about sharing their confidential data from the work with care.data and objections to third party access, to lack of consent. Just because some policy makers don’t like what was said, doesn’t make that public opinion any less valid.

We must bring to the table the public voice from past but recent public engagement work on administrative datasharing [5], the voice of the non-research community, and from those who are not stakeholders who will use the data but the ‘data subjects’, the public  whose data are to be used.

Policy Making must be built on Public Trust

Open policy making is not open just because it says it is. Who has been invited, participated, and how their views actually make a difference on content and implementation is what matters.

Adding controversial ideas at the last minute is terrible engagement, its makes the process less trustworthy and diminishes its legitimacy.

This last minute change suggests some datasharing will be dictated despite critical views in the policy making and without any public engagement. If so, we should ask policy makers on what mandate?

Democracy depends on social legitimacy. Once you lose public trust, it is not easy to restore.

Can new datasharing laws win social legitimacy, public trust and support without public engagement?

In my next post I’ll post look at some of the public engagement work done on datasharing to date, and think about ethics in how data are applied.

*************

References:

[1] The Royal Statistical Society data trust deficit

[2] “The social licence for research: why care.data ran into trouble,” by Carter et al.

[3] FAQs: Campaign for safe and ethical National Pupil Data

[4] CJEU Bara 2015 Ruling – fair processing between public bodies

[5] Public Dialogues using Administrative data (ESRC / ADRN)

img credit: flickr.com/photos/internetarchivebookimages/

Are care.data pilots heading for a breech delivery?

Call the midwife [if you can find one free, the underpaid overworked miracle workers that they are], the care.data ‘pathfinder’ pilots are on their way! [This is under a five minute read, so there should be time to get the hot water on – and make a cup of tea.]

I’d like to be able to say I’m looking forward to a happy new arrival, but I worry care.data is set for a breech birth. Is there still time to have it turned around? I’d like to say yes, but it might need help.

The pause appears to be over as the NHS England board delegated the final approval of directions to their Chair, Sir Malcolm Grant and Chief Executive, Simon Stevens, on Thursday May 28.

Directions from NHS England which will enable the HSCIC to proceed with their pathfinder pilots’ next stage of delivery.

“this is a programme in which we have invested a great deal, of time and thought in its development.” [Sir Malcolm Grant, May 28, 2015, NHS England Board meeting]

And yet. After years of work and planning, and a 16 month pause, as long as it takes for the gestation of a walrus, it appears the directions had flaws. Technical fixes are also needed before the plan could proceed to extract GP care.data and merge it with our hospital data at HSCIC.

And there’s lots of unknowns what this will deliver.**

Groundhog Day?

The public and MPs were surprised in 2014. They may be even more surprised if 2015 sees a repeat of the same again.

We have yet to hear case studies of who received data in the past and would now be no longer eligible. Commercial data intermediaries? Can still get data, the NHS Open Day was told. And they do, as the HSCIC DARS meeting minutes in 2015 confirm.

By the time the pilots launch, the objection must actually work, communications must include an opt out form and must clearly give the programme a name.

I hope that those lessons have been learned, but I fear they have not been. There is still lack of transparency. NHS England’s communications materials and May-Oct 2014 and any 2015 programme board minutes have not been published.

We have been here before.

Back to September 2013: the GPES Advisory Committee, the ICO and Dame Fiona Caldicott, as well as campaigners and individuals could see the issues in the patient leaflet and asked for fixes.The programme went ahead anyway in February 2014 and although foreseen, failed to deliver. [For some, quite literally.]

These voices aren’t critical for fun, they call for fixes to get it right.

I would suggest that all of the issues raised since April 2014, were broadly known in February 2014 before the pause began. From the public listening exercise,  the high level summary captures some issues raised by patients, but doesn’t address their range or depth.

Some of the difficult and unwanted  issues, are still there, still the same and still being ignored, at least in the public domain. [4]

A Healthy New Arrival?

How is the approach better now and what happens next to proceed?

“It seems a shame,” the Walrus said, “To play them such a trick, After we’ve brought them out so far, And made them trot so quick!” [Lewis Carroll]

When asked by a board member: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach? it wasn’t very clear. [full detail end of post]

First they must pass the tests asked of them by Dame Fiona [her criteria and 27 questions from before Christmas.] At least that was what the verbal background given at the board meeting explained.

If the pilots should be a dip in the water of how national rollouts will proceed, then they need to test not just for today, but at least for the known future of changing content scope and expanding users – who will pay for the communication materials’ costs each time?

If policy keeps pressing forward, will it not make these complications worse under pressure? There may be external pressure ahead as potential changes to EU data protection are expected this year as well, for which the pilot must be prepared and design in advance for the expectations of best practice.

Pushing out the pathfinder directions, before knowing the answers to these practical things and patient questions open for over 16 months, is surely backwards. A breech birth, with predictable complications.

If in Sir Malcolm Grant’s words:

“we would only do this  if we believed it was absolutely critical in the interests of patients.” [Malcom Grant, May 28, 2015, NHS England Board meeting]

then I’d like to see the critical interest of patients put first. Address the full range of patient questions from the ‘listening pause’.

In the rush to just fix the best of a bad job, we’ve not even asked are we even doing the right thing? Is the system designed to best support doctor patient needs especially with the integration “blurring the lines” that Simon Stevens seems set on.

If  focus is on the success of the programme and not the patient, consider this: there’s a real risk too many opt out due to these unknowns. And lack of real choice on how their data gets used. It could be done better to reduce that risk.

What’s the percentage of opt out that the programme deems a success to make it worthwhile?

In March 2014, at a London event, a GP told me all her patients who were opting out were the newspaper reading informed, white, middle class. She was worried that the data that would be included, would be misleading and unrepresentative of her practice in CCG decision making.

medConfidential has written a current status for pathfinder areas that make great sense to focus first on fixing care.data’s big post-election question the opt out that hasn’t been put into effect. Of course in February 2014 we had to choose between two opt outs -so how will that work for pathfinders?

In the public interest we need collectively to see this done well. Another mis-delivery will be fatal. “No artificial timelines?”

Right now, my expectations are that the result won’t be as cute as a baby walrus.

******

Notes from the NHS England Board Meeting, May 28, 2015:

TK said:  “These directions [1] relate only to the pathfinder programme and specify for the HSCIC what data we want to be extracted in the event that Dame Fiona, this board and the Secretary of State have given their approval for the extraction to proceed.

“We will be testing in this process a public opt out, a citizen’s right to opt out, which means that, and to be absolutely clear if someone does exercise their right to opt out, no clinical data will be extracted from their general practice,  just to make that point absolutely clearly.

“We have limited access to the data, should it be extracted at the end of the pathfinder phase, in the pathfinder context to just four organisations: NHS England, Public Health England, the HSCIC and CQC.”

“Those four organisations will only be able to access it for analytic purposes in a safe, a secure environment developed by the Information Centre [HSCIC], so there will be no third party hosting of the data that flows from the extraction.

“In the event that Dame Fiona, this board, the Secretary of State, the board of the Information Centre, are persuaded that there is merit in the data analysis that proceeds from the extraction, and that we’ve achieved an appropriate standard of what’s called fair processing, essentially have explained to people their rights, it may well be that we proceed to a programme of national rollout, in that case this board will have to agree a separate set of directions.”

“This is not signing off anything other than a process to test communications, and for a conditional approval on extracting data subject to the conditions I’ve just described.”

CD said: “This is new territory, precedent, this is something we have to get right, not only for the pathfinders but generically as well.”

“One of the consequences of having a pathfinder approach, is as Tim was describing, is that directions will change in the future. So if we are going to have a truly fair process , one of the things we have to get right, is that for the pathfinders, people understand that the set of data that is extracted and who can use it in the pathfinders, will both be a subset of, the data that is extracted and who can use it in the future. If we are going to be true to this fair process, we have to make sure in the pathfinders that we do that.

“For example, at the advisory group last week, is that in the communication going forward we have to make sure that we flag the fact there will be further directions, and they will be changed, that we are overt in saying, subject to what Fiona Caldicott decides, that process itself will be transparent.”

Questions from Board members:
Q: What is it we seek to learn from the pathfinder approach that will guide us in the decision later if this will become a national approach?
What are the top three objectives we seek to achieve?

TK: So, Dame Fiona has set a series of standards she expects the pathfinders to demonstrate, in supporting GPs to be able to discharge this rather complex communication responsibility, that they have under the law  in any case.

“On another level how we can demonstrate that people have adequately understood their right to opt out [..]

“and how do we make sure that populations who are relatively hard to reach, although listed with GPs, are also made aware of their opportunity to opt out.

Perhaps it may help if I forward this to the board, It is in the public domain. But I will forward the letter to the board.”

“So that lays out quite a number of specific tangible objectives that we then have to evaluate in light of the pathfinder experience. “

Chair: “this is a programme in which we have invested a great deal, of time and thought in its development, we would only do this  if we believed it was absolutely critical in the interests of patients, it was something that would give us the information the intelligence that we need to more finely attune our commissioning practice, but also to get real time intelligence about how patients lives are lived, how treatments work and how we can better provide for their care.

“I don’t think this is any longer a matter of huge controversy, but how do we sensitively attune ourselves to patient confidentiality.”

“I propose that […] you will approve in principle the directions before you and also delegate to the Chief Executive and to myself to do final approval on behalf of the board, once we have taken into account the comments from medConfidential and any other issues, but the substance will remain unchanged.”

******

[4] request for the release of June 2014 Open House feedback still to be published in the hope that the range and depth of public questions can be addressed.

care.data comms letter

******
“The time has come,” the walrus said, “to talk of many things.”
[From ‘The Walrus* and the Carpenter’ in Through the Looking-Glass by Lewis Carroll]

*A walrus has a gestation period of about 16 months.
The same amount of time which the pause in the care.data programme has taken to give birth to the pathfinder sites.

references:
[1] NHS England Directions to HSCIC: May 28 2015 – http://www.england.nhs.uk/wp-content/uploads/2015/05/item6-board-280515.pdf
[2] Notes from care.data advisory group meeting on 27th February 2015
[3] Patient questions: https://jenpersson.com/pathfinder/
[4] Letter from NHS England in response to request from September, and November 2014 to request that public questions be released and addressed


15 Jan 2024: Image section in header replaced at the request of likely image tracing scammers who don’t own the rights and since it and this blog is non-commercial would fall under fair use anyway. However not worth the hassle. All other artwork on this site is mine.

A review of NHS news in 2014, from ‘the Spirit of the NHS Future’.

Respectful of all the serious, current news and that of the past year, this is a lighthearted look back at some of the stories of 2014. ‘The Spirit of the NHS Future’ looks forwards into 2015 & at what may still be changed.

***

The Spirit of the NHS Future  visits the Powers-at-be
(To the tune of The 12 Days of Christmas)

[click to open music in another window]

On the first day of Christmas
the Spirit said to me:
I’m the ghost of the family GP.

On the second day of Christmas
the Spirit said to me: a
two-tiered system,
in the future I foresee.

On the third day of Christmas
the Spirit said to me:
You told GPs,
merge or hand in keys,
feder-ate or salaried please.

On the fourth day of Christmas
the Spirit said, I hear:
“Save our surgeries”,
MPIG freeze,
partners on their knees,
blame commissioning on local CCGs.

On the fifth day of Christmas
the Spirit said to me:
Five Ye-ar Plan!
Call it Forward View,
digital or screwed.
Let’s have a new review,
keep ‘em happy at PWC.

On the sixth day of Christmas
the Spirit said to me:
Ill patients making,
out-of-Ho-urs-rings!
Callbacks all delayed,
six hours wait,
one one one mistakes.
But must tell them not to visit A&E.

On the seventh day of Christmas
the Spirit said, GPs:
see your service contract,
with the QOF they’re trimming,
what-will-this-bring?
Open Christmas Eve,
New Year’s no reprieve,
please don’t cheat our Steve,
or a breach notice will you see.

On the eighth day of Christmas
the Spirit said to me:
Population’s ageing,
social care is straining,
want is pro-creating,
obe-si-ty’s the thing!
Cash to diagnose,
statins no one knows,
indicator woes,
and Doc Foster staff employed at CQC.

On the ninth day of Christmas
the Spirit said to me:
Cash for transforming,
seven days of working.
Think of emigrating,
ten grand re-registration.
Four-teen hour stints!
DES and LES are fixed.
Called to heal the sick,
still they love the gig,
being skilled, conscientious GPs.

On the tenth day of Christmas
the Spirit said to me:
Many Lords a-leaping,
Owen’s not been sleeping,
private contracts creeping,
Circle’s ever growing.
Care home sales not slowing.
Merge-eve-ry-thing!
New bidding wars,
tenders are on course
top nine billion, more,
still you claim to run it nation-al-ly.

On the eleventh day of Christmas
the Spirit said to me:
Patient groups are griping,
records you’ve been swiping,
listening while sharing,
data firms are buying,
selling it for mining,
opt-out needs defining,
block Gold-acre tweets!
The care dot data* board
minutes we shall hoard,
troubled pilots loom.
Hi-de Partridge’s report behind a tree?

On the twelfth day of Christmas
the Spirit said to me:
disabled are protesting
sanctions, need arresting,
mental health is failing,
genomes we are trading,**
staff all need more paying,
boundaries set for changing,
top-down re-arranging,
All-this-to-come!
New hires, no absurd,
targets rule the world,
regulation first.
What’s the plan to save our service, Jeremy?

– – – – – –

Thanks to the NHS staff, whose hard work, grit and humour, continues to offer the service we know. You keep us and our loved ones healthy and whole whenever possible, and deal with us & our human frailty, when it is not.

Dear GPs & other NHS staff who’ve had a Dickens of a year. Please, don’t let the system get you down.

You are appreciated, & not just at Xmas. Happy New Year everyone.

“It is a fair, even-handed, noble adjustment of things, that while there is infection in disease and sorrow, there is nothing in the world so irresistibly contagious as laughter and good humour.”
Charles Dickens,   A Christmas Carol, 1843

– – – – –

*New Statesman, Dr Phil Whitaker’s Health Matters column, 20th March 2014, ‘Hunt should be frank about the economic imperative behind the urgency to establish the [care.data] database and should engage in a sensible discussion about what might be compromised by undue haste.’

**Genomics England Kickstarting a Genomics Industry

The care.data engagement – is it going to jilt citizens after all? A six month summary in twenty-five posts.

[Note update Sept 19th: after the NHS England AGM in the evening of Sept 18th – after this care.data engagement post published 18hrs earlier – I managed to ask Mr.Kelsey, National Director for Patients and Information, in person what was happening with all the engagement feedback and asked why it had not been made publicly available.

He said that the events’ feedback will be published before the pathfinder rollout begins, so that all questions and concerns can be responded to and that they will be taken into account before the pathfinders launch.

When might that be, I asked? ‘Soon’.

Good news? I look forward to seeing that happen. My open questions on commercial uses and more, and those of many others I have heard, have been captured in previous posts, in particular the most recent at the end of this post. – end of update.]

Medical data has huge power to do good, but it presents risks too. When leaked, it cannot be unleaked. When lost, public trust cannot be easily regained. That’s what broken-hearted Ben Goldacre wrote about care.data on February 28th of this year, ten days after the the pause was announced on February 18th [The Guardian] .

Fears and opinions, facts and analysis, with lots and lots of open questions. That’s what I’ve written up in the following posts related to care.data since then, including my own point-of-view and feedback from other citizens, events and discussions. All my care.data posts are listed here below, in one post, to give an overview of the whole story, and any progress in the six months ‘listening’ and ‘engagement’.

So what of that engagement? If there really have been all these events and listening, why has there been not one jot of public feedback published? This is from September 2014, I find it terrifyingly empty of anything but discussing change in communications of the status quo programme.

I was at that workshop, hosted by Mencap on communicating

with vulnerable and excluded groups the article mentions. It was carefully managed, with little open room discussion to share opinions cross groups (as the Senior Policy Adviser at Signature pointed out.) Whilst we got the NHS England compilation of the group feedback afterwards, it was not published. Maybe I should do that and ask how each concern will be addressed? I didn’t want to stand on the NHS England national comms. toes, assuming it would be, but you know, what? If the raw feedback says from all these meetings, these are our concerns and we want these changes, and none are forthcoming, then the public should justifiably question the whole engagement process.

It’s public money, and the public’s data. How both are used and why, is not to be hidden away in some civil service spreadsheet. Publish the business case. Publish the concerns. Publish how they are to be addressed.

From that meeting and the others I have been to, many intelligent questions from the public remain unanswered. The most recent care.data advisory workshop summarised many from the last year, and brought out some minority voices as well.

 

On the day of NHS Citizen, the new flagship of public involvement, people like me who attended the NHS England Open Day on June 17th, or care.data listening events, may be understandably frustrated that there is no publicly available feedback or plan of any next steps.
care.data didn’t make it into the NHS Citizen agenda for discussion for the 18th. [Many equally other worthy subjects did, check them out here if not attending or watch it online.] So from where will we get any answers? Almost all the comment, question and feedback I have heard at events has been constructively critical, and worthy of response. None is forthcoming.

 

Instead, the article above, this reported speech by Mr.Kelsey and its arguments, make me think engagement is going nowhere. No concerns are addressed. PR is repeated. More facts and figures which are a conflation of data use for clinical treatment and all sorts of other uses, are presented as an argument for gathering more data.

Citizens do not need told of the benefits. We need concrete steps taken in policy, process and practice, to demonstrate why we can now trust the new  system.

Only then is it worthwhile to come back to communications.

How valued is patient engagement in reality, if it is ignored?

How will involvement continue to be promoted in NHS Citizen and other platforms, if it is seen to be ineffective?

How might this affect future programmes and our willingness to get involved in clinical research?

I sincerely hope to see the raw feedback published very soon, which NHS England has gathered in their listening events. How that will be incorporated into any programme changes, as well as  communications, will go a long way to assuring the quantity in numbers and quality of cross-population participation.

The current care.data status is in limbo, as we await to see if and when any ‘pathfinder’ CCGs will be announced that will guinea pig the patient records from the GP practices in a trial rollout, in whatever form that may take. The latest official statements from Mr.Kelsey have been on 100-500 practices, but without any indicator of where or when. He suggests ‘shortly’.

What next for care.data? I’ll keep asking the questions and hope we hear some answers from the NHS England Patients and Information Directorate. Otherwise, what was the [&88!@xY!] point of a six month pause and all these efforts and listening?

Publish the business case. Publish the concerns. Publish how they are to be addressed.

What is there to hide?

After this six-month engagement, will there be a happy ending? I feel patients are about to be left jilted at the eleventh hour.
******

You’ll find my more recent posts [last] have more depth and linked document articles if you are looking for more detailed information.

******

March 31st: A mother’s journey – intro

March 31st: Transparency

April 3rd: Communication & Choice

April 4th: Fears & Facts

April 7th: What is care.data? Defined Scope is vital for Trust

April 10th: Raw Highlights from the Health Select Committee

April 12th: care.data Transparency & Truth, Remit & Responsibility

April 15th: No Security Blanket : why consent packages fail our kids

April 18th: care.data : Getting the Ducks in a Row

April 23rd: an Ode to care.data (on Shakespeare’s anniversary)

May 3rd: care.data, riding the curve: Change Management

May 15th: care.data the 4th circle: Empowerment

May 24th: Flagship care.data – commercial uses in theory [1]

June 6th: Reality must take Precedence over Public Relations

June 14th: Flagship care.data – commercial use with brokers [2]

June 20th: The Impact of the Partridge Review on care.data

June 24th: On Trying Again – Project Lessons Learned

July 1st: Communications & Core Concepts [1] Ten Things Learned at the Open House on care.data and part two: Communications and Core Concepts [2] – Open House 17th June Others’ Questions

July 12th: Flagship care.data – commercial use in Practice [3]

July 25th: care.data should be like playing Chopin – review after the HSCIC Data Sharing review ‘Driving Positive Change’ meeting

July 25th: Care.data should be like playing Chopin – but will it be all the right notes, in the wrong order? Looking forwards.

August 9th: care.data and genomics : launching lifeboats [Part One] the press, public reaction and genomics & care.data interaction

August 9th: care.data and genomics : launching lifeboats [Part Two] Where is the Engagement?

September 3rd: care.data – a Six Month Pause, Anniversary round up [Part one] Open questions: What and Who?

September 3rd: care.data – a Six Month Pause, Anniversary round up [Part two] Open questions: How, Why, When?

September 16th: care.data cutouts – Listening to Minority Voices Includes questions from those groups.

September 16th: care.data – “Anticipating Things to Come” means Confidence by Design

October 30th: patient questions on care.data – an open letter

November 19th: questions remain unanswered: what do patients do now?

December 9th: Rebuilding trust in care.data

December 24th: A care.data wish list for 2015

2015 (updated after this post was published, throughout the year)

January 5th 2015: care.data news you may have missed

January 21st 2015: care.data communications – all change or the end of the line?

February 25th 2015: care.data – one of our Business Cases is Missing.

March 14th 2015: The future of care.data in recent discussions

March 26th 2015: Wearables: patients will ‘essentially manage their data as they wish’. What will this mean for diagnostics, treatment and research and why should we care? [#NHSWDP 3]

May 10th 2015: The Economic Value of Data vs the Public Good? [1] care.data, Concerns and the cost of Consent

The Economic Value of Data vs the Public Good? [2] Pay-for-privacy, defining purposes

The Economic Value of Data vs the Public Good? [3] The value of public voice.

May 14th 2015: Public data in private hands – should we know who manages our data?

June 20th 2015: Reputational risk. Is NHS England playing a game of public confidence?

June 25th 2015: Digital revolution by design: building for change and people (1)

July 13th 2015: The nhs.uk digital platform: a personalised gateway to a new NHS?

July 27th 2015: care.data : the economic value of data versus the public interest? (First published in StatsLife)

August 4th 2015: Building Public Trust in care.data sharing [1]: Seven step summary to a new approach

August 5th, 2015: Building Public Trust [2]: a detailed approach to understanding Public Trust in data sharing

August 6th 2015: Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

August 12th 2015: Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

August 17th 2015: Building Public Trust [5]: Future solutions for health data sharing in care.data

September 12th 2015: care.data: delayed or not delayed? The train wreck that is always on time

****

Questions, ideas, info & other opinions continue to be all welcome. I’ll do my best to provide answers, or point to source sites.

For your reference and to their credit, I’ve found the following three websites useful and kept up to date with news and information:

Dr. Bhatia, GP in Hampshire’s care.data info site

HSCIC’s care.data site

medConfidential – campaign for confidentiality and consent in health and social care – seeking to ensure that every flow of data into, across and out of the NHS and care system is consensual, safe and transparent

 

 

 

care.data – the cut-outs: questions from minority voices

“By creating these coloured paper cut-outs, it seems to me that I am happily anticipating things to come…I know that it will only be much later that people will realise to what extent the work I am doing today is in step with the future.” Henri Matisse (1869-1954) [1]

My thoughts on the care.data advisory event Saturday September 6th.  “Minority voices, the need for confidentiality and anticipating the future.”

[Video in full > here. Well worth a viewing.]

After taking part in the care.data advisory group public workshop 10.30-1pm on Saturday Sept 6th in London, I took advantage of a recent, generous gift; membership of the Tate. I went to ‘Matisse – the cut outs’ art exhibition.  Whilst looking around it was hard to switch off the questions from the morning, and it struck me that we still have so many voices not heard in the discussion of benefits, risk and background to the care.data programme. So many ‘cut out’ of any decision making.

Most impressive of the morning, had been the depth and granularity of questions which were asked.  I have heard varying aspects of questions at public events, and the subject can differ a little based on the variety of organisations involved. However, increasingly, there are not new questions, rather I hear deeper versions of the questions which have already been asked, over the last eighteen months. Questions which have been asked intensely in the last 6 months pause, since February 2014 [2] and which remain unanswered. Those from the care.data advisory committee and hosting the event, said the same thing based on a previous care.data advisory event also.

What stood out, were a number of minority group voices.

A representative for the group Friends, Families and Travellers (FFT) raised a number of excellent questions, including that of communications and ‘home’ GP practices for the Traveller community. How will they be informed about care.data and know where their ‘home’ practice is and how to contact them? Whose responsibility will that be?

I spoke with a small group a few weeks ago simply about NHS use in general. One said they feared being tracked down through a government system [which was used for anything other than clinical care]. They register with new names if they need to access A&E. That tells you already how much they trust ‘the system’. For the most part, he said, they would avoid NHS care unless they were really desperately in need and beyond the capability of their own traveller community ‘nurse’. The exception was childbirth when this group said they would encourage expectant mums to go into hospital for delivery. They must continue to do so when they need to and must feel safe to do so. Whether in general they may use primary care or not, many travellers are registered at GPs, and unless their names have been inadvertently cleansed recently, they should be contacted before any data extraction as much as anyone else.

Our NHS is constitutionally there for all. That includes groups who may be cut off from mainstream inclusion in society, through their actions, inaction or others’ prejudice. Is the reality in this national programm actively inclusive? Does it demonstrate an exemplary model in practice of what we hear said the NHS aims to promote?

Transgender and other issues

The question was posed on twitter to the event, whether trans issues would be addressed by care.data. The person suggested, that the data to be extracted would “out us as probably being trans people.” As a result,  she said “I’d want to see all trans ppl excluded from care.data.”

Someone who addressed ‘her complex gender identity’ through her art, was another artist I respect, Fiore de Henriquez. She was ‘shy of publicity.’ One of her former studios is filled with work based on two faces or symbiotic heads, aside from practice pieces for her more famous commissioned work.For her biography she insisted that nothing be concealed. “Put in everything you can find out about me, darling. I am proud to be hermaphrodite, I think I am very lucky, actually.” However, in her lifetime she acknowledged the need for a private retreat and was shy until old age, despite her flamboyant appearance and behaviour. You can see why the tweet suggested excluding any transgender data or people.

‘Transgender issues’ is an upcoming topic to be addressed at the NHS Citizen even on 18th September as well. How are we making sure these groups and the ‘other’ conditions, are not forgotten by care.data and other initiatives? Minorities included by design will be better catered for, and likely to participate if they are not simply tacked on as an afterthought, in tick-box participation

However, another aspect of risk is to be considered – missing minorities 

Any groups who opt themselves out completely, may find that they and their issues are under represented in decision making about them by commissioners and budget planning for example.  If authorities or researchers choose to base decisions only on care.data these discrepancies will need taken into account.

Ciarán Devane highlighted this two-sided coin of discrimination for some people. There are conditions which are excluded from care.data scope. For example HIV. It is included in HARS reporting, but not in care.data. Will the conditions which are excluded from data, be discriminated against somehow? Why are they included in one place, not in another, or where data is duplicated in different collections, where is it necessary, where is the benefit? How can you make sure the system is safe and transparent for minorities’ data to be included,  and not find their trust undermined by taking part in a system, in which they may have fears about being identified?

Missing voices

These are just two examples of groups from whom there had been little involvement or at least public questions asked, until now. The traveller and transgender community. But there are many, notably BME, and many many others not represented at any public meetings I have been at. If they have been well represented elsewhere, any raw feedback, with issues addressed, is yet to be shared publicly.

Missing voices – youth

A further voice from which we hear little at meetings, because these meetings have been attended as far as I have seen so far, mainly by older people, is the voice of our youth.

They are left out of the care.data discussion in my opinion, but should be directly involved. It is after all, for them that we need to think most how consent should work, as once in, our data is never deleted.

Whilst consent is in law overridden by the Health and Social Care Act, it is still the age old and accepted ethical best practice. If care.data is to be used in research in future, it must design best practices now, fit for their future purposes.

How will our under-18s future lives be affected by choices others make now on their behalf?

Both for them as the future society and as individuals. Decisions which will affect research, public health planning and delivering the NHS service provision as well as decisions which will affect the risk of individual discrimination or harm, or simply that others have knowledge about their health and lifestyle which they did not choose to share themselves.

Some people assume that due to social networks, young people don’t care about privacy. This is just not true. In fact, studies show that younger people are more conscious of the potential harm to their reputation, than we may want to give them credit for.

This Royal Academy of Engineering report, [3]” Privacy and Prejudice – Young People’s views on the Development of Electronic Patient Records” produced in conjunction with Wellcome from 2010, examines in some depth, youth opinions of 14-18 year olds.  It tackles questions on medical data use: consent, control and commercialism. The hairy questions are asked about teen access to records, so when does Gillick become applied in practice and who decides?

The summary is a collection of their central questions and its discussion towards the end, which are just as valid for care.data today, as well as for considering in the Patient Online discussion for direct care access. I hope you’ll take time to read it, it’s worth it.

And what about the Children?

Some of our most vulnerable, will have their data and records held at the HSCIC. There are plans for expansion rapidly into social care data management, aligned with the transformation of health and social services. Where’s the discussion of this? Does HSCIC even have the legal capacity to handle children’s social care data?

How will at-risk groups be safer using this system in which their identities are less protected? How will the information gathered be used intelligently in practice to make a difference and bring benefit? What safeguards are in place?

“Future releases of new functionality are planned over the next 12 months, including the introduction of the Child Protection – Information Sharing application which will help to improve the protection of children who have previously been identified as vulnerable by social services.” (ref: HSCIC Spine transition)

“Domestic violence can affect anyone, but women,
transgender people and people from BME groups are at higher risk than the general population.”
(Ref: Islington’s JSNA Executive Summary – 9 – August 2014)

 

We must ask these questions about data sharing and its protection on behalf of others, because these under represented groups and minorities cannot themselves, if they are not in the room.

Where’s the Benefit?

We should also be asking the question raised at the event, about the benefits compared with the data already shared today. “Where’s the benefit?”, asked another blogger some time ago, raising his concerns for those with disabilities. We should be asking this about new dating sharing vs the many existing research databases and registries we already have, with years of history. Ciarán Devane wisely asked this on the 6th, succinctly asking what attendees had expressed.

“It will be interesting to know if they can demonstrate benefits. Not just: ‘Can we technically do this?’ but: ‘If we see primary care data next to HES data, can we see something we didn’t see before’?”

An attendee at the Healthwatch run care.data event in Oxford last week, asked the same thing. NHS England and IT providers would, one would think, be falling over themselves to demonstrate the cost/benefit, to show why this care.data programme is well managed compared with past failures. There is form on having expensive top down programmes go awry at huge public expense and time and effort. On NpfIT “the NAO also noted that “…it was not demonstrated that the financial value of the benefits exceeds the cost of the Programme.”

Where is the benefits case for care.data, to weigh against the risks? I have yet to see a publicly available business case.

The public donation

Like my museum membership, the donation of our data will be a gift. It deserves to be treated with the respect that each individual should deserve if you were to meet them face-to-face in the park.

As I enjoyed early evening sun  leaving the exhibition, the grassy area outside was packed with people. There were families, friends, children, and adults on their own. A woman rested heavily pregnant, her bump against her partner. Children chased wasps and stamped on empty cans. One man came and sold me a copy of the Big Issue, I glimpsed a hearing aid tucked into a young woman’s beehive hair, one amputee, a child with Down Syndrome giggling with a sister. Those glimpses of people gave me images I could label without a second glance. Disabled. Deaf. Downs. There were potentially conditions I could not see in others. Cancer. Crohn’s. Chlamydia. Some were drinking wine, some smoking. A small group possibly high. I know nothing about any of those individuals. I knew no names, no addresses. Yet I could see some familial relationships. Some connections were obvious. It struck me, that they represented part of a care.data population, whom buyers and researchers  may perceive as only data. I hope that we remember them as people. People from whom this programme wants to extract knowledge of their lifestyles and lives, and who have rights to express if, and how they want to share that knowledge. How will that process work?

Pathfinders – the rollout challenges that remain?

At the advisory group led meeting it was confirmed that pathfinders, would be chosen shortly.

[CCGs were subsequently announced here,  see related links, end of page for detail, note added Oct 7th]

But  the care.data programme is “still delivering without a business case”.  Despite this, “between two and four clinical commissioning groups will be selected, “in the coming weeks” to begin the pathfinder stage of the care.data programme, ” reports NIB meeting[8]

It reports what was discussed at the meeting.

“The pathfinders will test different communication strategies before moving forward with the data extraction part of the project.”

I for one would be extremely  disappointed if pathfinders go ahead in the ‘as is’ mode.  It’s not communications which is the underlying issue still. It’s not communications that most people ask about. It’s questions of substance, to which, there appear to be still insufficient information to give sound answers.

Answers would acknowledge the trust in confidentiality owed to the individual men, women, and children whose data this is. The people represented by those in the park. Or by the fifty who gave up their time on a sunny Saturday to come and ask their questions. Many without pay or travel expenses just giving up their time. Bringing their questions in search of some answers.

The pathfinder communications cannot be meaningfully trialled to meet the needs of today and the future design, when the substance of key parts of the message is uncertain. Like scope.

The care.data advisory group and the Health and Social Care Information Centre , based on the open discussion at the workshop both appear to be working, “anticipating things to come…” and to be doing their best to put processes and change in place today, which will be “in step with the future.”

To what extent that is given the right tools, time and support to be successful with all of the public, including our minorities, I don’t know. It will depend largely now on the answers to all the open questions, which need to come from the Patients and Information Directorate at the Commissioning Board, NHS England.

After all, as Mr.Kelsey himself says,

“The NHS should be engaging, empowering and hearing patients and their carers throughout the whole system all the time. The goal is not for patients to be the passive recipients of increased engagement, but rather to achieve a pervasive culture that welcomes authentic patient participation.”

What could be less empowering than to dismiss patient rights?

The challenge is: how will the Directorate at NHS England ensure to meet all these technical, governance and security needs, and yet put the most important factors first in the design; confidentiality and the voice of the empowered patient: the voice of Consent?

*****

This post captured my thoughts on the care.data advisory event Saturday September 6th.  “Minority voices, the need for confidentiality and anticipating the future.” This was about the people side of things. Part two, focuses on the system part of that.

*****

Immediate information and support for women experiencing domestic violence: National Domestic Violence, Freephone Helpline 0808 2000 247

*****

[1] Interested in a glimpse into the Matisse exhibition which has now closed? Check out this film.

[2] Previous post: My six month pause round up [part one] https://jenpersson.com/care-data-pause-six-months-on/

[3] Privacy and Prejudice: http://www.raeng.org.uk/publications/reports/privacy-and-prejudice-views This study was conducted by The Royal Academy of Engineering (the Academy) and Laura Grant Associates and was made possible by a partnership with the YTouring Theatre Company, support from Central YMCA, and funding from the Wellcome Trust and three of the Research Councils (Engineering and Physical and Sciences Research Council; Economic and Social Research Council and Medical Research Council).

[4]  Barbara Hepworth – Pelagos – in Prospect Magazine

[5] Questions remain open on how opt out works with identifiable vs pseudonymous data sharing requirement and what the objection really offers. [ref: Article by Tim Kelsey in Prospect Magazine 2009 “Long Live the Database State.”]
[6] HSCIC current actions published with Board minutes
[8] NIB https://app.box.com/s/aq33ejw29tp34i99moam/1/2236557895/19347602687/1

 

*****

More information about the Advisory Group is here: http://www.england.nhs.uk/ourwork/tsd/ad-grp/

More about the care.data programme here at HSCIC – there is an NHS England site too, but I think the HSCIC is cleaner and more useful: http://www.hscic.gov.uk/article/3525/Caredata