Tag Archives: datasharing

Breaking up is hard to do. Restructuring education in England.

This Valentine’s I was thinking about the restructuring of education in England and its wide ranging effects. It’s all about the break up.

The US EdTech market is very keen to break into the UK, and our front door is open.

We have adopted the model of Teach First partnered with Teach America, while some worry we do not ask “What is education for?

Now we hear the next chair of Oftsed is to be sought from the US, someone who is renowned as “the scourge of the unions.”

Should we wonder how long until the management of schools themselves is US-sourced?

The education system in England has been broken up in recent years into manageable parcels  – for private organisations, schools within schools, charity arms of commercial companies, and multi-school chains to take over – in effect, recent governments have made reforms that have dismantled state education as I knew it.

Just as the future vision of education outlined in the 2005 Direct Democracy co-authored by Michael Gove said, “The first thing to do is to make existing state schools genuinely independent of the state.”

Free schools touted as giving parents the ultimate in choice, are in effect another way to nod approval to the outsourcing of the state, into private hands, and into big chains. Despite seeing the model fail spectacularly abroad, the government seems set on the same here.

Academies, the route that finagles private corporations into running public-education is the preferred model, says Mr Cameron. While there are no plans to force schools to become academies, the legislation currently in ping-pong under the theme of coasting schools enables just that. The Secretary of State can impose academisation. Albeit only on Ofsted labeled ‘failing’ schools.

What fails appears sometimes to be a school that staff and parents cannot understand as anything less than good, but small. While small can be what parents want, small pupil-teacher ratios, mean higher pupil-per teacher costs. But the direction of growth is towards ‘big’ is better’.

“There are now 87 primary schools with more than 800 pupils, up from 77 in 2014 and 58 in 2013. The number of infants in classes above the limit of 30 pupils has increased again – with 100,800 pupils in these over-sized classes, an increase of 8% compared with 2014.” [BBC]

All this restructuring creates costs about which the Department wants to be less than transparent.  And has lost track of.

If only we could see that these new structures raised standards?  But,” while some chains have clearly raised attainment, others achieve worse outcomes creating huge disparities within the academy sector.”

If not delivering better results for children, then what is the goal?

A Valentine’s view of Public Service Delivery: the Big Break up

Breaking up the State system, once perhaps unthinkable is possible through the creation of ‘acceptable’ public-private partnerships (as opposed to outright privatisation per se). Schools become academies through a range of providers and different pathways, at least to start with, and as they fail, the most successful become the market leaders in an oligopoly. Ultimately perhaps, this could become a near monopoly. Delivering ‘better’. Perhaps a new model, a new beginning, a new provider offering salvation from the flood of ‘failing’ schools coming to the State’s rescue.

In order to achieve this entry to the market by outsiders, you must first remove conditions seen as restrictive, giving more ‘freedom’ to providers; to cut corners make efficiency savings on things like food standards, required curriculum, and numbers of staff, or their pay.

And what if, as a result, staff leave, or are hard to recruit?

Convincing people that “tech” and “digital” will deliver cash savings and teach required skills through educational machine learning is key if staff costs are to be reduced, which in times of austerity and if all else has been cut, is the only budget left to slash.

Self-taught systems’ providers are convincing in their arguments that tech is the solution.

Sadly I remember when a similar thing was tried on paper. My first year of GCSE maths aged 13-14  was ‘taught’ at our secondary comp by working through booklets in a series that we self-selected from the workbench in the classroom. Then we picked up the master marking-copy once done. Many of the boys didn’t need long to work out the first step was an unnecessary waste of time. The teacher had no role in the classroom. We were bored to bits. By the final week at end of the year they sellotaped the teacher to his chair.

I kid you not.

Teachers are so much more than knowledge transfer tools, and yet by some today seem to be considered replaceable by technology.

The US is ahead of us in this model, which has grown hand-in-hand with commercialism in schools. Many parents are unhappy.

So is the DfE setting us up for future heartbreak if it wants us to go down the US route of more MOOCs, more tech, and less funding and fewer staff? Where’s the cost benefit risk analysis and transparency?

We risk losing the best of what is human from the classroom, if we will remove the values they model and inspire. Unions and teachers and educationalists are I am sure, more than aware of all these cumulative changes. However the wider public seems little engaged.

For anyone ‘in education’ these changes will all be self-evident and their balance of risks and benefits a matter of experience, and political persuasion. As a parent I’ve only come to understand these changes, through researching how our pupils’ personal and school data have been commercialised,  given away from the National Pupil Database without our consent, since legislation changed in 2013; and the Higher Education student and staff data sold.

Will more legislative change be needed to keep our private data accessible in public services operating in an increasingly privately-run delivery model? And who will oversee that?

The Education Market is sometimes referred to as ‘The Wild West’. Is it getting a sheriff?

The news that the next chair of Oftsed is to be sought from the US did set alarm bells ringing for some in the press, who fear US standards and US-led organisations in British schools.

“The scourge of unions” means not supportive of staff-based power and in health our junior doctors have clocked exactly what breaking their ‘union’ bargaining power is all about.  So who is driving all this change in education today?

Some ed providers might be seen as profiting individuals from the State break up. Some were accused of ‘questionable practices‘. Oversight has been lacking others said. Margaret Hodge in 2014 was reported to have said: “It is just wrong to hand money to a company in which you have a financial interest if you are a trustee.”

I wonder if she has an opinion on a lead non-executive board member at the Department for Education also being the director of one of the biggest school chains? Or the ex Minister now employed by the same chain? Or that his campaign was funded by the same Director?  Why this register of interests is not transparent is a wonder.

It could appear to an outsider that the private-public revolving door is well oiled with sweetheart deals.

Are the reforms begun by Mr Gove simply to be executed until their end goal, whatever that may be, through Nikky Morgan or she driving her own new policies?

If Ofsted were  to become US-experience led, will the Wild West be tamed or US providers invited to join the action, reshaping a new frontier? What is the end game?

Breaking up is not hard to do, but in whose best interest is it?

We need only look to health to see the similar pattern.

The structures are freed up, and boundaries opened up (if you make the other criteria) in the name of ‘choice’. The organisational barriers to break up are removed in the name of ‘direct accountability’. And enabling plans through more ‘business intelligence’ gathered from data sharing, well, those plans abound.

Done well, new efficient systems and structures might bring public benefits, the right technology can certainly bring great things, but have we first understood what made the old less efficient if indeed it was and where are those baselines to look back on?

Where is the transparency of the end goal and what’s the price the Department is prepared to pay in order to reach it?

Is reform in education, transparent in its ideology and how its success is being measured if not by improved attainment?

The results of change can also be damaging. In health we see failing systems and staff shortages and their knock-on effects into patient care. In schools, these failures damage children’s start in life, it’s not just a ‘system’.

Can we assess if and how these reforms are changing the right things for the right reasons? Where is the transparency of what problems we are trying to solve, to assess what solutions work?

How is change impact for good and bad being measured, with what values embedded, with what oversight, and with whose best interests at its heart?

2005’s Direct Democracy could be read as a blueprint for co-author Mr Gove’s education reforms less than a decade later.

Debate over the restructuring of education and its marketisation seems to have bypassed most of us in the public, in a way health has not.

Underperformance as measured by new and often hard to discern criteria, means takeover at unprecedented pace.

And what does this mean for our most vulnerable children? SEN children are not required to be offered places by academies. The 2005 plans co-authored by Mr Gove also included: “killing the government’s inclusion policy stone dead,” without an alternative.

Is this the direction of travel our teachers and society supports?

What happens when breakups happen and relationship goals fail?

Who picks up the pieces? I fear the state is paying heavily for the break up deals, investing heavily in new relationships, and yet will pay again for failure. And so will our teaching staff, and children.

While Mr Hunt is taking all the heat right now, for his part in writing Direct Democracy and its proposals to privatise health – set against the current health reforms and restructuring of junior doctors contracts – we should perhaps also look to Mr Gove co-author, and ask to better understand the current impact of his recent education reforms, compare them with what he proposed in 2005, and prepare for the expected outcomes of change before it happens (see p74).

One outcome was that failure was to be encouraged in this new system, and Sweden held up as an exemplary model:

“Liberating state schools would also allow the all-important freedom to fail.”

As Anita Kettunen, principal of JB Akersberga in Sweden reportedly said when the free schools chain funded by a private equity firm failed:

“if you’re going to have a system where you have a market, you have to be ready for this.”

Breaking up can be hard to do. Failure hurts. Are we ready for this?
******

 

Abbreviated on Feb 18th.

 

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (2)

“Children do not lose their human rights by virtue of passing through the school gates” (UN Committee on the Rights of the Child, General Comment on ‘The aims of education’, 2001).

The Digital Skills in Schools inquiry [1] is examining the gap in education of our children to enable them to be citizens fit for the future.

We have an “educational gap” in digital skills and I have suggested it should not be seen only as functional or analytical, but should also address a gap in ethical skills and framework to equip our young people to understand their digital rights, as well as responsibilities.

Children must be enabled in education with opportunity to understand how they can grow “to develop physically, mentally, morally, spiritually and socially in a healthy and normal manner and in conditions of freedom and dignity”. [2]

Freedom to use the internet in privacy does not mean having to expose children to risks, but we should ask, are there ways of implementing practices which are more proportionate, and less intrusive than monitoring and logging keywords [3] for every child in the country? What problem is the DfE trying to solve and how?

Nicky Morgan’s “fantastic” GPS tracking App

The second technology tool Nicky Morgan mentioned in her BETT speech on January 22nd, is an app with GPS tracking and alerts creation. Her app verdict was “excellent” and “fantastic”:

“There are excellent examples at the moment such as the Family First app by Group Call. It uses GPS in mobile phones to help parents keep track of their children’s whereabouts, allowing them to check that they have arrived safely to school, alerting them if they stray from their usual schedule.” [4]

I’m not convinced tracking every child’s every move is either excellent or fantastic. Primarily because it will foster a nation of young people who feel untrusted, and I see a risk it could create a lower sense of self-reliance, self-confidence and self-responsibility.

Just as with the school software monitoring [see part one], there will be a chilling effect on children’s freedom if these technologies become the norm. If you fear misusing a word in an online search, or worry over stigma what others think, would you not change your behaviour? Our young people need to feel both secure and trusted at school.

How we use digital in schools shapes our future society

A population that trusts one another and trusts its government and organisations and press, is vital to a well functioning society.

If we want the benefits of a global society, datasharing for example to contribute to medical advance, people must understand how their own data and digital footprint fits into a bigger picture to support it.

In schools today pupils and parents are not informed that their personal confidential data are given to commercial third parties by the Department for Education at national level [5]. Preventing public engagement, hiding current practices, downplaying the risks of how data are misused, also prevents fair and transparent discussion of its benefits and how to do it better. Better, like making it accessible only in a secure setting not handing data out to Fleet Street.

For children this holds back public involvement in the discussion of the roles of technology in their own future. Fear of public backlash over poor practices must not hold back empowering our children’s understanding of digital skills and how their digital identity matters.

Digital skills are not shorthand for coding, but critical life skills

Skills our society will need must simultaneously manage the benefits to society and deal with great risks that will come with these advances in technology; advances in artificial intelligence, genomics, and autonomous robots, to select only three examples.

There is a glaring gap in their education how their own confidential personal data and digital footprint fit a globally connected society, and how they are used by commercial business and third parties.

There are concerns how apps could be misused by others too.

If we are to consider what is missing in our children’s preparations for life in which digital will no longer be a label but a way of life, then to identify the gap, we must first consider what we see as whole.

Rather than keeping children safe in education, as regards data sharing and digital privacy, the DfE seems happy to keep them ignorant. This is no way to treat our young people and develop their digital skills, just as giving their data away is not good cyber security.

What does a Dream for a  great ‘digital’ Society look like?

Had Martin Luther King lived to be 87 he would have continued to inspire hope and to challenge us to fulfill his dream for society – where everyone would have an equal opportunity for “life, liberty and the pursuit of happiness.”

Moving towards that goal, supported with technology, with ethical codes of practice, my dream is we see a more inclusive, fulfilled, sustainable and happier society. We must educate our children as fully rounded digital and data savvy individuals, who trust themselves and systems they use, and are well treated by others.

Sadly, introductions of these types of freedom limiting technologies for our children, risk instead that it may be a society in which many people do not feel comfortable, that lost sight of the value of privacy.

References:

[1] Digital Skills Inquiry: http://www.parliament.uk/business/committees/committees-a-z/commons-select/science-and-technology-committee/inquiries/parliament-2015/digital-skills-inquiry-15-16/

[2] UN Convention of the Rights of the Child

[3] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

[4] Nicky Morgan’s full speech at BETT

[5] The defenddigitalme campaign to ask the Department forEducation to change practices and policy around The National Pupil Database

 

 

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (1)

Nicky Morgan, the Education Secretary,  gave a speech [1] this week and shared her dream of the benefits technology for pupils.

It mentioned two initiatives to log children’s individual actions; one is included in a consultation on new statutory guidance, and the other she praised, is a GPS based mobile monitoring app.

As with many new applications of technology, how the concept is to be implemented in practice is important to help understand how intrusive any new use of data is going to be.

Unfortunately for this consultation there is no supporting code of practice what the change will mean, and questions need asked.

The most significant aspects in terms of changes to data collection through required monitoring are in the areas of statutory monitoring, systems, and mandatory teaching of ‘safeguarding’:

Consultation p11/14: “We believe including the requirement to ensure appropriate filtering and monitoring are in place, in statutory guidance, is proportional and reasonable in order to ensure all schools and colleges are meeting this requirement. We don’t think including this requirement will create addition burdens for the vast majority of schools, as they are already doing this, but we are keen to test this assumption.”

Consultation:  paragraph 75 on page 22 introduces this guidance section and ends with a link to “Buying advice for schools.” “Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. Guidance on e-security is available from the National Education Network.

Guidance: para 78 “Whilst it is essential that governing bodies and proprietors ensure that appropriate filters and monitoring systems are in place they should be careful  that “over blocking” does not lead to unreasonable restrictions as to what children can be taught with regards to online teaching  and safeguarding.” —

Consultation: “The Opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and  “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities. This is an important topic and the assumption is the vast majority of governing bodies and proprietors will already be ensuring the children in their school are suitably equipped with regards to safeguarding. But we are keen to hear views as to the change in emphasis.”

Paragraph 88 on p24  is oddly phrased: “Governing bodies and proprietors should ensure that staff members do not agree confidentiality and always act in the best interests of the child.”

What if confidentiality may sometimes be in the best interests of the child? What would that mean in practice?

 

Keeping Children Safe in Education – Questions on the Consultation and Use in practice

The consultation [2] is open until February 16th, and includes a new requirement to have web filtering and monitoring systems.

Remembering that 85% of children’s waking hours are spent outside school, and in a wide range of schools our children aged 2 -19, not every moment is spent unsupervised and on-screen, is it appropriate that this 24/7 monitoring would be applied to all types of school?

This provider software is reportedly being used in nearly 1,400 secondary schools in the UK.  We hear little about its applied use.

The cases of cyber bullying or sexting in schools I hear of locally, or read in the press, are through smartphones. Unless the school snoops on individual devices I wonder therefore if the cost, implementation and impact is proportionate to the benefit?

  1. Necessary and proportionate? How does this type of monitoring compare with other alternatives?
  2. Privacy impact assessment? Has any been done – surely required as a minimum measure?
  3. Cost benefit risk assessment of the new guidance in practice?
  4. Problem vs Solution: What problem is it trying to solve and how will they measure if it is successful, or stop its use if it is not?  Are other methods on offer?
  5. Due diligence: how do parents know that the providers have undergone thorough vetting and understand who they are? After all, these providers have access to millions of our children’s  online interactions.
  6. Evidence: If it has been used for years in school, how has it been assessed against other methods of supervision?
  7. The national cash cost: this must be enormous when added up for every school in the country, how is cost balanced against risk?
  8. Intangible costs – has anyone asked our children’s feeling on this? Where is the boundary between what is constructive and creepy? Is scope change documented if they decide to collect more data?

Are we Creating a Solution that Solves or creates a Problem?

The private providers would have no incentive to say their reports don’t work and schools, legally required to be risk averse, would be unlikely to say stop if there is no outcome at all.

Some providers  include “review of all incidents by child protection and forensic experts; freeing up time for teachers to focus on intervention” and “trends such as top users can be viewed.” How involved are staff who know the child as a first point of information sharing?

Most tools are multipurposed and I understand the reasons given behind them, but how it is implemented concerns me.

If the extent of these issues really justify this mass monitoring in every school – what are we doing to fix the causes, not simply spy on every child’s every online action in school? (I look at how it extends outside in part two.)

Questions on Public engagement: How are children and families involved in the implementation and with what oversight?

  1. Privacy and consent: Has anyone asked pupils and parents what they think and what rights they have to say no to sharing data?
  2. Involvement: Are parents to be involved and informed in software purchasing and in all data sharing decisions at local or regional level? Is there consistency of message if providers vary?
  3. Transparency: Where are the data created through the child’s actions stored, and for how long? Who has access to the data? What actions may result from it? And with what oversight?
  4. Understanding: How will children and parents be told what is “harmful and inappropriate content” as loosely defined by the consultation, and what they may or may not research? Children’s slang changes often, and “safeguarding” terms are subjective.
  5. Recourse: Will it include assessment of unintended consequences from misinterpretation of information gathered?
  6. Consent: And can I opt my child out from data collection by these unknown and ‘faceless’ third parties?

If children and parents are told their web use is monitored, what chilling effect may that have on their trust of the system, of teaching staff, and their ability to look for content to support their own sensitive concerns or development  that they may not be able to safe to look for at home? What limitation will that put on their creativity?

These are all questions that should be asked to thoroughly understand the consultation and requires wide public examination.

Since key logging is already common practice (according to provider websites) and will effectively in practice become statutory, where is the public discussion? If it’s not explicitly statutory, should pupils be subject to spying on their web searches in a postcode lottery?

What exactly might this part of the new guidance mean for pupils?

In part two, I include the other part of her speech, the GPS app and ask whether if we track every child in and outside school, are we moving closer to the digital dream, or nightmare, in the search to close the digital skills gap?

****

References:

[1] Nicky Morgan’s full speech at BETT

[2] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

“Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. [Proposed statutory guidance]

Since “guidance on procuring appropriate ICT” from the National Education Network NEN* is offered, it is clearly intended that this ‘system’ to be ‘in place’, should be computer based. How will it be applied in practice? A number of the software providers for schools already provide services that include key logging, using “keyword detection libraries” that “provides a complete log of all online activity”.

(*It’s hard to read more about as many of NEN’s links are dead.)  

Ethics, standards and digital rights – time for a citizens’ charter

Central to future data sharing [1] plans is the principle of public interest, intended to be underpinned by transparency in all parts of the process, to be supported by an informed public.  Three principles that are also key in the plan for open policy.

The draft ethics proposals [2] start with user need (i.e. what government wants, researchers want, the users of the data) and public benefit.

With these principles in mind I wonder how compatible the plans are in practice, plans that will remove the citizen from some of the decision making about information sharing from the citizen; that is, you and me.

When talking about data sharing it is all too easy to forget we are talking about people, and in this case, 62 million individual people’s personal information, especially when users of data focus on how data are released or published. The public thinks in terms of personal data as info related to them. And the ICO says, privacy and an individual’s rights are engaged at the point of collection.

The trusted handling, use and re-use of population-wide personal data sharing and ID assurance are vital to innovation and digital strategy. So in order to make these data uses secure and trusted, fit for the 21st century, when will the bad bits of current government datasharing policy and practice [3] be replaced by good parts of ethical plans?

Current practice and Future Proofing Plans

How is policy being future proofed at a time of changes to regulation in the new EUDP which are being made in parallel? Changes that clarify consent and the individual, requiring clear affirmative action by the data subject. [4]  How do public bodies and departments plan to meet the current moral and legal obligation to ensure persons whose personal data are subject to transfer and processing between two public administrative bodies must be informed in advance?

How is public perception [5] being taken into account?

And how are digital identities to be protected when they are literally our passport to the world, and their integrity is vital to maintain, especially for our children in the world of big data [6] we cannot imagine today? How do we verify identity but not have to reveal the data behind it, if those data are to be used in ever more government transactions – done badly that could mean the citizen loses sight of who knows what information and who it has been re-shared with.

From the 6th January there are lots of open questions, no formal policy document or draft legislation to review. It appears to be far off being ready for public consultation, needing concrete input on practical aspects of what the change would mean in practice.

Changing the approach to the collection of citizens’ personal data and removing the need for consent to wide re-use and onward sharing, will open up a massive change to the data infrastructure of the country in terms of who is involved in administrative roles in the process and when. As a country to date we have not included data as part of our infrastructure. Some suggest we should. To change the construction of roads would require impact planning, mapping and thought out budget before beginning the project to assess its impact. An assessment this data infrastructure change appears to be missing entirely.

I’ve considered the plans in terms of case studies of policy and practice, transparency and trust, the issues of data quality and completeness and digital inclusion.

But I’m starting by sharing only my thoughts on ethics.

Ethics, standards and digital rights – time for a public charter

How do you want your own, or your children’s personal data handled?

This is not theoretical. Every one of us in the UK has our own confidential data used in a number of ways about which we are not aware today. Are you OK with that? With academic researchers? With GCHQ? [7] What about charities? Or Fleet Street press? All of these bodies have personal data from population wide datasets and that means all of us or all of our children, whether or not we are the subjects of research, subject to investigation, or just an ordinary citizen minding their own business.

On balance, where do you draw the line between your own individual rights and public good? What is fair use without consent and where would you be surprised and want to be informed?
I would like to hear more about how others feel about and weigh the risks and benefits trade off in this area.

Some organisations on debt have concern about digital exclusion. Others about compiling single view data in coercive relationships. Some organisations are campaigning for a digital bill of rights. I had some thoughts on this specifically for health data in the past.

A charter of digital standards and ethics could be enabling, not a barrier and should be a tool that must come to consultation before new legislation.

Discussing datasharing that will open up every public data set “across every public body” without first having defined a clear policy is a challenge. Without defining its ethical good practice first as a reference framework, it’s dancing in the dark. This draft plan is running in parallel but not part of the datasharing discussion.
Ethical practice and principles must be the foundation of data sharing plans, not an after thought.

Why? Because this stuff is hard. The kinds of research that use sensitive de-identified data are sometimes controversial and will become more challenging as the capabilities of what is possible increase with machine learning, genomics, and increased personalisation and targeting of marketing, and interventions.

The ADRN had spent months on its ethical framework and privacy impact assessment, before I joined the panel.

What does Ethics look like in sharing bulk datasets?

What do you think about the commercialisation of genomic data by the state – often from children whose parents are desperate for a diagnosis – to ‘kick start’ the UK genomics industry?  What do you think about data used in research on domestic violence and child protection? And in predictive policing?

Or research on religious affiliations and home schooling? Or abortion and births in teens matching school records to health data?

Will the results of the research encourage policy change or interventions with any group of people? Could these types of research have unintended consequences or be used in ways researchers did not foresee supporting not social benefit but a particular political or scientific objective? If so, how is that governed?

What research is done today, what is good practice, what is cautious and what would Joe Public expect? On domestic violence for example, public feedback said no.

And while there’s also a risk of not making the best use of data, there are also risks of releasing even anonymised data [8] in today’s world in which jigsawing together the pieces of poorly anonymised data means it is identifying. Profiling or pigeonholing individuals or areas was a concern raised in public engagement work.

The Bean Report used to draw out some of the reasoning behind needs for increased access to data: “Remove obstacles to the greater use of public sector administrative data for statistical purposes, including through changes to the associated legal framework, while ensuring appropriate ethical safeguards are in place and privacy is protected.”

The Report doesn’t outline how the appropriate ethical safeguards are in place and privacy is protected. Or what ethical looks like.

In the Public interest is not clear cut.

The boundary between public and private interest shift in time as well as culture. While in the UK the law today says we all have the right to be treated as equals, regardless of our gender, identity or sexuality it has not always been so.

By putting the rights of the individual on a lower par than the public interest in this change, we risk jeopardising having any data at all to use. But data will be central to the digital future strategy we are told the government wants to “show the rest of the world how it’s done.”

If they’re serious, if all our future citizens must have a digital identity to use with government with any integrity, then the use of not only our current adult, but our children’s data really matters – and current practices must change.  Here’s a case study why:

Pupil data: The Poster Child of Datasharing Bad Practice

Right now, the National Pupil database containing our 8 million or more children’s personal data in England is unfortunately the poster child of what a change in legislation and policy around data sharing, can mean in practice.  Bad practice.

The “identity of a pupil will not be discovered using anonymised data in isolation”, says the User Guide, but when they give away named data, and identifiable data in all but 11 requests since 2012, it’s not anonymised. Anything but the ‘anonymised data’ of publicly announced plans presented in 2011, yet precisely what the change in law to broaden the range of users in the Prescribed Persons Act 2009 permitted , and the expansion of purposes in the amended Education (Individual Pupil Information)(Prescribed Persons) Regulations introduced in June 2013.  It was opened up to:

“(d)persons who, for the purpose of promoting the education or well-being of children in England are—

(i)conducting research or analysis,

(ii)producing statistics, or

(iii)providing information, advice or guidance,

and who require individual pupil information for that purpose(5);”.

The law was changed so that, individual pupil level data, and pupil names are extracted, stored and have also been released at national level. Raw data sent to commercial third parties, charities and press in identifiable individual level and often sensitive data items.

This is a world away from safe setting, statistical analysis of de-identified data by accredited researchers, in the public interest.

Now our children’s confidential data sit on servers on Fleet Street – is this the model for all our personal administrative data in future?

If not, how do we ensure it is not? How will the new all-datasets’ datasharing legislation permit wider sharing with more people than currently have access and not end up with all our identifiable data sent ‘into the wild’ without audit as our pupil data are today?

Consultation, transparency, oversight and public involvement in ongoing data decision making are key, and  well written legislation.

The public interest alone, is not a strong enough description to keep data safe. This same government brought in this National Pupil Database policy thinking it too was ‘in the public interest’ after all.

We need a charter of ethics and digital rights that focuses on the person, not exclusively the public interest use of data.

They are not mutually exclusive, but enhance one another.

Getting ethics in the right place

These ethical principles start in the wrong place. To me, this is not an ethical framework, it’s a ‘how-to-do-data-sharing’ guideline and try to avoid repeating care.data. Ethics is not first about the public interest, or economic good, or government interest. Instead, referencing an ethics council view, you start with the person.

“The terms of any data initiative must take into account both private and public interests. Enabling those with relevant interests to have a say in how their data are used and telling them how they are, in fact, used is a way in which data initiatives can demonstrate respect for persons.”

Professor Michael Parker, Member of the Nuffield Council on Bioethics Working Party and Professor of Bioethics and Director of the Ethox Centre, University of Oxford:

“Compliance with the law is not enough to guarantee that a particular use of data is morally acceptable – clearly not everything that can be done should be done. Whilst there can be no one-size-fits-all solution, people should have say in how their data are used, by whom and for what purposes, so that the terms of any project respect the preferences and expectations of all involved.”

The  partnership between members of the public and public administration must be consensual to continue to enjoy support. [10]. If personal data are used for research or other uses, in the public interest, without explicit consent, it should be understood as a privilege by those using the data, not a right.

As such, we need to see data as about the person, as they see it themselves, and data at the point of collection as information about individual people, not just think of statistics. Personal data are sensitive, and some research uses highly sensitive,  and data used badly can do harm. Designing new patterns of datasharing must think of the private, as well as public interest,  co-operating for the public good.

And we need a strong ethical framework to shape that in.

******

[1] http://datasharing.org.uk/2016/01/13/data-sharing-workshop-i-6-january-2016-meeting-note/

[2] Draft data science ethical framework: https://data.blog.gov.uk/wp-content/uploads/sites/164/2015/12/Data-science-ethics-short-for-blog-1.pdf

[3] defenddigitalme campaign to get pupil data in England made safe http://defenddigitalme.com/

[4] On the European Data Protection regulations: https://www.privacyandsecuritymatters.com/2015/12/the-general-data-protection-regulation-in-bullet-points/

[5] Public engagament work – ADRN/ESRC/ Ipsos MORI 2014 https://adrn.ac.uk/media/1245/sri-dialogue-on-data-2014.pdf

[6] Written evidence submitted to the parliamentary committee on big data: http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/big-data-dilemma/written/25380.pdf

[7] http://www.bbc.co.uk/news/uk-politics-35300671 Theresa May affirmed bulk datasets use at the IP Bill committee hearing and did not deny use of bulk personal datasets, including medical records

[8] http://www.economist.com/news/science-and-technology/21660966-can-big-databases-be-kept-both-anonymous-and-useful-well-see-you-anon

[9] Nuffield Council on Bioethics http://nuffieldbioethics.org/report/collection-linking-use-data-biomedical-research-health-care/ethical-governance-of-data-initiatives/

[10] Royal Statistical Society –  the data trust deficit https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

Background: Why datasharing matters to me:

When I joined the data sharing discussions that have been running for almost 2 years only very recently, it was wearing two hats, both in a personal capacity.

The first was with interest in how any public policy and legislation may be changing and will affect deidentified datasharing for academic research, as I am one of two lay people, offering public voice on the ADRN approvals panel.

Its aim is to makes sure the process of granting access to the use of sensitive, linked administrative data from population-wide datasets is fair, equitable and transparent, for de-identified use by trusted researchers, for non-commercial use, under strict controls and in safe settings. Once a research project is complete, the data are securely destroyed. It’s not doing work that “a government department or agency would carry out as part of its normal operations.”

Wearing my second hat, I am interested to see how new policy and practice plan to affect current practice. I coordinate the campaign efforts with the Department for Education to stop giving away the identifiable, confidential and sensitive personal data of our 8m children in England to commercial third parties and press from the National Pupil Database.

Access to school pupil personal data by third parties is changing

The Department for Education in England and Wales [DfE] has lost control of who can access our children’s identifiable school records by giving individual and sensitive personal data out to a range of third parties, since government changed policy in 2012. It looks now like they’re panicking how to fix it.

Applicants wanting children’s personal identifiable and/or sensitive data now need to first apply for the lowest level criminal record check, DBS, in the access process, to the National Pupil Database.

Schools Week wrote about it and asked for comment on the change [1] (as discussed by Owen in his blog [2] and our tweets).

At first glance, it sound like a great idea, but what real difference will this make to who can receive 8 million school pupils’ data?

Yes, you did read that right.

The National Pupil Database gives away the personal data of eight million children, aged 2-19. Gives it away outside its own protection,  because users get sent raw data, to their own desks.[3]

It would be good to know people receiving your child’s data hadn’t ever been cautioned or convicted about something related to children in their past, right?

Unfortunately, this DBS check won’t tell the the Department for Education (DfE) that – because it’s the the basic £25 DBS check [4], not full version.

So this change seems less about keeping children’s personal data safe than being seen to do something. Anything. Anything but the thing that needs done. Which is to keep the data secure.

Why is this not a brilliant solution? 

Moving towards the principle of keeping the data more secure is right, but in practice, the DBS check is only useful if it would make data safe by stopping people receiving data and the risks associated with data misuse. So how will this DBS check achieve this? It’s not designed for people who handle data. It’s designed for people working with children.

There is plenty of evidence available of data inappropriately used for commercial purposes often in the news, and often through inappropriate storage and sharing of data as well as malicious breaches. I am not aware, and refer to this paper [5], of risks realised through malicious data misuse of data for academic purposes in safe settings. Though mistakes do happen through inappropriate processes, and through human error and misjudgement.

However it is not necessary to have a background check for its own sake. It is necessary to know that any users handle children’s data securely and appropriately, and with transparent oversight. There is no suggestion at all that people at TalkTalk are abusing data, but their customers’ data were not secure and those data held in trust are now being misused.

That risk is the harm that is likely to affect a high number of individuals if bulk personal data are not securely managed. Measures to make it so must be proportionate to that risk. [6]

Coming back to what this will mean for individual applicants and its purpose: Basic Disclosure contains only convictions considered unspent under The Rehabilitation of Offenders Act 1974. [7]

The absence of a criminal record does not mean data are securely stored or appropriately used by the recipient.

The absence of a criminal record does not mean data will not be forwarded to another undisclosed recipient and there be a way for the DfE to ever know it happened.

The absence of a criminal record showing up on the basic DBS check does not even prove that the person has no previous conviction related to misuse of people or of data. And anything you might consider ‘relevant’ to children for example, may have expired.

DBS_box copy

So for these reasons, I disagree that the decision to have a basic DBS check is worthwhile.  Why? Because it’s effectively meaningless and doesn’t solve the problem which is this:

Anyone can apply for 8m children’s personal data, and as long as they meet some purposes and application criteria, they get sent sensitive and identifiable children’s data to their own setting. And they do. [8]

Anyone the 2009 designed legislation has defined as a prescribed person or researcher, has come to mean journalists for example. Like BBC Newsnight, or Fleet Street papers. Is it right journalists can access my children’s data, but as pupils and parents we cannot, and we’re not even informed? Clearly not.

It would be foolish to be reassured by this DBS check. The DfE is kidding themselves if they think this is a workable or useful solution.

This step is simply a tick box and it won’t stop the DfE regularly giving away the records of eight million children’s individual level and sensitive data.

What problem is this trying to solve and how will it achieve it?

Before panicking to implement a change DfE should first answer:

  • who will administer and store potentially sensitive records of criminal convictions, even if unrelated to data?
  • what implications does this have for other government departments handling individual personal data?
  • why are 8m children’s personal and sensitive data given away ‘into the wild’ beyond DfE oversight in the first place?

Until the DfE properly controls the individual personal data flowing out from NPD, from multiple locations, in raw form, and its governance, it makes little material difference whether the named user is shown to have, or not have a previous criminal record. [9] Because the DfE has no idea if they are they only person who uses it.

The last line from DfE in the article is interesting: “it is entirely right that we we continue to make sure that those who have access to it have undergone the necessary background checks.”

Continue from not doing it before? Tantamount to a denial of change, to avoid scrutiny of the past and status quo? They have no idea who has “access” to our children’s data today after they have released it, except on paper and trust, as there’s no audit process.[10]

If this is an indicator of the transparency and type of wording the DfE wants to use to communicate to schools, parents and pupils I am concerned. Instead we need to see full transparency, assessment of privacy impact and a public consultation of coordinated changes.

Further, if I were an applicant, I’d be concerned that DfE is currently handling sensitive pupil data poorly, and wants to collect more of mine.

In summary: because of change in Government policy in 2012 and the way in which it is carried out in practice, the Department for Education in England and Wales [DfE] has lost control of who can access our 8m children’s identifiable school records. Our children deserve proper control of their personal data and proper communication about who can access that and why.

Discovering through FOI [11] the sensitivity level and volume of identifiable data access journalists are being given, shocked me. Discovering that schools and parents have no idea about it, did not.

This is what must change.

 

*********

If you have questions or concerns about the National Pupil Database or your own experience, or your child’s data used in schools, please feel free to get in touch, and let’s see if we can make this better to use our data well, with informed public support and public engagement.

********

References:
[1] National Pupil Database: How to apply: https://www.gov.uk/guidance/national-pupil-database-apply-for-a-data-extract

[2]Blogpost: http://mapgubbins.tumblr.com/post/132538209345/no-more-fast-track-access-to-the-national-pupil

[3] Which third parties have received data since 2012 (Tier 1 and 2 identifiable, individual and/or sensitive): release register https://www.gov.uk/government/publications/ national-pupil-database-requests-received

[4] The Basic statement content http://www.disclosurescotland.co.uk/disclosureinformation/index.htm

[5] Effective Researcher management: 2009 T. Desai (London School of Economics) and F. Ritchie (Office for National Statistics), United Kingdom http://www.unece.org/fileadmin/DAM/stats/documents/ece/ces/ge.46/2009/wp.15.e.pdf

[6] TalkTalk is not the only recent significant data breach of public trust. An online pharmacy that sold details of more than 20,000 customers to marketing companies has been fined £130,000 https://ico.org.uk/action-weve-taken/enforcement/pharmacy2u-ltd/

[7] Guidance on rehabilitation of Offenders Act 1974 https://www.gov.uk/government/uploads/system/uploads/
attachment_data/file/299916/rehabilitation-of-offenders-guidance.pdf

[8] the August 2014 NPD application from BBC Newsnight https://www.whatdotheyknow.com/request/293030/response/723407/attach/10/BBC%20Newsnight.pdf

[9] CPS Guidelines for offences involving children https://www.sentencingcouncil.org.uk/wp-content/uploads/Final_Sexual_Offences_Definitive_Guideline_content_web1.pdf
indecent_images_of_children/

[10] FOI request https://www.whatdotheyknow.com/request/pupil_data_application_approvals#outgoing-482241

[11] #saveFOI – I found out exactly how many requests had been fast tracked and not scrutinised by the data panel via a Freedom of Information Request, as well as which fields journalists were getting access to. The importance of public access to this kind of information is a reason to stand up for FOI  http://www.pressgazette.co.uk/press-gazette-launches-petition-stop-charges-foi-requests-which-would-be-tax-journalism

 

Act now: Stand up and speak out for your rights to finding out the facts #saveFOI

The Freedom of Information Act has enabled me to stand up for my children’s rights. It really matters to me. And we might lose it.

For every member of the public, who has ever or who has never used their rights under the Freedom of Information Act laws, the government consultation on changing them that closes today is worth caring about. If you haven’t yet had your say, go and take action here >> now.  If it is all you have time for before the end of today, you can sign 38 degrees petition or write an email to your MP.

Or by the end of today you can reply to the call for evidence. There is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also got this plain English version.

Please do. Now. It closes today, on November 20th.

If you need convinced why it matters to me and it should to you, read on.

What will happen

If the proposed changes come to pass, information about public accountability will be lost. Political engagement will not be open to all equally. It will promote an unfair society in which individuals are not only prevented from taking part in full public life, but prevented from understanding decisions made about them or that affect them. Campaign groups will be constrained from standing up for human rights by cost.  The press will be restrained in what they can ask.

MySociety has a brilliant summary.  Michael Sheen spoke up calling it “nothing short of a full frontal attack” on the principles of democratic government. And Tom Watson spoke of three serious instances where facts would have stayed hidden, were it not for access made using the law of Freedom of Information:

1. death rates in cardiac patient care
2. cases when the police use Tasers on children
3. the existence of cracks in the nuclear power station at Hinckley

Why does FOI matter to me personally? In Education.

Because it’s enabled me to start a conversation to get the Department for Education to start to improve their handling of our 8 million children’s personal and sensitive data they hold in the National Pupil Database for England and Wales. Through FOI I asked for unpublished facts how many releases of identifiable personal data of school pupils have been fast-tracked at the Department of Education without panel oversight. And to see the panel terms of reference which are still not on their website.

The request: whatdotheykknow.com
The outcome:
National Pupil Database FOI case study summary here.

I’m now coordinating calls for changes on behalf of the 8m children whose records they hold and parents across the country.

******

Why does FOI matter to me personally? In Health.

Because Freedom of Information law has enabled public transparency of decision making and accountability of the care.data programme board decision making that was kept secret for over a year. NHS England refused to publish them. Their internal review declined appeal. The Information Commissioner’s Office upheld it.

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. In fact my case study, while highlighting that NHS England refused to release information, also shows that only a handful of genuine redactions were necessary, using Section 36 to keep them hidden, when the minutes were finally released.

In October 2014 I simply wanted to see the meeting minutes form part of the public record of care.data planning. I wanted to see the cost-benefit business case and scrutinise it against the benefits case that the public were told of at every public engagement event I had been to.  When at every turn the public is told how little money the NHS can afford to spend I wanted scrutiny of what the programme would cost at national and local levels. It was in the public interest to better inform public debate about the merits of the national programme. And I strongly believe that it is in the public interest to be informed and fully understand the intention and programme that demands the use of sensitive personal data.

The request: whatdotheyknow.com
The outcome: care.data FOI case study summary here.

Others could use this information I hoped, to ask the right questions about missing meeting minutes and transparency, and for everyone to question why there was no cost-benefit business plan at all in private; while the public kept being told of the benefits.  And it shows that data collection is further set to expand, without public debate.

Since then the programme has been postoned again and work is in progress on improved public engagement to enable public and professional confidence.

What has Freedom of Information achieved?

One of the most memorable results of Freedom of Information was the MPs expenses scandal. Who knows how much this Freedom of Information Request saved the taxpayers in immeasurable amounts of future spending on duck houses since MPs have been required to publish expenses since 2010? Four MPs were jailed for false accounting. Peers were expelled. Second homes and what appeared to the public as silly spending on sundries were revealed. Mr. Cameron apologized in 2009, saying he was “appalled” by the expenses. The majority of MPs had done nothing illegal but the Freedom of Information request enabled the start of a process of increased transparency to the public which showed where activities, while permitted by law, were simply unethical or unreasonable.

Historical record

Information published under the Freedom of Information Act can help to ensure that important records of decision-making processes are retained as part of the historic background to government.

Increased trust

The right information at the right time helps make better decisions, make spending more transparent and makes policies and practices more trustworthy.

Access to official information can also improve public confidence where public sector bodies are seen as being open. In a 2011 survey carried out on behalf of the Information Commissioner’s Office, 81% of public bodies questioned agreed that the Act had increased the public’s trust in their organisation.

A key argument made by the commission is that those in public office need private space for decision making. The Information Commissioner’s Office countered this in their submission to the consultation saying,

“there is a distinction between a need for a private space, depending on the circumstances and a desire for secrecy across a broad area of public sector activity. It was the latter tendency that FOIA was intended to correct.”

So how much more “private space” do public servants need?

Holding back information

When information that are judged should not be released in the public interest, there are already exemptions that can be applied to prevent disclosure of information under the Freedom of Information Act. [1]

The exemptions include:

  • if the information can easily be accessed by other means – e.g. the internet or published documents
  • if the information is personal information
  • if the information is provided in confidence (but only if legally enforceable)
  • when there is a legal reason not to disclose
  • if the information is about national security, defence, the economy, law enforcement, formulation of Government policy, health and safety, communications with Her Majesty or other royalty, international relations, intended for future publication and commercial interests. (All exemptions in this group must be tested to see if disclosure is in the public interest.)

In addition to these exemptions, organisations can withhold information if it will take more than two-and-a-half days to provide it, or they cannot identify what information is needed (although they have to work with the requester to clarify what is being requested).

They can also withhold information if they decide the request is vexatious.

Does it cost us too much to administer?

Some people who are supportive of these changes say they are concerned about costs in answering requests but have perhaps not considered the savings in exceptional cases (like the Expenses Scandal outcome). And as mySociety has reported [2], money spent responding to Freedom of Information requests also needs to be considered fairly in the context of wider public spending. In 2012 it was reported that Staffordshire County Council had spent £38,000 in a year responding to Freedom of Information requests. The then Director of mySociety, Tom Steinberg, commented:

“From this I can see that oversight by citizens and journalists cost only £38,000 from a yearly total budget of £1.3bn. I think it is fantastic that Staffordshire County Council can provide such information for only 0.002 per cent of its operating budget.”

Why does the government want to make itself less transparent? Even the Information Commissioner’s office has replied to the consultation to say that the Commissioner does not consider that significant changes to the core principles of the legislation are needed. This is a good law, that gives the public rights in our favour and transparency into how we are governed and tax money spent.

How will the value of FOI be measured of what would be lost if the changes are made?

What can you do?

The call for evidence is here and there is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also put together this super-easy Plain English version.

To have your say in the consultation closing on November 20th go online.

Or simply call or write to your MP.  Today. This really matters.


References:

[1] Requests can be refused https://ico.org.uk/for-organisations/guide-to-freedom-of-information/refusing-a-request/

[2] MySociety opposes restrictions https://www.mysociety.org/2015/11/11/voices-from-whatdotheyknow-why-we-oppose-foi-act-restrictions/

[3] National Pupil Database FOI case study summary here

[4] My care.data programme board FOI case study summary here

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex. Too little, too late.

Parliament’s talking about Talk Talk and Big Data, like some parents talk about sex ed. They should be discussing prevention and personal data protection for all our personal data, not just one company, after the event.

Everyone’s been talking about TalkTalk and for all the wrong reasons. Data loss and a 15-year-old combined with a reportedly reckless response to data protection, compounded by lack of care.

As Rory Cellan-Jones wrote [1] rebuilding its reputation with customers and security analysts is going to be a lengthy job.

In Parliament Chi Onwarah, Shadow Minister for Culture & the Digital Economy, summed up in her question, asking the Minister to acknowledge “that all the innovation has come from the criminals while the Government sit on their hands, leaving it to businesses and consumers to suffer the consequences?”  [Hansard 2]

MPs were concerned for the 4 million* customers’ loss of name, date of birth, email, and other sensitive data, and called for an inquiry. [It may now be fewer*.] [3] The SciTech committee got involved too.

I hope this means Parliament will talk about TalkTalk not as the problem to be solved, but as one case study in a review of contemporary policy and practices in personal data handling.

Government spends money in data protection work in the [4] “National Cyber Security Programme”. [NCSP] What is the measurable outcome – particularly for TalkTalk customers and public confidence – from its £860M budget?  If you look at the breakdown of those sums, with little going towards data protection and security compared with the Home Office and Defence, we should ask if government is spending our money in an appropriately balanced way on the different threats it perceives. Keith Vaz suggested British companies that lose £34 billion every year to cybercrime. Perhaps this question will come into the inquiry.

This all comes after things have gone wrong.  Again [5]. An organisation we trusted has abused that trust by not looking after data with the stringency that customers should be able to expect in the 21st century, and reportedly not making preventative changes, apparent a year ago. Will there be consequences this time?

The government now saying it is talking about data protection and consequences, is like saying they’re talking sex education with teens, but only giving out condoms to the boys.

It could be too little too late. And they want above all to avoid talking about their own practices. Let’s change that.

Will this mean a review to end risky behaviour, bring in change, and be wiser in future?

If MPs explore what the NCSP does, then we the public, should learn more about what government’s expectations of commercial companies is in regards modern practices.

In addition, any MPs’ inquiry should address government’s own role in its own handling of the public’s personal data. Will members of government act in a responsible manner or simply tell others how to do so?

Public discussion around both commercial and state use of our personal data, should mean genuine public engagement. It should involve a discussion of consent where necessary for purposes  beyond those we expect or have explained when we submit our data, and there needs to be a change in risky behaviour in terms of physical storage and release practices, or all the talk, is wasted.

Some say TalkTalk’s  practices mean they have broken their contract along with consumer trust. Government departments should also be asking whether their data handling would constitute a breach of the public’s trust and reasonable expectations.

Mr Vaizey should apply his same logic to government handling data as he does to commercial handling. He said he is open to suggestions for improvement. [6]

Let’s not just talk about TalkTalk.

    • Let’s Talk Consequences: organisations taking risk seriously and meaningful consequences if not [7]
    • Let’s Talk Education: the education of the public on personal data use by others and rights and responsibilities we have [8]
    • Let’s Talk Parliament’s Policies and Practices: about its own complementary lack of data  understanding in government and understand what good practice is in physical storage, good governance and transparent oversight
    • Let’s Talk Public Trust: and the question whether government can be trusted with public data it already has and whether its current handling makes it trustworthy to take more [9]

Vaizey said of the ICO now in his own department: “The Government take the UK’s cyber-security extremely seriously and we will continue to do everything in our power to protect organisations and individuals from attacks.”

“I will certainly meet the Information Commissioner to look at what further changes may be needed in the light of this data breach. [..] It has extensive powers to take action and, indeed, to levy significant fines. “

So what about consequences when data are used in ways the public would consider a loss, and not through an attack or a breach, but government policy? [10]

Let’s Talk Parliament’s Policies and Practices

Commercial companies are not alone in screwing up the use and processing [11] management of our personal data. The civil service under current policy seems perfectly capable of doing by itself. [12]

Government data policy has not kept up with 21st century practices and to me seems to work in the dark, as Chi Onwarah said,

‘illuminated by occasional flashes of incompetence.’

This incompetence can risk harm to people’s lives, to business and to public confidence.

And once given, trust would be undermined by changing the purposes or scope of use for which it was given, for example as care.data plans to do after the pilot. A most risky idea.

Trust in these systems, whether commercial or state, is crucial. Yet reviews which highlight this, and make suggestions to support trust such as ‘data should never be (and currently is never) released with personal identifiers‘ in The Shakespeare Review have been ignored by government.

Where our personal data are not used well in government departments by the department themselves, they seem content to date to rely on public ignorance to get away with current shoddy practices.

Practices such as not knowing who all your customers are, because they pass data on to others. Practices, such as giving individual level identifiable personal data to third parties without informing the public, or asking for consent. Practices, such as never auditing or measuring any benefit of giving away others personal data.

“It is very important that all businesses, particularly those handling significant amounts of sensitive customer data, have robust procedures in place to protect those data and to inform customers when there may have been a data breach.” Ed Vaizey, Oct 26th, HOC

If government departments prove to be unfit to handle the personal data we submit in trust to the state today, would we be right to trust them with even more?

While the government is busy wagging fingers at commercial data use poor practices, the care.data debacle is evidence that not all its MPs or civil service understand how data are used in commercial business or through government departments.

MPs calling for commercial companies to sharpen up their data protection must understand how commercial use of data often piggy-backs the public use of our personal data, or others getting access to it via government for purposes that were unintended.

Let’s Talk Education

If the public is to understand how personal data are to be kept securely with commercial organisations, why should they not equally ask to understand how the state secures their personal data? Educating the public could lead to better engagement with research, better understanding of how we can use digital services and a better educated society as a whole. It seems common sense.

At a recent public event [13],  I asked civil servants talking about big upcoming data plans they announced, linking school data with more further education and employment data, I asked how they planned to involve the people whose data they would use. There was no public engagement to mention. Why not? Inexcusable in this climate.

Public engagement is a matter of trust and developing understanding in a relationship. Organisations must get this right.[14]

If government is discussing risky practices by commercial companies, they also need to look closer to home and fix what is broken in government data handling where it exposes us to risk through loss of control of our personal data.

The National Pupil Database for example, stores and onwardly shares identifiable individual sensitive data of at least 8m children’s records from age 2 -19. That’s twice as big as the TalkTalk loss was first thought to be.

Prevention not protection is what we should champion. Rather than protection after the events,  MPs and public must demand emphasis on prevention measures in our personal data use.

This week sees more debate on how and why the government will legislate to have more powers to capture more data about all the people in the country. But are government policy, process and practices fit to handle our personal data, what they do with it and who they give it to?

Population-wide gathering of data surveillance in any of its many forms is not any less real just because you don’t see it. Children’s health, schools, increases in volume of tax data collection. We don’t discuss enough how these policies can be used every day without the right oversight. MPs are like the conservative parents not comfortable talking to their teens about sleeping with someone. Just because you don’t know, it doesn’t mean they’re not doing it. [15] It just means you don’t want to know because if you find out they’re not doing it safely, you’ll have to do something about it.

And it might be awkward. (Meanwhile in schools real, meaningful PHSE has been left off the curriculum.)

Mr. Vaizey asked in the Commons for suggestions for improvement.

My suggestion is this. How government manages data has many options. But the principle should be simple. Our personal data needs not only protected, but not exposed to unnecessary risk in the first place, by commercial or state bodies. Doing nothing, is not an option.

Let’s Talk about more than TalkTalk

Teens will be teens. If commercial companies can’t manage their systems better to prevent a child successfully hacking it, then it’s not enough to point at criminal behaviour. There is fault to learn from on all sides. In commercial and state uses of personal data.

There is talk of new, and bigger, data sharing plans. [16]

Will the government wait to see  and keep its fingers crossed each month to see if our data are used safely at unsecured settings with some of these unknown partners data might be onwardly shared with, hoping we won’t find out and they won’t need to talk about it, or have a grown up public debate based on public education?

Will it put preventative measures in place appropriate to the sensitivity and volume of the data it is itself responsible for?

Will moving forward with new plans mean safer practices?

If government genuinely wants our administrative data at the heart of digital government fit for the 21st century, it must first understand how all government departments collect and use public data. And it must educate the public in this and commercial data use.

We need a fundamental shift in the way the government respects public opinion and shift towards legal and privacy compliance – both of which are lacking.

Let’s not talk about TalkTalk. Let’s have meaningful grown up debate with genuine engagement. Let’s talk about prevention measures in our data protection. Let’s talk about consent. It’s personal.

******

[1] Questions for TalkTalk: http://www.bbc.co.uk/news/technology-34636308

[2] Hansard: http://www.publications.parliament.uk/pa/cm201516/cmhansrd/cm151026/debtext/151026-0001.htm#15102612000004

[3] TalkTalk update: http://www.talktalkgroup.com/press/press-releases/2015/cyber-attack-update-tuesday-october-30-2015.aspx

[4] The Cyber Security Programme: http://www.civilserviceworld.com/articles/feature/depth-look-national-cyber-security-programme

[5] Paul reviews TalkTalk; https://paul.reviews/value-security-avoid-talktalk/

[6] https://ico.org.uk/for-organisations/guide-to-data-protection/conditions-for-processing/

[7] Let’s talk Consequences: the consequences of current failures to meet customers’ reasonable expectations of acceptable risk, are low compared with elsewhere.  As John Nicolson (East Dunbartonshire) SNP pointed out in the debate, “In the United States, AT&T was fined £17 million for failing to protect customer data. In the United Kingdom, the ICO can only place fines of up to £500,000. For a company that received an annual revenue of nearly £1.8 billion, a fine that small will clearly not be terrifying. The regulation of telecoms must be strengthened to protect consumers.”

[8] Let’s talk education: FOI request revealing a samples of some individual level data released to members of the press: http://www.theyworkforyou.com/debates/?id=2015-10-26b.32.0

The CMA brought out a report in June, on the use of consumer data, the topic should be familiar in parliament, but little engagement has come about as a result. It suggested the benefit:

“will only be realised if consumers continue to provide data and this relies on them being able to trust the firms that collect and use it”, and that “consumers should know when and how their data is being collected and used and be able to decide whether and how to participate. They should have access to information from firms about how they are collecting, storing and using data.”

[9] Let’s Talk Public Trust – are the bodies involved Trustworthy? Government lacks an effective data policy and is resistant to change. Yet it wants to collect ever more personal and individual level for unknown purposes from the majority of 60m people, with an unprecedented PR campaign.  When I heard the words ‘we want a mature debate’ it was reminiscent of HSCIC’s ‘intelligent grown up debate’ requested by Kinglsey Manning, in a speech when he admitted lack of public knowledge was akin to a measure of past success, and effectively they would rather have kept the use of population wide health data ‘below the radar’.

Change: We need change, the old way after all, didn’t work, according to Minister Matt Hancock: “The old model of government has failed, so we will build a new one.” I’d like to see what that new one will look like. Does he mean to expand only data sharing policy, or the powers of the civil service?

[10] National Pupil Database detailed data releases to third parties https://www.whatdotheyknow.com/request/pupil_data_national_pupil_databa

[11] http://adrn.ac.uk/news-events/latest-news/adrn-rssevent

[12] http://jenpersson.com/public-trust-datasharing-nib-caredata-change/

[13] https://www.liberty-human-rights.org.uk/human-rights/privacy/state-surveillance

[14] http://www.computerweekly.com/news/4500256274/Government-will-tackle-barriers-to-sharing-and-linking-data-says-Cabinet-Office-minister-Hancock

Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data

care.data communicating the benefits as its response to the failed communications in spring 2014, has failed to deliver public trust, here’s why:

To focus on the benefits is a shortcut for avoiding the real issues

Talking about benefits is about telling people what the organisation wants to tell them. This fails to address what the public and professionals want to know. The result is not communication, but a PR exercise.

Talking about benefits in response to the failed communications in spring 2014 and failing to address criticism since, ignores concerns that public and professionals raised at macro and micro level.  It appears disingenuous about real engagement despite saying ‘we’re listening’ and seems uncaring.

Talking about only the benefits does not provide any solution to demonstrably outweigh the potential risk of individual and public health harm through loss of trust in the confidential GP relationship, or data inaccuracy, or loss, and by ignoring these, seems unrealistic.

Talking about short term benefits and not long term solutions [to the broken opt out, long term security, long term scope change of uses and users and how those will be communicated] does not demonstrate competency or reliability.

Talking about only the benefits of commissioning, and research for the merged dataset CES, doesn’t mention all the secondary uses to which all HSCIC patient level health data are put, [those reflected in Type 2 opt out] including commercial re-use and National Back Office: “2073 releases made from the National Back Office between April 2013 and December 2013. This includes 313 releases to police forces, 1531 to the Home Office and 229 to the National Crime Agency.” [HSCIC, July2,  2014].

This use of hospital records and other secondary data by the back office, without openly telling the public, does not feel  ethical and transparent.

Another example, is the past patient communications that expressly said, ‘we do not collect name’, the intent of which would appear to be to assure patients of anonymity, without saying name is already stored at HSCIC on the Personal Demographics Service, or that name is not needed to be identifiable.

We hear a lot about transparency. But is transparent the same fully accurate, complete and honest? Honest about the intended outcomes of the programme. Honest about all the uses to which health data are put. Honest about potential future scope changes and those already planned.

Being completely truthful in communications is fundamental to future-proofing trust in the programme.

NHS England’s care.data programme through the focus on ‘the benefits’ lacks balance and appears disingenuous, disinterested,  unrealistic and lacking in reliability, competency and honesty. Through these actions it does not demonstrate the organisation is trustworthy.  This could be changed.

care.data fundamentally got it wrong with the intention to not communicate the programme at all.  It got it wrong in the tool and tone of communications in the patient leaflet.  There is a chance to get it right now, if the organisation  would only stop the focus on communicating the benefits.

I’m going to step through with a couple of examples why to-date, some communications on care.data and use of NHS data are not conducive to trust.

Communication designed to ‘future-proof’ an ongoing relationship and trust must be by design, not afterthought.

Communications need to start addressing the changes that are happening and how they make people feel and address the changes that create concern – in the public and professionals – not address the  goals that the organisation has.

Sound familiar? Communications to date have been flawed in the same way that the concept of ‘building trust’ has been flawed. It has aimed to achieve the wrong thing and with the wrong audience.

Communications in care.data needs to stop focussing on what the organisation wants from the public and professionals – the benefits it sees of getting data – and address instead firstly at a macro level, why the change is necessary and why the organisation should be trusted to bring it about.

When explaining benefits there are clearly positives to be had from using primary and secondary data in the public interest. But what benefits will be delivered in care.data that are not already on offer today?

Why if commissioning is done today with less identifiable data, can there be no alternative to the care.data level of identifiable data extraction? Why if the CPRD offers research in both primary and secondary care today, will care.data offer better research possibilities? And secondly at a micro level, must address questions individuals asked up and down the country in 2014.

What’s missing and possible to be done?

  1. aim to meet genuine ongoing communication needs not just legal data protection fair processing tick-boxes
  2. change organisational attitude that encourages people to ask what they each want to know at macro and micro level – why the programme at all, and what’s in it for me? What’s new and a benefit that differs from the status quo? This is only possible if you will answer what is asked.
  3. deliver robust explanations of the reason why the macro and micro benefits demonstrably outweigh the risk of individual potential harms
  4. demonstrate reliability, honesty, competency and you are trustworthy
  5. agree how scope changes will trigger communication to ‘future-proof’ an ongoing relationship and trust by design.

As the NIB work stream on Public Trust says, “This is not merely a technical exercise to counter negative media attention; substantial change and long-term work is needed to deliver the benefits of data use.”

If they’re serious about that long term work, then why continue to roll out pathfinder communications based on a model that doesn’t work, with an opt out that doesn’t work? Communications isn’t an afterthought to public trust. It’s key.

If you’re interested in details and my proposals for success in communications I’ve outlined in depth below:

  • Why Communicate Changes at all?
  • What is change in care.data about?
  • Is NHS England being honest about why this is hard?
  • Communicate the Benefits is not working
  • A mock case study in why ‘communicate the benefits’ will fail
  • Long term trust needs a long term communications solution
  • How a new model for NHS care.data Communication could deliver

Continue reading “Building Public Trust [4]: “Communicate the Benefits” won’t work for care.data” »

Building Public Trust in care.data datasharing [3]: three steps to begin to build trust

Let’s assume the question of public trust is as important to those behind data sharing plans in the NHS [1] as they say it is. That the success of the care.data programme today and as a result, the very future of the NHS depends upon it.

“Without the care.data programme, the health service will not have a future, said Tim Kelsey, national director for patients and information, NHS England.” [12]

And let’s assume we accept that public trust is not about the public, but about the organisation being trustworthy.[2]

The next step is to ask, how trustworthy is the programme and organisation behind care.data? And where and how do they start to build?

The table discussion on  [3] “Building Public Trust in Data Sharing”  considered  “what is the current situation?” and “why?”

What’s the current situation? On trust public opinion is measurable. The Royal Statistical Society Data Trust Deficit shows that the starting points are low with the state and government, but higher for GPs. It is therefore important that the medical profession themselves trust the programme in principle and practice. They are after all the care.data point of contact for patients.

The current status on the rollout, according to news reports, is that pathfinder  practices are preparing to rollout [4]  communications in the next few weeks. Engagement is reportedly being undertaken ‘over the summer months’. 

Understanding both public trust and the current starting point matters as the rollout is moving forwards and as leading charity and research organisation experts said: “Above all, patients, public and healthcare professionals must understand and trust the system. Building that trust is fundamental. We believe information from patient records has huge potential to save and improve lives but privacy concerns must be taken seriously. The stakes are too high to risk any further mistakes.” [The Guardian Letters, July 27, 2015]

Here’s three steps I feel could be addressed in the short term, to start to demonstrate why the public and professionals should trust  both organisation and process.

What is missing?

1. Opt out: The type 2 opt out does not work. [5]  

2 a. Professional voices called for answers and change: As mentioned in my previous summary various bodies called for change. Including the BMA whose policy [6] remains that care.data should be on a patient opt-in basis.

2bPublic voices called for answers and change: care.data’s own listening event feedback [7] concluded there was much more than ‘communicate the benefits’ that needed done. There is much missing. Such as questions on confusing SCR and care.data, legislation and concern over controlling its future change, GP concerns of their ethical stance, the Data Guardian’s statutory footing, correction of mistakes, future funding and more.
How are open questions being addressed? If at all?

3. A single clear point of ownership on data sharing and public trust communications> Is this now NIB, NHS England Patients and Information Directorate, the DH  who owns care.data now? It’s hard to ask questions if you don’t know where to go and the boards seem to have stopped any public communications. Why? The public needs clarity of organisational oversight.

What’s the Solution? 

1. Opt out: The type 2 opt out does not work. See the post graphic, the public wanted more clarity over opt out in 2014, so this needs explained clearly >>Solution: follows below from a detailed conversation with Mr. Kelsey.

2. Answers to professional opinions: The Caldicott panel,  raised 27 questions in areas of concern in their report. [8] There has not yet been any response to address them made available in the public domain by NHS England. Ditto APPG report, BMA LMC vote, and others >> Solution: publish the responses to these concerns and demonstrate what actions are being done to address them.

2b. Fill in the lack of transparency: There is no visibility of any care.data programme board meeting minutes or materials from 2015. In eight months, nothing has been published. Their 2014 proposal for transparency, appears to have come to nothing. Why?  The minutes from June-October 2014 are also missing entirely and the October-December 2014 materials published were heavily redacted. There is a care.data advisory board, which seems to have had little public visibility recently either. >> Solution: the care.data programme business case must be detailed and open to debate in the public domain by professionals and public. Scrutiny of its associated current costs and time requirements, and ongoing future financial implications at all levels should be welcomed by national, regional (CCG) and local level providers (GPs). Proactively publishing creates demonstrable reasons why both the organisation, and the plans are both trustworthy. Refusing this without clear justifications, seems counter productive, which is why I have challenged this in the public interest. [10]

3. Address public and professional confusion of ownership: Since data sharing and public trust are two key components of the care.data programme, it seems to come under the NIB umbrella, but there is a care.data programme board [9] of its own with a care.data Senior Responsible Owner and Programme Director. >> Solution: an overview of where all the different nationally driven NHS initiatives fit together and their owners would be helpful.

[Anyone got an interactive Gantt chart for all national level driven NHS initiatives?]

This would also help public and professionals see how and why different initiatives have co-dependencies. This could also be a tool to reduce the ‘them and us’ mentality. Also useful for modelling what if scenarios and reality checks on 5YFV roadmaps for example, if care.data pushes back six months, what else is delayed?

If the public can understand how things fit together it is more likely to invite questions, and an engaged public is more likely to be a supportive public. Criticism can be quashed if it’s incorrect. If it is justified criticism, then act on it.

Yes, these are hard decisions. Yes, to delay again would be awkward. If it were the right decision, would it be worse to ignore it and carry on regardless? Yes.

The most important of the three steps in detail: a conversation with Mr. Kelsey on Type 2 opt out. What’s the Solution?

We’re told “it’s complicated.” I’d say “it’s simple.” Here’s why.

At the table of about fifteen participants at the Bristol NIB event, Mr. Kelsey spoke very candidly and in detail about consent and the opt out.

On the differences between consent in direct care and other uses he first explained the assumption in direct care. Doctors and nurses are allowed to assume that you are happy to have your data shared, without asking you specifically. But he said, “beyond that boundary, for any other purpose, that is not a medical purpose in law, they have to ask you first.”

He went on to explain that what’s changed the whole dynamic of the conversation, is the fact that the current Secretary of State, decided that when your data is being shared for purposes other than your direct care, you not only have the right to be asked, but actually if you said you didn’t want it to be shared, that decision has to be respected, by your clinician.

He said: “So one of the reasons we’re in this rather complex situation now, is because if it’s for analysis, not only should you be asked, but also when you say no, it means no.”

Therefore, I asked him where the public stands with that now. Because at the moment there are ca. 700,000 people who we know said no in spring 2014.

Simply: They opted out of data used for secondary purposes, and HSCIC continues to share their data.

“Is anything more fundamentally damaging to trust, than feeling lied to?”

Mr. Kelsey told the table there is a future solution, but asked us not to tweet when. I’m not sure why, it was mid conversation and I didn’t want to interrupt:

“we haven’t yet been able to respect that preference, because technically the Information Centre doesn’t have the digital capability to actually respect it.”

He went on to say that they have hundreds of different databases and at the moment, it takes 24 hrs for a single person’s opt out to be respected across all those hundreds of databases. He explained a person manually has to enter a field on each database, to say a person’s opted out. He asked the hoped-for timing not be tweeted but explained that all those current historic objections which have been registered will be respected at a future date.

One of the other attendees expressed surprise that GP practices hadn’t been informed of that, having gathered consent choices in 2014 and suggested the dissent code could be extracted now.

The table discussion then took a different turn with other attendee questions, so I’m going to ask here what I would have asked next in response to his statement, “if it’s for analysis, not only should you be asked, but also when you say no, it means no.”

Where is the logic to proceed with pathfinder communications?

What was said has not been done and you therefore appear untrustworthy.

If there will be a future solution it will need communicated (again)?

“Trust is not about the public. Public trust is about the organisation being trustworthy.”

There needs to be demonstrable action that what the org said it would do, the org did. Respecting patient choice is not an optional extra. It is central in all current communications. It must therefore be genuine.

Knowing that what was promised was not respected, might mean millions of people choose to opt out who would not otherwise do so if the process worked when you communicate it.

Before then any public communications in Blackburn and Darwen, and Somerset, Hampshire and Leeds surely doesn’t make sense.

Either the pathfinders will test the same communications that are to be rolled out as a test for a national rollout, or they will not. Either those communications will explain the secondary uses opt out, or they will not. Either they will explain the opt out as is [type 2 not working] or as they hope it might be in future. [will be working] Not all of these can be true.

People who opt out on the basis of a broken process simply due to a technical flaw, are unlikely to ever opt back in again. If it works to starts with, they might choose to stay in.

Or will the communications roll out in pathfinders with a forward looking promise, repeating what was promised but has not yet been done? We will respect your promise (and this time we really mean it)? Would public trust survive that level of uncertainty? In my opinion, I don’t think so.

There needs to be demonstrable action in future as well, that what the org said it would do, the org did. So the use audit report and how any future changes will be communicated both seem basic principles to clarify for the current rollout as well.

So what’s missing and what’s the solution on opt out?

We’re told “it’s complicated.” I say “it’s simple.” The promised opt out must work before moving forward with anything else. If I’m wrong, then let’s get the communications materials out for broad review to see how they accommodate this and the future re-communication of  second process.

There must be a budgeted and planned future change communication process.

So how trustworthy is the programme and organisation behind care.data?

Public opinion on trust levels is measurable. The Royal Statistical Society Data Trust Deficit shows that the starting points are clear. The current position must address the opt out issue before anything else. Don’t say one thing, and do another.

To score more highly on the ‘truthworthy scale’ there must be demonstrable action, not simply more communications.

Behaviours need change and modelled in practice, to focus on people, not  tools and tech solutions, which make patients feel as if they are less important to the organisations than their desire to ‘enable data sharing’.

Actions need to demonstrate they are ethical and robust for a 21stC solution.

Policies, practical steps and behaviours all play vital roles in demonstrating that the organisations and people behind care.data are trustworthy.

These three suggestions are short term, by that I mean six months. Beyond that further steps need to be taken to be demonstrably trustworthy in the longer term and on an ongoing basis.

Right now, do I trust that the physical security of HSCIC is robust? Yes.

Do I trust that the policies in the programme would not to pass my data in the future to third party commercial pharma companies? No.
Do I believe that for enabling commissioning my fully identifiable confidential health records should be stored indefinitely with a third party? No.
Do I trust that the programme would not potentially pass my data to non-health organisations, such as police or Home Office? No.
Do I trust that the programme to tell me if they potentially change the purposes from those which they outline now ? No.

I am open to being convinced.

*****

What is missing from any communications to date and looks unlikely to be included in the current round and why that matters I address in my next post Building Public Trust [4]: Communicate the Benefits won’t work for care.data and then why a future change management model of consent needs approached now, and not after the pilot, I wrap up in [5]: Future solutions.

Continue reading “Building Public Trust in care.data datasharing [3]: three steps to begin to build trust” »

Building Public Trust in care.data sharing [1]: Seven step summary to a new approach

Here’s my opinion after taking part in the NIB #health2020 Bristol event 24/7/2015 and presentation of plans at the June King’s Fund hosted event. Data sharing includes plans for extraction and uses of primary care data by third parties, charging ahead under the care.data banner.

Wearing my hat from a previous role in change management and communications, I share my thoughts in the hope the current approach can adapt and benefit from outside perspectives.

The aim of “Rebuilding and sustaining Public trust” [1] needs refocused to treat the cause, not only the symptoms of the damage done in 2014.  Here’s why:

A Seven Step Top Line Summary

1. Abstract ‘public trust’ is not vital to the future of data sharing. Being demonstrably worthy of public trust is.

2. Data-sharing is not vital to future-proof the NHS. Using knowledge wisely is.

3. A timed target to ‘get the public’s data’, is not what is needed. Having a stable, long term future-proofed and governable model is.

4. Tech solutions do not create trust. Enable the positive human response to what the org wants from people, enabling their confident ‘yes to data-sharing.’ [It might be supported by technology-based tools.]

5. Communications that tell the public ‘we know best, trust us’ fail.  While professional bodies [BMA [2], GPES advisory group, APPG report calling for a public benefits plan, ICO, and expert advice such as Caldicott] are ignored or remain to be acted upon, it remains challenging for the public to see how the programme’s needs, motives and methods are trustworthy. The [Caldicott 2] Review Panel found that commissioners do not need dispensation from confidentiality, human rights & data protection law.” [3] Something’s gotta give. What will it be?

6. care.data consistency. Relationships must be reliable and have integrity.
“Trust us – see the benefits” [But we won’t share the business cost/benefit plan.]
“Trust us – we’re transparent” [But there is nothing published in 2015 at all from the programme board minutes] [4]
“Trust us – we’ll only use your data wisely, with the patient in control” [Ignore that we didn’t before [5] and that we still share your data for secondary uses even if you opted out [6] and no, we can’t tell you when it will be fixed…]

7. Voices do not exist in a vacuum. Being trustworthy on care.data  does not stand alone but is part of the NHS ‘big picture’.
Department of Health to GPs: “Trust us about data sharing.’  [And ignore that we haven’t respected many of  your judgement or opinions.]
NHS England to GPs: “Trust us about data sharing.’  
[And ignore our lack of general GP support: MPIG withdrawal, misrepresentation in CQC reports] NHS England and Department of Health to professionals and public: “The NHS is safe in our hands.’ Everyone: “We see no evidence that plans for cost savings, 7 day working, closures and the 5YFV integration will bring the promised benefits. Let us ‘see the holes’, so that we can trust you based on evidence.”

See the differences?

Target the cause not Symptom:

The focus in the first half, the language used by NHS England/NIB/ DH, sets out their expectations of the public. “You must trust us and how you give us your data.”

The focus should instead to be on the second half, a shift to the organisation, the NHS England/NIB/ DH, and set out expectations from the public point-of-view. ” Enable the public to trust the organisation. Enable individual citizens to trust what is said by individual leaders. This will enable citizens to be consensual sharers in the activity your organisation imposes – the demand for care.data through a statutory gateway, obliging GPs to disclose patient data.

The fact that trust is broken, and specifically to data-sharing that there is the deficit [A] between how much the public trusts the organisation and how the organisation handles data, is not the fault of the public, or “1.4 M NHS staff”, or the media, or patient groups’ pressure. It’s based on proven experience.

It’s based on how organisations have handled data in the past. [5] Specifically on the decisions made by DH, and the Information Centre and leaders in between. Those who chose to sell patient data without asking the public.

The fact that trust is broken is based on how leadership individuals in those organisations have responded to that. Often taking no responsibility for loss.

No matter how often we hear “commissioners will get a better joined up picture of care needs and benefit you”, it does not compensate for past failings.

Only demonstrable actions to show why it will not happen in future can start that healing process.

Target the timing to the solution, not a shipping deadline

“Building trust to enable data sharing” aims at quick fixes, when what is needed is a healing process and ongoing relationship maintenance.

Timing has to be tailored to what needs done, not an ‘artificial deadline’. Despite that being said it doesn’t seem to match reality.

Addressing the Symptoms and not the Cause, will not find a Cure

What needs done?

Lack of public trust, the data trust deficit [A] are symptoms in the public to be understood. But it is the causes in the organisations that must be treated.

So far many NHS England staff I have met in relation to care.data, appear to have a “them and us” mentality. It’s almost tangibly wrapped up in the language used at these meetings or in defensive derision of public concerns: “Tin foil hat wearers”, “Luddites” [7] and my personal favourite, ‘Consent fetishists.’ [8] It’s counter productive and seems borne from either a lack of understanding, or frustration.

The NIB/DH/NHS England/ P&I Directorate must accept they cannot force any consensual change in an emotion-based belief based on past experiences, held by the public.

Those people each have different starting points of knowledge and beliefs.  As one attendee said, “There is no single patient replicated 60 million times.”

The NIB/DH/NHS England/ P&I Directorate can only change what they themselves can control. They have to model and be seen to model change that is trustworthy.

How can an organisation demonstrate it is trustworthy?

This means shifting the focus of the responsibility for change from public and professionals, to leadership organisation.

There is a start in this work stream, but there is little new that is concrete.

The National Data Guardian (NDG) role has been going to be put on a legal footing “at the earliest opportunity” since November 2014. [9] Nine months.

Updated information governance guidance is on the way.

Then there’s two really strong new items that would underpin public trust, to be planned in a ‘roadmap’: the first a system that can record and share consent decisions and the second, to provide information on the use to which an individual’s data has been put.

How and when those two keystones to public trust will be actually offered appear unknown. They will  encourage public trust by enabling choice and control of our data. So I would ask, if we’re not there yet on the roadmap, how can consent options be explained to the public in care.data communications, if there is as yet no mechanism to record and effect them? More on that later.

Secondly, when will a usage report be available? That will be the proof to demonstrate that what was offered, was honoured. It is one of the few tools the organisation(s) can offer to demonstrate they are trustworthy: you said, we did. So again, why jeopardise public trust by rolling out data extractions into the existing, less trustworthy environment?

How well this is done will determine whether it can realise its hoped for benefits. How the driving leadership influences that outcome, will be about the organisational approach to opt out, communicating care.data content decisions, the way and the channels in which they are communicated, accepting what has not worked to date and planning long-term approaches to communicating change before you start the pathfinders. [Detailed steps on this follows.]

Considering the programme’s importance we have been told, it’s vital to get right. [10]

i believe changing the approach from explaining benefits and focus on public trust, to demonstrating why the public should trust demonstrable changes made, will make all the difference.

So before rolling out next data sharing steps think hard what the possible benefits and risks will be, versus waiting for a better environment to do it in.

Conclusion: Trust is not about the public. Public trust is about the organisation being trustworthy. Over to you, orgs.

####

To follow, for those interested in nitty gritty, some practical suggestions for progress in Building Public Trust in data sharing:

This is Part one: A seven step top line summary – What I’d like to see change addressing public trust in health data sharing for secondary purposes.

Part two: a New Approach is needed to understanding Public Trust For those interested in a detailed approach on Trust. What Practical and Policy steps influence trust. On Research and Commissioning. Trust is not homogeneous. Trust  is nuanced even within the single relationship between one individual and another. It doesn’t exist in a vacuum.

Part three: Know where you’re starting from. What behaviours influence trust. Fixing what has already been communicated is vital before new communications get rolled out. Vital to content of your communications and vital for public trust and credibility.

Part four: Communicate the Benefits won’t work – How Communications influence trust. For those interested in more in-depth reasons, I outline in part two why the communications approach is not working, why the focus on ‘benefits’ is wrong, and fixes.

Part five: Future solutions – why a new approach may work better for future trust – not to attempt to rebuild trust where there is now none, but strengthen what is already trusted and fix today’s flawed behaviours; honesty and reliability, that  are vital to future proofing trust

####

Background References:

I’m passionate about people using technology to make their jobs and lives better, simpler, and about living well. So much so, that this became over 5000 words. To solve that, I’ve assumed a baseline knowledge and I will follow up with separate posts on why a new approach is needed to understanding “Public Trust”, to “Communicating the benefits” and “Being trustworthy and other future solutions”.

If this is all new, welcome, and I suggest you look over some of the past 18 months posts that include  public voice captured from eight care.data  events in 2014. care.data is about data sharing for secondary purposes not direct care.

[1] NHS England October 2014 http://www.england.nhs.uk/2014/10/23/nhs-leaders-vision/

[2] BMA LMC Vote 2014 http://bma.org.uk/news-views-analysis/news/2014/june/patients-medical-data-sacrosanct-declares–bma

[3] Caldicott Review 2: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/192572/2900774_InfoGovernance_accv2.pdf

[4] Missing Programme Board documents: 2015 and June-October 2014

[5] HSCIC Data release register

[6] Telegraph article on Type 2 opt out http://www.telegraph.co.uk/news/health/news/11655777/Nearly-1million-patients-could-be-having-confidential-data-shared-against-their-wishes.html

[7] Why Wanting a Better Care.Data is not Luddite: http://davidg-flatout.blogspot.co.uk/2014/04/why-wanting-better-caredata-is-not.html

[8] Talking to the public about using their data is crucial- David Walker, StatsLife http://www.statslife.org.uk/opinion/1316-talking-to-the-public-about-using-their-data-is-crucial

[9] Dame Fiona Caldicott appointed in new role as National Data Guardian

[10] Without care.data health service has no future says director http://www.computerweekly.com/news/2240216402/Without-Caredata-we-wont-have-a-health-service-for-much-longer-says-NHS

Polls of public feeling:

[A] Royal Statistical Society Data Trust Deficit http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers

(B] Dialogue on data – work carried out through the ADRN