Tag Archives: transparency

Destination smart-cities: design, desire and democracy (Part one)

When I drop my children at school in the morning I usually tell them three things: “Be kind. Have fun. Make good choices.”

I’ve been thinking recently about what a positive and sustainable future for them might look like. What will England be in 10 years?

The #Sprint16 snippets I read talk about how: ”Digital is changing how we deliver every part of government,” and “harnessing the best of digital and technology, and the best use of data to improve public services right across the board.”

From that three things jumped out at me:

  • The first is that the “best use of data” in government’s opinion may conflict with that of the citizen.
  • The second, is how to define “public services” right across the board in a world in which boundaries between private and public in the provision of services have become increasingly blurred.
  • And the third is the power of tech to offer both opportunity and risk if used in “every part of government” and effects on access to, involvement in, and the long-term future of, democracy.

What’s the story so far?

In my experience so far of trying to be a digital citizen “across the board” I’ve seen a few systems come and go. I still have my little floppy paper Government Gateway card, navy blue with yellow and white stripes. I suspect it is obsolete. I was a registered Healthspace user, and used it twice. It too, obsolete. I tested my GP online service. It was a mixed experience.

These user experiences are shaping how I interact with new platforms and my expectations of organisations, and I will be interested to see what the next iteration, nhs alpha, offers.

How platforms and organisations interact with me, and my data, is however increasingly assumed without consent. This involves new data collection, not only using data from administrative or commercial settings to which I have agreed, but new scooping of personal data all around us in “smart city” applications.

Just having these digital applications will be of no benefit and all the disadvantages of surveillance for its own sake will be realised.

So how do we know that all these data collected are used – and by whom? How do we ensure that all the tracking actually gets turned into knowledge about pedestrian and traffic workflow to make streets and roads safer and smoother in their operation, to make street lighting more efficient, or the environment better to breathe in and enjoy? And that we don’t just gift private providers tonnes of valuable data which they simply pass on to others for profit?

Because without making things better, in this Internet-of-Things will be a one-way ticket to power in the hands of providers and loss of control, and quality of life. We’ll work around it, but buying a separate SIM card for trips into London, avoiding certain parks or bridges, managing our FitBits to the nth degree under a pseudonym. But being left no choice but to opt out of places or the latest technology to enjoy, is also tedious. If we want to buy a smart TV to access films on demand, but don’t want it to pass surveillance or tracking information back to the company how can we find out with ease which products offer that choice?

Companies have taken private information that is none of their business, and quite literally, made it their business.

The consumer technology hijack of “smart” to always mean marketing surveillance creates a divide between those who will comply for convenience and pay the price in their privacy, and those who prize privacy highly enough to take steps that are less convenient, but less compromised.

But even wanting the latter, it can be so hard to find out how to do, that people feel powerless and give-in to the easy option on offer.

Today’s system of governance and oversight that manages how our personal data are processed by providers of public and private services we have today, in both public and private space, is insufficient to meet the values most people reasonably expect, to be able to live their life without interference.

We’re busy playing catch up with managing processing and use, when many people would like to be able to control collection.

The Best use of Data: Today

My experience of how the government wants to ‘best use data’ is that until 2013 I assumed the State was responsible with it.

I feel bitterly let down.

care.data taught me that the State thinks my personal data and privacy are something to exploit, and “the best use of my data” for them, may be quite at odds with what individuals expect. My trust in the use of my health data by government has been low ever since. Saying one thing and doing another, isn’t making it more trustworthy.

I found out in 2014 how my children’s personal data are commercially exploited and given to third parties including press outside safe settings, by the Department for Education. Now my trust is at rock bottom. I tried to take a look at what the National Pupil Database stores on my own children and was refused a subject access request, meanwhile the commercial sector and Fleet Street press are given out not only identifiable data, but ‘highly sensitive’ data. This just seems plain wrong in terms of security, transparency and respect for the person.

The attitude that there is an entitlement of the State to individuals’ personal data has to go.

The State has pinched 20 m children’s privacy without asking. Tut Tut indeed. [see Very British Problems for a translation].

And while I support the use of public administrative data in deidentified form in safe settings, it’s not to be expected that anything goes. But the feeling of entitlement to access our personal data for purposes other than that for which we consented, is growing, as it stretches to commercial sector data. However suggesting that public feeling measured based on work with 0.0001% of the population, is “wide public support for the use and re-use of private sector data for social research” seems tenuous.

Even so, comments even in that tiny population suggested, “many participants were taken by surprise at the extent and size of data collection by the private sector” and some “felt that such data capture was frequently unwarranted.” “The principal concerns about the private sector stem from the sheer volume of data collected with and without consent from individuals and the profits being made from linking data and selling data sets.”

The Best use of Data: The Future

Young people, despite seniors often saying “they don’t care about privacy” are leaving social media in search of greater privacy.

These things cannot be ignored if the call for digital transformation between the State and the citizen is genuine because try and do it to us and it will fail. Change must be done with us. And ethically.

And not “ethics” as in ‘how to’, but ethics of “should we.” Qualified transparent evaluation as done in other research areas, not an add on, but integral to every project, to look at issues such as:

  • whether participation is voluntary, opt-out or covert
  • how participants can get and give informed consent
  • accessibility to information about the collection and its use
  • small numbers, particularly of vulnerable people included
  • identifiable data collection or disclosure
  • arrangements for dealing with disclosures of harm and recourse
  • and how the population that will bear the risks of participating in the research is likely to benefit from the knowledge derived from the research or not.

Ethics is not about getting away with using personal data in ways that won’t get caught or hauled over the coals by civil society.

It’s balancing risk and benefit in the public interest, and not always favouring the majority, but doing what is right and fair.

We hear a lot at the moment on how the government may see lives, shaped by digital skills, but too little of heir vison for what living will look and feel like, in smart cities of the future.

My starting question is, how does government hope society will live there and is it up to them to design it? If not, who is because these smart-city systems are not designing themselves. You’ve heard of Stepford wives. I wonder what do we do if we do not want to live like Milton Keynes man?

I hope that the world my children will inherit will be more just, more inclusive, and with a more sustainable climate to support food, livelihoods and kinder than it is today. Will ‘smart’ help or hinder?

What is rarely discussed in technology discussions is how the service should look regardless of technology. The technology assumed as inevitable, becomes the centre of service delivery.

I’d like to first understand what is the central and local government vision for “public services”  provision for people of the future? What does it mean for everyday services like schools and health, and how does it balance security and our freedoms?

Because without thinking about how and who provides those services for people, there is a hole in the discussion of “the best use of data” and their improvement “right across the board”.

The UK government has big plans for big data sharing, sharing across all public bodies, some tailored for individual interventions.

While there are interesting opportunities for public benefit from at-scale systems, the public benefit is at risk not only from lack of trust in how systems gather data and use them, but that interoperability in service, and the freedom for citizens to transfer provider, gets lost in market competition.

Openness and transparency can be absent in public-private partnerships until things go wrong. Given the scale of smart-cities, we must have more than hope that data management and security will not be one of those things.

How will we know if new plans are designed well, or not?

When I look at my children’s future and how our current government digital decision making may affect it, I wonder if their future will be more or less kind. More or less fun.

Will they be left with the autonomy to make good choices of their own?

The hassle we feel when we feel watched all the time, by every thing that we own, in every place we go, having to check every check box has a reasonable privacy setting, has a cumulative cost in our time and anxieties.

Smart technology has invaded not only our public space and our private space, but has nudged into our head space.

I for one have had enough already. For my kids I want better. Technology should mean progress for people, not tyranny.

Living in smart cities, connected in the Internet-of-Things, run on their collective Big Data and paid for by commercial corporate providers, threatens not only their private lives and well-being, their individual and independent lives, but ultimately independent and democratic government as we know it.

*****

This is the start of a four part set of thoughts: Beginnings with smart technology and data triggered by the Sprint16 session (part one). I think about this more in depth in “Smart systems and Public Services” (Part two) here, and the design and development of smart technology making “The Best Use of Data” looking at today in a UK company case study (Part three) before thoughts on “The Best Use of Data” used in predictions and the Future (Part four).

The front door to our children’s personal data in schools

“EdTech UK will be a pro-active organisation building and accelerating a vibrant education and learning technology sector and leading new developments with our founding partners. It will also be a front door to government, educators, companies and investors from Britain and globally.”

Ian Fordham, CEO, EdTech UK

This front door is a gateway to access our children’s personal data and through it some companies are coming into our schools and homes and taking our data without asking.  And with that, our children lose control over their safeguarded digital identity. Forever.

Companies are all “committed to customer privacy” in those privacy policies which exist at all. However, typically this means they also share your information with ‘our affiliates, our licensors, our agents, our distributors and our suppliers’ and their circles are wide and often in perpetuity. Many simply don’t have a published policy.

Where do they store any data produced in the web session? Who may access it and use it for what purposes? Or how may they use the personal data associated with staff signing up with payment details?

According to research from London & Partners, championed by Boris Johnson, Martha Lane-Fox and others in EdTech, education is one of the fastest growing tech sectors in Britain and is worth £45bn globally; a number set to reach a staggering £129bn by 2020. And perhaps the EdTech diagrams in US dollars shows where the UK plan to draw companies from. If you build it, they will come.

The enthusiasm that some US EdTech type entrepreneurs I have met or listened to speak, is akin to religious fervour. Such is their drive for tech however, that they appear to forget that education is all about the child. Individual children. Not cohorts, or workforces. And even when they do it can be sincerely said, but lacks substance when you examine policies in practice.

How is the DfE measuring the cost and benefit of tech and its applications in education?

Is anyone willing to say not all tech is good tech, not every application is a wise application? Because every child is unique, not every app is one size fits all?

My 7-yo got so caught up in the game and in the mastery of the app their class was prescribed for homework in the past, that she couldn’t master the maths and harmed her confidence. (Imagine something like this, clicking on the two correct sheep with numbers stamped on them, that together add up to 12, for example, before they fall off and die.)

She has no problem with maths. Nor doing sums under pressure. She told me happily today she’d come joint second in a speed tables test. That particular app style simply doesn’t suit her.

I wonder if other children and parents find the same and if so, how would we know if these apps do more harm than good?

Nearly 300,000 young people in Britain have an anxiety disorder according to the Royal College of Psychiatrists. Feeling watched all the time on-and offline is unlikely to make anxiety any better.

How can the public and parents know that edTech which comes into the home with their children, is behaviourally sound?

How can the public and parents know that edTech which affects their children, is ethically sound in both security and application?

Where is the measured realism in the providers’ and policy makers fervour when both seek to marketise edTech and our personal data for the good of the economy, and ‘in the public interest’.

Just because we can, does not always mean we should. Simply because data linkage is feasible, even if it brings public benefit, cannot point blank mean it will always be in our best interest.

In whose best Interest is it anyway?

Right now, I’m not convinced that the digital policies at the heart of the Department for Education, the EdTech drivers or many providers have our children’s best interests at heart at all. It’s all about the economy; when talking if at all about children using the technology, many talk only of ‘preparing the workforce’.

Are children and parents asked to consent at individual level to the terms and conditions of the company and told what data will be extracted from the school systems about their child? Or do schools simply sign up their children and parents en masse, seeing it as part of their homework management system?

How much ‘real’ personal data they use varies. Some use only pseudo-IDs assigned by the teacher. Others log, store and share everything they do assigned to their ID or real email address , store performance over time and provide personalised reports of results.

Teachers and schools have a vital role to play in understanding data ethics and privacy to get this right and speaking to many, it doesn’t seem something they feel well equipped to do. Parents aren’t always asked. But should schools not always have to ask before giving data to a commercial third party or when not in an ’emergency’ situation?

I love tech. My children love making lego robots move with code. Or driving drones with bananas. Or animation. Technology offers opportunity for application in and outside schools for children that are fascinating, and worthy, and of benefit.

If however all parents are to protect children’s digital identity for future, and to be able to hand over any control and integrity over their personal data to them as adults,  we must better accommodate children’s data privacy in this 2016 gold rush for EdTech.

Pupils and parents need to be assured their software is both educationally and ethically sound.  Who defines those standards?

Who is in charge of Driving, Miss Morgan?

Microsoft’s vice-president of worldwide education, recently opened the BETT exhibition and praised teachers for using technology to achieve amazing things in the classroom, and urged innovators to  “join hands as a global community in driving this change”.

While there is a case to say no exposure to technology in today’s teaching would be neglectful, there is a stronger duty to ensure exposure to technology is positive and inclusive, not harmful.

Who regulates that?

We are on the edge of an explosion of tech and children’s personal data ‘sharing’ with third parties in education.

Where is its oversight?

The community of parents and children are at real risk of being completely left out these decisions, and exploited.

The upcoming “safeguarding” policies online are a joke if the DfE tells us loudly to safeguard children’s identity out front, and quietly gives their personal data away for cash round the back.

The front door to our children’s data “for government, educators, companies and investors from Britain and globally” is wide open.

Behind the scenes  in pupil data privacy, it’s a bit of a mess. And these policy makers and providers forgot to ask first,  if they could come in.

If we build it, would you come?

My question now is, if we could build something better on pupil data privacy AND better data use, what would it look like?

Could we build an assessment model of the collection, use and release of data in schools that could benefit pupils and parents, AND educational establishments and providers?

This could be a step towards future-proofing public trust which will be vital for companies who want a foot-in-the door of EdTech. Design an ethical framework for digital decision making and a practical data model for use in Education.

Educationally and ethically sound.

If together providers, policy makers, schools at group Trust level, could meet with Data Protection and Privacy civil society experts to shape a tool kit of how to assess privacy impact, to ensure safeguarding and freedoms, enable safe data flow and help design cybersecurity that works for them and protects children’s privacy that is lacking today, designing for tomorrow, would you come?

Which door will we choose?

*******

image credit: @ Ben Buschfeld Wikipedia

*added February 13th: Oftsed Chair sought from US

Commission on Freedom of Information: submission

Since it appears that the Independent Commission on Freedom of Information [FOI] has not published all of the received submissions, I thought I’d post what I’d provided via email.

I’d answered two of the questions with two case studies. The first on application of section 35 and 36 exemptions and the safe space. The second on the proposal for potential charges.

On the Commission website, the excel spreadsheet of evidence submitted online, tab 2 notes that NHS England asked belatedly for its submission be unpublished.

I wonder why.

Follow up to both these FOI requests are now long overdue in 2016. The first from NHS England for the care.data decision making  behind the 2015 decision not to publish a record of whether part of the board meetings were to be secret. Transparency needs to be seen in action, to engender public trust. After all, they’re deciding things like how care.data and genomics will be at the “heart of the transformation of the NHS.”

The second is overdue at the Department for Education on the legal basis for identifiable sensitive data releases from the National Pupil Database that meets Schedule 3 of the Data Protection Act 1998 to permit this datasharing with commercial third parties.

Both in line with the apparently recommended use of FOI
according to Mr. Grayling who most recently said:

“It is a legitimate and important tool for those who want to understand why and how Government is taking decisions and it is not the intention of this Government to change that”.  [Press Gazette]

We’ll look forward to see whether that final sentence is indeed true.

*******

Independent Commission on Freedom of Information Submission
Question 1: a) What protection should there be for information relating to the internal deliberations of public bodies? b) For how long after a decision does such information remain sensitive? c) Should different protections apply to different kinds of information that are currently protected by sections 35 and 36?

A “safe space” in which to develop and discuss policy proposals is necessary. I can demonstrate where it was [eventually] used well, in a case study of a request I made to NHS England. [1]

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. I asked in October 2014 for NHS England to publish the care.data planning and decision making for the national NHS patient data extraction programme. This programme has been controversial [2]. It will come at great public expense and to date has been harmful to public and professional trust with no public benefit. [3]

NHS England refused my request based on Section 22 [intended for future publication]. [4] However ten months later the meeting minutes had never been published. In July 2015, after appeal, the Information Commissioner issued an Information Notice and NHS England published sixty-three minutes and papers in August 2015.

In these released documents section 36 exemption was then applied to only a tiny handful of redacted comments. This was sufficient to protect the decisions that NHS England had felt to be most sensitive and yet still enable the release of a year’s worth of minutes.

Transparency does not mean that difficult decisions cannot be debated since only outcomes and decisions are recorded, not every part of every discussion verbatim.

The current provision for safe space using these exemptions is effective and in this case would have been no different made immediately after the meeting or one and a half years later.  If anything, publication sooner may have resulted in better informed policy and decision making through wider involvement from professionals and civil society.  The secrecy in the decision making did not build trust.

When policies such as these are found to have no financial business cost-benefit case for example, I believe it is strongly in the public interest to have transparency of these facts, to scrutinise the policy governance in the public interest to enable early intervention when seen to be necessary.
In the words of the Information Commissioner:

“FOIA can rightly challenge and pose awkward questions to public authorities. That is part of democracy. However, checks and balances are needed to ensure that the challenges are proportionate when viewed against all the other vital things a public authority has to do.

“The Commissioner believes that the current checks and balances in the legislation are sufficient to achieve this outcome.” [5]

Given that most public bodies, including NHS England’s Board, routinely publish its minutes this would seem a standard good practice to be expected and I believe routine publication of meeting minutes would have raised trustworthiness of the programme and its oversight and leadership.

The same section 36 exemption could have been applied from the start to the small redactions that were felt necessary balanced against the public interest of open and transparent decision making.

I do not believe more restrictive applications should be made than are currently under sections 35 and 36.

_____________________________________________________________________

Question 6: Is the burden imposed on public authorities under the Act justified by the public interest in the public’s right to know? Or are controls needed to reduce the burden of FoI on public authorities?

As an individual I made 40 requests of schools and 2 from the Department for Education which may now result in benefit for 8 million children and their families, as well as future citizens.

The transparency achieved through these Freedom of Information requests will I hope soon transform the culture at the the Department for Education from one of secrecy to one of openness.

There is the suggestion that a Freedom of Information request would incur a charge to the applicant.

I believe that the benefits of the FOI Act in the public interest outweigh the cost of FOI to public authorities.  In this second example [6], I would ask the Commission to consider if I had not been able to make these Freedom of Information requests due to cost, and therefore I was not able to present evidence to the Minister, Department, and the Information Commissioner, would the panel members support the secrecy around the ongoing risk that current practices pose to children and our future citizens?

Individual, identifiable and sensitive pupil data are released to third parties from the National Pupil Database without telling pupils, parents and schools or their consent. This Department for Education (DfE) FOI request aimed to obtain understanding of any due diligence and the release process: privacy impact and DfE decision making, with a focus on its accountability.

This was to enable transparency and scrutiny in the public interest, to increase the understanding of how our nation’s children’s personal data are used by government, commercial third parties, and even identifiable and sensitive data given to members of the press.

Chancellor Mr. Osborne spoke on November 17 about the importance of online data protection:

“Each of these attacks damages companies, their customers, and the public’s trust in our collective ability to keep their data and privacy safe.”[…] “Imagine the cumulative impact of repeated catastrophic breaches, eroding that basic faith… needed for our online economy & social life to function.”

Free access to FOI enabled me as a member of the public to ask and take action with government and get information from schools to improve practices in the broad public interest.

If there was a cost to this process I could not afford to ask schools to respond.  Schools are managed individually, and as such I requested the answer to the question; whether they were aware of the National Pupil Database and how the Department shared their pupils’ data onwardly with third parties.

I asked a range of schools in the South and East. In order to give a fair picture of more than one county I made requests from a range of types of school – from academy trusts to voluntary controlled schools – 20 primary and 20 secondary.  Due to the range of schools in England and Wales [7] this was a small sample.

Building even a small representative picture of pupil data privacy arrangements in the school system therefore required a separate request to each school.

I would not have been able to do this, had there been a charge imposed for each request.  This research subsequently led me to write to the Information Commissioner’s Office, with my findings.

Were this only to be a process that access costs would mean organisations or press could enter into due to affordability, then the public would only be able to find out what matters or was felt important to those organisations, but not what matters to individuals.

However what matters to one individual might end up making a big difference to many people.

Individuals may be interested in what are seen as minority topics, perhaps related to discrimination according to gender, sexuality, age, disability, class, race or ethnicity.  If individuals cannot afford to  challenge government policies that matter to them as an individual, we may lose the benefit that they can bring when they go on to champion the rights of more people in the country as a whole.

Eight million children’s records, from children aged 2-19 are stored in the National Pupil Database. I hope that due to the FOI request increased transparency and better practices will help restore their data protections for individuals and also re-establish organisational trust in the Department.

Information can be used to enable or constrain citizenship. In order to achieve universal access to human rights to support participation, transparency and accountability, I appeal that the Commission recognise the need for individuals to tackle vested interests, unjust laws and policies.

Any additional barriers such as cost, only serve to reduce equality and make society less just. There is however an immense intangible value in an engaged public which is hard to measure. People are more likely to be supportive of public servant decision making if they are not excluded from it.

Women for example are underrepresented in Parliament and therefore in public decision making. Further, the average gap within the EU pay is 16 per cent, but pay levels throughout the whole of Europe differ hugely, and in the South East of the UK men earn 25 per cent more than their female counterparts. [8]  Women and mothers like me may therefore find it more difficult to participate in public life and to make improvements on behalf of other families and children across the country.

To charge for access to information about our public decision making process could therefore be excluding and discriminatory.

I believe these two case studies show that the Act’s intended objectives, on parliamentary introduction — to ‘transform the culture of Government from one of secrecy to one of openness’; ‘raise confidence in the processes of government, and enhance the quality of decision making by Government’; and to ‘secure a balance between the right to information…and the need for any organisation, including Government, to be able to formulate its collective policies in private’ — work in practice.

If anything, they need strengthened to ensure accessibility.

Any actions to curtail free and equal access to these kinds of information would not be in the public interest and a significant threat to the equality of opportunity offered to the public in making requests. Charging would particularly restrict access to FOI for poorer individuals and communities who are often those already excluded from full public participation in public life.
___________________________________________________________________________

[1] https://www.whatdotheyknow.com/request/caredata_programme_board_minutes
[2] http://www.theguardian.com/society/2014/dec/12/nhs-patient-care-data-sharing-scheme-delayed-2015-concerns
[3] http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers
[4] https://jenpersson.com/wp-content/uploads/2015/11/caredataprogramme_FOI.pdf
[5] https://ico.org.uk/media/about-the-ico/consultation-responses/2015/1560175/ico-response-independent-commission-on-freedom-of-information.pdf
[6] https://jenpersson.com/wp-content/uploads/2015/11/NPD_FOI_submissionv3.pdf
[7] http://www.newschoolsnetwork.org/sites/default/files/Comparison%20of%20school%20types.pdf
[8] http://www.equalpayportal.co.uk/statistics/

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (2)

“Children do not lose their human rights by virtue of passing through the school gates” (UN Committee on the Rights of the Child, General Comment on ‘The aims of education’, 2001).

The Digital Skills in Schools inquiry [1] is examining the gap in education of our children to enable them to be citizens fit for the future.

We have an “educational gap” in digital skills and I have suggested it should not be seen only as functional or analytical, but should also address a gap in ethical skills and framework to equip our young people to understand their digital rights, as well as responsibilities.

Children must be enabled in education with opportunity to understand how they can grow “to develop physically, mentally, morally, spiritually and socially in a healthy and normal manner and in conditions of freedom and dignity”. [2]

Freedom to use the internet in privacy does not mean having to expose children to risks, but we should ask, are there ways of implementing practices which are more proportionate, and less intrusive than monitoring and logging keywords [3] for every child in the country? What problem is the DfE trying to solve and how?

Nicky Morgan’s “fantastic” GPS tracking App

The second technology tool Nicky Morgan mentioned in her BETT speech on January 22nd, is an app with GPS tracking and alerts creation. Her app verdict was “excellent” and “fantastic”:

“There are excellent examples at the moment such as the Family First app by Group Call. It uses GPS in mobile phones to help parents keep track of their children’s whereabouts, allowing them to check that they have arrived safely to school, alerting them if they stray from their usual schedule.” [4]

I’m not convinced tracking every child’s every move is either excellent or fantastic. Primarily because it will foster a nation of young people who feel untrusted, and I see a risk it could create a lower sense of self-reliance, self-confidence and self-responsibility.

Just as with the school software monitoring [see part one], there will be a chilling effect on children’s freedom if these technologies become the norm. If you fear misusing a word in an online search, or worry over stigma what others think, would you not change your behaviour? Our young people need to feel both secure and trusted at school.

How we use digital in schools shapes our future society

A population that trusts one another and trusts its government and organisations and press, is vital to a well functioning society.

If we want the benefits of a global society, datasharing for example to contribute to medical advance, people must understand how their own data and digital footprint fits into a bigger picture to support it.

In schools today pupils and parents are not informed that their personal confidential data are given to commercial third parties by the Department for Education at national level [5]. Preventing public engagement, hiding current practices, downplaying the risks of how data are misused, also prevents fair and transparent discussion of its benefits and how to do it better. Better, like making it accessible only in a secure setting not handing data out to Fleet Street.

For children this holds back public involvement in the discussion of the roles of technology in their own future. Fear of public backlash over poor practices must not hold back empowering our children’s understanding of digital skills and how their digital identity matters.

Digital skills are not shorthand for coding, but critical life skills

Skills our society will need must simultaneously manage the benefits to society and deal with great risks that will come with these advances in technology; advances in artificial intelligence, genomics, and autonomous robots, to select only three examples.

There is a glaring gap in their education how their own confidential personal data and digital footprint fit a globally connected society, and how they are used by commercial business and third parties.

There are concerns how apps could be misused by others too.

If we are to consider what is missing in our children’s preparations for life in which digital will no longer be a label but a way of life, then to identify the gap, we must first consider what we see as whole.

Rather than keeping children safe in education, as regards data sharing and digital privacy, the DfE seems happy to keep them ignorant. This is no way to treat our young people and develop their digital skills, just as giving their data away is not good cyber security.

What does a Dream for a  great ‘digital’ Society look like?

Had Martin Luther King lived to be 87 he would have continued to inspire hope and to challenge us to fulfill his dream for society – where everyone would have an equal opportunity for “life, liberty and the pursuit of happiness.”

Moving towards that goal, supported with technology, with ethical codes of practice, my dream is we see a more inclusive, fulfilled, sustainable and happier society. We must educate our children as fully rounded digital and data savvy individuals, who trust themselves and systems they use, and are well treated by others.

Sadly, introductions of these types of freedom limiting technologies for our children, risk instead that it may be a society in which many people do not feel comfortable, that lost sight of the value of privacy.

References:

[1] Digital Skills Inquiry: http://www.parliament.uk/business/committees/committees-a-z/commons-select/science-and-technology-committee/inquiries/parliament-2015/digital-skills-inquiry-15-16/

[2] UN Convention of the Rights of the Child

[3] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

[4] Nicky Morgan’s full speech at BETT

[5] The defenddigitalme campaign to ask the Department forEducation to change practices and policy around The National Pupil Database

 

 

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (1)

Nicky Morgan, the Education Secretary,  gave a speech [1] this week and shared her dream of the benefits technology for pupils.

It mentioned two initiatives to log children’s individual actions; one is included in a consultation on new statutory guidance, and the other she praised, is a GPS based mobile monitoring app.

As with many new applications of technology, how the concept is to be implemented in practice is important to help understand how intrusive any new use of data is going to be.

Unfortunately for this consultation there is no supporting code of practice what the change will mean, and questions need asked.

The most significant aspects in terms of changes to data collection through required monitoring are in the areas of statutory monitoring, systems, and mandatory teaching of ‘safeguarding’:

Consultation p11/14: “We believe including the requirement to ensure appropriate filtering and monitoring are in place, in statutory guidance, is proportional and reasonable in order to ensure all schools and colleges are meeting this requirement. We don’t think including this requirement will create addition burdens for the vast majority of schools, as they are already doing this, but we are keen to test this assumption.”

Consultation:  paragraph 75 on page 22 introduces this guidance section and ends with a link to “Buying advice for schools.” “Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. Guidance on e-security is available from the National Education Network.

Guidance: para 78 “Whilst it is essential that governing bodies and proprietors ensure that appropriate filters and monitoring systems are in place they should be careful  that “over blocking” does not lead to unreasonable restrictions as to what children can be taught with regards to online teaching  and safeguarding.” —

Consultation: “The Opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and  “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities. This is an important topic and the assumption is the vast majority of governing bodies and proprietors will already be ensuring the children in their school are suitably equipped with regards to safeguarding. But we are keen to hear views as to the change in emphasis.”

Paragraph 88 on p24  is oddly phrased: “Governing bodies and proprietors should ensure that staff members do not agree confidentiality and always act in the best interests of the child.”

What if confidentiality may sometimes be in the best interests of the child? What would that mean in practice?

 

Keeping Children Safe in Education – Questions on the Consultation and Use in practice

The consultation [2] is open until February 16th, and includes a new requirement to have web filtering and monitoring systems.

Remembering that 85% of children’s waking hours are spent outside school, and in a wide range of schools our children aged 2 -19, not every moment is spent unsupervised and on-screen, is it appropriate that this 24/7 monitoring would be applied to all types of school?

This provider software is reportedly being used in nearly 1,400 secondary schools in the UK.  We hear little about its applied use.

The cases of cyber bullying or sexting in schools I hear of locally, or read in the press, are through smartphones. Unless the school snoops on individual devices I wonder therefore if the cost, implementation and impact is proportionate to the benefit?

  1. Necessary and proportionate? How does this type of monitoring compare with other alternatives?
  2. Privacy impact assessment? Has any been done – surely required as a minimum measure?
  3. Cost benefit risk assessment of the new guidance in practice?
  4. Problem vs Solution: What problem is it trying to solve and how will they measure if it is successful, or stop its use if it is not?  Are other methods on offer?
  5. Due diligence: how do parents know that the providers have undergone thorough vetting and understand who they are? After all, these providers have access to millions of our children’s  online interactions.
  6. Evidence: If it has been used for years in school, how has it been assessed against other methods of supervision?
  7. The national cash cost: this must be enormous when added up for every school in the country, how is cost balanced against risk?
  8. Intangible costs – has anyone asked our children’s feeling on this? Where is the boundary between what is constructive and creepy? Is scope change documented if they decide to collect more data?

Are we Creating a Solution that Solves or creates a Problem?

The private providers would have no incentive to say their reports don’t work and schools, legally required to be risk averse, would be unlikely to say stop if there is no outcome at all.

Some providers  include “review of all incidents by child protection and forensic experts; freeing up time for teachers to focus on intervention” and “trends such as top users can be viewed.” How involved are staff who know the child as a first point of information sharing?

Most tools are multipurposed and I understand the reasons given behind them, but how it is implemented concerns me.

If the extent of these issues really justify this mass monitoring in every school – what are we doing to fix the causes, not simply spy on every child’s every online action in school? (I look at how it extends outside in part two.)

Questions on Public engagement: How are children and families involved in the implementation and with what oversight?

  1. Privacy and consent: Has anyone asked pupils and parents what they think and what rights they have to say no to sharing data?
  2. Involvement: Are parents to be involved and informed in software purchasing and in all data sharing decisions at local or regional level? Is there consistency of message if providers vary?
  3. Transparency: Where are the data created through the child’s actions stored, and for how long? Who has access to the data? What actions may result from it? And with what oversight?
  4. Understanding: How will children and parents be told what is “harmful and inappropriate content” as loosely defined by the consultation, and what they may or may not research? Children’s slang changes often, and “safeguarding” terms are subjective.
  5. Recourse: Will it include assessment of unintended consequences from misinterpretation of information gathered?
  6. Consent: And can I opt my child out from data collection by these unknown and ‘faceless’ third parties?

If children and parents are told their web use is monitored, what chilling effect may that have on their trust of the system, of teaching staff, and their ability to look for content to support their own sensitive concerns or development  that they may not be able to safe to look for at home? What limitation will that put on their creativity?

These are all questions that should be asked to thoroughly understand the consultation and requires wide public examination.

Since key logging is already common practice (according to provider websites) and will effectively in practice become statutory, where is the public discussion? If it’s not explicitly statutory, should pupils be subject to spying on their web searches in a postcode lottery?

What exactly might this part of the new guidance mean for pupils?

In part two, I include the other part of her speech, the GPS app and ask whether if we track every child in and outside school, are we moving closer to the digital dream, or nightmare, in the search to close the digital skills gap?

****

References:

[1] Nicky Morgan’s full speech at BETT

[2] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

“Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. [Proposed statutory guidance]

Since “guidance on procuring appropriate ICT” from the National Education Network NEN* is offered, it is clearly intended that this ‘system’ to be ‘in place’, should be computer based. How will it be applied in practice? A number of the software providers for schools already provide services that include key logging, using “keyword detection libraries” that “provides a complete log of all online activity”.

(*It’s hard to read more about as many of NEN’s links are dead.)  

Act now: Stand up and speak out for your rights to finding out the facts #saveFOI

The Freedom of Information Act has enabled me to stand up for my children’s rights. It really matters to me. And we might lose it.

For every member of the public, who has ever or who has never used their rights under the Freedom of Information Act laws, the government consultation on changing them that closes today is worth caring about. If you haven’t yet had your say, go and take action here >> now.  If it is all you have time for before the end of today, you can sign 38 degrees petition or write an email to your MP.

Or by the end of today you can reply to the call for evidence. There is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also got this plain English version.

Please do. Now. It closes today, on November 20th.

If you need convinced why it matters to me and it should to you, read on.

What will happen

If the proposed changes come to pass, information about public accountability will be lost. Political engagement will not be open to all equally. It will promote an unfair society in which individuals are not only prevented from taking part in full public life, but prevented from understanding decisions made about them or that affect them. Campaign groups will be constrained from standing up for human rights by cost.  The press will be restrained in what they can ask.

MySociety has a brilliant summary.  Michael Sheen spoke up calling it “nothing short of a full frontal attack” on the principles of democratic government. And Tom Watson spoke of three serious instances where facts would have stayed hidden, were it not for access made using the law of Freedom of Information:

1. death rates in cardiac patient care
2. cases when the police use Tasers on children
3. the existence of cracks in the nuclear power station at Hinckley

Why does FOI matter to me personally? In Education.

Because it’s enabled me to start a conversation to get the Department for Education to start to improve their handling of our 8 million children’s personal and sensitive data they hold in the National Pupil Database for England and Wales. Through FOI I asked for unpublished facts how many releases of identifiable personal data of school pupils have been fast-tracked at the Department of Education without panel oversight. And to see the panel terms of reference which are still not on their website.

The request: whatdotheykknow.com
The outcome:
National Pupil Database FOI case study summary here.

I’m now coordinating calls for changes on behalf of the 8m children whose records they hold and parents across the country.

******

Why does FOI matter to me personally? In Health.

Because Freedom of Information law has enabled public transparency of decision making and accountability of the care.data programme board decision making that was kept secret for over a year. NHS England refused to publish them. Their internal review declined appeal. The Information Commissioner’s Office upheld it.

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. In fact my case study, while highlighting that NHS England refused to release information, also shows that only a handful of genuine redactions were necessary, using Section 36 to keep them hidden, when the minutes were finally released.

In October 2014 I simply wanted to see the meeting minutes form part of the public record of care.data planning. I wanted to see the cost-benefit business case and scrutinise it against the benefits case that the public were told of at every public engagement event I had been to.  When at every turn the public is told how little money the NHS can afford to spend I wanted scrutiny of what the programme would cost at national and local levels. It was in the public interest to better inform public debate about the merits of the national programme. And I strongly believe that it is in the public interest to be informed and fully understand the intention and programme that demands the use of sensitive personal data.

The request: whatdotheyknow.com
The outcome: care.data FOI case study summary here.

Others could use this information I hoped, to ask the right questions about missing meeting minutes and transparency, and for everyone to question why there was no cost-benefit business plan at all in private; while the public kept being told of the benefits.  And it shows that data collection is further set to expand, without public debate.

Since then the programme has been postoned again and work is in progress on improved public engagement to enable public and professional confidence.

What has Freedom of Information achieved?

One of the most memorable results of Freedom of Information was the MPs expenses scandal. Who knows how much this Freedom of Information Request saved the taxpayers in immeasurable amounts of future spending on duck houses since MPs have been required to publish expenses since 2010? Four MPs were jailed for false accounting. Peers were expelled. Second homes and what appeared to the public as silly spending on sundries were revealed. Mr. Cameron apologized in 2009, saying he was “appalled” by the expenses. The majority of MPs had done nothing illegal but the Freedom of Information request enabled the start of a process of increased transparency to the public which showed where activities, while permitted by law, were simply unethical or unreasonable.

Historical record

Information published under the Freedom of Information Act can help to ensure that important records of decision-making processes are retained as part of the historic background to government.

Increased trust

The right information at the right time helps make better decisions, make spending more transparent and makes policies and practices more trustworthy.

Access to official information can also improve public confidence where public sector bodies are seen as being open. In a 2011 survey carried out on behalf of the Information Commissioner’s Office, 81% of public bodies questioned agreed that the Act had increased the public’s trust in their organisation.

A key argument made by the commission is that those in public office need private space for decision making. The Information Commissioner’s Office countered this in their submission to the consultation saying,

“there is a distinction between a need for a private space, depending on the circumstances and a desire for secrecy across a broad area of public sector activity. It was the latter tendency that FOIA was intended to correct.”

So how much more “private space” do public servants need?

Holding back information

When information that are judged should not be released in the public interest, there are already exemptions that can be applied to prevent disclosure of information under the Freedom of Information Act. [1]

The exemptions include:

  • if the information can easily be accessed by other means – e.g. the internet or published documents
  • if the information is personal information
  • if the information is provided in confidence (but only if legally enforceable)
  • when there is a legal reason not to disclose
  • if the information is about national security, defence, the economy, law enforcement, formulation of Government policy, health and safety, communications with Her Majesty or other royalty, international relations, intended for future publication and commercial interests. (All exemptions in this group must be tested to see if disclosure is in the public interest.)

In addition to these exemptions, organisations can withhold information if it will take more than two-and-a-half days to provide it, or they cannot identify what information is needed (although they have to work with the requester to clarify what is being requested).

They can also withhold information if they decide the request is vexatious.

Does it cost us too much to administer?

Some people who are supportive of these changes say they are concerned about costs in answering requests but have perhaps not considered the savings in exceptional cases (like the Expenses Scandal outcome). And as mySociety has reported [2], money spent responding to Freedom of Information requests also needs to be considered fairly in the context of wider public spending. In 2012 it was reported that Staffordshire County Council had spent £38,000 in a year responding to Freedom of Information requests. The then Director of mySociety, Tom Steinberg, commented:

“From this I can see that oversight by citizens and journalists cost only £38,000 from a yearly total budget of £1.3bn. I think it is fantastic that Staffordshire County Council can provide such information for only 0.002 per cent of its operating budget.”

Why does the government want to make itself less transparent? Even the Information Commissioner’s office has replied to the consultation to say that the Commissioner does not consider that significant changes to the core principles of the legislation are needed. This is a good law, that gives the public rights in our favour and transparency into how we are governed and tax money spent.

How will the value of FOI be measured of what would be lost if the changes are made?

What can you do?

The call for evidence is here and there is further guidance on the Campaign For Freedom of Information’s website. 38 Degrees have also put together this super-easy Plain English version.

To have your say in the consultation closing on November 20th go online.

Or simply call or write to your MP.  Today. This really matters.


References:

[1] Requests can be refused https://ico.org.uk/for-organisations/guide-to-freedom-of-information/refusing-a-request/

[2] MySociety opposes restrictions https://www.mysociety.org/2015/11/11/voices-from-whatdotheyknow-why-we-oppose-foi-act-restrictions/

[3] National Pupil Database FOI case study summary here

[4] My care.data programme board FOI case study summary here

care.data : the economic value of data versus the public interest?

 This is a repost of my opinion piece published in StatsLife in June 2015.

The majority of the public supports the concept of using data for public benefit.[1] But the measurable damage done in 2014 to the public’s trust in data sharing [2] and reasons for it, are an ongoing threat to its achievement.

Rebuilding trust and the public legitimacy of government data gathering could be a task for Sisyphus, given the media atmosphere clouded by the smoke and mirrors of state surveillance. As Mark Taylor, chair of the NHS’s Confidentiality Advisory Group wrote when he considered the tribulations of care.data [3] ‘…we need a much better developed understanding of ‘the public interest’ than is currently offered by law.’

So what can we do to improve this as pilot sites move forward and for other research? Can we consistently quantify the value of the public good and account for intangible concerns and risks alongside demonstrable benefits? Do we have a common understanding of how the public feels what is in its own best interests?

And how are shifting public and professional expectations to be reflected in the continued approach to accessing citizens’ data, with the social legitimacy upon which research depends?

Listening and lessons learned

Presented as an interval to engage the public and professionals, the 18 month long pause in care.data involved a number of ‘listening’ events. I attended several of these to hear what people were saying about the use of personal health data. The three biggest areas of concern raised frequently [4] were:

  • Commercial companies’ use and re-use of data
  • Lack of transparency and control over who was accessing data for what secondary purposes, and
  • Potential resulting harms: from data inaccuracy, loss of trust and confidentiality, and fear of discrimination.

It’s not the use of data per se that the majority of the public raises objection to. Indeed many people would object if health data were not used for research in the public interest. Objections were more about the approach to this in the past and in the future.

There is a common understanding of what bona fide research is, how it serves the public interest, and polls confirm a widespread acceptance of ‘reasonable’ research use of data. The HSCIC audit under Sir Nick Partridge [5] acknowledged that some past users or raw data sharing had not always met public expectations of what was ‘reasonable’. The new secure facility should provide a safe setting for managing this better, but open questions remain on governance and transparency.

As one question from a listening event succinctly put it [6]:

‘Are we saying there will be only clinical use of the data – no marketing, no insurance, no profit making? This is our data.’

Using the information gleaned from data was often seen as exploitation when used in segmenting the insurance markets, consumer market research or individual targeting. There is also concern, even outright hostility, to raw health data being directly sold, re-used or exchanged as a commodity – regardless whether this is packaged as ‘for profit’ or ‘covering administrative costs’.

Add to that, the inability to consent to, control or find out who uses individual level data and for what purpose, or to delete mistakes, and there is a widespread sense of disempowerment and loss of trust.

Quantifying the public perception of care.data’s value

While the pause was to explain the benefits of the care.data extraction, it actually seemed clear at meetings that people already understood the potential benefits. There is clear public benefit to be gained for example, from using data as a knowledge base, often by linking with other data to broaden scientific and social insights, generating public good.

What people were asking, was what new knowledge would be gained that isn’t gathered from non-identifiable data already? Perhaps more tangible, yet less discussed at care.data events, is the economic benefits for commissioning use by using data as business intelligence to inform decisions in financial planning and cost cutting.

There might be measurable economic public good from data, from outside interests who will make a profit by using data to create analytic tools. Some may even sell information back into the NHS as business insights.

Care.data is also to be an ‘accelerator’ for other projects [7]. But it is hard to find publicly available evidence to a) support the economic arguments for using primary care data in any future projects, and b) be able to compare them with the broader current and future needs of the NHS.

A useful analysis could find that potential personal benefits and the public good overlap, if the care.data business case were to be made available by NHS England in the public domain. In a time when the NHS budget is rarely out of the media it seems a no-brainer that this should be made open.

Feedback consistently shows that making money from data raises more concern over its uses. Who all future users might be remains open as the Care Act 2014 clause is broadly defined. Jamie Reed MP said in the debate [8]: ‘the new clause provides for entirely elastic definitions that, in practice, will have a limitless application.’

Unexpected uses and users of public data has created many of its historical problems. But has the potential future cost of ‘limitless’ applications been considered in the long term public interest? And what of the confidentiality costs [9]? The NHS’s own Privacy Impact Assessment on care.data says [10]:

‘The extraction of personal confidential data from providers without consent carries the risk that patients may lose trust in the confidential nature of the health service.

Who has quantified the cost of that loss of confidence and have public and professional opinions been accounted for in any cost/benefit calculations? All these tangible and intangible factors should be measured in calculating its value in the public interest and ask, ‘what does the public want?’ It is after all, our data and our NHS.

Understanding shifting public expectations

‘The importance of building and maintaining trust and confidence among all stakeholder groups concerned – including researchers, institutions, ethical review boards and research participants – as a basis for effective data sharing cannot be overstated.’ – David Carr, policy adviser at the Wellcome Trust [11]

To rebuild trust in data sharing, individuals need the imbalance of power corrected, so they can control ‘their data’. The public was mostly unaware health records were being used for secondary purposes by third parties, before care.data. In February 2014, the secretary of state stepped in to confirm an opt-out will be offered, as promised by the prime minister in his 2010 ‘every patient a willing research patient’ speech.

So leaving aside the arguments for and against opt-in versus opt-out (and that for now it is not technically possible to apply the 700,000 opt-outs already made) the trouble is, it’s all or nothing. By not offering any differentiation between purposes, the public may feel forced to opt-out of secondary data sharing, denying all access to all their data even if they want to permit some uses and not others.

Defining and differentiating secondary uses and types of ‘research purposes’ could be key to rebuilding trust. The HSCIC can disseminate information ‘for the purposes of the provision of health care or adult social care, or the promotion of health’. This does not exclude commercial use. Cutting away commercial purposes which appear exploitative from purposes in the public interest could benefit the government, commerce and science if, as a result, more people would be willing to share their data.

This choice is what the public has asked for at care.data events, other research events [12] and in polls, but to date has yet to see any move towards. I feel strongly that the government cannot continue to ignore public opinion and assume its subjects are creators of data, willing to be exploited, without expecting further backlash. Should a citizen’s privacy become a commodity to put a price tag on if it is a basic human right?

One way to protect that right is to require an active opt-in to sharing. With ongoing renegotiation of public rights and data privacy at EU level, consent is no longer just a question best left ignored in the pandora’s box of ethics, as it has been for the last 25 years in hospital data secondary use. [13]

The public has a growing awareness, differing expectations, and different degrees of trust around data use by different users. Policy makers ignoring these expectations, risk continuing to build on a shaky foundation and jeopardise the future data sharing infrastructure. Profiting at the expense of public feeling and ethical good practice is an unsustainable status quo.

Investing in the public interest for future growth

The care.data pause has revealed differences between the thinking of government, the drivers of policy, the research community, ethics panels and the citizens of the country. This is not only about what value we place on our own data, but how we value it as a public good.

Projects that ignore the public voice, that ‘listen’ but do not act, risk their own success and by implication that of others. And with it they risk the public good they should create. A state which allows profit for private companies to harm the perception of good research practice sacrifices the long term public interest for short term gain. I go back to the words of Mark Taylor [3]:

‘The commitment must be an ongoing one to continue to consult with people, to continue to work to optimally protect both privacy and the public interest in the uses of health data. We need to use data but we need to use it in ways that people have reason to accept. Use ‘in the public interest’ must respect individual privacy. The current law of data protection, with its opposed concepts of ‘privacy’ and ‘public interest’, does not do enough to recognise the dependencies or promote the synergies between these concepts.’ 

The economic value of data, personal rights and the public interest are not opposed to one another, but have synergies and a co-dependency. The public voice from care.data listening could positively help shape a developing consensual model of data sharing if the broader lessons learned are built upon in an ongoing public dialogue. As Mark Taylor also said, ‘we need to do this better.’

*******

[1] according to various polls and opinions gathered from my own discussions and attendance at care.data events in 2014 [ refs: 2, 4. 6. 12]

[2] The data trust deficit, work by the Royal Statistical Society in 2014

[3] M Taylor, “Information Governance as a Force for Good? Lessons to be Learnt from Care.data”, (2014) 11:1 SCRIPTed 1 http://script-ed.org/?p=1377

[4] Communications and Change – blogpost https://jenpersson.com/care-data-communications-change/

[5] HSCIC audit under Sir Nick Partridge https://www.gov.uk/government/publications/review-of-data-releases-made-by-the-nhs-information-centre

[6] Listening events, NHS Open Day blogpost https://jenpersson.com/care-data-communications-core-concepts-part-two/

[7] Accelerator for projects mentioned include the 100K Genomics programme https://www.youtube.com/watch?v=s8HCbXsC4z8

[8] Hansard http://www.publications.parliament.uk/pa/cm201314/cmhansrd/cm140311/debtext/140311-0002.htm

[9] Confidentiality Costs; StatsLife http://www.statslife.org.uk/opinion/1723-confidentiality-costs

[10] care.data privacy impact assessment Jan 2014 [newer version has not been publicly released] http://www.england.nhs.uk/wp-content/uploads/2014/01/pia-care-data.pdf

[11] Wellcome Trust http://blog.wellcome.ac.uk/2015/04/08/sharing-research-data-to-improve-public-health/

[12]  Dialogue on Data – Exploring the public’s views on using linked administrative data for research purposes: https://www.ipsos-mori.com/researchpublications/publications/1652/Dialogue-on-Data.aspx

[13] HSCIC Lessons Learned http://www.hscic.gov.uk/article/4780/HSCIC-learns-lessons-of-the-past-with-immediate-programme-for-change

The views expressed in this article originally published in the Opinion section of StatsLife are solely mine, the original author. These views and opinions do not necessarily represent those of The Royal Statistical Society.

The nhs.uk digital platform: a personalised gateway to a new NHS?

In recent weeks rebranding the poverty definitions and the living wage in the UK deservedly received more attention than the rebrand of the website NHS Choices into ‘nhs.uk.

The site that will be available only in England and Wales despite its domain name, will be the doorway to enter a personalised digital NHS offering.

As the plans proceed without public debate, I took some time to consider the proposal announced through the National Information Board (NIB) because it may be a gateway to a whole new world in our future NHS. And if not, will it be a  big splash of cash but create nothing more than a storm-in-a-teacup?

In my previous post I’d addressed some barriers to digital access. Will this be another? What will it offer that isn’t on offer already today and how will the nhs.uk platform avoid the problems of its predecessor HealthSpace?

Everyone it seems is agreed, the coming cuts are going to be ruthless. So, like Alice, I’m curious. What is down the rabbit hole ahead?

What’s the move from NHS Choices to nhs.uk about?

The new web platform nhs.uk would invite users to log on, using a system that requires identity, and if compulsory, would be another example of a barrier to access simply from a convenience point of view, even leaving digital security risks aside.

What will nhs.uk offer to incentivise users and offer benefit as a trade off against these risks, to go down the new path into the unknown and like it?

“At the heart of the domain , will be the development of nhs.uk into a new integrated health and care digital platform that will be a source of access to information, directorate, national services and locally accredited applications.”

In that there is nothing new compared with information, top down governance and signposting done by NHS Choices today.  

What else?

“Nhs.uk will also become the citizen ’s gateway to the creation of their own personal health record, drawing on information from the electronic health records in primary and secondary care.”

nhs.uk will be an access point to patient personal confidential records

Today’s patient online we are told offers 97% of patients access to their own GP created records access. So what will nhs.uk offer more than is supposed to be on offer already today? Adding wearables data into the health record is already possible for some EMIS users, so again, that won’t be new. It does state it will draw on both primary and secondary records which means getting some sort of interoperability to show both hospital systems data and GP records. How will the platform do this?

Until care.data many people didn’t know their hospital record was stored anywhere outside the hospital. In all the care.data debates the public was told that HES/SUS was not like a normal record in the sense we think of it. So what system will secondary care records come from? [Some places may have far to go. My local hospital pushes patients round with beige paper folders.] The answer appears to be an unpublished known or an unknown.

What else?

nhs.uk will be an access point to tailored ‘signposting’ of services

In addition to access to your personal medical records in the new “pull not push” process the nhs.uk platform will also offer information and services, in effect ‘advertising’ local services, to draw users to want to use it, not force its use. And through the power of web tracking tools combined with log in, it can all be ‘tailored’ or ‘targeted’ to you, the user.

“Creating an account will let you save information, receive emails on your chosen topics and health goals and comment on our content.”

Do you want to receive emails on your chosen topics or comment on content today? How does it offer more than can already be done by signing up now to NHS Choices?

NHS Choices today already offers information on local services, on care provision and symptoms’ checker.

What else?

Future nhs.uk users will be able to “Find, Book, Apply, Pay, Order, Register, Report and Access,” according to the NIB platform headers.

platform

“Convenient digital transactions will be offered like ordering and paying for prescriptions, registering with GPs, claiming funds for treatment abroad, registering as an organ and blood donor and reporting the side effects of drugs . This new transactional focus will complement nhs.uk’s existing role as the authoritative source of condition and treatment information, NHS services and health and care quality information.

“This will enable citizens to communicate with clinicians and practices via email, secure video links and fill out pre-consultation questionnaires. They will also be able to include data from their personal applications and wearable devices in their personal record. Personal health records will be able to be linked with care accounts to help people manage their personal budget.”

Let’s consider those future offerings more carefully.

Separating out the the transactions that for most people will be one off, extremely rare or never events (my blue) leaves other activities which you can already do or will do via the patient online programme (in purple).

The question is that although video and email are not yet widespread where they do work today and would in future, would they not be done via a GP practice system, not a centralised service? Or is the plan not that you could have an online consultation with ‘your’ named GP through nhs.uk but perhaps just ‘any’ GP from a centrally provided GP pool? Something like this? 

That leaves two other things, which are both payment tools (my bold).

i. digital transactions will be offered like ordering and paying for prescriptions
ii. …linked with care accounts to help people manage their personal budget.”

Is the core of the new offering about managing money at individual and central level?

Beverly Bryant, ‎Director of Strategic Systems and Technology at NHS England, said at the #kfdigi2015 June 16th event, that implementing these conveniences had costs saving benefits as well: “The driver is customer service, but when you do it it actually costs less.”

How are GP consultations to cost less, significantly less, to be really cost effective compared with the central platform to enable it to happen, when the GP time is the most valuable part and remains unchanged spent on the patient consultation and paperwork and referral for example?

That most valuable part to the patient, may be seen as what is most costly to ‘the system’.

If the emphasis is on the service saving money, it’s not clear what is in it for people to want to use it and it risks becoming another Healthspace, a high cost top down IT rollout without a clear customer driven need.

The stated aim is that it will personalise the user content and experience.

That gives the impression that the person using the system will get access to information and benefits unique and relevant to them.

If this is to be something patients want to use (pull) and are not to be forced to use (push) I wonder what’s really at its core, what’s in it for them, that is truly new and not part of the existing NHS Choices and Patient online offering?

What kind of personalised tailoring do today’s NHS Choices Ts&Cs sign users up to?

“Any information provided, or any information the NHS.uk site may infer from it, are used to provide content and information to your account pages or, if you choose to, by email.  Users may also be invited to take part in surveys if signed up for emails.

“You will have an option to submit personal information, including postcode, age, date of birth, phone number, email address, mobile phone number. In addition you may submit information about your diet and lifestyle, including drinking or exercise habits.”

“Additionally, you may submit health information, including your height and weight, or declare your interest in one or more health goals, conditions or treatments. “

“With your permission, academic institutions may occasionally use our data in relevant studies. In these instances, we shall inform you in advance and you will have the choice to opt out of the study. The information that is used will be made anonymous and will be confidential.”

Today’s NHS Choices terms and conditions say that “we shall inform you in advance and you will have the choice to opt out of the study.”

If that happens already and the NHS is honest about its intent to give patients that opt out right whether to take part in studies using data gathered from registered users of NHS Choices, why is it failing to do so for the 700,000 objections to secondary use of personal data via HSCIC?

If the future system is all about personal choice NIB should perhaps start by enforcing action over the choice the public may have already made in the past.

Past lessons learned – platforms and HealthSpace

In the past, the previous NHS personal platform, HealthSpace, came in for some fairly straightforward criticism including that it offered too little functionality.

The Devil’s in the Detail remarks are as relevant today on what users want as they were in 2010. It looked at the then available Summary Care Record (prescriptions allergies and reactions) and the web platform HealthSpace which tried to create a way for users to access it.

Past questions from Healthspace remain unanswered for today’s care.data or indeed the future nhs.uk data: What happens if there is a mistake in the record and the patient wants it deleted? How will access be given to third party carers/users on behalf of individuals without capacity to consent to their records access?

Reasons given by non-users of HealthSpace included lack of interest in managing their health in this way, a perception that health information was the realm of health professionals and lack of interest or confidence in using IT.

“In summary, these findings show that ‘self management’ is a much more complex, dynamic, and socially embedded activity than original policy documents and technical specifications appear to have assumed.”

What lessons have been learned? People today are still questioning the value of a centrally imposed system. Are they being listened to?

Digital Health reported that Maurice Smith, GP and governing body member for Liverpool CCG, speaking in a session on self-care platforms at the King’s Fund event he said that driving people towards one national hub for online services was not an option he would prefer and that he had no objection to a national portal, “but if you try drive everybody to a national portal and expect everybody to be happy with that I think you will be disappointed.”

How will the past problems that hit Healthspace be avoided for the future?

How will the powers-at-be avoid repeating the same problems for its ongoing roll out of care.data and future projects? I have asked this same question to NHS England/NIB leaders three times in the last year and it remains unanswered.

How will you tell patients in advance of any future changes who will access their data records behind the scenes, for what purpose, to future proof any programmes that plan to use the data?

One of the Healthspace 2010 concerns was: “Efforts of local teams to find creative new uses for the SCR sat in uneasy tension with implicit or explicit allegations of ‘scope creep’.”

Any programme using records can’t ethically sign users up to one thing and change it later without informing them before the change. Who will pay for that and how will it be done? care.data pilots, I’d want that answered before starting pilot communications.

As an example of changes to ‘what’ or content scope screep, future plans will see ‘social care flags added’ to the SCR record, states p.17 of the NIB 2020 timeline. What’s the ‘discovery for the use of genomic data complete’ about on p.11?  Scope creep of ‘who’ will access records, is very current. Recent changes allow pharmacists to access the SCR yet the change went by with little public discussion. Will they in future see social care flags or mental health data under their SCR access? Do I trust the chemist as I trust a GP?

Changes without adequate public consultation and communication cause surprises. Bad idea. Sir Nick Partridge said ensuring ‘no surprises’ is key to citizens’ trust after the audit of HES/SUS data uses. He is right.

The core at the heart of this nhs.uk plan is that it needs to be used by people, and enough people to make the investment vs cost worthwhile. That is what Healthspace failed to achieve.

The change you want to see doesn’t address the needs of the user as a change issue. (slide 4) This is all imposed change. Not user need-driven change.

Dear NIB, done this way seems to ignore learning from Healthspace. The evidence shown is self-referring to Dr. Foster and NHS Choices. The only other two listed are from Wisconsin and the Netherlands, hardly comparable models of UK lifestyle or healthcare systems.

What is really behind the new front door of the nhs.uk platform?

The future nhs.uk looks very much as though it seeks to provide a central front door to data access, in effect an expanded Summary Care Record (GP and secondary care records) – all medical records for direct care – together with a way for users to add their own wider user data.

Will nhs.uk also allow individuals to share their data with digital service providers of other kinds through the nhs.uk platform and apps? Will their data be mined to offer a personalised front door of tailored information and service nudges? Will patients be profiled to know their health needs, use and costs?

If yes, then who will be doing the mining and who will be using that data for what purposes?

If not, then what value will this service offer if it is not personal?

What will drive the need to log on to another new platform, compared with using the existing services of patient online today to access our health records, access GPs via video tools, and without any log-in requirement, browse similar content of information and nudges towards local services offered via NHS Choices today?

If this is core to the future of our “patient experience” of the NHS the public should be given the full and transparent facts  to understand where’s the public benefit and the business case for nhs.uk, and what lies behind the change expected via online GP consultations.

This NIB programme is building the foundation of the NHS offering for the next ten years. What kind of NHS are the NIB and NHS England planning for our children and our retirement through their current digital designs?

If the significant difference behind the new offering for nhs.uk platform is going to be the key change from what HealthSpace offered and separate from what patient online already offers it appears to be around managing cost and payments, not delivering any better user service.

Managing more of our payments with pharmacies and personalised budgets would reflect the talk of a push towards patient-responsible-self-management  direction of travel for the NHS as a whole.

More use of personal budgets is after all what Simon Stevens called a “radical new option” and we would expect to see “wider scale rollout of successful projects is envisaged from 2016-17″.

When the system will have finely drawn profiles of its users, will it have any effect for individuals in our universal risk-shared system? Will a wider roll out of personalised budgets mean more choice or could it start to mirror a private insurance system in which a detailed user profile would determine your level of risk and personal budget once reached, mean no more service?

What I’d like to see and why

To date, transparency has a poor track record on sharing central IT/change programme business plans.  While saying one thing, another happens in practice. Can that be changed? Why all the effort on NHS Citizen and ‘listening’, if the public is not to be engaged in ‘grown up debate‘ to understand the single biggest driver of planned service changes today: cost.

It’s at best patronising in the extreme, to prevent the public from seeing plans which spend public money.

We risk a wasteful, wearing repeat of the past top down failure of an imposed NPfIT-style HealthSpace, spending public money on a project which purports to be designed to save it.

To understand the practical future we can look back to avoid what didn’t work and compare with current plans. I’d suggest they should spell out very clearly what were the failures of Healthspace, and why is nhs.uk different.

If the site will offer an additional new pathway to access services than we already have, it will cost more, not less. If it has genuine expected cost reduction compared with today, where precisely will it come from?

I’d suggest you publish the detailed business plan for the nhs.uk platform and have the debate up front. Not only the headline numbers towards the end of these slides, but where and how it fits together in the big picture of Stevens’ “radical new option”.  This is public money and you *need* the public on side for it to work.

Publish the business cases for the NIB plans before the public engagement meet ups, because otherwise what facts will opinion be based on?

What discussion can be of value without them, when we are continually told by leadership those very  details are at the crux of needed change – the affordability of the future of the UK health and care system?

Now, as with past projects, The Devil’s in the Detail.

***

NIB detail on nhs.uk and other concepts: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/437067/nib-delivering.pdf

The Devil’s in the Detail: Final report of the independent evaluation of the Summary Care Record and HealthSpace programmes 2010

Digital revolution by design: infrastructures and the fruits of knowledge

Since the beginning of time and the story of the Garden of Eden, man has found a way to share knowledge and its power.

Modern digital tools have become the everyday way to access knowledge for many across the world, giving quick access to information and sharing power more fairly.

In this third part of my thoughts on digital revolution by design, triggered by the #kfdigi15 event on June 16-17, I’ve been considering some of the constructs we have built; those we accept and those that could be changed, given the chance, to build a better digital future.

Not only the physical constructions, the often networked infrastructures, but intangible infrastructures of principles and power, co-dependencies around a physical system; the legal and ethical infrastructures of ownership, governance and accountability.

Our personal data flow in systems behind the screens, at the end of our fingertips. Controlled in frameworks designed by providers and manufacturers, government and commercial agencies.

Increasingly in digital discussions we hear that the data subject, the citizen, will control their own data.

But if it is on the terms and conditions set by others, how much control is real and how much is the talk of a consenting citizen only a fig leaf behind which any real control is still held by the developer or organisation providing the service?

When data are used, turned into knowledge as business intelligence that adds value to aid informed decision making. By human or machine.

How much knowledge is too much knowledge for the Internet of Things to build about its users? As Chris Matyszczyk wrote:

“We have all agreed to this. We click on ‘I agree’ with no thought of consequences, only of our convenience.”

Is not knowing what we have agreed to our fault, or the responsibility of the provider who’d rather we didn’t know?

Citizens’ rights are undermined in unethical interactions if we are exploited by easy one-click access and exchange our wealth of data at unseen cost. Can it be regulated to promote, not stifle innovation?

How can we get those rights back and how will ‘doing the right thing’ help shape and control the digital future we all want?

The infrastructures we live inside

As Andrew Chitty says in this HSJ article: “People live more mobile and transient lives and, as a result, expect more flexible, integrated, informed health services.

To manage that, do we need to know how systems work, how sharing works, and trust the functionality of what we are not being told and don’t see behind the screens?

At the personal level, whether we sign up for the social network, use a platform for free email, or connect our home and ourselves in the Internet of things, we each exchange our personal data with varying degrees of willingness. There there is often no alternative if we want to use the tool.

As more social and consensual ‘let the user decide’ models are being introduced, we hear it’s all about the user in control, but reality is that users still have to know what they sign up for.

In new models of platform identity sign on, and tools that track and mine our personal data to the nth degree that we share with the system, both the paternalistic models of the past and the new models of personal control and social sharing are merging.

Take a Fitbit as an example. It requires a named account and data sharing with the central app provider. You can choose whether or not to enable ‘social sharing’ with nominated friends whom you want to share your boasts or failures with. You can opt out of only that part.

I fear we are seeing the creation of a Leviathan sized monster that will be impossible to control and just as scary as today’s paternalistic data mis-management. Some data held by the provider and invisibly shared with third parties beyond our control, some we share with friends, and some stored only on our device.

While data are shared with third parties without our active knowledge, the same issue threatens to derail consumer products, as well as commercial ventures at national scale, and with them the public interest. Loss of trust in what is done behind the settings.

Society has somehow seen privacy lost as the default setting. It has become something to have to demand and defend.

“If there is one persistent concern about personal technology that nearly everybody expresses, it is privacy. In eleven of the twelve countries surveyed, with India the only exception, respondents say that technology’s effect on privacy was mostly negative.” [Microsoft survey 2015, of  12,002 internet users]

There’s one part of that I disagree with. It’s not the effect of technology itself, but the designer or developers’ decision making that affects privacy. People have a choice how to design and regulate how privacy is affected, not technology.

Citizens have vastly differing knowledge bases of how data are used and how to best interact with technology. But if they are told they own it, then all the decision making framework should be theirs too.

By giving consumers the impression of control, the shock is going to be all the greater if a breach should ever reveal where fitness wearable users slept and with whom, at what address, and were active for how long. Could a divorce case demand it?

Fitbit users have already found their data used by police and in the courtroom – probably not what they expected when they signed up to a better health tool.  Others may see benefits that could harm others by default who are excluded from accessing the tool.

Some at org level still seem to find this hard to understand but it is simple:
No trust = no data = no knowledge for commercial, public or personal use and it will restrict the very innovation you want to drive.

Google gmail users have to make 10+ clicks to restrict all ads and information sharing based on their privacy and ad account settings. The default is ad tailoring and data mining. Many don’t even know it is possible to change the settings and it’s not intuitive how to.

Firms need to consider their own reputational risk if users feel these policies are not explicit and are exploitation. Those caught ‘cheating’ users can get a very public slap on the wrist.

Let the data subjects rule, but on whose terms and conditions?

The question every citizen signing up to digital agreements should ask, is what are the small print  and how will I know if they change? Fair processing should offer data protection, but isn’t effective.

If you don’t have access to information, you make decisions based on a lack of information or misinformation. Decisions which may not be in your own best interest or that of others. Others can exploit that.

And realistically and fairly, organisations can’t expect citizens to read pages and pages of Ts&Cs. In addition, we don’t know what we don’t know. Information that is missing can be as vital to understand as that provided. ‘Third parties’ sharing – who exactly does that mean?

The concept of an informed citizenry is crucial to informed decision making but it must be within a framework of reasonable expectation.

How do we grow the fruits of knowledge in a digital future?

Real cash investment is needed now for a well-designed digital future, robust for cybersecurity, supporting enforceable governance and oversight. Collaboration on standards and thorough change plans. I’m sure there is much more, but this is a start.

Figurative investment is needed in educating citizens about the technology that should serve us, not imprison us in constructs we do not understand but cannot live without.

We must avoid the chaos and harm and wasted opportunity of designing massive state-run programmes in which people do not want to participate or cannot participate due to barriers of access to tools. Avoid a Babel of digital blasphemy in which the only wise solution might be to knock it down and start again.

Our legislators and regulators must take up their roles to get data use, and digital contract terms and conditions right for citizens, with simplicity and oversight. In doing so they will enable better protection against risks for commercial and non-profit orgs, while putting data subjects first.

To achieve greatness in a digital future we need: ‘people speaking the same language, then nothing they plan to do will be impossible for them’.

Ethics. It’s more than just a county east of London.

Let’s challenge decision makers to plant the best of what is human at the heart of the technology revolution: doing the right thing.

And from data, we will see the fruits of knowledge flourish.

******

1. Digital revolution by design: building for change and people
2. Digital revolution by design: barriers by design
3
. Digital revolution by design: infrastructures and the fruits of knowledge
4. Digital revolution by design: infrastructures and the world we want

Digital revolution by design: building for change and people (1)

Andy Williams said* that he wants not evolution, but a revolution in digital health.

It strikes me that few revolutions have been led top down.

We expect revolution from grass roots dissent, after a growing consensus in the population that the status quo is no longer acceptable.

As the public discourse over the last 18 months about the NHS use of patient data has proven, we lack a consensual agreement between state, organisations and the public how the data in our digital lives should be collected, used and shared.

The 1789 Declaration of the Rights of Man and Citizen as part of the French Revolution set out a charter for citizens, an ethical and fair framework of law in which they trusted their rights would be respected by fellow men.

That is something we need in this digital revolution.

We are told hand by government that it is necessary to share all our individual level data in health from all sorts of sources.

And that bulk data collection is vital in the public interest to find surveillance knowledge that government agencies want.

At the same time other government departments plan to restrict citizens’ freedom of access to knowledge that could be used to hold the same government  and civil servants to account.

On the consumer side, there is public concern about the way we are followed around on the web by companies including global platforms like Google and Facebook, that track our digital footprint to deliver advertising.

There is growing objection to the ways in which companies scoop up data to build profiles of individuals and groups and personalising how they get treated. Recent objection was to marketing misuse by charities.

There is little broad understanding yet of the power of insight that organisations can now have to track and profile due to the power of algorithms and processing capability.

Technology progress that has left legislation behind.

But whenever you talk to people about data there are two common threads.

The first, is that although the public is not happy with the status quo of how paternalistic organisations or consumer companies ‘we can’t live without’ manage our data, there is a feeling of powerlessness that it can’t change.

The second, is frustration with organisations that show little regard for public opinion.

What happens when these feelings both reach tipping point?

If Marie Antoinette were involved in today’s debate about the digital revolution I suspect she may be the one saying: “let them eat cookies.”

And we all know how that ended.

If there is to be a digital revolution in the NHS where will it start?

There were marvelous projects going on at grassroots discussed over the two days: bringing the elderly online and connected and in housing and deprivation. Young patients with rare diseases are designing apps and materials to help consultants improve communications with patients.

The NIB meeting didn’t have real public interaction and or any discussion of those projects ‘in the room’ in the 10 minutes offered. Considering the wealth of hands-on digital health and care experience in the audience it was a missed opportunity for the NIB to hear common issues and listen to suggestions for co-designed solutions.

While white middle class men (for the most part) tell people of their grand plans from the top down, the revolutionaries of all kinds are quietly getting on with action on the ground.

If a digital revolution is core to the NHS future, then we need to ask to understand the intended change and outcome much more simply and precisely.

We should all understand why the NHS England leadership wants to drive change, and be given proper opportunity to question it, if we are to collaborate in its achievement.

It’s about the people, stoopid

Passive participation will not be enough from the general public if the revolution is to be as dramatic as it is painted.

Consensual co-design of plans and co-writing policy are proven ways to increase commitment to change.

Evidence suggests citizen involvement in planning is more likely to deliver success. Change done with, not to.

When constructive solutions have been offered what impact has engagement if no change is made to any  plans?

If that’s engagement, you’re doing it wrong.

Struggling with getting themselves together on the current design for now, it may be hard to invite public feedback on the future.

But it’s only made hard if what the public wants is ignored.  If those issues were resolved in the way the public asked for at listening events it could be quite simple to solve.

The NIB leadership clearly felt nervous to have debate, giving only 10 minutes of three hours for public involvement, yet that is what it needs. Questions and criticism are not something to be scared of, but opportunities to make things better.

The NHS top-down digital plans need public debate and dissected by the clinical professions to see if it fits the current and future model of healthcare, because if not involved in the change, the ride will be awfully bumpy to get there.

For data about us, to be used without us, is certainly an outdated model incompatible with a digital future.

The public needs to fight for citizen rights in a new social charter that demands change along lines we want, change that doesn’t just talk of co-design but that actually means it.

If unhappy about today’s data use, then the general public has to stop being content to be passive cash cows as we are data mined.

If we want data used only for public benefit research and not market segmentation, then we need to speak up. To the Information Commissioner’s Office if the organisation itself will not help.

“As Nicole Wong, who was one of President Obama’s top technology advisors, recently wrote, “[t]here is no future in which less data is collected and used.”

“The challenge lies in taking full advantage of the benefits that the Internet of Things promises while appropriately protecting consumers’ privacy, and ensuring that consumers are treated fairly.” Julie Brill, FTC, May 4 2015, Berlin

In the rush to embrace the ‘Internet of Things’ it can feel as though the reason for creating them has been forgotten. If the Internet serves things, it serves consumerism. AI must tread an enlightened path here. If the things are designed to serve people, then we would hope they offer methods of enhancing our life experience.

In the dream of turning a “tsunami of data” into a “tsunami of actionable business intelligence,” it seems all too often the person providing the data is forgotten.

While the Patient and Information Directorate, NHS England or NIB speakers may say these projects are complex and hard to communicate the benefits, I’d say if you can’t communicate the benefits, its not the fault of the audience.

People shouldn’t have to either a) spend immeasurable hours of their personal time, understanding how these projects work that want their personal data, or b) put up with being ignorant.

We should be able to fully question why it is needed and get a transparent and complete explanation. We should have fully accountable business plans and scrutiny of tangible and intangible benefits in public, before projects launch based on public buy-in which may misplaced. We should expect  plans to be accessible to everyone and make documents straightforward enough to be so.

Even after listening to a number of these meetings and board meetings, I am  not sure many would be able to put succinctly: what is the NHS digital forward view really? How is it to be funded?

On the one hand new plans are to bring salvation, while the other stops funding what works already today.

Although the volume of activity planned is vast, what it boils down to, is what is visionary and achievable, and not just a vision.

Digital revolution by design: building for change and people

We have opportunity to build well now, avoiding barriers-by-design, pro-actively designing ethics and change into projects, and to ensure it is collaborative.

Change projects must map out their planned effects on people before implementing technology. For the NHS that’s staff and public.

The digital revolution must ensure the fair and ethical use of the big data that will flow for direct care and secondary uses if it is to succeed.

It must also look beyond its own bubble of development as they shape their plans in the ever changing infrastructures in which data, digital, AI and ethics will become important to discuss together.

That includes in medicine.

Design for the ethics of the future, and enable change mechanisms in today’s projects that will cope with shifting public acceptance, because that shift has already begun.

Projects whose ethics and infrastructures of governance were designed years ago, have been overtaken in the digital revolution.

Projects with an old style understanding of engagement are not fit-for-the-future. As Simon Denegri wrote, we could have 5 years to get a new social charter and engagement revolutionised.

Tim Berners-Lee when he called for a Magna Carta on the Internet asked for help to achieve the web he wants:

“do me a favour. Fight for it for me.”

The charter as part of the French Revolution set out a clear, understandable, ethical and fair framework of law in which they trusted their rights would be respected by fellow citizens.

We need one for data in this digital age. The NHS could be a good place to start.

****

It’s exciting hearing about the great things happening at grassroots. And incredibly frustrating to then see barriers to them being built top down. More on that shortly, on the barriers of cost, culture and geography.

****

* at the NIB meeting held on the final afternoon of the Digital Conference on Health & Social Care at the King’s Fund, June 16-17.

NEXT>>
2. Driving Digital Health: revolution by design
3. Digital revolution by design: building infrastructure

Refs:
Apps for sale on the NHS website
Whose smart city? Resident involvement
Data Protection and the Internet of Things, Julie Brill FTC
A Magna Carta for the web