Destination smart-cities: design, desire and democracy (Part three)

Smart Technology we have now: A UK Case Study

In places today, where climate surveillance sensors are used to predict and decide which smog-days cars should be banned from cities, automatic number-plate recognition (ANPR) can identify cars driving on the wrong days and send automatic penalties.

Similarly ANPR technology is used in our UK tunnels and congestion charging systems. One British company encouraging installation of ANPR in India is the same provider of a most significant part of our British public administrative data and surveillance softwares in a range of sectors.

About themselves that company says:

“Northgate Public Services has a unique experience of delivering ANPR software to all Home Office police forces. We developed and managed the NADC, the mission critical solution providing continuous surveillance of the UK’s road network.  The NADC is integrated with other databases, including the Police National Computer, and supports more than 30 million reads a day across the country.”

30 million snapshots from ‘continuous surveillance of the UK’s road network‘. That’s surprised me. That’s half the population in England, not all of whom drive. 30 million every day. It’s massive, unreasonable, and risks backlash.

Northgate Public Services’ clients also include 80% of UK water companies, as well as many other energy and utility suppliers.

And in the social housing market they stretch to debt collection, or ‘income management’.

So who I wondered, who is this company that owns all this data-driven access to our homes, our roads, our utilities, life insurance, hospital records and registeries, half of all UK calls to emergency services?

Northgate Information Solutions announced the sale of its Public Services division in December 2014 to venture capital firm Cinven. Cinven that also owns a 62% shareholding in the UK private healthcare provider Spire with all sorts of influence given their active share of services and markets. 

Not only does this private equity firm hold these vast range of data systems across a wide range of sectors, but it’s making decisions about how our public policies and money are being driven.

Using health screening data they’re even making decisions that affect our future and our behaviour and affect our private lives: software provides the information and tools that housing officers need to proactively support residents, such as sending emails, letters or rent reminders by SMS and freeing up time for face-to-face support.”

Of their ANPR systems, Northgate says the data should be even more widely used “to turn CONNECT: ANPR into a critical source of intelligence for proactive policing.”

If the company were to start to ‘proactively’ use all the data it owns across the sectors we should be asking, is ‘smart’ sensible and safe?

Where is the boundary between proactive and predictive? Or public and private?

Where do companies draw the line between public and personal space?

The public services provided by the company seem to encroach into our private lives in many ways, In Northgate’s own words, “It’s also deeply personal.”

Who’s driving decision making is clear. The source of their decision making is data. And it’s data about us.

Today already whether collected by companies proactively like ANPR or through managing data we give them with consent for direct administrative purpose, private companies are the guardians of massive amounts of our personal and public data.

What is shocking to me, is how collected data in one area of public services are also used for entirely different secondary purposes without informed consent or an FYI, for example in schools.

If we don’t know which companies manage our data, how can we trust that it is looked after well and that we are told if things go wrong?

Steps must be taken in administrative personal data security, transparency and public engagement to shore up public trust as the foundation for future datasharing as part of the critical infrastructure for any future strategy, for public or commercial application. Strategy must include more transparency of the processing of our data and public involvement, not the minimum, if ‘digital citizenship’ is to be meaningful.

How would our understanding of data improve if anyone using personal data were required to put in place clear public statements about their collection, use and analysis of data?  If the principles of data protection were actually upheld, in particular that individuals should be informed? How would our understanding of data improve especially regards automated decision making and monitoring technology? Not ninety page privacy policies. Plain English. If you need ninety pages, you’re doing too much with my data.

Independent privacy impact assessments should be mandatory and published before data are collected and shared with any party other than that to which it was given for a specific purpose. Extensions broadening that purpose should require consultation and consent. If that’s a street, then make it public in plain sight.

Above all, planning committees in local government, in policy making and practical application, need to think of data in every public decision they make and its ethical implications. We need some more robust decision-making in the face of corporate data grabs, to defend data collected in public space safe, and to keep some private.

How much less fun is a summer’s picnic spent smooching, if you feel watched? How much more anxious will we make our children if they’re not allowed to ever have their own time to themselves, and every word they type in a school computer is monitored?

How much individual creativity and innovation does that stifle? We are effectively censoring children before they have written a word.

Large corporations have played historically significant and often shadowy roles in surveillance that retrospectively were seen as unethical.

We should consider sooner rather than later, if corporations such as BAE systems, Siemens and the IMSs of the world act in ways worthy of our trust in such massive reach into our lives, with little transparency and oversight.

“Big data is big opportunity but Government should tackle misuse”

The Select Committee warned in its recent report on Big Data that distrust arising from concerns about privacy and security is often well-founded and must be resolved by industry and Government.

If ‘digital’ means smart technology in the future is used in “every part of government” as announced at #Sprint16, what will its effects be on the involvement and influence these massive corporations on democracy itself?

******

I thought about this more in depth on Part one here,  “Smart systems and Public Services” here (part two), and continue after this by looking at “The Best Use of Data” used in predictions and the Future (part four).

Destination smart-cities: design, desire and democracy (Part two)

Smart cities: private reach in public space and personal lives

Smart-cities are growing in the UK through private investment and encroachment on public space. They are being built by design at home, and supported by UK money abroad, with enormous expansion plans in India for example, in almost 100 cities.

With this rapid expansion of “smart” technology not only within our living rooms but my living space and indeed across all areas of life, how do we ensure equitable service delivery, (what citizens generally want, as demonstrated by strength of feeling on the NHS) continues in public ownership, when the boundary in current policy is ever more blurred between public and private corporate ownership?

How can we know and plan by-design that the values we hope for, are good values, and that they will be embedded in systems, in policies and planning? Values that most people really care about. How do we ensure “smart” does not ultimately mean less good? That “smart” does not in the end mean, less human.

Economic benefits seem to be the key driver in current government thinking around technology – more efficient = costs less.

While using technology progressing towards replacing repetitive work may be positive, how will we accommodate for those whose skills will no longer be needed? In particular its gendered aspect, and the more vulnerable in the workforce, since it is women and other minorities who work disproportionately in our part-time, low skill jobs. Jobs that are mainly held by women, even what we think of as intrinsically human, such as carers, are being trialed for outsourcing or assistance by technology. These robots monitor people, in their own homes and reduce staffing levels and care home occupancy. We’ll no doubt hear how good it is we need fewer carers because after all, we have a shortage of care staff. We’ll find out whether it is positive for the cared, or whether they find it it less ‘human'[e]. How will we measure those costs?

The ideal future of us all therefore having more leisure time sounds fab, but if we can’t afford it, we won’t be spending more of our time employed in leisure. Some think we’ll simply be unemployed. And more people live in the slums of Calcutta than in Soho.

One of the greatest benefits of technology is how more connected the world can be, but will it also be more equitable?

There are benefits in remote sensors monitoring changes in the atmosphere that dictate when cars should be taken off the roads on smog-days, or indicators when asthma risk-factors are high.

Crowd sourcing information about things which are broken, like fix-my-street, or lifts out-of-order are invaluable in cities for wheelchair users.

Innovative thinking and building things through technology can create things which solve simple problems and add value to the person using the tool.

But what of the people that cannot afford data, cannot be included in the skilled workforce, or will not navigate apps on a phone?

How this dis-incentivises the person using the technology has not only an effect on their disappointment with the tool, but the service delivery, and potentially wider still even to societal exclusion or stigma.These were the findings of the e-red book in Glasgow explained at the Digital event in health, held at the King’s Fund in summer 2015.

Further along the scale of systems and potential for negative user experience, how do we expect citizens to react to finding punishments handed out by unseen monitoring systems, finding out our behaviour was ‘nudged’ or find decisions taken about us, without us?

And what is the oversight and system of redress for people using systems, or whose data are used but inaccurate in a system, and cause injustice?

And wider still, while we encourage big money spent on big data in our part of the world how is it contributing to solving problems for millions for whom they will never matter? Digital and social media makes increasingly transparent our one connected world, with even less excuse for closing our eyes.

Approximately 15 million girls worldwide are married each year – that’s one girl, aged under 18, married off against her will every two seconds. [Huff Post, 2015]

Tinder-type apps are luxury optional extras for many in the world.

Without embedding values and oversight into some of what we do through digital tools implemented by private corporations for profit, ‘smart’ could mean less fair, less inclusive, less kind. Less global.

If digital becomes a destination, and how much it is implemented is seen as a measure of success, by measuring how “smart” we become risks losing sight of seeing technology as solutions and steps towards solving real problems for real people.

We need to be both clever and sensible, in our ‘smart’.

Are public oversight and regulation built in to make ‘smart’ also be safe?

If there were public consultation on how “smart” society will look would we all agree if and how we want it?

Thinking globally, we need to ask if we are prioritising the wrong problems? Are we creating more tech that we already have invented solutions for place where governments are willing to spend on them? And will it in those places make the society more connected across class and improve it for all, or enhance the lives of the ‘haves’ by having more, and the ‘have-nots’ be excluded?

Does it matter how smart your TV gets, or carer, or car, if you cannot afford any of these convenient add-ons to Life v1.1?

As we are ever more connected, we are a global society, and being ‘smart’ in one area may be reckless if at the expense or ignorance of another.

People need to Understand what “Smart” means

“Consistent with the wider global discourse on ‘smart’ cities, in India urban problems are constructed in specific ways to facilitate the adoption of “smart hi-tech solutions”. ‘Smart’ is thus likely to mean technocratic and centralized, undergirded by alliances between the Indian government and hi-technology corporations.”  [Saurabh Arora, Senior Lecturer in Technology and Innovation for Development at SPRU]

Those investing in both countries are often the same large corporations. Very often, venture capitalists.

Systems designed and owned by private companies provide the information technology infrastructure that i:

the basis for providing essential services to residents. There are many technological platforms involved, including but not limited to automated sensor networks and data centres.’

What happens when the commercial and public interest conflict and who decides that they do?

Decision making, Mining and Value

Massive amounts of data generated are being mined for making predictions, decisions and influencing public policy: in effect using Big Data for research purposes.

Using population-wide datasets for social and economic research today, is done in safe settings, using deidentified data, in the public interest, and has independent analysis of the risks and benefits of projects as part of the data access process.

Each project goes before an ethics committee review to assess its considerations for privacy and not only if the project can be done, but should be done, before it comes for central review.

Similarly our smart-cities need ethics committee review assessing the privacy impact and potential of projects before commissioning or approving smart-technology. Not only assessing if they are they feasible, and that we ‘can’ do it, but ‘should’ we do it. Not only assessing the use of the data generated from the projects, but assessing the ethical and privacy implications of the technology implementation itself.

The Committee recommendations on Big Data recently proposed that a ‘Council of Data Ethics’ should be created to explicitly address these consent and trust issues head on. But how?

Unseen smart-technology continues to grow unchecked often taking root in the cracks between public-private partnerships.

We keep hearing about Big Data improving public services but that “public” data is often held by private companies. In fact our personal data for public administration has been widely outsourced to private companies of which we have little oversight.

We’re told we paid the price in terms of skills and are catching up.

But if we simply roll forward in first gear into the connected city that sees all, we may find we arrive at a destination that was neither designed nor desired by the majority.

We may find that the “revolution, not evolution”, hoped for in digital services will be of the unwanted kind if companies keep pushing more and more for more data without the individual’s consent and our collective public buy-in to decisions made about data use.

Having written all this, I’ve now read the Royal Statistical Society’s publication which eloquently summarises their recent work and thinking. But I wonder how we tie all this into practical application?

How we do governance and regulation is tied tightly into the practicality of public-private relationships but also into deciding what should society look like? That is what our collective and policy decisions about what smart-cities should be and may do, is ultimately defining.

I don’t think we are addressing in depth yet the complexity of regulation and governance that will be sufficient to make Big Data and Public Spaces safe because companies say too much regulation risks choking off innovation and creativity.

But that risk must not be realised if it is managed well.

Rather we must see action to manage the application of smart-technology in a thoughtful way quickly, because if we do not, very soon, we’ll have lost any say in how our service providers deliver.

*******

I began my thoughts about this in Part one, on smart technology and data from the Sprint16 session and after this (Part two), continue to look at the design and development of smart technology making “The Best Use of Data” with a UK company case study (Part three) and “The Best Use of Data” used in predictions and the Future (Part four).

Destination smart-cities: design, desire and democracy (Part one)

When I drop my children at school in the morning I usually tell them three things: “Be kind. Have fun. Make good choices.”

I’ve been thinking recently about what a positive and sustainable future for them might look like. What will England be in 10 years?

The #Sprint16 snippets I read talk about how: ”Digital is changing how we deliver every part of government,” and “harnessing the best of digital and technology, and the best use of data to improve public services right across the board.”

From that three things jumped out at me:

  • The first is that the “best use of data” in government’s opinion may conflict with that of the citizen.
  • The second, is how to define “public services” right across the board in a world in which boundaries between private and public in the provision of services have become increasingly blurred.
  • And the third is the power of tech to offer both opportunity and risk if used in “every part of government” and effects on access to, involvement in, and the long-term future of, democracy.

What’s the story so far?

In my experience so far of trying to be a digital citizen “across the board” I’ve seen a few systems come and go. I still have my little floppy paper Government Gateway card, navy blue with yellow and white stripes. I suspect it is obsolete. I was a registered Healthspace user, and used it twice. It too, obsolete. I tested my GP online service. It was a mixed experience.

These user experiences are shaping how I interact with new platforms and my expectations of organisations, and I will be interested to see what the next iteration, nhs alpha, offers.

How platforms and organisations interact with me, and my data, is however increasingly assumed without consent. This involves new data collection, not only using data from administrative or commercial settings to which I have agreed, but new scooping of personal data all around us in “smart city” applications.

Just having these digital applications will be of no benefit and all the disadvantages of surveillance for its own sake will be realised.

So how do we know that all these data collected are used – and by whom? How do we ensure that all the tracking actually gets turned into knowledge about pedestrian and traffic workflow to make streets and roads safer and smoother in their operation, to make street lighting more efficient, or the environment better to breathe in and enjoy? And that we don’t just gift private providers tonnes of valuable data which they simply pass on to others for profit?

Because without making things better, in this Internet-of-Things will be a one-way ticket to power in the hands of providers and loss of control, and quality of life. We’ll work around it, but buying a separate SIM card for trips into London, avoiding certain parks or bridges, managing our FitBits to the nth degree under a pseudonym. But being left no choice but to opt out of places or the latest technology to enjoy, is also tedious. If we want to buy a smart TV to access films on demand, but don’t want it to pass surveillance or tracking information back to the company how can we find out with ease which products offer that choice?

Companies have taken private information that is none of their business, and quite literally, made it their business.

The consumer technology hijack of “smart” to always mean marketing surveillance creates a divide between those who will comply for convenience and pay the price in their privacy, and those who prize privacy highly enough to take steps that are less convenient, but less compromised.

But even wanting the latter, it can be so hard to find out how to do, that people feel powerless and give-in to the easy option on offer.

Today’s system of governance and oversight that manages how our personal data are processed by providers of public and private services we have today, in both public and private space, is insufficient to meet the values most people reasonably expect, to be able to live their life without interference.

We’re busy playing catch up with managing processing and use, when many people would like to be able to control collection.

The Best use of Data: Today

My experience of how the government wants to ‘best use data’ is that until 2013 I assumed the State was responsible with it.

I feel bitterly let down.

care.data taught me that the State thinks my personal data and privacy are something to exploit, and “the best use of my data” for them, may be quite at odds with what individuals expect. My trust in the use of my health data by government has been low ever since. Saying one thing and doing another, isn’t making it more trustworthy.

I found out in 2014 how my children’s personal data are commercially exploited and given to third parties including press outside safe settings, by the Department for Education. Now my trust is at rock bottom. I tried to take a look at what the National Pupil Database stores on my own children and was refused a subject access request, meanwhile the commercial sector and Fleet Street press are given out not only identifiable data, but ‘highly sensitive’ data. This just seems plain wrong in terms of security, transparency and respect for the person.

The attitude that there is an entitlement of the State to individuals’ personal data has to go.

The State has pinched 20 m children’s privacy without asking. Tut Tut indeed. [see Very British Problems for a translation].

And while I support the use of public administrative data in deidentified form in safe settings, it’s not to be expected that anything goes. But the feeling of entitlement to access our personal data for purposes other than that for which we consented, is growing, as it stretches to commercial sector data. However suggesting that public feeling measured based on work with 0.0001% of the population, is “wide public support for the use and re-use of private sector data for social research” seems tenuous.

Even so, comments even in that tiny population suggested, “many participants were taken by surprise at the extent and size of data collection by the private sector” and some “felt that such data capture was frequently unwarranted.” “The principal concerns about the private sector stem from the sheer volume of data collected with and without consent from individuals and the profits being made from linking data and selling data sets.”

The Best use of Data: The Future

Young people, despite seniors often saying “they don’t care about privacy” are leaving social media in search of greater privacy.

These things cannot be ignored if the call for digital transformation between the State and the citizen is genuine because try and do it to us and it will fail. Change must be done with us. And ethically.

And not “ethics” as in ‘how to’, but ethics of “should we.” Qualified transparent evaluation as done in other research areas, not an add on, but integral to every project, to look at issues such as:

  • whether participation is voluntary, opt-out or covert
  • how participants can get and give informed consent
  • accessibility to information about the collection and its use
  • small numbers, particularly of vulnerable people included
  • identifiable data collection or disclosure
  • arrangements for dealing with disclosures of harm and recourse
  • and how the population that will bear the risks of participating in the research is likely to benefit from the knowledge derived from the research or not.

Ethics is not about getting away with using personal data in ways that won’t get caught or hauled over the coals by civil society.

It’s balancing risk and benefit in the public interest, and not always favouring the majority, but doing what is right and fair.

We hear a lot at the moment on how the government may see lives, shaped by digital skills, but too little of heir vison for what living will look and feel like, in smart cities of the future.

My starting question is, how does government hope society will live there and is it up to them to design it? If not, who is because these smart-city systems are not designing themselves. You’ve heard of Stepford wives. I wonder what do we do if we do not want to live like Milton Keynes man?

I hope that the world my children will inherit will be more just, more inclusive, and with a more sustainable climate to support food, livelihoods and kinder than it is today. Will ‘smart’ help or hinder?

What is rarely discussed in technology discussions is how the service should look regardless of technology. The technology assumed as inevitable, becomes the centre of service delivery.

I’d like to first understand what is the central and local government vision for “public services”  provision for people of the future? What does it mean for everyday services like schools and health, and how does it balance security and our freedoms?

Because without thinking about how and who provides those services for people, there is a hole in the discussion of “the best use of data” and their improvement “right across the board”.

The UK government has big plans for big data sharing, sharing across all public bodies, some tailored for individual interventions.

While there are interesting opportunities for public benefit from at-scale systems, the public benefit is at risk not only from lack of trust in how systems gather data and use them, but that interoperability in service, and the freedom for citizens to transfer provider, gets lost in market competition.

Openness and transparency can be absent in public-private partnerships until things go wrong. Given the scale of smart-cities, we must have more than hope that data management and security will not be one of those things.

How will we know if new plans are designed well, or not?

When I look at my children’s future and how our current government digital decision making may affect it, I wonder if their future will be more or less kind. More or less fun.

Will they be left with the autonomy to make good choices of their own?

The hassle we feel when we feel watched all the time, by every thing that we own, in every place we go, having to check every check box has a reasonable privacy setting, has a cumulative cost in our time and anxieties.

Smart technology has invaded not only our public space and our private space, but has nudged into our head space.

I for one have had enough already. For my kids I want better. Technology should mean progress for people, not tyranny.

Living in smart cities, connected in the Internet-of-Things, run on their collective Big Data and paid for by commercial corporate providers, threatens not only their private lives and well-being, their individual and independent lives, but ultimately independent and democratic government as we know it.

*****

This is the start of a four part set of thoughts: Beginnings with smart technology and data triggered by the Sprint16 session (part one). I think about this more in depth in “Smart systems and Public Services” (Part two) here, and the design and development of smart technology making “The Best Use of Data” looking at today in a UK company case study (Part three) before thoughts on “The Best Use of Data” used in predictions and the Future (Part four).

Breaking up is hard to do. Restructuring education in England.

This Valentine’s I was thinking about the restructuring of education in England and its wide ranging effects. It’s all about the break up.

The US EdTech market is very keen to break into the UK, and our front door is open.

We have adopted the model of Teach First partnered with Teach America, while some worry we do not ask “What is education for?

Now we hear the next chair of Oftsed is to be sought from the US, someone who is renowned as “the scourge of the unions.”

Should we wonder how long until the management of schools themselves is US-sourced?

The education system in England has been broken up in recent years into manageable parcels  – for private organisations, schools within schools, charity arms of commercial companies, and multi-school chains to take over – in effect, recent governments have made reforms that have dismantled state education as I knew it.

Just as the future vision of education outlined in the 2005 Direct Democracy co-authored by Michael Gove said, “The first thing to do is to make existing state schools genuinely independent of the state.”

Free schools touted as giving parents the ultimate in choice, are in effect another way to nod approval to the outsourcing of the state, into private hands, and into big chains. Despite seeing the model fail spectacularly abroad, the government seems set on the same here.

Academies, the route that finagles private corporations into running public-education is the preferred model, says Mr Cameron. While there are no plans to force schools to become academies, the legislation currently in ping-pong under the theme of coasting schools enables just that. The Secretary of State can impose academisation. Albeit only on Ofsted labeled ‘failing’ schools.

What fails appears sometimes to be a school that staff and parents cannot understand as anything less than good, but small. While small can be what parents want, small pupil-teacher ratios, mean higher pupil-per teacher costs. But the direction of growth is towards ‘big’ is better’.

“There are now 87 primary schools with more than 800 pupils, up from 77 in 2014 and 58 in 2013. The number of infants in classes above the limit of 30 pupils has increased again – with 100,800 pupils in these over-sized classes, an increase of 8% compared with 2014.” [BBC]

All this restructuring creates costs about which the Department wants to be less than transparent.  And has lost track of.

If only we could see that these new structures raised standards?  But,” while some chains have clearly raised attainment, others achieve worse outcomes creating huge disparities within the academy sector.”

If not delivering better results for children, then what is the goal?

A Valentine’s view of Public Service Delivery: the Big Break up

Breaking up the State system, once perhaps unthinkable is possible through the creation of ‘acceptable’ public-private partnerships (as opposed to outright privatisation per se). Schools become academies through a range of providers and different pathways, at least to start with, and as they fail, the most successful become the market leaders in an oligopoly. Ultimately perhaps, this could become a near monopoly. Delivering ‘better’. Perhaps a new model, a new beginning, a new provider offering salvation from the flood of ‘failing’ schools coming to the State’s rescue.

In order to achieve this entry to the market by outsiders, you must first remove conditions seen as restrictive, giving more ‘freedom’ to providers; to cut corners make efficiency savings on things like food standards, required curriculum, and numbers of staff, or their pay.

And what if, as a result, staff leave, or are hard to recruit?

Convincing people that “tech” and “digital” will deliver cash savings and teach required skills through educational machine learning is key if staff costs are to be reduced, which in times of austerity and if all else has been cut, is the only budget left to slash.

Self-taught systems’ providers are convincing in their arguments that tech is the solution.

Sadly I remember when a similar thing was tried on paper. My first year of GCSE maths aged 13-14  was ‘taught’ at our secondary comp by working through booklets in a series that we self-selected from the workbench in the classroom. Then we picked up the master marking-copy once done. Many of the boys didn’t need long to work out the first step was an unnecessary waste of time. The teacher had no role in the classroom. We were bored to bits. By the final week at end of the year they sellotaped the teacher to his chair.

I kid you not.

Teachers are so much more than knowledge transfer tools, and yet by some today seem to be considered replaceable by technology.

The US is ahead of us in this model, which has grown hand-in-hand with commercialism in schools. Many parents are unhappy.

So is the DfE setting us up for future heartbreak if it wants us to go down the US route of more MOOCs, more tech, and less funding and fewer staff? Where’s the cost benefit risk analysis and transparency?

We risk losing the best of what is human from the classroom, if we will remove the values they model and inspire. Unions and teachers and educationalists are I am sure, more than aware of all these cumulative changes. However the wider public seems little engaged.

For anyone ‘in education’ these changes will all be self-evident and their balance of risks and benefits a matter of experience, and political persuasion. As a parent I’ve only come to understand these changes, through researching how our pupils’ personal and school data have been commercialised,  given away from the National Pupil Database without our consent, since legislation changed in 2013; and the Higher Education student and staff data sold.

Will more legislative change be needed to keep our private data accessible in public services operating in an increasingly privately-run delivery model? And who will oversee that?

The Education Market is sometimes referred to as ‘The Wild West’. Is it getting a sheriff?

The news that the next chair of Oftsed is to be sought from the US did set alarm bells ringing for some in the press, who fear US standards and US-led organisations in British schools.

“The scourge of unions” means not supportive of staff-based power and in health our junior doctors have clocked exactly what breaking their ‘union’ bargaining power is all about.  So who is driving all this change in education today?

Some ed providers might be seen as profiting individuals from the State break up. Some were accused of ‘questionable practices‘. Oversight has been lacking others said. Margaret Hodge in 2014 was reported to have said: “It is just wrong to hand money to a company in which you have a financial interest if you are a trustee.”

I wonder if she has an opinion on a lead non-executive board member at the Department for Education also being the director of one of the biggest school chains? Or the ex Minister now employed by the same chain? Or that his campaign was funded by the same Director?  Why this register of interests is not transparent is a wonder.

It could appear to an outsider that the private-public revolving door is well oiled with sweetheart deals.

Are the reforms begun by Mr Gove simply to be executed until their end goal, whatever that may be, through Nikky Morgan or she driving her own new policies?

If Ofsted were  to become US-experience led, will the Wild West be tamed or US providers invited to join the action, reshaping a new frontier? What is the end game?

Breaking up is not hard to do, but in whose best interest is it?

We need only look to health to see the similar pattern.

The structures are freed up, and boundaries opened up (if you make the other criteria) in the name of ‘choice’. The organisational barriers to break up are removed in the name of ‘direct accountability’. And enabling plans through more ‘business intelligence’ gathered from data sharing, well, those plans abound.

Done well, new efficient systems and structures might bring public benefits, the right technology can certainly bring great things, but have we first understood what made the old less efficient if indeed it was and where are those baselines to look back on?

Where is the transparency of the end goal and what’s the price the Department is prepared to pay in order to reach it?

Is reform in education, transparent in its ideology and how its success is being measured if not by improved attainment?

The results of change can also be damaging. In health we see failing systems and staff shortages and their knock-on effects into patient care. In schools, these failures damage children’s start in life, it’s not just a ‘system’.

Can we assess if and how these reforms are changing the right things for the right reasons? Where is the transparency of what problems we are trying to solve, to assess what solutions work?

How is change impact for good and bad being measured, with what values embedded, with what oversight, and with whose best interests at its heart?

2005’s Direct Democracy could be read as a blueprint for co-author Mr Gove’s education reforms less than a decade later.

Debate over the restructuring of education and its marketisation seems to have bypassed most of us in the public, in a way health has not.

Underperformance as measured by new and often hard to discern criteria, means takeover at unprecedented pace.

And what does this mean for our most vulnerable children? SEN children are not required to be offered places by academies. The 2005 plans co-authored by Mr Gove also included: “killing the government’s inclusion policy stone dead,” without an alternative.

Is this the direction of travel our teachers and society supports?

What happens when breakups happen and relationship goals fail?

Who picks up the pieces? I fear the state is paying heavily for the break up deals, investing heavily in new relationships, and yet will pay again for failure. And so will our teaching staff, and children.

While Mr Hunt is taking all the heat right now, for his part in writing Direct Democracy and its proposals to privatise health – set against the current health reforms and restructuring of junior doctors contracts – we should perhaps also look to Mr Gove co-author, and ask to better understand the current impact of his recent education reforms, compare them with what he proposed in 2005, and prepare for the expected outcomes of change before it happens (see p74).

One outcome was that failure was to be encouraged in this new system, and Sweden held up as an exemplary model:

“Liberating state schools would also allow the all-important freedom to fail.”

As Anita Kettunen, principal of JB Akersberga in Sweden reportedly said when the free schools chain funded by a private equity firm failed:

“if you’re going to have a system where you have a market, you have to be ready for this.”

Breaking up can be hard to do. Failure hurts. Are we ready for this?
******

 

Abbreviated on Feb 18th.

 

The front door to our children’s personal data in schools

“EdTech UK will be a pro-active organisation building and accelerating a vibrant education and learning technology sector and leading new developments with our founding partners. It will also be a front door to government, educators, companies and investors from Britain and globally.”

Ian Fordham, CEO, EdTech UK

This front door is a gateway to access our children’s personal data and through it some companies are coming into our schools and homes and taking our data without asking.  And with that, our children lose control over their safeguarded digital identity. Forever.

Companies are all “committed to customer privacy” in those privacy policies which exist at all. However, typically this means they also share your information with ‘our affiliates, our licensors, our agents, our distributors and our suppliers’ and their circles are wide and often in perpetuity. Many simply don’t have a published policy.

Where do they store any data produced in the web session? Who may access it and use it for what purposes? Or how may they use the personal data associated with staff signing up with payment details?

According to research from London & Partners, championed by Boris Johnson, Martha Lane-Fox and others in EdTech, education is one of the fastest growing tech sectors in Britain and is worth £45bn globally; a number set to reach a staggering £129bn by 2020. And perhaps the EdTech diagrams in US dollars shows where the UK plan to draw companies from. If you build it, they will come.

The enthusiasm that some US EdTech type entrepreneurs I have met or listened to speak, is akin to religious fervour. Such is their drive for tech however, that they appear to forget that education is all about the child. Individual children. Not cohorts, or workforces. And even when they do it can be sincerely said, but lacks substance when you examine policies in practice.

How is the DfE measuring the cost and benefit of tech and its applications in education?

Is anyone willing to say not all tech is good tech, not every application is a wise application? Because every child is unique, not every app is one size fits all?

My 7-yo got so caught up in the game and in the mastery of the app their class was prescribed for homework in the past, that she couldn’t master the maths and harmed her confidence. (Imagine something like this, clicking on the two correct sheep with numbers stamped on them, that together add up to 12, for example, before they fall off and die.)

She has no problem with maths. Nor doing sums under pressure. She told me happily today she’d come joint second in a speed tables test. That particular app style simply doesn’t suit her.

I wonder if other children and parents find the same and if so, how would we know if these apps do more harm than good?

Nearly 300,000 young people in Britain have an anxiety disorder according to the Royal College of Psychiatrists. Feeling watched all the time on-and offline is unlikely to make anxiety any better.

How can the public and parents know that edTech which comes into the home with their children, is behaviourally sound?

How can the public and parents know that edTech which affects their children, is ethically sound in both security and application?

Where is the measured realism in the providers’ and policy makers fervour when both seek to marketise edTech and our personal data for the good of the economy, and ‘in the public interest’.

Just because we can, does not always mean we should. Simply because data linkage is feasible, even if it brings public benefit, cannot point blank mean it will always be in our best interest.

In whose best Interest is it anyway?

Right now, I’m not convinced that the digital policies at the heart of the Department for Education, the EdTech drivers or many providers have our children’s best interests at heart at all. It’s all about the economy; when talking if at all about children using the technology, many talk only of ‘preparing the workforce’.

Are children and parents asked to consent at individual level to the terms and conditions of the company and told what data will be extracted from the school systems about their child? Or do schools simply sign up their children and parents en masse, seeing it as part of their homework management system?

How much ‘real’ personal data they use varies. Some use only pseudo-IDs assigned by the teacher. Others log, store and share everything they do assigned to their ID or real email address , store performance over time and provide personalised reports of results.

Teachers and schools have a vital role to play in understanding data ethics and privacy to get this right and speaking to many, it doesn’t seem something they feel well equipped to do. Parents aren’t always asked. But should schools not always have to ask before giving data to a commercial third party or when not in an ’emergency’ situation?

I love tech. My children love making lego robots move with code. Or driving drones with bananas. Or animation. Technology offers opportunity for application in and outside schools for children that are fascinating, and worthy, and of benefit.

If however all parents are to protect children’s digital identity for future, and to be able to hand over any control and integrity over their personal data to them as adults,  we must better accommodate children’s data privacy in this 2016 gold rush for EdTech.

Pupils and parents need to be assured their software is both educationally and ethically sound.  Who defines those standards?

Who is in charge of Driving, Miss Morgan?

Microsoft’s vice-president of worldwide education, recently opened the BETT exhibition and praised teachers for using technology to achieve amazing things in the classroom, and urged innovators to  “join hands as a global community in driving this change”.

While there is a case to say no exposure to technology in today’s teaching would be neglectful, there is a stronger duty to ensure exposure to technology is positive and inclusive, not harmful.

Who regulates that?

We are on the edge of an explosion of tech and children’s personal data ‘sharing’ with third parties in education.

Where is its oversight?

The community of parents and children are at real risk of being completely left out these decisions, and exploited.

The upcoming “safeguarding” policies online are a joke if the DfE tells us loudly to safeguard children’s identity out front, and quietly gives their personal data away for cash round the back.

The front door to our children’s data “for government, educators, companies and investors from Britain and globally” is wide open.

Behind the scenes  in pupil data privacy, it’s a bit of a mess. And these policy makers and providers forgot to ask first,  if they could come in.

If we build it, would you come?

My question now is, if we could build something better on pupil data privacy AND better data use, what would it look like?

Could we build an assessment model of the collection, use and release of data in schools that could benefit pupils and parents, AND educational establishments and providers?

This could be a step towards future-proofing public trust which will be vital for companies who want a foot-in-the door of EdTech. Design an ethical framework for digital decision making and a practical data model for use in Education.

Educationally and ethically sound.

If together providers, policy makers, schools at group Trust level, could meet with Data Protection and Privacy civil society experts to shape a tool kit of how to assess privacy impact, to ensure safeguarding and freedoms, enable safe data flow and help design cybersecurity that works for them and protects children’s privacy that is lacking today, designing for tomorrow, would you come?

Which door will we choose?

*******

image credit: @ Ben Buschfeld Wikipedia

*added February 13th: Oftsed Chair sought from US

Commission on Freedom of Information: submission

Since it appears that the Independent Commission on Freedom of Information [FOI] has not published all of the received submissions, I thought I’d post what I’d provided via email.

I’d answered two of the questions with two case studies. The first on application of section 35 and 36 exemptions and the safe space. The second on the proposal for potential charges.

On the Commission website, the excel spreadsheet of evidence submitted online, tab 2 notes that NHS England asked belatedly for its submission be unpublished.

I wonder why.

Follow up to both these FOI requests are now long overdue in 2016. The first from NHS England for the care.data decision making  behind the 2015 decision not to publish a record of whether part of the board meetings were to be secret. Transparency needs to be seen in action, to engender public trust. After all, they’re deciding things like how care.data and genomics will be at the “heart of the transformation of the NHS.”

The second is overdue at the Department for Education on the legal basis for identifiable sensitive data releases from the National Pupil Database that meets Schedule 3 of the Data Protection Act 1998 to permit this datasharing with commercial third parties.

Both in line with the apparently recommended use of FOI
according to Mr. Grayling who most recently said:

“It is a legitimate and important tool for those who want to understand why and how Government is taking decisions and it is not the intention of this Government to change that”.  [Press Gazette]

We’ll look forward to see whether that final sentence is indeed true.

*******

Independent Commission on Freedom of Information Submission
Question 1: a) What protection should there be for information relating to the internal deliberations of public bodies? b) For how long after a decision does such information remain sensitive? c) Should different protections apply to different kinds of information that are currently protected by sections 35 and 36?

A “safe space” in which to develop and discuss policy proposals is necessary. I can demonstrate where it was [eventually] used well, in a case study of a request I made to NHS England. [1]

The current protection afforded to the internal deliberations of public bodies are sufficient given section 35 and 36 exemptions. I asked in October 2014 for NHS England to publish the care.data planning and decision making for the national NHS patient data extraction programme. This programme has been controversial [2]. It will come at great public expense and to date has been harmful to public and professional trust with no public benefit. [3]

NHS England refused my request based on Section 22 [intended for future publication]. [4] However ten months later the meeting minutes had never been published. In July 2015, after appeal, the Information Commissioner issued an Information Notice and NHS England published sixty-three minutes and papers in August 2015.

In these released documents section 36 exemption was then applied to only a tiny handful of redacted comments. This was sufficient to protect the decisions that NHS England had felt to be most sensitive and yet still enable the release of a year’s worth of minutes.

Transparency does not mean that difficult decisions cannot be debated since only outcomes and decisions are recorded, not every part of every discussion verbatim.

The current provision for safe space using these exemptions is effective and in this case would have been no different made immediately after the meeting or one and a half years later.  If anything, publication sooner may have resulted in better informed policy and decision making through wider involvement from professionals and civil society.  The secrecy in the decision making did not build trust.

When policies such as these are found to have no financial business cost-benefit case for example, I believe it is strongly in the public interest to have transparency of these facts, to scrutinise the policy governance in the public interest to enable early intervention when seen to be necessary.
In the words of the Information Commissioner:

“FOIA can rightly challenge and pose awkward questions to public authorities. That is part of democracy. However, checks and balances are needed to ensure that the challenges are proportionate when viewed against all the other vital things a public authority has to do.

“The Commissioner believes that the current checks and balances in the legislation are sufficient to achieve this outcome.” [5]

Given that most public bodies, including NHS England’s Board, routinely publish its minutes this would seem a standard good practice to be expected and I believe routine publication of meeting minutes would have raised trustworthiness of the programme and its oversight and leadership.

The same section 36 exemption could have been applied from the start to the small redactions that were felt necessary balanced against the public interest of open and transparent decision making.

I do not believe more restrictive applications should be made than are currently under sections 35 and 36.

_____________________________________________________________________

Question 6: Is the burden imposed on public authorities under the Act justified by the public interest in the public’s right to know? Or are controls needed to reduce the burden of FoI on public authorities?

As an individual I made 40 requests of schools and 2 from the Department for Education which may now result in benefit for 8 million children and their families, as well as future citizens.

The transparency achieved through these Freedom of Information requests will I hope soon transform the culture at the the Department for Education from one of secrecy to one of openness.

There is the suggestion that a Freedom of Information request would incur a charge to the applicant.

I believe that the benefits of the FOI Act in the public interest outweigh the cost of FOI to public authorities.  In this second example [6], I would ask the Commission to consider if I had not been able to make these Freedom of Information requests due to cost, and therefore I was not able to present evidence to the Minister, Department, and the Information Commissioner, would the panel members support the secrecy around the ongoing risk that current practices pose to children and our future citizens?

Individual, identifiable and sensitive pupil data are released to third parties from the National Pupil Database without telling pupils, parents and schools or their consent. This Department for Education (DfE) FOI request aimed to obtain understanding of any due diligence and the release process: privacy impact and DfE decision making, with a focus on its accountability.

This was to enable transparency and scrutiny in the public interest, to increase the understanding of how our nation’s children’s personal data are used by government, commercial third parties, and even identifiable and sensitive data given to members of the press.

Chancellor Mr. Osborne spoke on November 17 about the importance of online data protection:

“Each of these attacks damages companies, their customers, and the public’s trust in our collective ability to keep their data and privacy safe.”[…] “Imagine the cumulative impact of repeated catastrophic breaches, eroding that basic faith… needed for our online economy & social life to function.”

Free access to FOI enabled me as a member of the public to ask and take action with government and get information from schools to improve practices in the broad public interest.

If there was a cost to this process I could not afford to ask schools to respond.  Schools are managed individually, and as such I requested the answer to the question; whether they were aware of the National Pupil Database and how the Department shared their pupils’ data onwardly with third parties.

I asked a range of schools in the South and East. In order to give a fair picture of more than one county I made requests from a range of types of school – from academy trusts to voluntary controlled schools – 20 primary and 20 secondary.  Due to the range of schools in England and Wales [7] this was a small sample.

Building even a small representative picture of pupil data privacy arrangements in the school system therefore required a separate request to each school.

I would not have been able to do this, had there been a charge imposed for each request.  This research subsequently led me to write to the Information Commissioner’s Office, with my findings.

Were this only to be a process that access costs would mean organisations or press could enter into due to affordability, then the public would only be able to find out what matters or was felt important to those organisations, but not what matters to individuals.

However what matters to one individual might end up making a big difference to many people.

Individuals may be interested in what are seen as minority topics, perhaps related to discrimination according to gender, sexuality, age, disability, class, race or ethnicity.  If individuals cannot afford to  challenge government policies that matter to them as an individual, we may lose the benefit that they can bring when they go on to champion the rights of more people in the country as a whole.

Eight million children’s records, from children aged 2-19 are stored in the National Pupil Database. I hope that due to the FOI request increased transparency and better practices will help restore their data protections for individuals and also re-establish organisational trust in the Department.

Information can be used to enable or constrain citizenship. In order to achieve universal access to human rights to support participation, transparency and accountability, I appeal that the Commission recognise the need for individuals to tackle vested interests, unjust laws and policies.

Any additional barriers such as cost, only serve to reduce equality and make society less just. There is however an immense intangible value in an engaged public which is hard to measure. People are more likely to be supportive of public servant decision making if they are not excluded from it.

Women for example are underrepresented in Parliament and therefore in public decision making. Further, the average gap within the EU pay is 16 per cent, but pay levels throughout the whole of Europe differ hugely, and in the South East of the UK men earn 25 per cent more than their female counterparts. [8]  Women and mothers like me may therefore find it more difficult to participate in public life and to make improvements on behalf of other families and children across the country.

To charge for access to information about our public decision making process could therefore be excluding and discriminatory.

I believe these two case studies show that the Act’s intended objectives, on parliamentary introduction — to ‘transform the culture of Government from one of secrecy to one of openness’; ‘raise confidence in the processes of government, and enhance the quality of decision making by Government’; and to ‘secure a balance between the right to information…and the need for any organisation, including Government, to be able to formulate its collective policies in private’ — work in practice.

If anything, they need strengthened to ensure accessibility.

Any actions to curtail free and equal access to these kinds of information would not be in the public interest and a significant threat to the equality of opportunity offered to the public in making requests. Charging would particularly restrict access to FOI for poorer individuals and communities who are often those already excluded from full public participation in public life.
___________________________________________________________________________

[1] https://www.whatdotheyknow.com/request/caredata_programme_board_minutes
[2] http://www.theguardian.com/society/2014/dec/12/nhs-patient-care-data-sharing-scheme-delayed-2015-concerns
[3] http://www.statslife.org.uk/news/1672-new-rss-research-finds-data-trust-deficit-with-lessons-for-policymakers
[4] https://jenpersson.com/wp-content/uploads/2015/11/caredataprogramme_FOI.pdf
[5] https://ico.org.uk/media/about-the-ico/consultation-responses/2015/1560175/ico-response-independent-commission-on-freedom-of-information.pdf
[6] https://jenpersson.com/wp-content/uploads/2015/11/NPD_FOI_submissionv3.pdf
[7] http://www.newschoolsnetwork.org/sites/default/files/Comparison%20of%20school%20types.pdf
[8] http://www.equalpayportal.co.uk/statistics/

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (2)

“Children do not lose their human rights by virtue of passing through the school gates” (UN Committee on the Rights of the Child, General Comment on ‘The aims of education’, 2001).

The Digital Skills in Schools inquiry [1] is examining the gap in education of our children to enable them to be citizens fit for the future.

We have an “educational gap” in digital skills and I have suggested it should not be seen only as functional or analytical, but should also address a gap in ethical skills and framework to equip our young people to understand their digital rights, as well as responsibilities.

Children must be enabled in education with opportunity to understand how they can grow “to develop physically, mentally, morally, spiritually and socially in a healthy and normal manner and in conditions of freedom and dignity”. [2]

Freedom to use the internet in privacy does not mean having to expose children to risks, but we should ask, are there ways of implementing practices which are more proportionate, and less intrusive than monitoring and logging keywords [3] for every child in the country? What problem is the DfE trying to solve and how?

Nicky Morgan’s “fantastic” GPS tracking App

The second technology tool Nicky Morgan mentioned in her BETT speech on January 22nd, is an app with GPS tracking and alerts creation. Her app verdict was “excellent” and “fantastic”:

“There are excellent examples at the moment such as the Family First app by Group Call. It uses GPS in mobile phones to help parents keep track of their children’s whereabouts, allowing them to check that they have arrived safely to school, alerting them if they stray from their usual schedule.” [4]

I’m not convinced tracking every child’s every move is either excellent or fantastic. Primarily because it will foster a nation of young people who feel untrusted, and I see a risk it could create a lower sense of self-reliance, self-confidence and self-responsibility.

Just as with the school software monitoring [see part one], there will be a chilling effect on children’s freedom if these technologies become the norm. If you fear misusing a word in an online search, or worry over stigma what others think, would you not change your behaviour? Our young people need to feel both secure and trusted at school.

How we use digital in schools shapes our future society

A population that trusts one another and trusts its government and organisations and press, is vital to a well functioning society.

If we want the benefits of a global society, datasharing for example to contribute to medical advance, people must understand how their own data and digital footprint fits into a bigger picture to support it.

In schools today pupils and parents are not informed that their personal confidential data are given to commercial third parties by the Department for Education at national level [5]. Preventing public engagement, hiding current practices, downplaying the risks of how data are misused, also prevents fair and transparent discussion of its benefits and how to do it better. Better, like making it accessible only in a secure setting not handing data out to Fleet Street.

For children this holds back public involvement in the discussion of the roles of technology in their own future. Fear of public backlash over poor practices must not hold back empowering our children’s understanding of digital skills and how their digital identity matters.

Digital skills are not shorthand for coding, but critical life skills

Skills our society will need must simultaneously manage the benefits to society and deal with great risks that will come with these advances in technology; advances in artificial intelligence, genomics, and autonomous robots, to select only three examples.

There is a glaring gap in their education how their own confidential personal data and digital footprint fit a globally connected society, and how they are used by commercial business and third parties.

There are concerns how apps could be misused by others too.

If we are to consider what is missing in our children’s preparations for life in which digital will no longer be a label but a way of life, then to identify the gap, we must first consider what we see as whole.

Rather than keeping children safe in education, as regards data sharing and digital privacy, the DfE seems happy to keep them ignorant. This is no way to treat our young people and develop their digital skills, just as giving their data away is not good cyber security.

What does a Dream for a  great ‘digital’ Society look like?

Had Martin Luther King lived to be 87 he would have continued to inspire hope and to challenge us to fulfill his dream for society – where everyone would have an equal opportunity for “life, liberty and the pursuit of happiness.”

Moving towards that goal, supported with technology, with ethical codes of practice, my dream is we see a more inclusive, fulfilled, sustainable and happier society. We must educate our children as fully rounded digital and data savvy individuals, who trust themselves and systems they use, and are well treated by others.

Sadly, introductions of these types of freedom limiting technologies for our children, risk instead that it may be a society in which many people do not feel comfortable, that lost sight of the value of privacy.

References:

[1] Digital Skills Inquiry: http://www.parliament.uk/business/committees/committees-a-z/commons-select/science-and-technology-committee/inquiries/parliament-2015/digital-skills-inquiry-15-16/

[2] UN Convention of the Rights of the Child

[3] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

[4] Nicky Morgan’s full speech at BETT

[5] The defenddigitalme campaign to ask the Department forEducation to change practices and policy around The National Pupil Database

 

 

Monitoring software in schools: the Department for Education’s digital dream or nightmare? (1)

Nicky Morgan, the Education Secretary,  gave a speech [1] this week and shared her dream of the benefits technology for pupils.

It mentioned two initiatives to log children’s individual actions; one is included in a consultation on new statutory guidance, and the other she praised, is a GPS based mobile monitoring app.

As with many new applications of technology, how the concept is to be implemented in practice is important to help understand how intrusive any new use of data is going to be.

Unfortunately for this consultation there is no supporting code of practice what the change will mean, and questions need asked.

The most significant aspects in terms of changes to data collection through required monitoring are in the areas of statutory monitoring, systems, and mandatory teaching of ‘safeguarding’:

Consultation p11/14: “We believe including the requirement to ensure appropriate filtering and monitoring are in place, in statutory guidance, is proportional and reasonable in order to ensure all schools and colleges are meeting this requirement. We don’t think including this requirement will create addition burdens for the vast majority of schools, as they are already doing this, but we are keen to test this assumption.”

Consultation:  paragraph 75 on page 22 introduces this guidance section and ends with a link to “Buying advice for schools.” “Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. Guidance on e-security is available from the National Education Network.

Guidance: para 78 “Whilst it is essential that governing bodies and proprietors ensure that appropriate filters and monitoring systems are in place they should be careful  that “over blocking” does not lead to unreasonable restrictions as to what children can be taught with regards to online teaching  and safeguarding.” —

Consultation: “The Opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and  “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities. This is an important topic and the assumption is the vast majority of governing bodies and proprietors will already be ensuring the children in their school are suitably equipped with regards to safeguarding. But we are keen to hear views as to the change in emphasis.”

Paragraph 88 on p24  is oddly phrased: “Governing bodies and proprietors should ensure that staff members do not agree confidentiality and always act in the best interests of the child.”

What if confidentiality may sometimes be in the best interests of the child? What would that mean in practice?

 

Keeping Children Safe in Education – Questions on the Consultation and Use in practice

The consultation [2] is open until February 16th, and includes a new requirement to have web filtering and monitoring systems.

Remembering that 85% of children’s waking hours are spent outside school, and in a wide range of schools our children aged 2 -19, not every moment is spent unsupervised and on-screen, is it appropriate that this 24/7 monitoring would be applied to all types of school?

This provider software is reportedly being used in nearly 1,400 secondary schools in the UK.  We hear little about its applied use.

The cases of cyber bullying or sexting in schools I hear of locally, or read in the press, are through smartphones. Unless the school snoops on individual devices I wonder therefore if the cost, implementation and impact is proportionate to the benefit?

  1. Necessary and proportionate? How does this type of monitoring compare with other alternatives?
  2. Privacy impact assessment? Has any been done – surely required as a minimum measure?
  3. Cost benefit risk assessment of the new guidance in practice?
  4. Problem vs Solution: What problem is it trying to solve and how will they measure if it is successful, or stop its use if it is not?  Are other methods on offer?
  5. Due diligence: how do parents know that the providers have undergone thorough vetting and understand who they are? After all, these providers have access to millions of our children’s  online interactions.
  6. Evidence: If it has been used for years in school, how has it been assessed against other methods of supervision?
  7. The national cash cost: this must be enormous when added up for every school in the country, how is cost balanced against risk?
  8. Intangible costs – has anyone asked our children’s feeling on this? Where is the boundary between what is constructive and creepy? Is scope change documented if they decide to collect more data?

Are we Creating a Solution that Solves or creates a Problem?

The private providers would have no incentive to say their reports don’t work and schools, legally required to be risk averse, would be unlikely to say stop if there is no outcome at all.

Some providers  include “review of all incidents by child protection and forensic experts; freeing up time for teachers to focus on intervention” and “trends such as top users can be viewed.” How involved are staff who know the child as a first point of information sharing?

Most tools are multipurposed and I understand the reasons given behind them, but how it is implemented concerns me.

If the extent of these issues really justify this mass monitoring in every school – what are we doing to fix the causes, not simply spy on every child’s every online action in school? (I look at how it extends outside in part two.)

Questions on Public engagement: How are children and families involved in the implementation and with what oversight?

  1. Privacy and consent: Has anyone asked pupils and parents what they think and what rights they have to say no to sharing data?
  2. Involvement: Are parents to be involved and informed in software purchasing and in all data sharing decisions at local or regional level? Is there consistency of message if providers vary?
  3. Transparency: Where are the data created through the child’s actions stored, and for how long? Who has access to the data? What actions may result from it? And with what oversight?
  4. Understanding: How will children and parents be told what is “harmful and inappropriate content” as loosely defined by the consultation, and what they may or may not research? Children’s slang changes often, and “safeguarding” terms are subjective.
  5. Recourse: Will it include assessment of unintended consequences from misinterpretation of information gathered?
  6. Consent: And can I opt my child out from data collection by these unknown and ‘faceless’ third parties?

If children and parents are told their web use is monitored, what chilling effect may that have on their trust of the system, of teaching staff, and their ability to look for content to support their own sensitive concerns or development  that they may not be able to safe to look for at home? What limitation will that put on their creativity?

These are all questions that should be asked to thoroughly understand the consultation and requires wide public examination.

Since key logging is already common practice (according to provider websites) and will effectively in practice become statutory, where is the public discussion? If it’s not explicitly statutory, should pupils be subject to spying on their web searches in a postcode lottery?

What exactly might this part of the new guidance mean for pupils?

In part two, I include the other part of her speech, the GPS app and ask whether if we track every child in and outside school, are we moving closer to the digital dream, or nightmare, in the search to close the digital skills gap?

****

References:

[1] Nicky Morgan’s full speech at BETT

[2] Consultation: Keeping Children Safe in Education – closing Feb 16thThe “opportunities to teach safeguarding” section (para 77-78) has been updated and now says governing bodies and proprieties “should ensure” rather than “should consider” that children are taught about safeguarding, including online, through teaching and learning opportunities.

The Consultation Guidance: most relevant paragraphs 75 and 77 p 22

“Governing bodies and proprietors should be confident that systems are in place that will identify children accessing or trying to access harmful and inappropriate content online. [Proposed statutory guidance]

Since “guidance on procuring appropriate ICT” from the National Education Network NEN* is offered, it is clearly intended that this ‘system’ to be ‘in place’, should be computer based. How will it be applied in practice? A number of the software providers for schools already provide services that include key logging, using “keyword detection libraries” that “provides a complete log of all online activity”.

(*It’s hard to read more about as many of NEN’s links are dead.)  

Ethics, standards and digital rights – time for a citizens’ charter

Central to future data sharing [1] plans is the principle of public interest, intended to be underpinned by transparency in all parts of the process, to be supported by an informed public.  Three principles that are also key in the plan for open policy.

The draft ethics proposals [2] start with user need (i.e. what government wants, researchers want, the users of the data) and public benefit.

With these principles in mind I wonder how compatible the plans are in practice, plans that will remove the citizen from some of the decision making about information sharing from the citizen; that is, you and me.

When talking about data sharing it is all too easy to forget we are talking about people, and in this case, 62 million individual people’s personal information, especially when users of data focus on how data are released or published. The public thinks in terms of personal data as info related to them. And the ICO says, privacy and an individual’s rights are engaged at the point of collection.

The trusted handling, use and re-use of population-wide personal data sharing and ID assurance are vital to innovation and digital strategy. So in order to make these data uses secure and trusted, fit for the 21st century, when will the bad bits of current government datasharing policy and practice [3] be replaced by good parts of ethical plans?

Current practice and Future Proofing Plans

How is policy being future proofed at a time of changes to regulation in the new EUDP which are being made in parallel? Changes that clarify consent and the individual, requiring clear affirmative action by the data subject. [4]  How do public bodies and departments plan to meet the current moral and legal obligation to ensure persons whose personal data are subject to transfer and processing between two public administrative bodies must be informed in advance?

How is public perception [5] being taken into account?

And how are digital identities to be protected when they are literally our passport to the world, and their integrity is vital to maintain, especially for our children in the world of big data [6] we cannot imagine today? How do we verify identity but not have to reveal the data behind it, if those data are to be used in ever more government transactions – done badly that could mean the citizen loses sight of who knows what information and who it has been re-shared with.

From the 6th January there are lots of open questions, no formal policy document or draft legislation to review. It appears to be far off being ready for public consultation, needing concrete input on practical aspects of what the change would mean in practice.

Changing the approach to the collection of citizens’ personal data and removing the need for consent to wide re-use and onward sharing, will open up a massive change to the data infrastructure of the country in terms of who is involved in administrative roles in the process and when. As a country to date we have not included data as part of our infrastructure. Some suggest we should. To change the construction of roads would require impact planning, mapping and thought out budget before beginning the project to assess its impact. An assessment this data infrastructure change appears to be missing entirely.

I’ve considered the plans in terms of case studies of policy and practice, transparency and trust, the issues of data quality and completeness and digital inclusion.

But I’m starting by sharing only my thoughts on ethics.

Ethics, standards and digital rights – time for a public charter

How do you want your own, or your children’s personal data handled?

This is not theoretical. Every one of us in the UK has our own confidential data used in a number of ways about which we are not aware today. Are you OK with that? With academic researchers? With GCHQ? [7] What about charities? Or Fleet Street press? All of these bodies have personal data from population wide datasets and that means all of us or all of our children, whether or not we are the subjects of research, subject to investigation, or just an ordinary citizen minding their own business.

On balance, where do you draw the line between your own individual rights and public good? What is fair use without consent and where would you be surprised and want to be informed?
I would like to hear more about how others feel about and weigh the risks and benefits trade off in this area.

Some organisations on debt have concern about digital exclusion. Others about compiling single view data in coercive relationships. Some organisations are campaigning for a digital bill of rights. I had some thoughts on this specifically for health data in the past.

A charter of digital standards and ethics could be enabling, not a barrier and should be a tool that must come to consultation before new legislation.

Discussing datasharing that will open up every public data set “across every public body” without first having defined a clear policy is a challenge. Without defining its ethical good practice first as a reference framework, it’s dancing in the dark. This draft plan is running in parallel but not part of the datasharing discussion.
Ethical practice and principles must be the foundation of data sharing plans, not an after thought.

Why? Because this stuff is hard. The kinds of research that use sensitive de-identified data are sometimes controversial and will become more challenging as the capabilities of what is possible increase with machine learning, genomics, and increased personalisation and targeting of marketing, and interventions.

The ADRN had spent months on its ethical framework and privacy impact assessment, before I joined the panel.

What does Ethics look like in sharing bulk datasets?

What do you think about the commercialisation of genomic data by the state – often from children whose parents are desperate for a diagnosis – to ‘kick start’ the UK genomics industry?  What do you think about data used in research on domestic violence and child protection? And in predictive policing?

Or research on religious affiliations and home schooling? Or abortion and births in teens matching school records to health data?

Will the results of the research encourage policy change or interventions with any group of people? Could these types of research have unintended consequences or be used in ways researchers did not foresee supporting not social benefit but a particular political or scientific objective? If so, how is that governed?

What research is done today, what is good practice, what is cautious and what would Joe Public expect? On domestic violence for example, public feedback said no.

And while there’s also a risk of not making the best use of data, there are also risks of releasing even anonymised data [8] in today’s world in which jigsawing together the pieces of poorly anonymised data means it is identifying. Profiling or pigeonholing individuals or areas was a concern raised in public engagement work.

The Bean Report used to draw out some of the reasoning behind needs for increased access to data: “Remove obstacles to the greater use of public sector administrative data for statistical purposes, including through changes to the associated legal framework, while ensuring appropriate ethical safeguards are in place and privacy is protected.”

The Report doesn’t outline how the appropriate ethical safeguards are in place and privacy is protected. Or what ethical looks like.

In the Public interest is not clear cut.

The boundary between public and private interest shift in time as well as culture. While in the UK the law today says we all have the right to be treated as equals, regardless of our gender, identity or sexuality it has not always been so.

By putting the rights of the individual on a lower par than the public interest in this change, we risk jeopardising having any data at all to use. But data will be central to the digital future strategy we are told the government wants to “show the rest of the world how it’s done.”

If they’re serious, if all our future citizens must have a digital identity to use with government with any integrity, then the use of not only our current adult, but our children’s data really matters – and current practices must change.  Here’s a case study why:

Pupil data: The Poster Child of Datasharing Bad Practice

Right now, the National Pupil database containing our 8 million or more children’s personal data in England is unfortunately the poster child of what a change in legislation and policy around data sharing, can mean in practice.  Bad practice.

The “identity of a pupil will not be discovered using anonymised data in isolation”, says the User Guide, but when they give away named data, and identifiable data in all but 11 requests since 2012, it’s not anonymised. Anything but the ‘anonymised data’ of publicly announced plans presented in 2011, yet precisely what the change in law to broaden the range of users in the Prescribed Persons Act 2009 permitted , and the expansion of purposes in the amended Education (Individual Pupil Information)(Prescribed Persons) Regulations introduced in June 2013.  It was opened up to:

“(d)persons who, for the purpose of promoting the education or well-being of children in England are—

(i)conducting research or analysis,

(ii)producing statistics, or

(iii)providing information, advice or guidance,

and who require individual pupil information for that purpose(5);”.

The law was changed so that, individual pupil level data, and pupil names are extracted, stored and have also been released at national level. Raw data sent to commercial third parties, charities and press in identifiable individual level and often sensitive data items.

This is a world away from safe setting, statistical analysis of de-identified data by accredited researchers, in the public interest.

Now our children’s confidential data sit on servers on Fleet Street – is this the model for all our personal administrative data in future?

If not, how do we ensure it is not? How will the new all-datasets’ datasharing legislation permit wider sharing with more people than currently have access and not end up with all our identifiable data sent ‘into the wild’ without audit as our pupil data are today?

Consultation, transparency, oversight and public involvement in ongoing data decision making are key, and  well written legislation.

The public interest alone, is not a strong enough description to keep data safe. This same government brought in this National Pupil Database policy thinking it too was ‘in the public interest’ after all.

We need a charter of ethics and digital rights that focuses on the person, not exclusively the public interest use of data.

They are not mutually exclusive, but enhance one another.

Getting ethics in the right place

These ethical principles start in the wrong place. To me, this is not an ethical framework, it’s a ‘how-to-do-data-sharing’ guideline and try to avoid repeating care.data. Ethics is not first about the public interest, or economic good, or government interest. Instead, referencing an ethics council view, you start with the person.

“The terms of any data initiative must take into account both private and public interests. Enabling those with relevant interests to have a say in how their data are used and telling them how they are, in fact, used is a way in which data initiatives can demonstrate respect for persons.”

Professor Michael Parker, Member of the Nuffield Council on Bioethics Working Party and Professor of Bioethics and Director of the Ethox Centre, University of Oxford:

“Compliance with the law is not enough to guarantee that a particular use of data is morally acceptable – clearly not everything that can be done should be done. Whilst there can be no one-size-fits-all solution, people should have say in how their data are used, by whom and for what purposes, so that the terms of any project respect the preferences and expectations of all involved.”

The  partnership between members of the public and public administration must be consensual to continue to enjoy support. [10]. If personal data are used for research or other uses, in the public interest, without explicit consent, it should be understood as a privilege by those using the data, not a right.

As such, we need to see data as about the person, as they see it themselves, and data at the point of collection as information about individual people, not just think of statistics. Personal data are sensitive, and some research uses highly sensitive,  and data used badly can do harm. Designing new patterns of datasharing must think of the private, as well as public interest,  co-operating for the public good.

And we need a strong ethical framework to shape that in.

******

[1] http://datasharing.org.uk/2016/01/13/data-sharing-workshop-i-6-january-2016-meeting-note/

[2] Draft data science ethical framework: https://data.blog.gov.uk/wp-content/uploads/sites/164/2015/12/Data-science-ethics-short-for-blog-1.pdf

[3] defenddigitalme campaign to get pupil data in England made safe http://defenddigitalme.com/

[4] On the European Data Protection regulations: https://www.privacyandsecuritymatters.com/2015/12/the-general-data-protection-regulation-in-bullet-points/

[5] Public engagament work – ADRN/ESRC/ Ipsos MORI 2014 https://adrn.ac.uk/media/1245/sri-dialogue-on-data-2014.pdf

[6] Written evidence submitted to the parliamentary committee on big data: http://data.parliament.uk/writtenevidence/committeeevidence.svc/evidencedocument/science-and-technology-committee/big-data-dilemma/written/25380.pdf

[7] http://www.bbc.co.uk/news/uk-politics-35300671 Theresa May affirmed bulk datasets use at the IP Bill committee hearing and did not deny use of bulk personal datasets, including medical records

[8] http://www.economist.com/news/science-and-technology/21660966-can-big-databases-be-kept-both-anonymous-and-useful-well-see-you-anon

[9] Nuffield Council on Bioethics http://nuffieldbioethics.org/report/collection-linking-use-data-biomedical-research-health-care/ethical-governance-of-data-initiatives/

[10] Royal Statistical Society –  the data trust deficit https://www.ipsos-mori.com/researchpublications/researcharchive/3422/New-research-finds-data-trust-deficit-with-lessons-for-policymakers.aspx

Background: Why datasharing matters to me:

When I joined the data sharing discussions that have been running for almost 2 years only very recently, it was wearing two hats, both in a personal capacity.

The first was with interest in how any public policy and legislation may be changing and will affect deidentified datasharing for academic research, as I am one of two lay people, offering public voice on the ADRN approvals panel.

Its aim is to makes sure the process of granting access to the use of sensitive, linked administrative data from population-wide datasets is fair, equitable and transparent, for de-identified use by trusted researchers, for non-commercial use, under strict controls and in safe settings. Once a research project is complete, the data are securely destroyed. It’s not doing work that “a government department or agency would carry out as part of its normal operations.”

Wearing my second hat, I am interested to see how new policy and practice plan to affect current practice. I coordinate the campaign efforts with the Department for Education to stop giving away the identifiable, confidential and sensitive personal data of our 8m children in England to commercial third parties and press from the National Pupil Database.

Thoughts since #UKHC15. UK health datasharing.

The world you will release your technology into, is the world you are familiar with, which is already of the past. Based on old data.

How can you design tools and systems fit for the future? And for all?

For my 100th post and the first of 2016, here is a summary of some of my thoughts prompted by . Several grains of thought related to UK heath data that have been growing for some time.

1000 words on “Hard things: identity, data sharing and consent.” The fun run version.

Do we confuse hard with complex? Hard does not have to mean difficult. Some things seem to be harder than necessary, because of politics. I’ve found this hard to write. Where to start?

The search to capture solutions has been elusive.

The starting line: Identity

Then my first thoughts on identity got taken care of by Vinay Gupta in this post, better than I could. (If you want a long read about identity, you might want to get a hot drink like I did and read and re-read. It says it’ll take an hour. It took me several, in absorption and thinking time. And worth it.)

That leaves data sharing and consent. Both of which I have written many of my other 99 posts about in the last year. So what’s new?

Why are we doing this: why aren’t we there yet?

It still feels very much that many parts of the health service and broader government thinking on ‘digital’ is we need to do something. Why is missing, and therefore achieving and measuring success is hard.

Often we start with a good idea and set about finding a solution how to achieve it. But if the ‘why’ behind the idea is shaky to start with, the solution may falter, as soon as it gets difficult. No one seems to know what #paperless actually means in practice.

So why try and change things? Fixing problems, rather than coming up with good ideas is another way to think of it as they suggested at  #ukhc15, it was a meet-up for people who want to make things better, usually for others, and sometimes that involves improving the systems they worked with directly, or supported others in.

I no longer work in systems’ introductions, or enhancement processes, although I have a lay role in research and admin data, but regular readers know, most of the last two years has been all about the data.  care.data.

More often than not, in #ukhc2015 discussions that focused on “the data” I would try and bring people back to thinking about what the change is trying to solve, what it wants to “make better” and why.

There’s a broad tendency to simply think more data = better. Not true, and I’ll show later a case study why. We must question why.

Why doesn’t everyone volunteer or not want to join in?

Very many people who have spoken with me over the last two years have shared their concrete concerns over the plans to share GP data and they do not get heard. They did not see a need to share their identifiable personal confidential data, or see why truly anonymous data would not be sufficient for health planning, for example.

Homeless men, and women at risk, people from the travelling community, those with disabilities, questions on patients with stigmatising conditions, minorities, children, sexual orientation – not to mention from lawyers or agencies representing them. Or the 11 million of our adult population not online. Few of whom we spoke about. Few of whom we heard from at #ukhc15. Yet put together, these individuals make up not only a significant number of people, but make up a disproportionately high proportion of the highest demands on our health and social care services.

The inverse care law appears magnified in its potential when applied to digital, and should magnify the importance of thinking about access. How will care.data make things better for them, and how will the risks be mitigated? And are those costs being properly assessed if there is no assessment of the current care.data business case and seemingly, since 2012 at least, no serious effort to look at alternatives?

The finish line? We can’t see what it looks like yet.

The #ukhc2015 event was well run, and I liked the spontaneity of people braver than me who were keen to lead sessions and did it well.  As someone who is white, living in a ‘nice’ area, I am privileged. It was a privilege to spend a day with #UKHC15 and packed with people who clearly think about hard things all the time. People who want to make things better.  People who were welcoming to nervous first-timers at an ‘un’conference over a shared lunch.

I hope the voices of those who can’t attend these events, and outside London, are equally accounted for in all government 2016 datasharing plans.

This may be the last chance after years of similar consultations have failed to deliver workable, consensual public data sharing policies.

We have vast streams of population-wide data stored in the UK, about which, the population is largely ignorant. But while the data may be from 25 years ago, whatever is designed today is going to need to think long term, not how do we solve what we know, but how do we design solutions that will work for what we don’t.

Transparency here will be paramount to trust if future decisions are made for us, or those we make for ourselves are ‘influenced’ by machine learning, by algorithms, machine learning and ‘mindspace’ work.

As Thurgood Marshall said,

“Our whole constitutional heritage rebels at the thought of giving government the power to control men’s minds.”

Control over who we are and who the system thinks we are becomes a whole new level of discussion, if we are being told how to make a decision, especially where the decision is toward a direction of public policy based on political choice. If pensions are not being properly funded, to not allocate taxes differently and fund them, is a choice the current government has made, while the DWP seeks to influence our decison, to make us save more in private pensions.

And how about in data discussions make an effort to start talking a little more clearly in the same terms – and stop packaging ‘sharing’ as if it is something voluntary in population-wide compulsory policy.

It’s done to us, not with us, in far too many areas of government we do not see. Perhaps this consultation might change that, but it’s the ‘nth’ number of consulations and I want to be convinvced this one is intentional of real change. It’s only open for a few weeks, and this meet up for discussion appeared to be something only organised in London.

I hope we’ll hear committment to real change in support of people and the uses of our personal data by the state in the new #UkDigiStrategy, not simply more blue skythinking and drinking the ‘datasharing’ kool-aid.  We’ve been talking in the UK for far too long about getting this right.

Let’s see the government serious about making it happen. Not for government, but in the public interest, in a respectful and ethical partnership with people, and not find changes forced upon us.

No other foundation will be fit for a future in which care.data, the phenotype data, is to be the basis for an NHS so totally personalised.

If you want a longer read, read on below for my ten things in detail.

Comment welcome.

########

Hard things: The marathon version, below.
Continue reading Thoughts since #UKHC15. UK health datasharing.

Thinking to some purpose