The Future of Data in Public Life

What is means to be human is going to be different. That was the last word of a panel of four excellent speakers, and the sparkling wit and charm of chair Timandra Harkness, at tonight’s Turing Institute event, hosted at the British Library, on the future of data.

The first speaker, Bernie Hogan, of the Oxford Internet Institute, spoke of Facebook’s emotion experiment,  and the challenges of commercial companies ownership and concentrations of knowledge, as well as their decisions controlling what content you get to see.

He also explained simply what an API is in human terms. Like a plug in a socket and instead of electricity, you get a flow of data, but the data controller can control which data can come out of the socket.

And he brilliantly brought in a thought what would it mean to be able to go back in time to the Nuremberg trials, and regulate not only medical ethics, but the data ethics of indirect and computational use of information. How would it affect today’s thinking on AI and machine learning and where we are now?

“Available does not mean accessible, transparent does not mean accountable”

Charles from the Bureau of Investigative Journalism, who had also worked for Trinity Mirror using data analytics, introduced some of the issues that large datasets have for the public.

  • People rarely have the means to do any analytics well.
  • Even if open data are available, they are not necessarily accessible due to the volume of data to access, and constraints of common software (such as excel) and time constraints.
  • Without the facts they cannot go see a [parliamentary] representative or community group to try and solve the problem.
  • Local journalists often have targets for the number of stories they need to write, and target number of Internet views/hits to meet.

Putting data out there is only transparency, but not accountability if we cannot turn information into knowledge that can benefit the public.

“Trust, is like personal privacy. Once lost, it is very hard to restore.”

Jonathan Bamford, Head of Parliamentary and Government Affairs at the ICO, took us back to why we need to control data at all. Democracy. Fairness. The balance of people’s rights,  like privacy, and Freedom-of-Information, and the power of data holders. The awareness that power of authorities and companies will affect the lives of ordinary citizens. And he said that even early on there was a feeling there was a need to regulate who knows what about us.

The third generation of Data Protection law he said, is now more important than ever to manage the whole new era of technology and use of data that did not exist when previous laws were made.

But, he said, the principles stand true today. Don’t be unfair. Use data for the purposes people expect. Security of data matters. As do rights to see the data people hold about us.  Make sure data are relevant, accurate, necessary and kept for a sensible amount of time.

And even if we think that technology is changing, he argued, the principles will stand, and organisations need to consider these principles before they do things, considering privacy as a fundamental human right by default, and data protection by design.

After all, we should remember the Information Commissioner herself recently said,

“privacy does not have to be the price we pay for innovation. The two can sit side by side. They must sit side by side.

It’s not always an easy partnership and, like most relationships, a lot of energy and effort is needed to make it work. But that’s what the law requires and it’s what the public expects.”

“We must not forget, evil people want to do bad things. AI needs to be audited.”

Joanna J. Bryson was brilliant her multifaceted talk, summing up how data will affect our lives. She explained how implicit biases work, and how we reason, make decisions and showed up how we think in some ways  in Internet searches. She showed in practical ways, how machine learning is shaping our future in ways we cannot see. And she said, firms asserting that doing these things fairly and openly and that regulation no longer fits new tech, “is just hoo-hah”.

She talked about the exciting possibilities and good use of data, but that , “we must not forget, evil people want to do bad things. AI needs to be audited.” She summed up, we will use data to predict ourselves. And she said:

“What is means to be human is going to be different.”

That is perhaps the crux of this debate. How do data and machine learning and its mining of massive datasets, and uses for ‘prediction’, affect us as individual human beings, and our humanity?

The last audience question addressed inequality. Solutions like transparency, subject access, accountability, and understanding biases and how we are used, will never be accessible to all. It needs a far greater digital understanding across all levels of society.   How can society both benefit from and be involved in the future of data in public life? The conclusion was made, that we need more faith in public institutions working for people at scale.

But what happens when those institutions let people down, at scale?

And some institutions do let us down. Such as over plans for how our NHS health data will be used. Or when our data are commercialised without consent breaking data protection law. Why do 23 million people not know how their education data are used? The government itself does not use our data in ways we expect, at scale. School children’s data used in immigration enforcement fails to be fair, is not the purpose for which it was collected, and causes harm and distress when it is used in direct interventions including “to effect removal from the UK”, and “create a hostile environment.” There can be a lack of committment to independent oversight in practice, compared to what is promised by the State. Or no oversight at all after data are released. And ethics in researchers using data are inconsistent.

The debate was less about the Future of Data in Public Life,  and much more about how big data affects our personal lives. Most of the discussion was around how we understand the use of our personal information by companies and institutions, and how will we ensure democracy, fairness and equality in future.

The question went unanswered from an audience member, how do we protect ourselves from the harms we cannot see, or protect the most vulnerable who are least able to protect themselves?

“How can we future proof data protection legislation and make sure it keeps up with innovation?”

That audience question is timely given the new Data Protection Bill. But what legislation means in practice, I am learning rapidly, can be very different from what is in the written down in law.

One additional tool in data privacy and rights legislation is up for discussion, right now,  in the UK. If it matters to you, take action.

NGOs could be enabled to make complaints on behalf of the public under article 80 of the General Data Protection Regulation (GDPR). However, the government has excluded that right from the draft UK Data Protection Bill launched last week.

“Paragraph 53 omits from Article 80, representation of data subjects, where provided for by Member State law” from paragraph 1 and paragraph 2,” [Data Protection Bill Explanatory notes, paragraph 681 p84/112]. 80 (2) gives members states the option to provide for NGOs to take action independently on behalf of many people that may have been affected.

If you want that right, a right others will be getting in other countries in the EU, then take action. Call your MP or write to them. Ask for Article 80, the right to representation, in UK law. We need to ensure that our human rights continue to be enacted and enforceable to the maximum, if, “what is means to be human is going to be different.”

For the Future of Data, has never been more personal.

Data Protection Bill 2017: summary of source links

The Data Protection Bill [Exemptions from GDPR] was introduced to the House of Lords on 13 September 2017
*current status April 6, 2018* Report Stage House of Commons — dates, to be announced
Debates

Dates for all stages of the passage of the Bill, including links to the debates.

EU GDPR Progress Overviews

Updates of GDPR age of consent mapping: Better Internet for Kids

Bird and Bird GDPR Tracker [Shows how and where GDPR has been supplemented locally, highlighting where Member States have taken the opportunities available in the law for national variation.]

ISiCo Tracker (Site in German language) with links.

UK Data Protection Bill Overview
  • Data Protection Bill Explanatory Notes [PDF], 1.2MB, 112 pages
  • Data Protection Bill Overview Factsheet [PDF], 229KB, 4 pages
  • Data Protection Bill Impact Assessment [PDF], 123KB, 5 pages
The General Data Protection Regulation

The General Data Protection Regulation [PDF] 959KB, 88 pages

Related Factsheets
  • General Processing Factsheet, [PDF], 141KB, 3 pages
  • Law Enforcement Data Processing Factsheet [PDF], 226KB, 3 pages
  • National Security Data Processing Factsheet [PDF], 231KB, 4 pages
These parts of the bill concern the function of the Information Commissioner and her powers of enforcement
  • Information Commissioner and Enforcement Factsheet [PDF] 223KB, 4 pages
  • Data sharing code of practice [PDF]
GDPR possible derogations

Source credit Amberhawk: Chris Pounder

Member State law can allow modifications to Articles 4(7), 4(9),  6(2), 6(3)(b), 6(4),  8(1), 8(3), 9(2)(a), 9(2)(b), 9(2)(g), 9(2)(h), 9(2)(i), 9(2)(j), 9(3), 9(4),  10,  14(5)(b), 14(5)(c), 14(5)(d),  17(1)(e), 17(3)(b), 17(3)(d), 22(2)(b),  23(1)(e),  26(1),  28(3), 28(3)(a), 28(3)(g), 28(3)(h), 28(4),  29,  32(4),  35(10), 36(5),  37(4),  38(5),  49(1)(g), 49(4), 49(5),  53(1), 53(3),  54(1), 54(2),  58(1)(f), 58(2), 58(3), 58(4), 58(5),  59,  61(4)(b),  62(3),  80,  83(5)(d), 83(7), 83(8),  85,  86,  87,  88,  89,  and 90 of the GDPR.

Other relevant significant connected legislation
  • The Police and Crime Directive [web link] 
  • EU Charter of Fundamental Rights – European Commission [link]
  • The proposed Regulation on Privacy and Electronic Communications [web link]
  • Draft modernised convention for the protection of individuals with regard to the processing of personal data (convention 108)
Data Protection Bill Statement of Intent
  • DCMS Statement of Intent [PDF] 229KB, 4 pages
  • Letter to Stakeholders [PDF] 184KB, 2 pages 7 Aug 2017
Other links on derogations and data processing
  • On Adequacy: Data transfers between the EU and UK post Brexit? Andrew D. Murray Article [link]
  • Two Birds [web link]
  • ICO legal basis for processing and children [link]
  • Public authorities under the Freedom of Information Act (ICO) Public authorities under FOIA 120160901 Version: 2.2 [link] 
  • ICO information for education [link]

Blogs on key issues [links in date of post]

  • Amberhawk
    • DP Bill’s new immigration exemption can put EU citizens seeking a right to remain at considerable disadvantage [09.10] re: Schedule 2, paragraph 4, new Immigration exemption.
    • On Adequacy:  Draconian powers in EU Withdrawal Bill can negate new Data Protection law [13.09]
    • Queen’s Speech, and the promised “Data Protection (Exemptions from GDPR) Bill [29.06]
  • defenddigitalme
    • Response to the Data Protection Bill debate and Green Paper on Online Strategy [11.10.2017]
  • Jon Baines
    • Serious DCMS error about consent data protection [11.08]
  • Eoin O’Dell
    • The UK’s Data Protection Bill 2017: repeals and compensation – updated: On DCMS legislating for Art 82 GDPR. [14.09]

Data Protection Bill Consultation: General Data Protection Regulation Call for Views on exemptions
  • New Data Protection Bill: Our planned reforms [PDF] 952KB, 30 pages
  • London Economics: Research and analysis to quantify benefits arising from personal data rights under the GDPR [PDF] 3.76MB 189 pages
  • ICO response to DCMS [link]
  • ESRC joint submissions on EU General Data Protection Regulation in the UK – Wellcome led multi org submission plus submission from British Academy / Erdos [link]
  • defenddigitalme response to the DCMS [link]
Minister for Digital Matt Hancock’s keynote address to the UK Internet Governance Forum, 13 September [link].

“…the Data Protection Bill, which will bring our data protection regime into the twenty first century, giving citizens more sovereignty over their data, and greater penalties for those who break the rules.

“With AI and machine learning, data use is moving fast. Good use of data isn’t just about complying with the regulations, it’s about the ethical use of data too.

“So good governance of data isn’t just about legislation – as important as that is – it’s also about establishing ethical norms and boundaries, as a society.  And this is something our Digital Charter will address too.”

Media links

14.09 BBC UK proposes exemptions to Data Protection Bill


Edits:

11.10.2017 to add links to the Second Reading in the House of Lords