Tag Archives: gdpr

Pirates and their stochastic parrots

It’s a privilege to have a letter published in the FT as I do today, and thanks to the editors for all their work in doing so.

I’m a bit sorry that it lost the punchline which was supposed to bring a touch of AI humour about pirates and their stochastic parrots. And its rather key point was cut that,

“Nothing in current European laws, including Convention 108 for the UK, prevents companies developing AI lawfully.”

So for the record, and since it’s (£), my agreed edited version was:

“The multi-signatory open letter advertisement, paid for by Meta, entitled “Europe needs regulatory certainty on AI” (September 19) was fittingly published on International Talk Like a Pirate Day.

It seems the signatories believe they cannot do business in Europe without “pillaging” more of our data and are calling for new law.

Since many companies lobbied against the General Data Protection Regulation or for the EU AI Act to be weaker, or that the Council of Europe’s AI regulation should not apply to them, perhaps what they really want is approval to turn our data into their products without our permission.

Nothing in current European laws, including Convention 108 for the UK, prevents companies developing AI lawfully. If companies want more consistent enforcement action, I suggest Data Protection Authorities comply and act urgently to protect us from any pirates out there, and their greedy stochastic parrots. “

Prior to print they asked to cut out a middle paragraph too.

“In the same week, LinkedIn sneakily switched on a ‘use me for AI development’ feature for UK users without telling us (paused the next day); Larry Ellison suggested at Oracle’s Financial Analyst Meeting  that more AI should usher in an era of mass citizen surveillance, and our Department for Education has announced it will allow third parties to exploit school children’s assessment data for AI product building, and can’t rule out it will include personal data.”

It is in fact the cumulative effect around the recent flurry of AI activities by various parties, state and commercial, that deserves greater attention rather than being only about this Meta-led complaint. Who is grabbing what data and what infrastructure contracts and creating what state dependencies and strengths to what end game?  While some present the “AI race” as China or India versus the EU or the US to become AI “super powers”, is what “Silicon Valley” offers, their way is the only way, a better offer?

It’s not in fact, “Big Tech” I’m concerned about, but the arrogance of so many companies that in the middle of regulatory scrutiny  would align themselves with one that would rather put out PR that omits the fact they are under it, instead only calling for the law to be changed, and frankly misleading the public by suggesting it is all for our own good than talk about how this serves their own interests.

Who do they think they are to dictate what new laws must look like when they seem simply unwilling to stick to those we have?

Perhaps this open letter serves as a useful starting point to direct DPAs to the companies in need of most scrutiny around their data practices. They seem to be saying they want weaker laws or more enforcement. Some are already well known for challenging both. Who could forget Meta (Facebook’s) secret emotional contagion study involving children in which friends’ postings were moved to influence moods, or the case of letting third parties, including Cambridge Analytica access users’ data? Then there’s the data security issues or the fine over international transfers or the anti-trust issues. And there’s the legal problems with their cookies. And all of this built from humble beginnings by the same founder of Facemash “a prank website” to rate women as hot or not.

As Congressman Long reportedly told Zuckerberg in 2018, “You’re the guy to fix this. We’re not. You need to save your ship.”

The Meta-led ad called for “harmonisation enshrined in regulatory frameworks like the GDPR” and I absolutely agree. The DPAs need to stand tall and stand up to OpenAI and friends (ever dwindling in number so it seems) and reassert the basic, fundamental principles of data protection laws from the GDPR to Convention 108 to protect fundamental human rights. Our laws should do so whether companies like them or not. After all, it is often abuse of data rights by companies, and states, that populations need protection from.

Data protection ‘by design and by default’ is not optional under European data laws established for decades. It is not enough to argue that processing is necessary because you have chosen to operate your business in a particular way, nor a necessary part of your chosen methods.

The Netherlands DPA is right to say scraping is almost always unlawful. A legitimate interest cannot be simply plucked from thin air by anyone who is neither an existing data controller nor processor and has no prior relationship to the data subjects who have no reasonable expectation of their re-use of data online that was not posted for the purposes that the scraper has grabbed it and without any informed processing and offer of an opt out. Instead the only possible basis for this kind of brand new controller should be consent. Having to break the law, hardly screams ‘innovation’.

Regulators do not exist to pander to wheedling, but to independently uphold the law in a democratic society in order to protect people, not prioritise the creation of products:

  • Lawfulness, fairness and transparency.
  • Purpose limitation.
  • Data minimisation.
  • Accuracy.
  • Storage limitation.
  • Integrity and confidentiality (security)
    and
  • Accountability.

In my view, it is the lack of dissausive enforcement as part of checks-and-balances on big power like this, regardless of where it resides, that poses one of the biggest data-related threats to humanity.

Not AI, nor being “left out” of being used to build it for their profit.

Shifting power and sovereignty. Please don’t spaff our data laws up the wall.

Duncan Green’s book, How Change Happens reflects on how power and systems shape change, and its key theme is most timely post the General Election.

Critical junctures shake the status quo and throw all the power structures in the air.

The Sunday Times ran several post-election stories this weekend. Their common thread is about repositioning power; realigning the relationships across Whitehall departments, and with the EU.

It appears that meeting the political want, to be seen by the public to re-establish sovereignty for Britain, is going to come at a price.

The Sunday Times article suggests our privacy and data rights are likely to be high up on the list, in any post-Brexit fire sale:

“if they think we are going to be signing up to stick to their data laws and their procurement rules, that’s not going to happen”.

Whether it was simply a politically calculated statement or not, our data rights are clearly on the table in current wheeling and dealing.

Since there’s nothing in EU data protection law that is a barrier to trade doing what is safe, fair and transparent with personal data it may be simply be politically opportunistic to be seen to be doing something that was readily associated with the EU. “Let’s take back control of our cookies”, no less.

But reality is that either way the UK_GDPR is already weaker for UK residents than what is now being labelled here as EU_#GDPR.

If anything, GDPR is already too lenient to organisations and does little especially for children, to shift the power balance required to build the data infrastructures we need to use data well. The social contract for research and other things, appropriate to  ever-expanding technological capacity, is still absent in UK practice.

But instead of strengthening it, what lies ahead is expected divergence between the UK_GDPR and the EU_GDPR in future, via the powers in the European Union (Withdrawal) Act 2017.

A post-Brexit majority government might pass all the law it likes to remove the ability to exercise our human rights or data rights under UK Data protection law.  Henry VIII powers adopted in the last year, allow space for top down authoritarian rule-making across many sectors. The UK government was alone among other countries when the government created its own exemption for immigration purposes in the UK Data Protection Act in 2018. That removed the ability from all of us,  to exercise rights under GDPR. It might choose to further reduce our freedom of speech, and access to the courts.

But would the harmful economic side effects be worth it?

If Britain is to become a ‘buzz of tech firms in the regions’, and since  much of tech today relies on personal data processing, then a ‘break things and move fast’ approach (yes, that way round), won’t protect  SMEs from reputational risk, or losing public trust. Divergence may in fact break many businesses. It will cause confusion and chaos, to have UK self-imposed double standards, increasing workload for many.

Weakened UK data laws for citizens, will limit and weaken UK business both in terms of their own positioning in being able to trade with others, and being able to manage trusted customer relations. Weakened UK data laws will weaken the position of UK research.

Having an accountable data protection officer can be seen as a challenge. But how much worse might challenges in court be, when you cock up handling millions of patients’ pharmaceutical records [1], or school children’s biometric data? Save nothing of the potential implications for national security [2] or politicians when lists of millions of people could be open to blackmail or abuse for a generation.

The level playing field that every company can participate in, is improved, not harmed, by good data protection law. Small businesses that moan about it, might simply never have been good at doing data well. Few significant changes have been of substance in Britain’s Data Protection laws over the last twenty years.

Data laws are neither made-up, bonkers banana-shaped standards,  nor a meaningful symbol of sovereignty.

GDPR is also far from the only law the UK must follow when it comes to data.  Privacy and other rights may be infringed unlawfully, even where data protection law is no barrier to processing. And that’s aside from ethical questions too.

There isn’t so much a reality of “their data laws”, but rather *our* data laws, good for our own protection, for firms, *and* the public good.

Policy makers who might want such changes to weaken rights, may not care, looking out for fast headlines, not slow-to-realise harms.

But if they want a legacy of having built a better infrastructure that positions the UK for tech firms, for UK research, for citizens and for the long game, then they must not spaff our data laws up the wall.


Duncan Green’s book, How Change Happens is available via Open Access.


Updated December 26, 2019 to add links to later news:

[1]   20/12/2019 The Information Commissioner’s Office (ICO) has fined a London-based pharmacy £275,000 for failing to ensure the security of special category data. https://ico.org.uk/action-weve-taken/enforcement/doorstep-dispensaree-ltd-mpn/

[2] 23/12/2019 Pentagon warns military members DNA kits pose ‘personal and operational risks’ https://www.yahoo.com/news/pentagon-warns-military-members-dna-kits-pose-personal-and-operational-risks-173304318.html