Category Archives: Tech

Chinese whispers, modern weapons #fiction

When you play the party game as children, what starts off said into the ear of one player, becomes something quite unintelligible by the time it reaches full circle.

It can cause chaos and it’s quite fun. Unless it ends up something hurtful the hosts would rather hadn’t been shared.

Not so fun, is the potential for the chaos caused by technology with the capability to spread information from one place to another, sufficiently damaging to bring business to a standstill. Or security. Or utilities. Or our medical devices.

In my spare time, I write fiction. [I have a long work-in-progress set against stories of post World War II emigration.] Here’s some flash fiction from today.

###

December 2015 in London.

After an apathetic  run up to the election, when few concrete policies emerged with detail that could be pinned down publicly and become humiliating in case of coalition concessions, the election was decided. A weak power sharing was agreed. Conservative and right wingers, with the dash of yellow that had survived.

Admitting another poor win, the party have ousted Cameron, and elected a new leadership. Boris and Nigel have already had a few laughs and a few run-ins.

On her way home, twenty-eight year old Kate grabs a copy of the Evening Express with their garish grins on the front cover. Again.

Barely 100 days into a winter government, May is long gone. Little of substance has changed save some minor screwing down on the rights to welfare access for foreigners or those ‘fit-to-work’.  Legacy policies remain.

One of those was on cybersecurity;  technology that protects online communications, banking, shopping, health data and more.

Kate reads the page 3 article:

“In a knee jerk reaction to recent violent attacks, the cyber security ban first proposed in the last parliament has been rushed through.

Campaigners claim the MPs understand so little of what they are legislating that they “believe it would be possible to stop terrorists communicating privately without astonishing collateral damage to Britain’s economy, freedom, and security.”

Businesses and government bodies that have security affected under the new laws, consider what to do.

Kate working in her finance IT admin job, spent the day running reports on what software and historical data she needs in the system. Some sort of internal review.

Banking has IT still in place from the seventies. Building anything from the ground up is hard work. Patches are added on for as long as they’ll work. They’ll get round to fixing it. Soon.

Curled up on the sofa  she uses her single log-on for government agencies, for identity, administration and payments. Finally submits her passport renewal and thinks about visiting her cousin in San Francisco in February.

She books the bargain deal seen in an ad on Facebook that suited exactly what she wanted.

Her cousin joked ‘please bring wine’, theirs has run out. His last post mentioned the price of bottled water.  And China extracting it from the sea. Crazy.

Kate decides to catch up with that BBC Radio 4 IT podcast she missed. “There’s a good bit on banking,” her colleague Dan had said,  “and bring-your-own-device at airports.”  He was cute, and she was interested. Earphones, PJs and slippers on.

She worries shes turning into her mother.

Finishing her popcorn, Kate’s half way up before remembers she need do nothing. She isn’t yet used to the clinic’s networked library system administering her insulin dose. [Something else she’d inherited.]

Medical devices have expanded exponentially. Thousands of people have insulin pumps or heart monitors installed, running citizens on invisible software. The transport system for data, the life blood for humans and high-tech.

Kate’s delighted to get independence from appointments.  Her consultant delighted to cut running costs. Teleheath permits datasharing . Algorithms flag warning reports for abnormal statistics.

The individual products are pre-integrated and powered by central backbone systems. The clinic has an overview of everyone it manages remotely.

Kate’s numbers are usually fine, and ignored with a normal label, somewhere in the system.  But they have started to show she needs a greater dose. She’ll get a call in the morning to discuss a change in her meds, to be adjusted remotely.

Thirsty, she gets up and fills her glass from the tap. That reminds her. She should check her bill online.

Utilities across the UK, power and water, many owned by one Chinese conglomerate, have replaced old mainframe customer billing systems with these integrated modern software. Behind the scenes too, its distribution has interoperable compliance permitted by deregulation and required for globalisation.

Checking her alarm before she puts out the light, Kate smiles at the perfect step count for the day. Her fitapp makes her think of Dan.

She dreams of him. Somehow he’s landed with her in San Francisco. They won’t let him through immigration as he doesn’t want to give up his work laptop. Or phone. The flight can’t take off again and every staff member is using their own device to try and control airspace which is filled with pac man eating the planes.

While she travels in her sleep in the small hours, an organisation in a country she’s never been to, starts sending massive amounts of data into systems around the world. Some with bugs.

Overloaded, her hospital system shuts down, spewing out warning reports including Kate’s into the nighttime corridor. It doesn’t report exactly how her own device is affected because they haven’t researched it. 

Still tired, she gets up and hears the radio news in drips through the shower.

“A military coup in China, thousands of private business owners rounded up.”

“Concern is growing after what appears to be a mass cyber attack spreading malware to banks…”

She wonders if hers is affected.  It’s going to snow says the forecaster just as the water stops. And the lights and radio cut off.

Kate curses the landlord, towelling soap suds from her hair. She picks up her phone but can’t call out as there’s no network – either down or just as after 7/7 perhaps it’s overloaded.

She swears again.

She hopes no one is trying to call her.

It looks like today is going to be very inconvenient.

In the tube queue Kate starts contemplating a duvet day, she’s squashed in an impatient mass of people. Ticket machines are down and it’s bedlam. She picks up a paper from the stand.

Water cannon are back on the front page.  The story is about the role of government, state security and how to keep control.

The headline asks:

“Is London’s newest weapon out of date?”

****

 

“Alongside the Great Firewall, China has been developing a new way to intercept and redirect internet traffic, according to a new report from Citizen Lab.” [The Verge, April 2015]

When the intelligence services knows states have infiltrated commercial company systems, and governments have these tools, how will they be used for good, and who defines those purposes?

How do citizens of all nations make sure that our commercial businesses, our everyday life support systems & legislation are well designed?

Do MPs in government understand what it should and can control and is it investing in the right tools to do so? Are our MPs sufficiently skilled for the requirements in the realm of cyber security and digital rights?

Has water potential to be the next weapon of mass destruction?

[image: Telegraph]

Non-human authors wanted. Drones, robots, and our relationship with technology.

“My relationship with the drone is like playing a video game: I feel out a composition and the drone will agree or challenge me. Eventually, though, the drone will develop a creative mind of its own.”  [KATSU, in interview with Mandi Keighran and N magazine, summer 2014].

KATSU, the New York City based artist/vandal/hacker depending on your point of view, raises the question in that interview for Norwegian Airlines’ magazine, of the relationship of “technology to graffiti,” or more broadly, of technology to art as a whole.

This, combined with another seemingly unrelated recent story, the David Salter macaque photo, made me wonder about drones, robots, and the role of the (non-)human author – our relationship with technology in art and beyond.

Ownership and Responsibility – Human or non-Human?

I wondered in both stories, how it may affect ownership and copyright. Rights, which led me to consider the boundaries of responsibility.

I should preface this by saying I know little about copyright and less about drones. But I’m thinking my lay thoughts, out loud.

In the first instance, if drones are used for creating something as in this story, is it as simple as ‘he who owns the drone owns or is responsible for the art it creates’? I wonder, because I don’t know, and while it may be clear today, I wonder if it is changing?

As regards the second story, when the monkey-selfie went around the world focus was sharper on copyright law, than it was in the majority of the photos the macaque had taken.  “Can a monkey own a picture?” asked many, including Metro at the time.

”Wikimedia, the non-profit organisation behind Wikipedia, has refused a photographer’s repeated requests to stop distributing his most famous shot for free – because a monkey pressed the shutter button and should own the copyright,” said the Telegraph.

But whilst most on social media and the press I read, focused on the outcome for this individual photographer, I wondered, what is the impact for the future of photography?

I’ve come to the conclusion, in this particular case I think it is more important we consider it less about the monkey having taken the photo, and more important that it was decided that a human, did not.

This decision was not (yet) decided by a UK court,  but was reached in Wikimedia’s own report.

Since then, the LA Times reported on August 21st, that:

“the public draft of the Compendium of U.S. Copyright Office Practices, Third Edition —was released this week[1], and, after final review, is to take effect in mid-December [2] — says the office will register only works that were created by human beings.”

This is the first major revision in over twenty years and is an internal manual, so it does not have the force of law.  But it’s still significant.

Copyright suitability is dependent on that the work “was created by a human being,” and only protects “…the fruits of intellectual labor” which are “founded in the creative powers of the mind.” Animal ownership is expressly excluded. (Section 306 – The Human Authorship Requirement). Pantomimes performed by a machine or robot are similarly, expressly non-copyrightable. (p.527) and continues:

“Similarly the Office will not register works produced by a machine or mere mechanical process that operates randomly, or automatically without any creative input or intervention from a human author.” (p 55)

The Telegraph article {August 6th} by Matthew Sparkes, said:

‘In its report Wikimedia said that it “does not agree” that the photographer owns the copyright, but also that US law means that “non-human authors” do not have the right to automatic copyright of any photographs that they take.

“To claim copyright, the photographer would have had to make substantial contributions to the final image, and even then, they’d only have copyright for those alterations, not the underlying image. This means that there was no one on whom to bestow copyright, so the image falls into the public domain,” it said.’

One would think common sense would mean that without the work by British photographer David Slater, there would have been no photograph. That his travel, equipment preparations and interaction with the animals was ‘substantial contribution’.

I wonder, could this become a significant argument in the future of copyright and access to material in the public domain?  Because the argument came down NOT to whether a monkey can own copyright, but whether there was any human in which copyright was vested.

copyright

 

Photography is changing. Increasingly technology is being used to take pictures. If photographic copyright depends on human ownership, I wonder if the way is opened for claims to creative images produced by drone or other forms of AI? I don’t know, and copyright law, is best left to experts but I’d like to ask the questions I have. I’ve read UK and US legislation, around ownership, and around use of computers, but it could appear to an ordinary lay eye, that technology is evolving faster than the laws to govern it. Users and uses growing in hobby and commercial markets perhaps even more so.

In UK legislation:

“In this Part “author”, in relation to a work, means the person who creates it.”

“In the case of a literary, dramatic, musical or artistic work which is computer-generated, the author shall be taken to be the person by whom the arrangements necessary for the creation of the work are undertaken.” [Copyright, Designs and Patents Act 1988, Section 9]

It is easy to see how the macaque can slip through in UK law here, as it is not computer generated. And in the US non-human is clearly defined and excluded. But my question is  how do you define computer-generated? At what point does copyright depend on autonomy or on arrangements by human-intervention?

Remember, in the US, the Office will not register works produced by a machine or mere mechanical process that operates randomly, or automatically without any creative input or intervention from a human author.” (p 55)

“Katsu pilots the drone remotely, but every movement is translated through the machine’s need to keep itself aloft and it adapts his directions.”

Where do you draw the line?

Why does it matter today at all?

It matters because copyright law is a gatekeeper and gateway. It makes it commercially viable for creators to produce and make work available to others. It defines responsibilities. One question I ask, is that if it’s no longer worth it, will we be worse off for not having the work they may have otherwise produced?

The market for work produced by or via drone,  is just becoming to hint at becoming mainstream.

The use of drones in photography, for example in hard to reach situations in useful functions like flood mapping will be of great service. Other uses in sports such as alpine skiing, canoeing, or extreme sports is only likely to increase by amateur, professional and commercial users. Stick a go-pro on the drone and it can get footage from places without the need for an accompanying person.

What questions might it raise for artists & creators today?

Specifically on art and copyright:  will this ruling affect what types of images are worth taking? Will it make some work non-commercially viable, or their value determined by the channels of distribution rather than creator? Will this Wikimedia ruling affect the balance of power between creator and  commercial channel providers, in terms of ownership and distribution? I believe it rather serves to highlight where the balance is already.

Have we lessons learned from the music and book industry that apply here? (Clue: they both start with vowels and control distribution.)

Will the decision now go to a UK court and become a clarified legal position?

David Slater reportedly faces an estimated £10,000 legal bill to take the matter to court, said the Telegraph. At very best, this situation  is disrespectful to him and leaves a bitter aftermath, in the question of the power between artist and distributor.

At worst, we could be on the cusp of being left behind in a brave new world of ownership and control of art and knowledge. A world in which actions may be taken through our technology, the product of which no human is deemed to have ownership.

So how does that affect responsibility?

If it has been legally defined, that there is no human copyright ownership for the product of the action by something non-human, where do you draw the line for human responsibility? Am I not responsible for the actions of anything non-human I own? If an animal I own, creates a road traffic accident, am I responsible for its actions? If so, then why not for artwork it creates? If there are differences, why are there differences, and where does the line of responsibility get defined and by whom?

Where are the boundaries of responsibility if we start to distinguish in law between ownership of the result of a task a human set up, but did not carry out? David Slater enabled everything for the photograph to be taken, but did not press the shutter.

I ask: “is the boundary of responsibility undermined by weakening the boundaries of ownership and definition of autonomy of action?”

I believe copyright, ownership, responsibility and non-human authorship is about more than this man vs macaque debate. Will we leave it at that and learn no more from this? If so, then the real monkey is definitely not the one in the picture.

What about considering wider impacts?

In broader context, I believe the public should be asking questions to stay ahead of the game in terms of all legal rulings, and consider carefully the idea of non-human creation and ownership. We should  ensure that both State and commercial uses of drone are not set in place, from which we need to play catch up later. We should be driving the thinking to help shape the society we want to see, and shape the expectations of commercial and State use of drone technology.

What of the drones we cannot see, never hear and yet seem to be supported by our Governments? State surveillance piggybacks commercial infrastructures and tools in other fields, such as communications and social media. We should stay ahead of how drones are increasingly used commercially (as in Amazon pilot news) and we should demand much greater transparency of the existing drone use in our name, in security, surveillance and weaponry. [ added 29 Aug 2014 > also see BT case in CW investigation].

Who controls government decisions and the ethics of drone or robot use? In all of these questions, it comes down to – who’s in a position of power? With power, comes responsibility.

The ethics in use in war zones and in other military action, seen and unseen, is also something we should be asking to understand. To date, much of the public dismisses drone use as something which happens somewhere else and nothing to do with us.

But these decisions do affect what is done in the name of our country and that does indirectly, reflect on us, as its citizens.  These decisions will shape the future commercial uses which will affect us as direct consumers, or as indirect recipients of their impacts on wider society.

There’s lots to think about, as drones develop into tools of art and applications in daily life. I know little of the legal aspects, what has been done already or is being considered today, or what will be in future. I just know, I have lots of questions as an everyday parent, considering what kind of society I hope my children, our future adult citizens, will inherit.  Where do I ask to find out?

My questions are not so much about the technology or law of it, at all. They come down to its ethics, fairness & how this will shape the future.  As a mother, that is the responsibility I bear for my children.

Will we see drones soon in ordinary life or in an everyday future?

In this Wired article, Karl VanHemert states part of Katsu’s aim with the drone is simply to raise questions about the transformative effect the machines might have on art. He plans for it to be Open Source soon. Some argue that tagging is not art, but vandalism. You can see it in action via Motherboard’s video on YouTube here. Suggesting property marking will become a blight on society, you can ask what purpose does it serve? Others suggest drones could be used precisely to paint over graffiti and be of practical uses.

In Scotland it is a well known joke,  that once the painters have finished repainting from one side of the Forth Road Bridge to the other, it’s time for them to start again. Perhaps, those days are over?

Will we see them soon in everyday occupations, and will it make a difference to the average citizen? In commercial service, the mundane estate agent [no offence to those who are, you may be 007 or M in your spare time I know] is reported to be one of the commercial market sectors looking at applications of the photographic potential. It could replace cameras on long poles.

“Unmanned drones can be used for a range of tasks including surveying repairs and capturing particularly good views from unusual angles. ” [Skip Walker, stroudnewsandjournal.co.uk]

These uses are regulated in the UK and must have permission from the CAA.

So far though, I wonder if anyone I’ve met flying a hobby drone with camera over our heads (veering wildly between tent pitches, and enthralling us all, watching it watching us) has requested permission as in point 2?

Regulation will no doubt become widely argued for and against in the public security and privacy debate, rightly or wrongly. With associated risks and benefits, they have the potential to be of public service, entertainment and have uses which we have not yet seen.  How far off is the jedi training remote game? How far off is the security training remote, which is not a game? How is it to be governed?

I have a niggling feeling that as long as State use of drones is less than fully transparent, the Government will not be in a rush to open the debate on the private and commercial uses.

Where does that leave my questions for my kids’ future?

Where is the future boundary in their use and who will set it?

The ethics of this ‘thinking’ technology in these everyday places must be considered today, because tomorrow you may walk into a retirement home and find a robot playing chess with your relative. How would you feel about the same robot, running their bath?

Have you met Bob – the G4S robot in Birmingham – yet?

“While ‘Bob’ carries out his duties, he will also be gathering information about his surroundings and learning about how the environment changes over time”

“A similar robot, called ‘Werner’, will be deployed in a care home environment in Austria.”

How about robots in the home, which can read and ‘learn’ from your emotions?

I think this seemingly silly monkey-selfie case, though clearly anything but for the livelihood of David Slater, should raise a whole raft of questions, that ordinary folk like me should be asking of our experts and politicians.  Perhaps I am wrong, and the non-human author as animal and non-human author as machine are clearly distinct and laid out already in legislation. But as the Compendium of U.S. Copyright Office Practices [open for comment see footnote 2] decision shows, at minimum the macaque-selfie shoot, is not yet done in its repercussions. It goes beyond authorship.

Who decides what is creative input and intervention vs automatic or autonomous action? Where do you draw the line at non-human? Does Bob – the G4S robot in Birmingham – count?

We may be far off yet, from AI that is legally considered ‘making its own decisions’, but when we get to the point where the owner of the equipment used has no influence, no intervention, of what, when or where an image is shot, will we be at the point where there is, no human author? Will we be at the point where there is no owner responsible for the action?

Especially, if in the words of Katsu,

“Eventually…the drone will develop a creative mind of its own.”

What may that mean for the responsibilities of drones & robots as security patrols, or as care workers? Is the boundary of responsibility undermined by weakening the boundaries of copyright, of ownership and autonomy of action?

If so, photographs being shot, without a legally responsible owner, is the least of my worries.

****

[1] Significant files ref: http://infojustice.org/archives/33164  Compendium of US Copyright Office practices – 3rd edition > full version: http://copyright.gov/comp3/docs/compendium-full.pdf

[2] Members of the public may provide feedback on the Compendium at any time before or after the Third Edition goes into effect. See www.copyright.gov/comp3/ for more information.

 

 

care.data should be like playing Chopin – or will it be all the right notes, but in the wrong order? [Part two]

How our data sharing performance will be judged, matters not just today, or in this electoral term but for posterity. The current work-in-progress is not a dress rehearsal for a care.data quick talent show, but the preparations for lifetime performance and at world standard.

How have we arrived where we are now, at a Grand Pause in the care.data performance? I looked at the past, reviewed through the Partridge Review meeting in [part one here] the first half of this post from attending the HSCIC ‘Driving Positive Change’ meeting on July 21st. (official minutes are online via HSCIC >>  here.)

Looking forward, how do we want our data sharing to be? I believe we must not lose sight of classical values in the rush to be centre stage in the Brave New World of medical technology. [updated link  August 3rd]* Our medical datasharing must be above and beyond the best model standards to be acceptable technically, legally and ethically, worldwide. Exercised with discipline, training and precision, care.data should be of the musical equivalent of Chopin.

Not only does HSCIC have a pivotal role to play in the symphony that the Government wishes research to play in the ‘health & wealth’ future of our economy, but they are currently alone on the world stage. Nowhere in the world has a comparable health data set over such length of time, as we do, and none has ever brought in all it’s primary care records into a central repository to merge and link, as is planned with care.data. Sir Kingsley Manning said in the current July/August Pharma Times article, data sharing now has to manage its reputation, just like Big Pharma.

reputation
Pharma Times – July/Aug 2014 http://www.pharmatimes.com/DigitalOnlineArea/digitaleditionlogin.aspx

Countries around the world, will be watching HSCIC, the companies and organisations involved in the management and in the use of our data.  They will be assessing the involvement and reaction of England’s population, to HSCIC’s performance. This performance will help shape what is acceptable, works well and failings will be learned from, by other countries, who will want to do the same in future.

Can we rise to the Challenge to be a world leader in Data Sharing?

If the UK Government wants England to be the world leader in research, we need, not only to be exemplary in how we govern the holding, management and release of data, but also exemplary in our ethics model and expectations of each other in the data sharing process.

How can we expect China [1] with whom the British Government recently agreed £14 billion in trade deals, [2] India, the country to which our GP support services are potentially poised to be outsourced through Steria [3] or any other organi Continue reading care.data should be like playing Chopin – or will it be all the right notes, but in the wrong order? [Part two]

Appendix F. For successful technology, reality must take precedence over public relations.

Richard Feynman
Richard Feynman via brainpickings.org bit.ly/1q1qWLt

June 6th 1986. Six months after the disaster, the Report to the Presidential Commission was released about The Space Shuttle Challenger.

Just over twenty eight years ago, I, like fellow children and citizens around the world, had watched the recorded images from January 28th 1986. We were horrified to see one of the greatest technological wonders of the world break up shortly after launch and crash into the sea minutes later. The lives of Challenger’s seven crew were lost, amongst them the first ‘ordinary citizen’ and member of the teacher in space project, mother of two, Christa McAuliffe.

As part of the follow up audit and report, Richard Feynman’s personal statement was included as Appendix F. Personal observations on reliability of the Shuttle. You can read his full statement. Below are just his conclusions and valuable lessons learned.

“If a reasonable launch schedule is to be maintained, engineering often cannot be done fast enough to keep up with the expectations of originally conservative certification criteria designed to guarantee a very safe vehicle. In these situations, subtly, and often with apparently logical arguments, the criteria are altered so that flights may still be certified in time.

They therefore fly in a relatively unsafe condition, with a chance of failure of the order of a percent (it is difficult to be more accurate).

Official management, on the other hand, claims to believe the probability of failure is a thousand times less. One reason for this may be an attempt to assure the government of NASA perfection and success in order to ensure the supply of funds. The other may be that they sincerely believed it to be true, demonstrating an almost incredible lack of communication between themselves and their working engineers.

In any event this has had very unfortunate consequences, the most serious of which is to encourage ordinary citizens to fly in such a dangerous machine, as if it had attained the safety of an ordinary airliner.

The astronauts, like test pilots, should know their risks, and we honor them for their courage. Who can doubt that McAuliffe was equally a person of great courage, who was closer to an awareness of the true risk than NASA management would have us believe?

Let us make recommendations to ensure that NASA officials deal in a world of reality in understanding technological weaknesses and imperfections well enough to be actively trying to eliminate them. They must live in reality in comparing the costs and utility of the Shuttle to other methods of entering space. And they must be realistic in making contracts, in estimating costs, and the difficulty of the projects.

Only realistic flight schedules should be proposed, schedules that have a reasonable chance of being met.

If in this way the government would not support them, then so be it. NASA owes it to the citizens from whom it asks support to be frank, honest, and informative, so that these citizens can make the wisest decisions for the use of their limited resources. For a successful technology, reality must take precedence over public relations, for nature cannot be fooled.”

Richard Feynman, 1918 -1988

“The Challenger accident has frequently been used as a case study in the study of subjects such as engineering safety, the ethics of whistle-blowing, communications, group decision-making, and the dangers of groupthink. It is part of the required readings for engineers seeking a professional license in Canada and other countries.” [Wikipedia]

Feynman’s Appendix F: Personal Observations on Reliability of the Shuttle is well worth a read in full.

From a business management point of view, Lessons Learned are integral to all projects and there is no reason why they cannot apply across industries. But they are frequently forgotten or ignored, in a project’s desire to look only ahead and achieve future deliverables on time.

Lessons learned can make a hugely important contribution to positive change and shaping outcomes. Assessing what worked well and how it can be repeated, just as important as learning from what went wrong or what was missing.

Public relations efforts which ignore learning from the past, and which fail to acknowledge real issues and gloss over reality doom a project to failure through false expectation. Whether due to naivety, arrogance, or under leadership pressure, it can put a whole project in jeopardy and threaten its successful completion.  Both internal and external stakeholder management are put at unnecessary risk .

In the words of Richard Feynman, “For successful technology, reality must take precedence over public relations.”

What is Care.data? Defined scope is vital for trust.

It seems impossible to date, to get an official simple line drawn around ‘what is care.data’. And therefore scope creep is inevitable and fair processing almost impossible. There is much misunderstanding, seeing it as exclusively this one-time GP load to merge with HES. Or even confusion with the Summary Care Record and its overlap, if it will be used in read-only environments such as Proactive care and Out-of-hours, or by 111 and A&E services.  The best unofficial summary is here from a Hampshire GP, Dr. Bhatia.

Care.data is an umbrella initiative, which is planned over many years.

Care.data seems to be a vision. An ethereal concept of how all Secondary Uses (ref.p28) health and social care data will be extracted and made available to share in the cloud for all manner of customers. A global standard allowing extract, query and reporting for top down control by the men behind the curtains, with intangible benefits for England’s inhabitants whose data it is. Each data set puts another brick in the path towards a perfect, all-knowing, care.data dream. And the data sets continue to be added to and plans made for evermore future flows. (Community Services make up 10 per cent of the NHS budget and the standards that will mandate the national submission of the revised CIDS data is now not due until 2015.)

Whilst offering insight opportunity for top down cost control, planning, and ‘quality’ measures, right down to the low level basics of invoice validation, it will not offer clinicians on the ground access to use data between hospitals for direct care. HES data is too clunky, or too detailed with the wrong kinds of data, or incomplete and inaccurate to benefit patients in care of their individual consultants. Prof Jonathan Kay at the Westminster Health Forum on 1st April telling hospitals, to do their own thing and go away and make local hospital IT systems work. Totally at odds with the mantra of Beverley Bryant, NHS England of, ‘interoperability’ earlier the same day. An audience question asked, how can we ensure patients can transfer successfully between hospitals without a set of standards? It is impossible to see good value for patients here.

Without a controlled scope I do not wish to release my children’s personal data for research purposes. But at the moment we have no choice. Our data is used in pseudonymous format and we have no known publicly communicated way to restrict that use. The patient leaflet, “better data means better care” certainly gives no indication that pseudonymous data is obligatory nor states clearly that only the identifiable data would be restricted if one objected.

Data extracted now, offers no possibility to time limit its use. I hope my children will have a long and happy lifetime, and can choose themselves if they are ‘a willing research patient’ as David Cameron stated in 2010 he would change the NHS Constitution for. We just don’t know to what use those purposes will be put in their lifetime.

The scope of an opt-in assumption should surely be reasonably expected only to be used for our care and nothing else, unless there is a proven patient need & benefit for otherwise? All other secondary uses cannot be assumed without any sort of fair processing, but they already are.

The general public can now see for the first time, the scope of how the HSCIC quango and its predecessors have been giving away our hospital records at arms-length, with commercial re-use licenses.

The scope of sharing and its security is clearly dependent on whether it is fully identifiable (red),  truly anonymous and aggregated (green, Open data) or so-called amber. This  pseudonymous data is re-identifiable if you know what you’re doing, according to anyone who knows about these things, and is easy when paired with other data. It’s illegal? Well so was phone hacking, and we know that didn’t happen either of course.  Knowledge once leaked, is lost. The bigger the data, the bigger the possible loss, as Target will testify. So for those who fear it falling into the wrong hands, it’s a risk which we just have to trust is well secured. This scope of what can be legitimately shared for what purposes must be reined in.

Otherwise, how can we possibly consent to something which may be entirely different purposes down the line?

If we need different data for real uses of commissioning, various aspects of research and the commercial ‘health purposes,’ why then are they conflated in the one cauldron? The Caldicott 2 review questioned many of these uses of identifiable data, notably for invoice validation and risk stratification.

Parents should be able to support research without that meaning our kids’ health data is given freely for every kind of research, for eternity, and to commercial intermediaries or other government departments. Whilst I have no qualms about Public Health research, I do about pushing today’s boundaries of predictive medicine. Our NHS belongs to us all, free-at-the-point-of-service for all, not as some sort of patient-care trade deal.

Where is the clear definition of scope and purposes for either the existing HES data or future care.data? Data extractions demand fair processing.

Data is not just a set of statistics. It is the knowledge of our bodies, minds and lifestyle choices. Sometimes it will provide knowledge to others, we don’t even yet have ourselves.

Who am I to assume today, a choice which determines my children have none forevermore? Why does the Government make that choice on our behalf and had originally decided not to even tell us at all?  It is very uncomfortable feeling like it is Mother vs Big Brother on this, but that is how it feels. You have taken my children’s hospital health records and are using them without my permission for purposes I cannot control. That is not fair processing. It was not in the past and it continues not to be now.  You want to do the same with their GP records, and planned not to ask us. And still have not explained why many had no communications leaflet. Where is my trust now?

We need to be very careful to ensure that all the right steps are put in place to safeguard patient data for the vital places which need it, public health, ethical and approved research purposes, planning and delivery of care. NHS England must surely step up publicly soon and explain what is going on. And ideally, that they will take as long as necessary to get all the right steps in the right order. Autumn is awfully close, if nothing is yet changed.

The longer trust is eroded, the greater chance there is long term damage to data quality and its flawed use by those who need it. But it would be fatal to rush and fail again.

If we set the right framework now, we should build a method that all future changes to scope ensure communication and future fair processing.

We need to be told transparently, to what purposes our data is being used today, so we can trust those who want to use it tomorrow. Each time purposes change, the right to revoke consent should change. And not just going forward, but from all records use. Historic and future.

How have we got here? Secondary Uses (SUS) is the big data cloud from which Hospital Episode Statistics (HES) is a subset. HES was originally extracted and managed as an admin tool. From the early days of the Open Exeter system GP patient data was used for our clinical care and its management. When did that change? Scope seems not so much to have crept, but skipped along a path to being OK to share the data, linked on demand even with Personal Demographics or from QOF data too, with pharma, all manner of research institutions and third party commercial intermediaries, but no one thought to tell the public. Oops says ICO.

Without scope definition, there can be no fair processing. We don’t know who will access which data for what purposes. Future trust can only be built if we know what we have been signed up to, stays what we were signed up to, across all purposes, across all classes of data. Scope creep must be addressed for all patient data handling and will be vital if we are to trust care.data extraction.

***

 

care.data – 2. A mother’s journey in Oz: communication & choice

David Aaronovitch’s Times’ opinion article on March 27th stated data privacy fears have made health-data sharing “toxic” and that campaigners are nothing but a ‘man with a megaphone’, like the Wizard of Oz. My response, part two. Communications & Choice.

1939 – The Wizard of Oz – MGM

Honesty, clarity and real communication, not PR, is fundamental to a renewal of trust across these areas.

The announcement via HSJ today comes, that the HSCIC Chair had concerns over the impact of the care.data leaflet drop, and asked the Department of Health to intervene. One wonders then, who made the decision to go ahead? 

On care.data communications, the Times commentator said HSCIC has probably thought, “Stick out a leaflet, bish, bash, bosh.” The result seems to be more ding, dong. The balloon upped and left before anyone was ready to go  and ICO, GPs, representatives from the BMA and others, including the campaign group, had well founded, and serious concerns.

I spoke with HSCIC communications and managers directly last October, as well as my MP and the Department of Health, to flag how misleading I felt it was for patients to say ‘your name is not extracted’ when it is held at HSCIC already but most of us did not know that. Many of the same leaflet concerns were, much more significantly than by little ol’ me, raised by both GPES advisory group in September and ICO before the launch. So now, despite the £1-2M state funded doormat drop leaflet & cartoon, it’s all up in the air.

(Whilst I know for HSCIC with its own budget of £220M and control of a £1BN annual spend, it may be peanuts, but what a waste of money. At a conservative estimate of £1M for the leaflet drop, at least 50 nurses could have been employed for a year on that. That makes me cross.) We still have no explanation of why so many did not get delivered, what they did when they heard they had not been nor any plans to clarify that. It was our money spent. We deserve to know.

I received a reply to my October letter, from the Secretary of State to assure me that ‘patient identifiable data was not and will not be shared with third parties’. I think with subsequent information coming out about releases, that is at best, may I say, questionable? It has been shown that patient data at individual level has been shared, and we know with researchers for sure. They are not my clinicians, they are not the only third party who may have access. It’s clearly documented by CAG and releases by DAAG from 2013 have just been released in detail for the first time today.

Through the campaign groups’ and ICO intervention that demanded a national communications programme and the subsequent ICO FOI release about the leaflet review and its shortcomings, we go a significant step forwards towards transparency why the leaflet failed to work for patients. It shows that all the issues we found after the event; junk mail vs letter, hard to reach groups, unclear language, missing opt out form, lack of internal communication and the Information Commissioner’s concerns were clearly known but ignored in advance. Why it happened, who made the decision to go ahead anyway and what follow up will be, remains to be seen. With all the past experience and tools at the disposal of NHS England it is stretching my credulity to believe it was simply poorly executed. Let’s not forget, the original plan was to not tell us at all.

We need to stop hearing we need a fix to communications. I’m trying to understand why, with everything at their disposal, they could want or have allowed to let such a thing happen? It was no surprise the leaflet drop was a disaster. HSCIC communications, leaders and now it seems the Department of Health knew clearly. So why go ahead?

The point of the communication should have been to give us fair processing and the leaflet said, ‘you have a choice.’ I have a duty to my children to safeguard their own health, its provision in a safe State health service and to safeguard their autonomy for future. As it stands, it seems an impossibility to choose all three.

Whilst the leaflet nominally gives us a choice, I struggle to see what value it is. It is some, but limited. The only choice we have truly, is before the extraction happens. A GP in Hampshire devised this flow chart to try to help his patients understand it. Anyone can object now and opt in later. But once opted in, there is no get out clause.

If I don’t opt my children out now, they are in for life whether they later want to exercise their Right to be be Forgotton, or not. If I change my mind later and want to opt out (after a media scandal huge breach, for example. Or perhaps my child grows to become a public figure, or contracts a rare condition and we worry about discrimination), it is impossible. Records will just be re-labelled as pseudonymous. Really?

So, if I share their data for secondary purposes by doing nothing, by allowing their data sharing with even health purposed non-NHS intermediaries who sign up to care.data, it feels like I may as well flog it on ebay myself. But although I want to share it, under good governance only for their care and its commissioning, that is impossible.

Surely we should be able to have their health records used only for their care and its direct management, in all forms? Pseudonymous is not anonymous. But we’ve been given a very limited choice. We can only restrict fully ‘identifiable’ data flows according to the leaflet.
The data that HSCIC already holds, is simply given a new label, the HES ID instead of my NHS number, and linked depending on the bespoke request design, I don’t know what else modified, and then exchanged for cash with buyers from commercial health analysts to medical researchers to intermediaries. Amendment to the Care Bill changes nothing, because as long as ‘health purposes’ are served, the customers are deemed acceptable.

What real kind of patient choice is that? Is my hospital data in pseudonymous, potentially re-identifiable form required from all, for all purposes, for all time whether I like it or not? They haven’t given us that choice in the only communication which we were meant to have received (but no one in my area did), the leaflet ‘Better information, means better care‘.

Right now, the only options are to restrict fully identifiable patient confidential data sharing. The leaflet says this means 1) you can restrict a flow between GP and HSCIC of the NHS Number, DOB, Postcode and Ethnicity, and/or 2) flowing out from the HSCIC, for anything other than commissioning to the regional DSCRO (One of 11 Data processing Centres at regional level). The second option also prevents researchers, even with Regulation 5, Section 251 approval, from obtaining red, fully identifiable data.

However, the objection code is not yet operational, so right now, our fully identifiable hospital data may be released without our knowledge or consent. Other data, considered non-personal, diagnoses, GP practice code, other local IDs from our records can still be shared. And according to September meeting minutes, there is no need to respect an objection for pseudonymous data.

To restrict identifiable flow for care.data from the GP record, we need to apply the code 9Nu0 to our record. 9Nu4 restricts the identifiable HES data flow. But NHS number is extracted with anonymous and aggregated data to identify who opts out. Since that must be matched with HES data to find the record we want restricted already at HSCIC, I don’t see how that can  work without landing, matching and being pseudonymised for all of us. I await to be corrected.

We cannot restrict pseudonymous, potentially identifiable data sharing from HES at all. Patients were not told us before HES was extracted, that it would have all these secondary uses, and now they tell us, tough luck? Without fair processing, it’s not even legal. The Health and Social Care Act, the Secretary of State’s direction of Section 251, and waiving the common law of confidentiality all still require us to be informed before the event.

There is no clarity on the options offered in the leaflet or mention of sharing pseudonymous data even if you opt out. That is not choice. The only publicly loud supporters of real choice are campaigners who provided an opt out form, that official channels still have not.

Six weeks into the six month pause, there has been no public communication to give us any clue what is going on to improve the situation, neither by NHS England nor the Secretary of State for Health.  This is not good communication. And knowing that many parents, including friends, have no idea about the initiative I just feel this is wrong.

I’ve written to my MP for the second time. I found in the whirlwind of information and my frustration, that Twitter #caredata and #datasharing offers an informed group of interested individuals. Thank goodness for their support, insights & banter in this tumultuous journey trying to understand what is going on. Until the ‘pause’, HSCIC and NHS England staff would engage and answer questions, too. Now they seem to have gone very quiet.

Like Dorothy, after seeing behind the curtain of how political and state decisions are made and executed, I have been surprised that so much happens ‘about us, without us,’ and will now never be quite as naive. We all deserve the full story, as patients and citizens. According to Jeremy Hunt at frequent presentations, and Tim Kelsey at Strata and other events, we are on the cusp of a brave new world of health data use and its wide ranging impact in our future healthcare provision of personalised medicine. If they expect to use me in that, I want to know how. So right now, there is no way I’m going home, until we know how the story ends.

Now, all this is not very constructive. Not like me at all. But what is past cannot be brushed away without clear answers. That would effectively say, ‘we don’t care we wasted your state money. We don’t care we misled you. We don’t care what you think.’ Get out the broomstick and clear up what went wrong and why. Then we can start fresh and see if together we can find solutions which fit the needs.

We are more than a cohort, and we are not a commodity. We need change.

If we should be Cameron’s ‘willing research patients’, then tell us precisely what that involves. Give me a definition with a limited scope. I support appropriate research use. Aside from the fact that we didn’t know about this either, research approved by CPRD, Thin, QResearch all have a different approach however, from the commercial and apparently limitless dynamic of care.data. It is quite one thing for researchers to access data and contact us for trials. Quite another to find without our knowledge our data may have been exchanged for cash and I want to know it has not been used in research abroad nor with projects with which my ethics may fundamentally disagree.

Data is not just a collection of codes and academic algorithims. It is the detailed knowledge of the inner workings of our mind, bodies and lifestyle which we entrusted to our medical guardians. Of individual people who did not ask nor sign up to become part of Big Data.Treat my children’s data with the respect that it deserves.

No number of animations, leaflets or letters with ‘improved communication’ is going to gloss over the fundamental fixes needed in handling patient data. Show us the flaw and what you have done to fix it. Along the lines of, ‘you said’, ‘we did’. Real communication.

And if you do decide to give us real choice, then make it statutory for life. Choice will only be worth having if we know that what we choose today, does not get transformed into something else tomorrow. It needs more than a magic wand to wave away the issues. Let’s hope the new care.data advisory group, can make it happen.

care.data – 1. A mother’s journey in Oz: transparency.

1939 The wizard of Oz MGM

David Aaronovitch’s Times’ article on March 27th stated data privacy fears have made health-data sharing “toxic” and that campaigners are nothing but a ‘man with a megaphone’, like the Wizard of Oz.

Mr. Aaronovitch chose the perfect fairy tale, but like Dorothy, it landed the wrong way round.

It is long overdue that the curtain of secrecy, behind which the mechanics of the Health and Social Care Information Centre has operated, was finally pulled away. Our medical records shared and sold for over 25 years? We had no idea, yet now find out with whom and how it has been used only though the campaigners. 

The group the article described as ‘not speaking for most of us’, MedConfidential, has in fact spoken with support from leading figures across a wide range of professional organisations, including before the Health Select Committee alongside the Chair of the BMA GP Committee on Feb 25th.  They have spoken about patient choice and fair processing, technical security issues and good governance to get the care.data scheme right, and secure a good future foundation on which to build safe & trusted patient data practices.

I should think ‘not most of us’, but in fact all of us, want to get these things right. These things need to be right, in order for the informed public to support the system. Not just come autumn, but for life. Otherwise they risk revolt and more than just this system, will lose support.

Yet six weeks into the six month delay, we see no publicly communicated changes.

The toxic ‘smoke and mirrors’ lack of transparency to date must change, this scheme is too important to hide away and get wrong. This sort of attitude is precisely why it has repeatedly cost the country billions in failed IT programmes over 10 years whether at the MOD, BBC or Department of Health. The NPfIT via the now named HSCIC, continue making the same mistakes at arms-length from the DH and whilst refusing to apologise, projects carry on regardless, wasting money, time, public and professional trust.

Kingsley Manning, Chair of HSCIC said last week, “One of our key measures of success might have been that we were safely below the radar of public attention.” He may as well have said, “Pay no attention to the man behind the curtain!”

He stated an “innocent lack of transparency” has fuelled suspicion that arrangements for organisations’ use of data were “unfairly tipped in favour of profit making”. Perhaps it’s rather the HSCIC 2013-15 Roadmap which gives us fact, not suspicion. By 2015 HSCIC  would ‘agree a plan for addressing the barriers to entry into the market for new commercial ventures’ using our data provided by the HSCIC and:

“Help stimulate the market through dynamic relationships with commercial organisations,
especially those who expect to use its data and outputs to design new information-based services.”

Working with care.data is promised as a sweetener to commercial business, to ‘innovators of all kinds’  including Google for unproven State economic development and gain. Why should any commercial monkeys, even under the wings of ‘healthcare purposes’, carry off a piece of our most intimate personal data without asking our permission, when we go for healthcare at our most vulnerable and trusting?

Thank goodness for the privacy campaigners, the Freedom of Information requestors, the experts and professionals who altruistically take the time and trouble to champion the patient and public interest. Otherwise, we would not have been informed at all of plans.

The rights of fair processing and Data Protection appear to be trampled upon in the rush to implement the increased sharing of pseudonymous data, which is not anonymous yet not protected.

MedConfidential offers a simple method to enable the opt outof identifiable data flows which NHS England did not do. A right to objection was offered by the Secretary of State for Health and would be upheld as, ‘a constitutional rather than legal right.’ The Commissioning Board NHS England’s unclear leaflet wording and no form compared with the SCR opt out makes the intent of the process hard to understand.

We need honesty, clarity and communication, not PR. Transparency is fundamental to a renewal of trust across these areas.

Don’t tell us one thing and say another to business and government. Talk to us without spin. Give us clarity of purpose, choice, good independent governance, defined scope and an ongoing communications plan. Let me understand why you need fully identifiable data and how it will be used by whom and how you will protect pseudonymous, re-identifiable records. Don’t appear to use technicalities to get what you want. Not only must our data protection be legal, but be seen to be legally appropriate. Listen to the informed critics. Ensure ethics champion commercial decision making. Address the risks as well as the benefits and tell us your forward plans. Then perhaps, you will have paved the pathway to properly use our world class data in the world class NHS, for the public good.

Oh, and please get rid of the monkeys.

care.data – Intro. A mother’s journey in Oz.

Mother’s Day seemed as good a day as any, to reflect how I safeguard my children in future, in a cloud-based digital world and currently, on care.data. Ever since I first read last summer about the initiative to be implemented by the Health and Social Care Centre, I have followed as in depth, as much as time has permitted. I began the journey, as an NHS patient who believed my health records were used by my GP at my GP practice. In 2010 I had opted out of the Summary Care Record. I usually read forms to the end and tick the boxes or not, to keep my data confidential.

Along the way, I have been surprised to learn our hospital records were used for anything other than our care and its delivery. I’ve been shocked to see how it has been widely distributed to third parties, in various formats. I’ve come to understand how our health data entered at a whole range of different entry points (Prescription Service, Choose and Book, Mental health and more), end up stored in linkable silos under the umbrella of one organisation. And I’ve learned that the more I know, the more patients like me, should know. So, feeling that this is missing in the current online debate, I’ve decided to share my point-of-view and learnings, from a patient’s point-of-view.

David Aaronovitch’s Times’ opinion article on March 27th stated data privacy fears have made health-data sharing “toxic” and that campaigners are nothing but a ‘man with a megaphone’, like the Wizard of Oz. Whilst he is correct that there is a vocal minority, I believe it is simply because the majority are not able to take the time or had the interest to get to grips with the subject in depth. I have, albeit as an ordinary lay person on the outside.

There has been little opportunity for discussion of our ordinary patient opinion. Yet it is all of our records, ordinary patients, parents and children, which are being handled as a commodity beyond our direct care, without past knowledge or consent. I think a lot about it, and have broken this into parts. Part one: Transparency, Part two: Communications and Choice. Part three looks at the simplest concrete risks the Times article believed, “have made for public disquiet, but when you examine them they behave like candyfloss”.

I’ve followed it for almost eight months now. Its highs and lows still need a brain, heart and courage. By standing up, I risk being labelled ‘selfish’, a consent fetishist, or scaremongering. I don’t believe it is any of those to seek facts, education and engagement.

So here’s my #caredata story so far.