Notes and thoughts from Full Fact’s event at Newspeak House in London on 27/3 to discuss fake news, the misinformation ecosystem, and how best to respond. The recording is here. The contributions and questions part of the evening began from 55.55.
What is fake news? Are there solutions?
1. Clickbait: celebrity pull to draw online site visitors towards traffic to an advertising model – kill the business model
2. Mischief makers: Deceptive with hostile intent – bots, trolls, with an agenda
3. Incorrectly held views: ‘vaccinations cause autism’ despite the evidence to the contrary. How can facts reach people who only believe what they want to believe?
Why does it matter? The scrutiny of people in power matters – to politicians, charities, think tanks – as well as the public.
It is fundamental to remember that we do in general believe that the public has a sense of discernment, however there is also a disconnect between an objective truth and some people’s perception of reality. Can this conflict be resolved? Is it necessary to do so? If yes, when is it necessary to do so and who decides that?
There is a role for independent tracing of unreliable information, its sources and its distribution patterns and identifying who continues to circulate fake news even when asked to desist.
Transparency about these processes is in the public interest.
Overall, there is too little public understanding of how technology and online tools affect behaviours and decision-making.
The Role of Media in Society
How do you define the media?
How can average news consumers distinguish between self-made and distributed content compared with established news sources?
What is the role of media in a democracy?
What is the mainstream media?
Does the media really represent what I want to understand? > Does the media play a role in failure of democracy if news is not representative of all views? > see Brexit, see Trump
What are news values and do we have common press ethics?
New problems in the current press model:
Failure of the traditional media organisations in fact checking; part of the problem is that the credible media is under incredible pressure to compete to gain advertising money share.
Journalism is under resourced. Verification skills are lacking and tools can be time consuming. Techniques like reverse image search, and verification take effort.
Press releases with numbers can be less easily scrutinised so how do we ensure there is not misinformation through poor journalism?
What about confirmation bias and reinforcement?
What about friends’ behaviours? Can and should we try to break these links if we are not getting a fair picture? The Facebook representative was keen to push responsibility for the bubble entirely to users’ choices. Is this fair given the opacity of the model?
Have we cracked the bubble of self-reinforcing stories being the only stories that mutual friends see?
Can we crack the echo chamber?
How do we start to change behaviours? Can we? Should we?
The risk is that if people start to feel nothing is trustworthy, we trust nothing. This harms relations between citizens and state, organisations and consumers, professionals and public and between us all. Community is built on relationships. Relationships are built on trust. Trust is fundamental to a functioning society and economy.
Is it game over?
Will Moy assured the audience that there is no need to descend into blind panic and there is still discernment among the public.
Then, it was asked, is perhaps part of the problem that the Internet is incapable in its current construct to keep this problem at bay? Is part of the solution re-architecturing and re-engineering the web?
What about algorithms? Search engines start with word frequency and neutral decisions but are now much more nuanced and complex. We really must see how systems decide what is published. Search engines provide but also restrict our access to facts and ‘no one gets past page 2 of search results’. Lack of algorithmic transparency is an issue, but will not be solved due to commercial sensitivities.
Fake news creation can be lucrative. Mangement models that rely on user moderation or comments to give balance can be gamed.
Are there appropriate responses to the grey area between trolling and deliberate deception through fake news that is damaging? In what context and background? Are all communities treated equally?
The question came from the audience whether the panel thought regulation would come from the select committee inquiry. The general response was that it was unlikely.
What are the solutions?
The questions I came away thinking about went unanswered, because I am not sure there are solutions as long as the current news model exists and is funded in the current way by current players.
I believe one of the things that permits fake news is the growing imbalance of money between the big global news distributors and independent and public interest news sources.
This loss of balance, reduces our ability to decide for ourselves what we believe and what matters to us.
The monetisation of news through its packaging in between advertising has surely contaminated the news content itself.
Think of a Facebook promoted post – you can personalise your audience to a set of very narrow and selective characteristics. The bubble that receives that news is already likely to be connected by similar interest pages and friends and the story becomes self reinforcing, showing up in friends’ timelines.
A modern online newsroom moves content on the webpage around according to what is getting the most views and trending topics in a list encourage the viewers to see what other people are reading, and again, are self reinforcing.
There is also a lack of transparency of power. Where we see a range of choices from which we may choose to digest a range of news, we often fail to see one conglomerate funder which manages them all.
The discussion didn’t address at all the fundamental shift in “what is news” which has taken place over the last twenty years. In part, I believe the responsibility for the credibility level of fake news in viewers lies with 24/7 news channels. They have shifted the balance of content from factual bulletins, to discussion and opinion. Now while the news channel is seen as a source of ‘news’ much of the time, the content is not factual, but opinion, and often that means the promotion and discussion of the opinions of their paymaster.
Most simply, how should I answer the question that my ten year old asks – how do I know if something on the Internet is true or not?
Can we really say it is up to the public to each take on this role and where do we fit the needs of the vulnerable or children into that?
Is the term fake news the wrong approach and something to move away from? Can we move solutions away from target-fixation ‘stop fake news’ which is impossible online, but towards what the problems are that fake news cause?
Interference in democracy. Interference in purchasing power. Interference in decision making. Interference in our emotions.
These interferences with our autonomy is not something that the web is responsible for, but the people behind the platforms must be accountable for how their technology works.
In the mean time, what can we do?
“if we ever want the spread of fake news to stop we have to take responsibility for calling out those who share fake news (real fake news, not just things that feel wrong), and start doing a bit of basic fact-checking ourselves.” [IB Times, Eliot Higgins is the founder of Bellingcat]
Not everyone has the time or capacity to each do that. As long as today’s imbalance of money and power exists, truly independent organisations like Bellingcat and FullFact have an untold value.
The billed Google and Twitter speakers were absent because they were invited to a meeting with the Home Secretary on 28/3. Speakers were Will Moy, Director of FullFact, Jenni Sargent Managing Director of firstdraftnews, Richard Allan, Facebook EMEA Policy Director and the event was chaired by Bill Thompson.