We must build a society that is resilient against disinformation

Traditional fact-checking is no match for the power of the crowd in the age of social media

In recent years, the internet has become the venue for a general collapse in trust. Trolling, fake news and “doing your own research” have become such a part of public discourse, it’s sometimes easy to imagine that all the online revolution has brought us is a myriad of new ways to be confused about the world.

Social media has played a particularly significant role in the spread of disinformation. Malicious state enterprises such as the notorious Russian “troll farm” are part of this, certainly. But there is a more powerful mechanism: the way it brings together people, whether flat-earthers or anti-vaxxers, who would find it difficult to meet likeminded folks in the real world.

Today, if you’re convinced our planet isn’t round, you don’t have to resort to standing on street corners with a sign, shouting at passersby. Instead, you have access to an online community of tens of thousands of individuals producing content that not only tells you you’re right, but also builds a web of pseudo-knowledge you can draw from any time you feel your beliefs are being challenged.

The same kinds of "counterfactual communities" arise around any topic that attracts enough general interest. I've witnessed this myself over the past decade while looking into war crimes in Syria, Covid-19 disinformation and now the Russian invasion of Ukraine.

READ MORE

Why do counterfactual communities form? A key factor is distrust in mainstream authority. For some, this is partly a reaction to the UK and US government's fabrications in the build-up to the 2003 invasion of Iraq. Sometimes, it stems from a sense of injustice around the Israel-Palestine conflict. These are of course legitimate positions, and are not by themselves indicative of a tendency to believe in conspiracy theories. But a pervasive sense of distrust can make you more vulnerable to slipping down the rabbit-hole.

Moral injury

One way of looking at this is that government deception or hypocrisy has caused a form of moral injury. As with the proverb “once bitten, twice shy”, that injury can result in a kneejerk rejection of anyone perceived as being on the side of the establishment.

This creates a problem for traditional approaches to combating disinformation, such as the top-down fact check, which might be provided by a mainstream media outlet or some other organisation. More often than not, this will be discredited, dismissed with: “They would say that, wouldn’t they?”

Fact-checking outfits may do good work, but they are missing a crucial component: the power of the crowd. Because, as well as counterfactual communities, we’ve also seen what you might call truth-seeking communities emerge around specific issues. These are the internet users who want to inform themselves while guarding against manipulation by others, or being misled by their own preconceptions. Once established, they will not only share and propagate fact checks in a way that lends them credibility, but often conduct the process of fact-checking themselves.

What’s important about these communities is that they react quickly to information being put out by various actors, including states. In 2017 the Russian ministry of defence published images on social media that it claimed showed evidence of US forces assisting Islamic State in the Middle East. Huge if true – except it was instantly debunked when social media users realised within seconds that the Russian MoD had used screenshots from a computer game.

I would go as far as to say that internet users who are heavily engaged with particular topics are our strongest defence against disinformation. At Bellingcat, a collective of researchers, investigators and citizen journalists I founded in 2014, we’ve seen this play out in real time during the Russian invasion of Ukraine.

Our investigation of the downing of Malaysia Airlines flight MH17 over eastern Ukraine helped create a community focused on the conflict there that uses open-source techniques to examine, verify and debunk a myriad of information. In the weeks leading up to the Russian invasion, members started gathering videos and photographs of Russian troop movements that forewarned of the planned attack, and proactively debunked disinformation being spread by separatists, including a supposed IED attack posed with bodies that were shown to have already been autopsied before they arrived at the scene.

Geolocate

After the invasion got under way, many of the same people helped collect and geolocate videos and photographs, including images of potential war crimes, that Bellingcat and its partners have been verifying and archiving for possible use in future accountability processes. These contributions, on the face of them disjointed and chaotic, have saved our verification team vast amounts of time.

But how do you grow and nurture what are essentially decentralised, self-organised, ad hoc groups like this? Our approach has been to engage with them, creating links from their useful social media posts to our publications (all thoroughly fact-checked by our team), and crediting them for their efforts. We also create guides and case studies so that anyone who is inspired to give it a go themselves has the opportunity to learn how to do it.

But there’s more to do than simply waiting for crowds of investigators to emerge and hoping they’re interested in the same things we are. We must take a broader approach. The answer lies in creating a society that’s not only resilient against disinformation, but has the tools to actively contribute to efforts towards transparency and accountability.

For example, digital media literacy charity the Student View has been going into schools and showing 16- to 18-year-olds how to use investigation techniques to look into issues affecting them. In one case, students in Bradford used freedom of information requests to uncover the unusually large number of high-speed police chases in their areas.

Teaching young people how to engage positively with issues they face and then expanding this work into online investigation is not only empowering, it gives them skills they can use throughout their lives. This is not about turning every 16- to 18-year-old into a journalist, police officer, or human rights investigator, but giving them tools they can use to contribute, in however small a way, to the fight against disinformation. In their home towns, in conflicts such as Ukraine, and in the wider world – they really can make a difference.

Eliot Higgins is founder of the Bellingcat investigative journalism network and author of We Are Bellingcat: An Intelligence Agency for the People

Further reading

  • This Is Not Propaganda: Adventures in the War Against Reality by Peter Pomerantsev (Faber & Faber, £14.99)
  • The Misinformation Age: How False Beliefs Spread by Cailin O'Connor and James Owen Weatherall (Yale, £11.99)
  • A Field Guide to Lies and Statistics by Daniel Levitin (Viking, £14.99)