Decolonising the Internet.
Collective action against misinformation
Veronica Fanzio, University of Amsterdam
Decolonisation is not an individual choice; it demands collective, sustained, committed work. Let us feed these visions for a future of blue skies and open paths. Let us nourish each other with responsibility, care, affection, and patience.
Luiza Prado de O. Martins
In the digital realm, it takes the truth about six times as long as falsehood to reach 1,500 people, according to the team led by Sinan Aral of the Massachusetts Institute of Technology. Not novel at all, misinformation - which is used here as an umbrella term for fake news, conspiracy theories and manipulated facts - have been spreading significantly since the dawn of the internet. However, it is highly possible that recently you have heard of misinformation and conspiracies more than usual. Since the COVID-19 outbreak, the spread of misinformation along with conspiracies has been exploding, seemingly abruptly, while we become increasingly more aware of the multilayered impact it may have on our lives.
What are conspiracy theories?
Conspiracy theories are generally defined as efforts to explain the causes of impactful social and political events with claims of secret plots enacted by powerful actors. They hold ideological significance, associated with ‘aversion and distrust of powerful groups.’ Studies have even outlined a ‘conspiracy mindset,’ pointed as one of the major causes of belief in misinformation and conspiracies, together with group attachment. However, conspiracy theories are also supposed to serve specific objectives, just like myths. As such, they explain the world, differentiate between good and evil, and ‘us’ and ‘them,’ fragmenting the world into simple pieces, while constituting some kind of manual describing how to act to survive.
Many have been disputing about the relation between the growth of misinformation and the pandemic. And, the answer could be simply found in the myths metaphor. Meaning, conspiracies are the easiest way to answer the uncountable questions raised by the global crisis. The more disoriented people are, the easier it is for conspiracy theories to spread. And, they are seemingly simpler, hence, more understandable than genetics or evolutionary biology, and at the same time propose a specific more visible than a microscopical virus target to ‘deal’ with.
While it is easy to point the finger at ‘the internet’ and conspiracy-lovers, which is often righteous to do so, we cannot oversimplify it. And it is, in fact, both a profoundly complex phenomenon and a manifestation of a way larger issue - of how we behave online.
What is so often neglected is that the digital reality is not that different from our physical reality, simply because our actions have consequences in both, for ourselves and the others. ‘Navigating online’ holds the literal significance of moving from one digital space to the other. One of the outcomes is, for instance, the path the user traces. Think of it as a route of breadcrumbs left behind. These fragments are knowingly picked and analysed - mostly - for advertising purposes, so that companies can have enough data to profile a target audience with somehow horrifying accuracy, leading scholars to call out at surveillance capitalism.
Although digital surveillance is as fascinating and worth discussing with greater detail as bottomless, it is crucial to mention that monetisation is actually one of the major forces behind the spread of some conspiracies. Freelon et al. say that actors purposely seek to spread fake news, polarising opinions and false rumours to promote specific ideological standpoints, attacking the mainstream and the institutional media outlets.
Single users may be responsible too. Engaging with a post (liking, commenting or sharing) amplifies its reach, while ‘giving’ views to a YouTube video eases its way toward the ‘recommended’ section, helping to widen the audience of a conspiracy. And, while major companies have been trying to deplatformise controversial authors and confront misinformation through censorship of damaging posts, users appear to detain a powerful agency, being able to re-propose and republish debunked content on mainstream platforms and/or gather in less monitored online spaces. However, according to the King’s College study, the belief in conspiracy theories is also directly linked to social media usage and relying only on the information collected online.
Still, on the other hand, it is said that the users do not have to be aware of the consequences of their actions online to contribute to the propagation of fake news [Read more in Bartosz Kołodziejczyk’s article in the 2nd issue]. And this is often the case. Just like the internet by allowing (‘affording’) multilateral communication has become a perfect environment for the dissemination of conspiracy theories, social media is not only a - more or less - neutral carrier of content, but also a creation that actively influences our beliefs and decisions. And foremost, it has been designed that way, as discussed in a popular Netflix documentary, ‘The Social Dilemma.’ For instance, to come back to the issue of monetisation, on Facebook, the structure of information sharing is incompatible with the model of making money. Hence, the system favours the even more controversial content that also causes political polarisation. It does not mean that social media platforms are purely ‘evil’ and aim to do so, yet, as such, it all becomes a game of power, obtained by manipulating the public opinion.
Beliefs become navigational maps that govern users’ behaviour, leading in this case from one conspiracist source to another within a misinformation gathering mannerism that assumes the characteristics of a snowball effect. Yet, many stories of conspiracy spreading are reconstructed and analysed, until recognisable pioneering figures can be pointed. ‘The New York Times’ piece on the infamous ‘Plandemic’ video provides an example. Such work is extremely helpful in mapping the networks performing in the spread of misinformation, where the ‘investigation’ often leads to specific, recognisable individuals. But a preacher without an audience would be labelled a fool, it needs to be stressed that the networks at play are mainly composed by larger groups of unnamed people, the audience that makes possible for conspiracies to gain visibility, creating an interplay between top-down and bottom-up movements.
The internet is a system, or rather an ecosystem, and has to be approached by asking for systemic solutions. Companies’ enforcement of deplatformization and personal fact-checking are surely few of the possible arms used against misinformation, but broader action is needed. This should begin by investing in education and digital literacy. For instance, ‘OECD Policy Responses to Coronavirus: Combatting COVID-19 disinformation on online platforms’ proposes actions aiming at avoiding the fragmentation of reporting standards. This includes an implementation of governmental support for independent fact-checking organisations, the juxtaposition of technological solutions and human moderators, and the privileging of evidence-based approaches that, conjointly with international cooperation.
The set of governing regulations has to be thought-grounded in the digital ecosystem but substantially integrated with the offline reality since the two visibly intertwine and have real consequences on each other. Just like we can witness when many of the conspiracists protest on the streets while people refuse to wear masks at all, often calling them ‘muzzles.’
These actions can be seen as the decolonisation of the digital information, now plundered and abused by ideological enforcers and power-seekers, undermining democratic deliberation, alimenting distrust in mainstream institutions and news media outlets. There is a need for an ecosystem-based mindset centred on interconnection. Users’ impact has to be embraced and made more purposeful for positive collectiveness, that doesn’t mean flattening homogenization, but to make complexity and diversification splendidly detoxified.