Chapter 1 - Orchestrated or emergent? Understanding online disinformation as a complex system

Disinformation is spread through a complex network of often independent actors. Many are traffickers in conspiracy theories or hoaxes, unified by a suspicion of Western governments and mainstream media. Their narratives, which appeal to leftists hostile to globalism and military intervention and nationalists against immigration, are frequently infiltrated and shaped by state-controlled trolls and altered news items from agencies such as RT and Sputnik. Motivations for participation in the spread of disinformation are varied and should be taken into consideration.

Almost on a daily basis, new revelations expose the extent to which the Russian government used social media and other online tools to interfere with the democratic process in the United States, Britain and elsewhere. These discoveries illuminate a multi-dimensional strategy using high- and low-tech tactics to generate and spread disinformation. They also suggest a complex system in which these tactics resonate with and shape the activities of various types of distinct and independent actors.

Examining the spread of conspiracy theories surrounding terrorist attacks and mass shooting events in the United States can act as a lens for viewing the complex dynamics of this disinformation space. For example, after the Boston Marathon bombings, an online rumour claimed that the event had been a ‘black ops’ operation perpetrated by the US government. After the 2015 Umpqua school shooting, online communities of Reddit and Twitter users theorised that the event (like Sandy Hook three years earlier) was a ‘hoax’, staged by the government to justify gun control legislation. Similarly, the October 2017 shooting in Las Vegas was seen by some as a ‘false flag’ event carried out by members of the ‘new world order’—a cabal of conspirators who pull the strings of world events.

These conspiracy theories are all somewhat distinct, but each reflects a pattern of claims about other man-made crisis events, and they all connect to a small number of shared underlying themes or narratives:

  • The US government and other Western or NATO-affiliated governments are untrustworthy and are unjustified aggressors in conflicts around the world;
  • These governments and other powerful people manipulate world events to ensure their power; and
  • ‘Mainstream’ and corporate media are untrustworthy. They assist governments and other powerful actors in hiding the truth from people. They are ‘fake news’.

Many of these narratives are explicitly connected to an ‘anti-globalist’ or nationalist worldview. The term globalism is a relative of globalization, used to characterise transnational perspectivesFootnote 2 and policies that integrate free trade and open bordersFootnote 3. In practise, the anti-globalist term pulls people from seemingly disparate parts of the political spectrum onto common ground. For example, they connect left-leaning individuals who oppose globalisation and foreign military intervention by the US and other NATO governments with right-leaning individuals who oppose immigration and favour nationalist policies.

Tracking the spread of these conspiracy theories and their related narratives demonstrates how state-sponsored information operations interact with organic communities of online users to spread disinformation.

For example, on 5 November 2017, a mass shooting at a church in small-town Texas took the lives of more than 20 people. Within hours, officials and mainstream media identified a suspect, a 26-year-old man who had a record of domestic violence and had been discharged from the US Air Force. However, before that narrative developed, and then continuing even after it had been established, an alternative narrative claimed that the suspect was really an Antifa terroristFootnote 4. With the goal of forwarding this narrative, online activists on the political right doctored screenshots of the shooter’s Facebook profile to include an Antifa flag, providing evidence for this theory, and then used social media to spread that content. The theory soon began to propagate through the Twittersphere among alt-right accounts. Popular alt-right blogger Mike Cernovich tweeted that details of the shooter were consistent with the profile of an Antifa member. Alex Jones, a right-wing media personality known for spreading conspiracy theories, commented that the shooter wore all black (reflective of leftist activists). The theory also took root in alternative media, appearing on web sites like TheGatewayPundit, YourNewsWire and BeforeItsNews. Russian-government funded news outlet RT (formerly Russia Today) also helped to spread the claim, sharing a Facebook post that noted the shooter’s Antifa connections, including content from the doctored Facebook profile.

This activity follows a now established pattern of online activity after mass shooting events. Recent research suggests that some of the initial conversations around these theories take place in the less visible (and more anonymous) places of the Internet, such as Reddit, 4chan, Discord and othersFootnote 5. These theories are then spread and amplified, sometimes strategically, on Twitter and Facebook. Additionally, there exists a surrounding ecosystem of online web sites that takes shape around and supports these conspiracy theory-building conversations with additional speculation, discussion and various forms of evidenceFootnote 6. This ecosystem consists largely of alternative media that position themselves as challenging mainstream narratives. It includes several web sites and blogs that push conspiracy theories and pseudo-science claims (eg, InfoWars, 21stCenturyWire and SecretsOfTheFed). Significantly, many web sites in this ecosystem are news aggregators, remixing and republishing content found elsewhere in the ecosystem (eg, BeforeItsNews and YourNewsWire). For alternative narratives about shooting events in 2016, the system contains a few explicitly nationalist and white supremacist web sites (DailyStormer) as well as some seemingly left-leaning activist web sites (ActivistPost). Web sites from the Russian-funded media outlets RT and Sputnik are also integrated into this ecosystem. Iran’s PressTV appears as well.

An open question is how the different pieces of this dynamic system—of seeding, amplifying and spreading these theories—fit together. It is not yet clear how much of this activity is emergent and how much is orchestrated (and by whom and why). However there appear to be distinct actors, driven by varied and overlapping motivations. Six categories of motivation are proposed as part of a preliminary conceptual framework.

Sincere ideology

One set of actors within this system is ideologically motivated. These persons, including individual social media users as well as small organisations that operate web sites, blogs, and other feeds, are ‘true believers’ of the messages that they are spreading. The messages are largely anti-globalist (ie, anti-imperialism and anti-globalisation on the left; pro-nationalism and anti-immigration on the right). They are also explicitly critical and distrusting of mainstream media. These actors may indeed be affected by political propaganda, though causation is difficult to establish. At times, they can be seen to act as amplifiers of political propaganda, seeded with messages that they repeat and amplify. But many sincerely ideologically motivated actors also can be seen to generate their own content, without the continued need for direct seeding or coordination of messages.

Political propaganda

The activities of the second group of actors in this system, which include the intentional production, sharing and amplification of disinformation, can be viewed as part of a political strategy. Unlike the ideologically-motivated actors, these actors are not necessarily true believers of the messages that they share. In their messaging, they mix false information with factual information, and intentionally connect other stories and narratives, often the ones that appeal to the ideologically motivated actors, to their own narratives. These politically-motivated actors are adapting old strategies of disinformation to the potential of the information age, leveraging the technological infrastructure of the Internet to spread their messages further, faster and at lower cost than ever before. Pomerantsev and WeissFootnote 7 have written that the purpose of disinformation is not necessarily to convince, but to confuse—to create muddled thinking across society, to sow distrust in information and information providers. There is evidence that this strategy is at work within this system. Another goal of disinformation is to create and amplify division in (adversarial) democracies, and this is visible as well.

Financial incentives

Other actors within this system are financially motivated. For example, there are numerous web sites selling online advertisements and health products. Many are essentially aggregators of ‘alternative’ and ‘pseudo’ media, regurgitating ‘clickbait’ content designed to attract users. Others, like InfoWars, integrate original content with borrowed content from other sites in the ecosystem, including RT, and use their platform to peddle an array of products (ie, nutritional supplements).

Reputation gains

Another set of actors, particularly within the social media sphere, appear to be motivated specifically by the reputational and attentional benefits inherent to those platforms. Social media is designed to be engaging, and part of that engagement involves a feedback loop of likes and follows. In the disinformation space, especially among the alt-right, there appear to exist a set of actors who are primarily (or at least significantly) motivated by attentional and perceived reputational gains. Mike Cernovich and Jack Posobiec are two high-profile examples, but there are many others among the ‘crowdsourced elite’ on Twitter and elsewhere who spread alternative narratives and other politicised disinformation and have consequently received much online visibility.

The last two categories are more conceptual. While not yet backed by large volumes of empirical evidence, they are however theorised as part of this complex system.

Entertainment

It is likely that some participants in the disinformation space simply engage for entertainment value or ‘for the Lulz’, as the now waning Anonymous group would say. That slogan was meant to describe a kind of mischievous entertainment unique to online activity. Another way to think of this category is as extending gaming practices to the real world. For example, disinformation can provide a platform for working together with online team mates and an avenue for embarking on culture-hacking quests (to spread certain ideologies).

Empowerment

Disinformation can provide an opportunity for a disempowered individual or group to assert agency and power in the world through digital action. This category includes 4chan denizens who use memetic warfareFootnote 8—the generation and propagation of graphical memes—to affect political change across the globe. Like digital volunteers who feel empowered by coming together online after disaster events in order to assist individuals, this set of actors is motivated by collectively working in an online team for a cause (eg, electing a favoured candidate). They are perhaps less motivated by the cause itself than by the emotional reward of having an impact.

These latter motivations and the associated sets of actors are significant. Preliminary research suggests that purposeful disinformation strategies are not just leveraging the power of social media platforms, but are resonating with the activities of online crowds that form within those platforms. For example, Russia-based troll accounts impersonating US citizens infiltrated online communities of alt-right Twitter users and functioned to both seed and amplify their messages during the 2016 US election cycle. They also embedded themselves within left-leaning Twitter communities that formed around issues such as #BlackLivesMatter, functioning to amplify existing divisions in the United States. On another front, Russia-connected information operations have targeted online activist communities that take shape around anti-war ideologies and use them to spread messages challenging US and NATO activities in Syria.

By focusing on explicit coordination by and collusion with state actors, and ignoring or under-appreciating the roles and motivations of these independent actors, researchers, journalists, and policy-makers risk over-simplifying the complexity of this system, limiting the development of effective solutions, and under-informing public awareness of the problem. Importantly, the opportunity to assist everyday users of these systems to recognise the role they play within the disinformation phenomenon is missed. In other words, the problem of disinformation cannot simply be attributed to the design of technological systems or the deliberate actions of government-funded trolls. Solutions to this problem must also take into account the people who are interacting with and affected by this information, not merely as victims, but as agents in its creation, propagation, and (hopefully) its correction.

Page details

Date modified: