2. Disinformation in Context
Definition and scope
After gaining notoriety on both sides of the Atlantic, the term ‘fake news’ has gradually been succeeded by the now prevailing ‘disinformation’, but a level of confusion around related terminology persists. Ambiguous definitions make it more difficult to find possible remedies. ‘Fake news’ insinuates that news producers and journalists should be held accountable for the pollution of the information space, and therefore are also implicitly responsible for tackling the problem. While ‘fake news’ scapegoats journalists, ‘information warfare’ alludes to offensive strategies that are often less nuanced and specific. The term ‘foreign influence’, although at times accurate, also runs the risk of cordoning off domestic propaganda purveyors such as political actors or foreign proxies. The scope of foreign influence is also broader than disinformation and according to the US Department of Justice (DoJ), the former may include hacking, malicious cyber activity, identity theft and fraud.4 Although it is often used interchangeably with these other descriptions, use of the term disinformation enables a more nuanced and holistic analysis of what has become a global problem, by focusing on communication vectors and processes.
Disinformation is defined as ‘verifiably false or misleading information that is created, presented and disseminated for economic gain or to intentionally deceive the public, and may cause public harm’.
According to the European Commission’s Action Plan against Disinformation,5 disinformation is defined as ‘verifiably false or misleading information that is created, presented and disseminated for economic gain or to intentionally deceive the public, and may cause public harm’. Harm can entail threats to democratic political and policymaking processes by undermining ‘the trust of citizens in democracy and democratic institutions’. The inclusion of intentionality in the description also differentiates the term from misinformation.6
Disinformation can be overt, displaying factually false content but it can also take more subtle forms, such as the cherry-picking of statistics to mislead audiences and prime them in certain ways,7 or re-contextualized8 or even tampered9 visual material. Narratives can be adjusted to take advantage of the existing information space by tapping into divisive issues.10
Disinformation’s shape-shifting nature and agility makes it a useful vehicle for hybrid threats or what the European Centre of Excellence for Countering Hybrid Threats (Hybrid CoE) defines as a ‘coordinated and synchronised action, that deliberately targets democratic states’ and institutions’ systemic vulnerabilities, through a wide range of means [political, economic, military, civil, and information]’.11 Coordinated and amplified disinformation can crowd-out rational debate and sow confusion and discord,12 numbing decision-making capacities. Indeed, hybrid threats aim to exploit the target’s vulnerabilities and generate ambiguity to ‘hinder decision-making processes’.13
Big data and its Faustian deal
Technological developments such as the ‘datafication’14 of different aspects of life, the rise of smart homes and smart cities, the Internet of Things (IoT), accelerating artificial intelligence (AI) development, and internet and mobile phone penetration, have vastly exacerbated the combined ripple effects of disinformation’s complexity and scale. The prevailing data governance ambiguities, a tech sector far removed from public scrutiny and a utopian vision of how humanity and the market would interact with the internet – encapsulated in the famous Declaration of the Independence of Cyberspace by John Perry Barlow15 – created cracks in the system and enabled privacy encroaching surveillance systems to be developed and refined. As Zuboff has highlighted, there is a need to attend to the anti-democratic implications of allowing the concentration of privacy rights ‘among private and public surveillance actors’, at the very moment those same rights are summarily and habitually removed from citizens resigned to the ‘Faustian deal’ of exchanging the right of privacy for a simulacrum of an effective digital life.16
Governments need to act to reverse this trend, which will only exacerbate the problem of disinformation. That is why the answer is not more surveillance of the online space or more debunking initiatives, but a re-appropriation of gatekeeper roles to responsible actors that have been or can be regulated sufficiently to fulfil them.
Key manifestations
Some suspicions of Russia’s influence operations in relation to the Syrian war,17 the downing of flight MH17,18 the US 2016 national elections,19 the US midterm elections,20 and the Novichok attacks in the UK, have been confirmed but, to a large extent, they have mostly been debunked. However, the country remains the main source of disinformation in Europe.21 Other state actors, such as Iran,22 China,23 or North Korea have also employed disinformation, as has been established both by the US and the European Parliament.24
Meanwhile, state-level domestic propaganda has also grown in recent years. Alarmingly, research indicates that over 28 state actors around the world have manipulated social media to target domestic as well as foreign audiences.25 On both sides of the pond domestic actors such as politicians, commentators, or the far-right,26 have also proved to be purveyors of disinformation, sometimes outperforming foreign actors in terms of reach. A case in point is research indicating that just two misleading claims by UK politicians during the EU referendum campaign were cited in 10.2 times more tweets than Brexit-related posts by Russian trolls.27
Disinformation has manifested itself as first and foremost a systemic issue, not solely an agent problem. Agents exploit in-built vulnerabilities of the current digital ecosystem and the regulatory gaps in political environments that are already dislocated or prone to influence.
The objectives and vectors of disinformation vary just as much as the differing agents of influence operations. Armed and civilian non-state actors have both deployed disinformation to serve their ideological or financial goals, with Islamic State of Iraq and Syria (ISIS)28 and a community of young Macedonians in Veles,29 respectively, being well-documented examples. These two instances showcase how multifaceted the problem of disinformation is, in terms of different objectives pursued and the dissemination vectors used. While ISIS broadcast its radicalization messages on YouTube, the Macedonian actors took advantage of Google’s AdSense interface.30 The latter is part of a worldwide ad tech infrastructure that has only recently come under scrutiny as it uses online tracking,31 data-driven targeting and real-time bidding via ad exchanges to reward attention-grabbing clickbait, which has supported the monetization of ‘fake news’ content.32 Despite actions taken thus far, disinformation continues to be a profitable business.33
The issue has become more complex due to the divergence in the motivations of individuals who receive, share and amplify disinformation. Internet and social media users may willfully or unwittingly share false news in an attempt to signal their identity or values,34 rather than influence their peers per se.
Disinformation has manifested itself as first and foremost a systemic issue, not solely an agent problem. Agents exploit in-built vulnerabilities of the current digital ecosystem and the regulatory gaps in political environments that are already dislocated or prone to influence. Context is paramount in any response and as Benkler et al. highlighted in their study of US media propaganda, ‘each country’s institutions, media ecosystems, and political culture will interact to influence the relative significance of the internet’s democratizing affordances relative to its authoritarian and nihilistic affordances.’35 Any attempt to move towards solving or containing the problem should be grounded on a common set of principles by the cooperating actors and a deep awareness and respect of each system’s distinctive circumstances.