Cooperation between the US and EU is paramount in tackling disinformation. Their collaboration should adopt a broad-based strategy that:
- Sees the big picture: Joint responses should be multipronged, coordinated, integrated, sustainable and their efficiency continuously evaluated as disinformation vectors and techniques evolve, adapt and migrate rapidly.
- Forms a united front: By engaging all relevant actors: electoral commissions, media regulators, competition authorities, journalists, civil society, human rights and communication experts, technologists and legislators.
- Changes the narrative: Technological innovation is not an end-in-itself, it should be seen as a tool, there to serve the needs of society not left to uncritically define how the latter operates. As a tool it can assist in tackling disinformation.
- Learns to learn: The Fourth Industrial Revolution is upon us and it demands upskilling and training at all levels, all the way to the executives and heads of state, so key decision-makers are in a position to deal with externalities such as vulnerabilities in the information space.
Recommendations for the EU Delegation in the US
The EU Delegation in Washington is uniquely placed to facilitate coordination between the EU and the US in their efforts to tackle disinformation. An immediate action could be convening a day-long workshop where the European Commission would share best practices following the May 2019 elections and liaise with US counterparts to discuss how to improve and expand the CoP, consider its adaptation in the US environment, refine its goals and build in an effective monitoring mechanism able to impose sanctions to transgressors. Even though the final report on the CoP implementation will not be produced before the end of 2019, with the next US presidential election coming up in 2020, US counterparts would profit from the Commission’s experience in tackling the distorting effects of disinformation ahead of elections.
The EU Delegation can serve as a conduit between the internal network on disinformation established within the Commission and US counterparts. It should also have a presence at the next Digital Assembly co-organized by the European Commission and the Presidency of the Council of the European Union, as well as its future incarnations. The EU Delegation should establish a working group, bridging the two sides of the Atlantic and linking their long-term strategies in terms of protecting democratic processes from disinformation coming both from within and without. The working group should aim to reduce disinformation based on an agreed set of principles and set out the institutional capacity necessary to facilitate future closer cooperation.
Recommendations for EU–US cooperation
Both the US and the EU are members of G7’s RRM, as such the two allies should take the lead and build on the mechanism’s analysis and work towards creating a multi-stakeholder forum that brings together professionals from different disciplines, from both the private and the public sector. More broadly, working with Hybrid CoE and the TCEI, the EU Delegation can assist the coordination of global efforts, led by the EU and the US. What follows is a series of recommendations that both the EU and the US could follow.
Common short-term recommendations
Regulate digital intermediaries and broaden oversight
Despite authorities’ understandable wariness of reversing mergers and acquisitions, they should examine all possible options, including imposing a moratorium on political microtargeting and multivariate political ad testing, auditing their business models and algorithmic systems, or even breaking up Big Tech. The latter option was aired in the European Parliament in regards to Google, and even though the US has been predominantly averse to interventions of this scale, the FTC’s statements may be pointing to a change of direction.
Still, momentum for the break-up of Big Tech is not likely to build in the home country of these companies so EU partners should take the lead in reformulating the debate based on a robust evidence base and in a way that could alleviate the political burden from US counterparts. A reorientation of the concept of monopolies in a market increasingly controlled by ‘data power’ is needed. While big technology companies continue to diversify and expand into new markets, a first step could be restricting the markets they can actually enter, at least until meaningful oversight of their current operations is in place.
Oversight should be broader than just covering Big Tech, to include increasingly popular platforms such as Instagram, Reddit, Telegram, and messaging apps in general in order to pre-empt the vulnerabilities created by the upcoming shift in disinformation dissemination. Any future regulators should incorporate resilient monitoring workflows for newcomers in the digital information space too. The practices of global non-US or EU companies such as WeChat, Weibo and TikTok should also be monitored.
Internet Service Providers (ISPs) and big telecom companies should be included in any regulatory overhaul, if not to re-examine liability, to re-evaluate their responsibilities in terms of protecting users’ data that can leave them exposed to disinformation campaigns. For example, in the US, Comcast provided data to NBC for its audience targeting platform and after the 2017 repeal of the FCC’s Broadband Consumer Privacy Rules in the US, ISPs are allowed to sell consumers’ information without consent. In March 2019, the FTC launched an inquiry into the privacy practices of ISPs, indicating the US has started taking notice.
Urgent research into sociotechnical systems
Evidence-based research is the priority before moving on with statutory regulation, so policymakers can ascertain the real impact of disinformation in the EU and the US based on real data and testimonies. Regrettably, clear and verifiable links between cause and effect are still lacking in disinformation research. The effectiveness of present approaches such as prioritizing transparency have to be properly audited, too, as without appropriate oversight and enforcement mechanisms, transparency can become a red herring.
Create state-level technology and digital intermediary regulators
New state regulators should be tasked with monitoring market penetration, conduct human rights audits of terms of service, platform structure and interactivity, content moderation updates, and technological adoption more broadly. Regulators should employ research, standardize the format of transparency reports and processes, and adopt a speculative outlook. They should launch inquiries into ad tech operations and development, tracking across platforms and devices, Big Tech and data broker profiling, as well as into private strategic communication companies. Regulators should also have oversight of new technological tools created to tackle problems that arise, and be able to access removed illegal content. State regulators should have a platform to convene on a regular basis, to exchange best practices and advise on federal or EU-wide policy.
Establish safeguards against conflicts of interest and regulatory capture
Protect policymaking processes by refining conflict of interest disclosure policies, being vigilant in terms of assembling stakeholders that can advance the public interest, and drawing on expertise that is verifiably independent from vested interests. Ensure balanced representation in decision-making forums that includes small and often under-represented stakeholders, and avoid regulatory capture by digital intermediaries or other actors. There should not be a ‘revolving door’ between policymaking and highly paid tech jobs. Capture of policymakers or other stakeholders can take various forms, from campaign contributions and lobbying to the long-term establishment of structural or financial dependencies.
Avoid rushed regulation
Adaptive principles and evidence-based regulation is appropriate for the fast-moving technology sector. Evaluate whether existing regulatory frameworks can be extrapolated to the digital intermediaries or new ones need to be created. Regulation should start with distribution, not content. Digital intermediaries may prefer regulation of the latter, as the former is the key driver of engagement, which is precisely what they monetize at scale, but a focus on regulating distribution and therefore, amplification, would alleviate freedom of expression concerns. New regulatory frameworks should not be exploited to suppress political dissidents, journalists or freedom of expression. Appeal and recourse mechanisms for intermediary or regulatory overreach should be put in place.
Support media plurality, public-service journalism and address media reform
Media pluralism is important in avoiding the ‘propaganda feedback loop’. Legacy media’s code of practice and journalistic codes of ethics should be updated so actors that have proved to disseminate disinformation or misinformed claims should eventually lose the credibility and amplification that legacy media provide. Media ownership has to be transparent across the board and media plurality preserved from the negative externalities of mergers. On the journalism front, impartiality should not be conflated with an uncritical provision of a platform to false claims and repeated disinformation offenders. Balanced reporting does not mean a free-for-all. In a highly polarized or polluted information environment, strategic silence needs to be selectively employed by journalists and editors based on the public interest, independently of online engagement metrics and ratings. Participants in debates need to be selected on the basis of their credibility and demonstrable commitment to informed, evidence-based dialogue advancing democratic values, not their entertainment value. Media reform that includes updating national regulators, competition law and oversight mechanisms, should be examined in cooperation with competition authorities, civil society and journalists. Local journalism should be properly funded.
Move data governance to the centre of the debate
This debate is even more urgent with international tech companies expanding in new markets. For instance, in October 2018, in the US new downloads for Chinese ByteDance’s TikTok app surpassed those of Facebook, Instagram, YouTube and Snapchat. China is also chairing the International Telecommunication Union (ITU) until 2022, enhancing its leverage in setting global internet standards. The US and the EU have to strike a really difficult balance between protecting (if not enhancing) citizens’ data privacy on one hand and adopting innovation enabling policies on the other. The Big Data & Digital Clearinghouse established by the European Data Protection Supervisor indicates the EU is taking a multi-stakeholder approach. GDPR, although not perfect, has raised EU public awareness of the importance of personal data protections. The G20 Osaka Leaders’ Declaration also highlighted the important link between data and trust.
Common long-term recommendations
Reform political campaigning and electoral law
Both the US and EU member states should conduct reviews of the ethics of political campaigning, what is legitimate and illegitimate influence, and lobbying reform to avoid state capture by Big Tech. Digital intermediaries have already advised political campaigns. Investigation of the links between domestic actors and agents of foreign influence needs to be conducted. Statements from politicians could be fact-checked as well to improve the level of political discourse and reinstate trust.
Invest in political security resilience
Disenfranchisement renders citizens more vulnerable to manipulation, so both US and EU states should put political engagement at the forefront of their efforts. Data and AI should be employed to meaningfully inform and engage with citizens, by using Natural Language Processing for example to translate and coordinate debate across different countries and quickly process ideas, similar to the way that CitizenLab did with the Youth4Climate movement.
Tailor-made media and digital literacy programmes
Each digital literacy initiative will demand clearly defined objectives grounded on thoroughly researched conclusions about the information individuals need to make informed decisions in regard to both legacy and digital media. Individuals of different age, background, and nationality may need a different set of skills as well as a different training approach, which is why a national and context-specific approach to the issue is more appropriate.
Digital literacy programmes should target not just young students but all age groups, as well as politicians themselves. Initiatives should also entail enhancing public and policymakers’ awareness of how data is collected, processed and managed, as well as the affordances and even false promises of emerging technologies such AI, deepfakes, ‘neuropolitical’ consulting, and others. Following the example of Big Tobacco companies being ordered to pay for anti-smoking advertising, some suggested a similar model can be applied to the digital domain, by compelling tech giants to raise awareness and invest substantially into digital literacy programmes.
Embrace technological innovation as a tool not a goal
Apart from its deployment in spotting disinformation campaigns, AI can facilitate the rapid exchange of information between the US and the EU, as well as international stakeholders. The US and the EU should invest resources in research into how emerging technologies can empower citizens and democracies. Governmental departments across the board should appoint AI experts to advise on how emerging technologies can enhance institutional capacity and contribute to societal resilience.
Convene an International Digital Assembly
The Tech Accord and the Digital Peace Campaign focus on cybersecurity but the call for global engagement in terms of setting up cyber norms relates to the issue of disinformation too. The health of democratic discourse depends on it. The US and the EU could bridge the work of Hybrid CoE, RRM, TCEI, the Internet Governance Forum (IGF) and the High-Level Panel on Digital Cooperation. As the latter stated, the current digital cooperation architecture is complex but not necessarily efficient. The panel’s suggestion of the ‘IGF Plus’ model for cooperation merits further attention. In the context of the ITU, the US and the EU should take the lead in establishing a long-term, open, international and interdisciplinary forum where developers, civil society, journalists, researchers, and policymakers can develop a code of ethics in terms of tech development and harness the affordances of technological innovation to support and promote democracy. Evidently, the work of the UN’s Group of Governmental Experts and the Open-Ended Working Group on cyber norms should also inform the EU–US cooperation on the issue of disinformation. The two allies should lead in setting a roadmap with the ITU towards convening an International Digital Assembly that would address the really pressing issues of international digital governance, such as information pollution, manipulation, and data governance. The cooperation frameworks recommended by the High-Level Panel on Digital Cooperation, which the ITU committed to refine, could be used in this process.