A small number of platforms set global news agendas, culture and norms. But governance of those platforms is fragmented. Coordination and interoperability would strengthen states’ ability to deal with corporate power, while also reducing compliance burdens on industry.
Over the past decade, large platforms like Amazon, Facebook, Google, Instagram, TikTok and WeChat have become a ubiquitous presence in daily life. As a result, these largely Chinese and US digital platforms are renegotiating the relationship between people and the world around them.
Governments have previously been slow on the uptake. The governance of digital platforms and services is now a central priority. An array of government bodies, technology firms and civil society actors have contributed to a patchwork of principles, laws and best practices that attempt to reflect the new primacy of digital technologies in shaping our lives. Those groups’ attention has most frequently been focused around the twin poles of data protection and online harm.
National governments are motivated by a diverse set of ambitions in relation to platforms. For some, the hegemony of those platforms over their citizens’ experience of the internet has challenged the social contract, opening a rift between citizen expectations and government capacity. For others, the spread of platforms has proved an unwelcome challenge to central power. Accordingly, national governance frameworks vary significantly and reflect the diversity of societal concerns, challenges and cultural and political approaches.
The US, for example, follows its free market and free speech traditions. Its hesitancy to regulate platforms has been a defining feature of their growth, with US-based platforms operating largely free of intermediary liability, able to set their own rules and taking minimal legal liability for what their users do or say. This has created tension as those platforms spread beyond the US, particularly with European authorities that are pursuing a more vigilant, co-regulatory model with greater focus on balancing liberties, in contrast to the US focus on freedom of expression.1
In countries where platforms might be perceived to challenge state hegemony over information, regulatory and legislative responses have tended to be stricter. Governments like those of Nigeria and Singapore are increasingly enacting laws to exert greater state control over online space. Under such regimes, platforms can be required to proactively monitor and filter broadly defined categories of online content, make user data available to authorities indiscriminately and reduce user-level protections.
Fitting approaches to platform regulation into neat categories is an imperfect process. Paradoxically, countries with a poor track record on human rights have sometimes mandated platforms to carry out human rights audits, as seen in China. Elsewhere, comparatively liberal platform regulation may include employee liability or proactive content moderation requirements, as in New Zealand or India respectively. India in particular highlights the added difficulty of marrying a regulatory approach with its domestic use and application, and the strength of its oversight and democratic protections.
Striking a balance between the substantial benefits of an open internet and the push by countries to exercise their power online is the policy challenge for future platform regulation.
Whether one approach can or will win out over others, or whether diverse approaches can co-exist, remains to be seen. Efforts to find commonality across regulatory regimes – either from groups of countries or from international bodies like the OECD – are in their infancy. Outside of highly technical spaces such as standards-setting bodies, there is no single major international institution through which platform regulation is currently negotiated. The idea of harmonizing global regulatory approaches to the internet is controversial: the one-size-fits-all approaches that have defined the design of digital platforms to date have regularly failed to account for diverse local contexts, sometimes with catastrophic results. For instance, digital and social media platforms have been accused of high-profile failures in stewardship in Myanmar, Somalia and, most recently, during the Israel–Hamas conflict.
As well as having an integral role in underpinning global business, communication and community, for many people around the world, the web represents the most powerful tool for maintaining values such as freedom of expression and access to information, and for coordination on global challenges like climate change and sustainable development. But new digital jurisdictions have mapped poorly onto existing political and legal institutions, creating significant new challenges for sovereign nations seeking to protect their citizens, enforce their laws and set the fundamental norms of the societies they govern.
While the internet and its benefits should not be equated with or reduced to a handful of large digital platforms, such platforms constitute the main – and, sometimes, only – entry point to the digital space for many users across the world. Existing and upcoming national regulations on platforms will therefore have a direct impact on citizens’ experience and access. By extension, they will also partially define how open the global internet will be in the future. Striking a balance between the substantial benefits of an open internet on the one hand and the push by countries to exercise their power online on the other is the policy challenge for future platform regulation. Addressing technology governance, devising the appropriate policy and regulatory responses will require global cooperation. The internet could still form the basis of such cooperation. But a jurisdictional, fragmented internet threatens to undermine this promise at the time when it may be needed most.
This research paper – produced in partnership by Chatham House and Global Partners Digital – examines the divergent approaches to platform regulation to date. These approaches range from limited and independent regulation such as in Canada, to much firmer regimes aimed at preserving social order, like that in Belarus. Some approaches threaten companies with fines and others place liability with their directors. Some focus on the protection and promotion of civil liberties. Meanwhile, others look to empower users through technical tools, or gloss over the empowerment or protection of users entirely and lay out lists of illegal content for platforms to tackle.
The paper takes stock of where we are today, lays out where we might aim to get to tomorrow, and considers how we might measure the distance between the two.