Appendix 1: Methodology
Identification of laws and proposals for analysis
The scope of laws and policies around the world that apply to online platforms in some way is huge, including, among others, consumer protection regulations, competition law, media and broadcasting regulations and data protection legislation. For the purposes of this study, research focused on laws that relate specifically to the question of platforms’ intermediary liability for user-generated content: namely, on laws that impose legal requirements on how platforms should moderate content. Narrowing the scope of the mapping of platform regulations to focus specifically on laws that introduce such requirements allowed us to explore in greater depth how regulators are currently grappling with the novel issues that platforms pose compared to other types of businesses. More developed regulatory systems will include provisions impacting platforms elsewhere.
To map out relevant laws and policies, researchers conducted an initial mapping of laws and proposals that might potentially be in scope as of October 2022, drawing on:
- Global Partners Digital’s existing body of research and monitoring of platform regulation laws around the world;
- Published mappings or reviews of global platform regulations or intermediary liability legislations;
- Feedback from regional and local experts; and
- Desk research using national legal gazettes and records of legislation.
In some cases, translation tools were used to assist with conducting the mapping, which included laws currently in force as well as draft legislation and proposals. (Voluntary codes of practice or self-regulatory initiatives were not included.)
Through this initial exercise, researchers identified 137 laws and proposals across 95 jurisdictions as potentially including requirements relating to how online platforms moderate online content. These examples were taken forward for further examination.
Of this initial group of 137 laws, 82 were excluded as not in scope for further analysis for the following reasons:
- Five laws were excluded because they had already been repealed and were no longer in force;
- 11 proposals were excluded because they had been stalled for some time or had been identified by local researchers as highly unlikely to pass into law due to lack of support;
- 21 laws and proposals were excluded because they contained only a simple intermediary liability clause exempting platforms from liability for any user-generated content and or any content moderation decisions;
- 17 laws and proposals were excluded because their requirements on online platforms did not relate specifically to the moderation of online content (for example, laws which focused on access, data privacy or anti-monopoly);
- 17 laws and proposals were excluded because, while they did relate to management of content online, the requirements were applicable only to internet service providers, media and press outlets or regulators rather than online platforms themselves; and
- 11 were excluded because it was not possible to find publicly available versions or reliable translations at the time of research, to be able to analyse whether the law included requirements for platforms’ content moderation.
The remaining group of 55 laws and proposals (spanning 41 jurisdictions) were taken to be in scope and were analysed in full. Where laws or proposals were amendments to or regulations under an existing law, these were investigated in tandem as one holistic regulatory framework. Of the 55 regulatory frameworks examined, 35 were already in force as of October 2022, 11 had been introduced as bills but not yet passed, and nine were draft proposals or frameworks. Appendix 2 contains a full list of regulations considered.
This list is intended to be a robust snapshot of existing laws and proposals placing requirements on how platforms should moderate content as of October 2022. But it is by no means exhaustive, and it is important to acknowledge that there may be other laws and proposals that include requirements on how platforms moderate online content which are outside of this dataset.
Analysis of laws and proposals
The laws and proposals identified for analysis vary considerably in scope, approach and implementation. Some are hundreds of pages long and specifically focused on online safety and platform regulation, whereas others are just clauses in a broader piece of legislation. To be able to conduct quantitative analysis and to compare approaches across the whole group of regulations, researchers developed a taxonomy for analysis that could be applied to each law or proposal through a series of yes/no questions. The taxonomy was designed to capture trends and variation across platform regulations in terms of:
- The types and sizes of platforms the law includes in its scope;
- The nature and categories of online content the law relates to;
- The way that the regime is enforced, including through penalties and an independent regulator;
- The model of intermediary liability applied to platforms for user-generated content;
- The types of duties they most commonly introduce for platforms with relation to content moderation or other relevant processes; and
- The degree of protection they provide for freedom of expression.
The taxonomy consisted of 29 yes/no questions drawn from a preliminary evidence review, Researchers grouped these questions into six broad themes: 1) scope and governance; 2) penalties and sanctions; 3) content-based duties; 4) business-based duties; 5) considerations for freedom of expression; and 6) protections for users.
Scope and governance
- Does the regulation differentiate between types of digital platforms? (For example, between video streaming services and social network platforms?)
- Does the regulation differentiate between sizes of digital platforms? (For example, as measured by annual revenue or number of users or employees?)
- Is the regulation enforced by an independent authority?
- Does the regulation require a multi-stakeholder approach to platform governance?
Penalties and sanctions
- Does the regulation impose fines?
- Does the regulation threaten platforms with restrictions or blocking for non-compliance?
- Does the regulation threaten prison sentences for platform employees for non-compliance with content moderation requirements?
Content-based duties
- Does the regulation require platforms to remove prohibited content when ordered to do so by a court?
- Does the regulation require platforms to remove prohibited content whenever it is notified of such content?
- Does the regulation require platforms to proactively monitor for prohibited content?
- Does the regulation require platforms to remove prohibited content within a specific timeframe?
- Does the regulation tackle content which is already designated as illegal under other legislation?
- Does the regulation designate new types of content as illegal?
- Does the regulation require platforms to remove or deal with content that is not illegal?
Business-based duties
- Does the regulation require platforms to register its services with authorities?
- Does the regulation require platforms to establish a local office or local contact?
- Does the regulation require platforms to carry out human rights risk assessments?
- Does the regulation require platforms to report regularly on the performance of their content moderation systems?
- Does the regulation require platforms to report regularly on advertising revenue?
- Does the regulation require platforms to submit to independent audit?
- Does the regulation require platforms to store data locally?
Considerations for freedom of expression
- Does the regulation explicitly mention freedom of expression?
- Does the regulation reference platforms’ responsibility to consider freedom of expression in their operations?
- Are there regulatory exemptions for journalistic, scientific or public interest content?
- Are there limitations on the powers of regulators in line with freedom of expression safeguards?
Considerations for user capacities
- Does the regulation require platforms to publish terms of service?
- Does the regulation require platforms to implement complaints mechanisms?
- Does the regulation require platforms to implement appeals mechanisms?
- Does the regulation require platforms to notify users of ongoing complaints or appeals?
These questions were designed to capture the variation in approaches looked at in a quantifiable way and to make clear commonalities and differences across jurisdictions and regimes. However, the limitations of this method were that:
- The nature of the questions themselves was informed by the researchers’ own experience and understanding of platform regulations. For example, the focus was informed more by expertise on freedom of expression standards and safeguards than by expertise on children’s rights or minority rights.
- Not all features or relevant details of each platform regulation are represented in the 29 questions. For example, the taxonomy did not capture whether the platform regulation includes requirements to trace the first sender of a message or to monitor private or encrypted communications.
- The binary nature of the 29 questions, while necessary in order to aggregate data, does not capture qualitative details that might also be relevant. For example, while the taxonomy shows whether a platform regulation differentiates between types or sizes of platforms, it does not capture or compare what those categories are across different legislative frameworks.
- The taxonomy does not differentiate between proposals not yet passed and laws already in force.
- In some cases, where official English translations were not available, translation tools were required to interpret relevant clauses, which could have introduced some errors in analysis.
- Only the text of each law was analysed, rather than any implementation or enforcement in practice.
- Due to the fast pace of change of this regulatory area, by the time of publication some of the draft or proposed laws may have been amended since the analysis was conducted. A small percentage of the responses in the dataset may therefore no longer be accurate.
Despite these caveats and limitations, this dataset still serves as a useful starting point of analysis for core elements of the 55 laws and proposals considered.
Thematic analysis of approaches to platform regulation
With 29 data points for each of the 55 laws and proposals (and thus 1,595 data points in total), data analysis tools were used to identify and draw out similarities and differences between regulations from the raw data and to provide insights for further analysis.
Two regulations that had matching answers to the 29 yes/no questions were deemed 100 per cent similar, while two regulations that differed on all 29 questions were deemed 0 per cent similar. Between these two extremes, most regulations had at least some answers in common. Regulations were mapped by measuring their similarity using a Jaccard Similarity score across the 29 binary attributes, with the similarity score used as an edge weight in a simple clustering software package (Gephi) using a default graph layout algorithm (ForceAtlas2). This allows for an at-a-glance analysis to see how similar or different the various regulations were: the further apart two regulations were on the map, the less similar they were.,,