Digital platform regulation: Governing the ungovernable

How to regulate digital platforms, the last fortress for democratic and participatory civic engagement.

Expert comment
4 minute READ

Yasmin Afina

Former Research Fellow, Digital Society Initiative

Digital platform regulation is in the spotlight again. Amid questions around competition, allegations of shortcomings in stewardship and lawsuits alleging negligence, calls to regulate digital platforms by governments, the media and  civil society are louder than ever. But this is a discordant chorus: its volume should not mask the wide divergence in approaches to platform regulation playing out round the world.

Allegations have been levelled at platforms for their role in amplifying hateful content during the violence in  Myanmar. More recently, a lawsuit has been filed in Kenya against Meta on the basis that its recommendations systems magnified hateful and violent posts in the context of the Ethiopian civil war.

Elon Musk’s Twitter takeover has served as a further catalyst for wider discussions, notably in the press, on regulatory frameworks surrounding digital platforms. For example, with regards to Twitter’s compliance with the EU’s Digital Services Act, due to enter into force in 2024, censorship over journalists and issues surrounding anticompetitive practices mandated by the billionaire have faced criticism.

Elon Musk’s Twitter takeover has served as a further catalyst for wider discussions on regulatory frameworks surrounding digital platforms.

But the challenges posed by digital platform regulations cut both ways. Not all regulation is good regulation. In the absence of strong democratic institutions and appropriate checks and balances, they are likely to have a chilling effect on fundamental rights such as freedom of expression.

Concerns arise when governments leverage regulatory tools to reaffirm control over platforms without due process. In July 2022, the Economic Community of West African States (ECOWAS) Court ruled Nigeria’s ban on Twitter was a violation to freedom of expression, access to information and press freedom.

In some parts of the world, the rise of digital platforms has renegotiated long-standing settlements on the right to freedom of expression, which is deeply rooted in the International Covenant on Civil and Political Rights (ICCPR) and regional human rights treaties.

Striking the right balance

These trends and developments attest to three realities: First, the regulatory landscape surrounding digital platforms is a crowded – and increasingly fragmented – space. Second, as opposed to ‘classic’ power relationships between the state and individuals, digital platforms have grown into major and influential players in setting the boundaries and norms of freedom of expression and other rights. The social contract is now a negotiation between three parties.

It is precisely because of the power they hold that digital platforms are under such scrutiny: the question in most governments is no longer whether to regulate, but how.

Third, the question of digital platform regulation carries high stakes over the upholding of democratic values and human rights as the ‘digital public square’. It may even be argued that in some cases, digital platforms further the cause of democratic and timely participatory civic engagement.

Rapid and unfettered growth and a relative lack of licensing requirements, which digital platforms are generally not subject to in contrast with other forms of media (e.g., radio, television broadcasting, etc.) have propelled platforms into a critical societal role, and in doing so their terms of service have reshaped norms around freedom of expression.

It is precisely because of the power they hold that digital platforms are under such scrutiny: the question in most governments is no longer whether to regulate, but how.

Regulatory trends

States and other governing bodies have been approaching the question of digital platform regulation differently leading to a fragmented regulatory landscape across the globe. Even within geographic or linguistic regions, legislation is often disparate and disconnected.

In Europe, the European Union’s Digital Services Act requires that clear terms of service and redress systems are in place for users; the publication of transparency reports; and the appointment by each member state of a national independent regulator as the digital service coordinator.

But neighbouring countries may choose not to align themselves with this approach. Belarus, Russia, and Turkey all put forward further restrictions on politicized content types (e.g., those offensive to public morality) and potential sanctions on individual platform employees or directors.

Globally recognized legal or normative frameworks, like international human rights, have not yet translated to digital spaces. Industry attempts to set rules – Facebook’s Oversight Board, for instance – have been welcomed but do not appear to have deterred governments from taking matters further.

Globally recognized legal or normative frameworks, like international human rights, have not yet translated to digital spaces.

Whether regulatory harmonization is feasible or desirable at all remains, however, debatable, particularly in light of countries’ respective priorities, norms, and legislative landscapes. It remains to be seen how platforms will manage regulatory regimes as divergent as those in Ireland and those in Indonesia, or manage the expectations of lawmakers in the US where most platforms call home.

Risk mitigation and solutions

There remains little international agreement on a model to regulate digital platforms and it is unrealistic to hope for a ‘one-size-fits-all’ approach in light of diverse local democratic contexts. Local laws and customs, sociopolitical realities on the ground, the standing of human rights approaches, as well as the power relationships between the state, the people, and other private entities, all differ from country to country, and all inform national approaches to platforms.

Nevertheless policymakers cont.

Nevertheless, policymakers, legislators, the tech industry and other key stakeholders must take into account the following considerations to mitigate and reduce risks of human rights violations, and ultimately foster digital platforms and technologies made by all and for all:

  1. The right balance must be struck to provide for an appropriate regulatory framework, taking into account: 1) Human rights compliance, and in particular with regards to the right to exercise freedom of speech, freedom of information, and privacy concerns; 2) The need for safeguards against risks of harm, in particular against vulnerable users and parts of the population (e.g., children, minorities, etc.); and 3) The power relationships between powerful stakeholders on the one hand (i.e., states, platforms, shareholders) and the population on the other hand must be kept in check.
  2. Discussions and deliberations surrounding the regulation of digital platforms, as well as the establishment of regional and international standards, or soft law, should be inclusive and of multistakeholder nature. Governments should  demonstrate greater political will to include civil society when shaping the regulatory landscape surrounding digital platforms.
  3. Stakeholders with significant resources should facilitate and pave the way for inclusive and multistakeholder discussions both at the national and international levels, in addition to leveraging these resources to improve the general understanding on the dynamics and trends in a crowded and fast-paced space.
  4. There is a need for digital platforms, in particular those bearing great presence (and influence) over the public to acknowledge the important role and influence they have; take responsibility over their approach to content moderation while preserving and safeguarding human rights norms; and engage with users across different regions in a more equal, non-discriminatory manner. Processes established and/or led by the private sector to implement digital platform regulation must maintain a degree of democratic oversight – which must be inclusive and will inevitably lead to uncomfortable, but necessary, conversations.

A longer report on the regulatory landscape surrounding digital platform governance will be published later this year by the Digital Society Initiative. Contact us if you wish to engage with us on this topic, and/or would like to receive an update on the paper’s publication.