Kate Jones
From a human rights perspective, the Oversight Board’s decision is a strong one, and not at all surprising. The board decided Facebook was right to suspend the former president’s access to post content on Facebook and Instagram, but not indefinitely.
It found Donald Trump’s posts violated Facebook’s community standards because they amounted to praise or support of people engaged in violence and that, applying a human rights assessment, Facebook’s suspension of Trump was a necessary and proportionate restriction of his right to freedom of expression.
However the board also found Trump’s indefinite suspension was neither in conformity with a clear Facebook procedure nor consistent with its commitment to respect human rights. Its decision requires Facebook to make a new decision on the future of Donald Trump’s account, grounded in its rules.
While opinions on this result will differ, the increased call for clear and accessible rules and respect for human rights in their implementation that the Oversight Board brings to Facebook’s operations is welcome.
But the Oversight Board’s powers are limited to content moderation – Facebook declined to answer the board’s questions about amplification of Trump’s posts through the platform’s design decisions and algorithms. This limitation on the board’s role should be lifted. It is in content amplification, not just content moderation, that Facebook should face scrutiny and accountability for the sake of the human rights of its users.
Fundamentally, human rights is not a veneer which can mask or legitimize underlying power dynamics or public policy – those still fall to be assessed for themselves.
The Trump/Facebook saga does highlight the vast power Facebook and other major social media platforms have over political discussion and persuasion. Through granting or denying, or through amplifying or quietening the voices of political figures, Facebook has the power to shape politics, electorates, and democratic processes. Improving content moderation through the Oversight Board, although important, does little to constrain that power.
Facebook itself, unlike a government, has no accountability to the general public, and the Oversight Board must not distract us from the need for a full conversation about the extent to which Facebook’s power is appropriately held and properly wielded.
Emily Taylor
This decision marks a coming of age for Facebook’s content moderation process. For years, decisions to take down content or ban users have been opaque, conducted by a human workforce that Facebook and other platforms have been hesitant to acknowledge. The platforms have also been worried that being seen to exercise an editorial function might put at risk the legal protections which prevent the platforms being held responsible for user-generated content.
When the Oversight Board was first posited, observers questioned whether a body funded by Facebook could properly exercise a legitimate appeals function. Now there is a reasoned decision which partly supports the decision to de-platform a serving president, but also takes issue with the indefinite nature of the ban.
Facebook specifically asked the Oversight Board to consider specific challenges involved when the person involved is a political leader. The board concluded that Trump’s ‘status as head of state with a high position of trust not only imbued his words with greater force and credibility but also created risks that his followers would understand they could act with impunity’. The storming of the US Capitol and role President Trump played in stirring up the violence underlined that political leaders’ words can motivate others to take harmful actions.
Just as the events of January 6 remain shocking, it remains shocking that private platforms have exercised the power to curb the speech of a US president. It also remains shocking that the platforms sat back and took no action over the previous four years, but waited until the final days of the transition.
The board’s decision is an evolution in private-sector content moderation, with a diverse board giving a reasoned opinion on a Facebook decision. But to fully comply with the principles of open justice, board decisions should include more detail on the individuals who have made the decision – at present, it appears all members of the board review the decision but it is not clear which individuals were involved in its drafting, or that they were clear from conflicts. If the process is to gain respect as a truly independent oversight on the platform’s decisions, greater transparency over the identity of decision-makers will be needed.
Mark Zuckerberg expressed concern about Facebook becoming an arbiter of truth or free speech and, overall, the difficulty of having private companies managing the application of fundamental rights on their platforms has not been solved. Just because companies have the financial resources to do it, does not mean they necessarily should.
Yet no other international governance or arbitration system has emerged to handle the complexities of platform power over speech. In the context of that vacuum, the Oversight Board’s decision is a welcome step.