Dr Robin Niblett CMG
Ladies and gentlemen, welcome to this special session, co-hosted by Chatham House, by the Global Internet Forum to Counter Terrorism and by Microsoft. We’re delighted to welcome you to this special session we’re running in co-ordination or in the same time as the 75th UN General Assembly, and we’re going to be discussing the role of tech companies encountering terrorist use of the internet, an incredibly important topic. Sadly, a very timely topic, but one that we have an excellent series of speakers involved in engaging with us on, and I’ll introduce them in a minute.
But I do want to just let you know that this is session is on the record. It is being livestreamed, and it is being recorded, so just to be aware this is a – very much a public event, as I think it should be, given the importance of the topic that we’re covering. You will have an opportunity, after we’ve heard opening remarks from our opening keynote speaker and our panel, you’ll have a chance to ask questions. Just to let you know, we’re not using the ‘Chat’ function to interact, in terms of questions. Please use the ‘Q&A’ function, which I’m sure you’re all expert enough to use now, at the bottom of your screen. We will have the opportunity to unmute you to ask questions, if you can and are able to, otherwise I can pose the questions on your behalf to the panel when we get to that part of the programme.
Needless to say, this discussion takes place at a time where the threat of terrorism and violent extremism occur around the world, continues not only to expand and proliferate, but to intensify and to diversify, as well, and the capacity of these groups to take advantage of social media platforms of the digital world, and whether it is to recruit, whether it is to communicate, whether it is to fundraise, whether it’s to plan, whether it’s to promote. And, as we saw, tragically and terribly in the course of 2019 with two attacks, but most visibly in the Christchurch attack, terrorist organisations and individuals have made full use of what they see as the, kind of, fragmentation of various platforms, make a passage to navigate between to achieve their goals.
So, what we hope to do is to have an interesting conversation, and particularly about how tech companies and others in government, in civil society, are coming together to try to counter those threats together. We want to use this discussion today. We’ve got about an hour and a quarter from now. I’m not going to try to do all the time zones, there’s so many people on different time zones, an hour and a quarter from now, we hope to have an enlightening conversation, from a range of different perspectives, but also, to hear about solutions and to get a sense of whether a multisectoral community is coming together to address this challenge in a way that no individual sector, whether it is the tech companies, the governments, or individuals in civil society, can do alone. So, that’s really the objective of this meeting, you know, what steps can be taken, whether intergovernmentally, institutionally, what can we all do to co-operate better to create a safer environment than we have today and one that cannot be abused and used by terrorist actors from around the world?
Now, as I said, we’ve got a packed programme, and I’m going to try to keep a very tight eye on that programme, so that we don’t lose the flow. But I do want to say that, just so you know how we’re going to flow, in a minute, I will be inviting Michelle Coninsx, the Assistant Secretary-General and Executive Director of the Counter-Terrorism Executive Directorate here in the – well, here, at the UN. I say here because it’s a UNGA week, but she’ll give us some opening keynote remarks, and then we’ll have a panel composed of Nick Rasmussen, who’s the Executive Director of the Global Internet Forum to Counter Terrorism, GIFCT, as I think I will try to shorthand it, and as I know he will later on, and I’ll introduce each of these people just a little bit more when I turn to them. We’ve got Erin Saltman, who’s Head of Counter-Terrorism and Dangerous Organisation Policy, EMEA at Facebook, Samir Saran, the President of the Observer Research Foundation, joining us from Delhi, and we have Fionnuala Ní Aoláin, who’s Special Rapporteur for the Protection and Promotion of Human Rights While Countering Terrorism on behalf of the UN. So, that is the expert group that we have with us. We’ll have some closing remarks from Courtney Gregoire of Microsoft at the end, as well, but that’s the group we’ve brought together to have this conversation with you all now.
And so, what I’m going to do, to keep our eye a little bit on time in a second is, invite Michele Coninsx to give us some opening remarks. As I mentioned, she is Assistant Secretary-General at the UN overseeing the Executive Directorate on countering terrorism, but she used to be, until she took up this role, President of Eurojust, which those of you who know the EU, will know is the, kind of, in essence, member state intergovernmental organisation overseeing the justice co-operation across the members of the EU, and a former Prosecutor General at Eurojust, as well, and a Public Prosecutor in Belgium. So, she really brings a lot of national, supernational/regional, and now multilateral experience to this complex topic. Michele, we really appreciate you making time and we know how these weeks, for those of you working at the UN, are incredibly busy. Delighted you can be part of this special session that we’re co-hosting today. Over to you.
Michele Coninsx
So, good afternoon, good morning, ladies and gentlemen. Thank you so much for this nice introduction and thank you so much to the Chatham House for organising this timely webinar on the responsibility of tech companies in responding to terrorists and violent extremists’ mis-sue of their platforms. Also wish to seize an opportunity to convey my deepest appreciation for the longstanding co-operation that Chatham House has with the organisation that I am leading, the Counter-Terrorism Committee’s Executive Directorate, it’s called CTED.
Let me start by telling you how relevant this topic is in our mandate. The CTC, the Counter-Terrorism Committee’s comprehensive international framework to counterterrorist narratives and both the Security Council Resolution 2354, both stemming from 2017, stress that public-private partnership are needed to counter incitement to commit acts of terrorism, and this includes counting terrorist narratives, online propaganda, using both disruptive and preventive measures, as well as counter messaging. And among the stakeholders urged to engage in the whole of society, human compliant – human rights compliant, gender sensitive approaches, the tech sector was identified in particular as a partner for engaging on counter narratives, alternative narratives, positive messaging, and strategic communication.
And since then, the tech companies, civil society, and the governments have been working together to prevent incitement and strengthen the society’s resilience. This was the approach taken by CTED in 2017, when it launched the TAT, or the Tech Against Terrorism, an initiative aimed at engaging with the tech sector, including smaller start-up platforms, to understand the risks that it faces in these areas. And TAT works to understand and anticipate terrorist views of all online technologies, including social media, the filesharing, messaging, financial platforms, terrorist-operated websites, and decentralised file servers. It engages with high-risk platforms and mentors them, so that they are better equipped to design effective policy, processes, and systems to tackle terrorist abuse of the services.
And, over the years, CTED and TAT have supported the Global Internet Forum to Counter Terrorism at all stages of its development, and so we are delighted, this year, today, it’s their ED, but also to see that formalisation and – of GIFCT and we look really forward to further, wherever possible, supporting GIFCT in its future work, with its new full-time team. Security Council has welcomed the approach taken by Tech Against Terrorism and GIFCT in several resolutions.
So, ladies and gentlemen, the internet continues to provide unprecedented global access to information. We all agree on that. But, however, I think it’s also fair to say that the world’s early idolism about its impact has somewhat faded. The public sees hate, abuse, and disinformation in content generated by certain users, and governments seek terrorist recruitment or unwelcome dissent and opposition. And several society organisations see the outsourcing of public functions or the public responsibility, such as the protection of freedom of expression, to unaccountable private sector, at least private actors.
And what responsibilities do companies have to ensure that their platform do not interfere with rights guaranteed on the international law? What standards should they apply to content moderation? Should states regulate commercial content moderation, and, if so, how? To mitigate tracks to freedom of expression, the law expects states to demonstrate both transparency and accountability, and should we expect the same of private actors? What do the processes of protection and remedy look like in the digital age? These and many other difficult questions present themselves not just in the context of counter-terrorism, but in all areas of our lives, which are increasingly plagued by the efforts of misinformation and disinformation. I’m looking forward to the interventions of the panellists. I’m sure that a lot of those questions will be correctly addressed.
The need to counterterrorist online content is, of course, not a new issue. Security Council Resolution 1373, a landmark resolution that saw the daylight after the 9/11 attacks noted the use of new technologies to terrorists, and, clearly, however, with the rise of ISIL Da’esh in 2014 and foreign terrorist fighters phenomenon, it took on a new dimension, and has remained high on the agenda of policymakers, counter-terrorism practitioners, the media and civil society.
Extreme right-wing groups and individuals are also using ICT in an increasingly sophisticated manner to radicalise and recruit. You alluded to it, Mr Chair, there are mainstream social media platforms to target new audiences that are – and use those mainstream social media platforms to target new audiences that are not yet part of their movement, exploit the appeal of cultural pushback. They also use non-mainstream platforms for in-group communication and radicalisation, and extreme right-wing terrorists have adapted their operations to new online spaces and tools, often being early adopters.
For example, the perpetrators of some of the recent terrorist attacks that include attacks in Christchurch, in El Paso, but also in Burnham, in Norway, announced that plans on HN or other similar online forums. And, in some cases, they even tried to livestream their attacks on major platforms, including gaming platforms, to maximise the publicity both and the impact. Notable among the efforts made to counter these activities is a Christchurch call to eliminate terrorist and violent extremist content online, which was launched by New Zealand and by France in May 2019, and is supported by many member states at major online service providers.
COVID-19, let’s also tackle that issue, has also added to the complexity of the situation and the discussion. We have seen extreme right-wing groups go up to exploit the COVID-19 pandemic, to spread misinformation, mal-information, and disinformation, again in an attempt to radicalise, recruit, and inspire plots, and the recent research shows that the number of users joining telegram channels, associated with extreme right-wing racist ideologies, has grown since the onset of the pandemic. And, in comparison with other terrorist groups, including ISIL and Al-Qaeda, extreme right-wing terrorist groups present relatively distinct organisational structures, patterns of violence, which includes significant rules for lone actors, who often discover their ideological justifications, tactical inspiration, and social support in online communities, and this reinforces the need for tailored responses that specifically address those characteristics.
And Tech Against Terrorism, which is funded now by a grant from Public Safety Canada, is also developing so-called TCAP, Terrorist Content Analytics Platform, which is intended as the world’s most comprehensive analytics platform and database of verified terrorist content. The purpose of TCAP will be to facilitate the moderation of terrorist content by smaller platforms and provide a safe, secure, and effective working environment for academics and researchers. Civil society will be able to access the platform to verify content and challenge decisions, and TCAP is intended to be a collaborative content moderation platform that emphasises public-private partnership and transparency.
In addition to content relating to Al-Qaeda and ISIL, it will also include content relating to extreme right-wing terrorist groups listed by member states. It will also provide training sets for machine learning and classification. It will be launched in beta form this year and will be fully available for platforms, academics, civil society, and Data Scientists. Through this and other initiatives, Tech Against Terrorism will continue to seek ways to help platforms scale up their efforts to moderate terrorist content and to help academics and researchers to strengthen their analyses.
Finally, you alluded to it, Mr Chair, this year marks the 75th anniversary of the founding of the United Nations, and next week will be its first virtual General Assembly debate. Strengthening multilateralism and international co-operation will be an important theme for member states’ deliberation. Likewise, multisectoral partnership should be a key phrase for us in addressing the responsibility of tech companies in responding to terrorists’ violent misuse of their platforms. CTED, we really look forward to further strengthening our partnership and co-operation in this very critical area and we look forward to the upcoming debate. Thank you so much.
Dr Robin Niblett CMG
Thank you very much, Michele. Thank you for those opening remarks and for updating us, I think importantly, on some of the new steps being taken, in particular that reference there to TCAP and some of the content analytics, but also, some of the other steps being taken at the UN and multilateral level. And I think you’ve put on the table some of the important questions there about the dilemma of, in essence, governments, as you said, outsourcing freedom of expression to the private sector, therefore under what standards, under what regulatory frameworks, and under what requirements for transparency and accountability. Obviously, these are some of the key questions that we will be addressing and posing in a minute. And I think, with your set-up there, your very effective set-up of the state of play from the UN’s perspective and your own perspective, I can turn now to our panel, which I introduced briefly earlier on.
And we’re going to be kicking off with Nick Rasmussen, who is the inaugural Executive Director of the Global Internet Forum to Counter Terrorism. He is also a very well-experienced individual to be able to take this role on. He was the former Director of the National Counter-Terrorism Center for the US government, where he led over 1,000 people from across the intelligence community and the Federal Government to deal with intelligent risks and intelligence assessment of counter-terrorism risks and terrorist risks. He was also Senior Director for Counter-Terrorism in both the George W. Bush and Obama administrations, and worked prior to that in the State Department, in particular on the Middle East. So, again, as we found with Michele, we’ve got people with often a long history in this space.
Nick, I’m going to turn to you to share some initial thoughts on the purpose of the Global Internet Forum, and maybe you could say a word or two about why it was set-up, and how it’s adapted itself currently. Obviously, it was a private sector initiative initially, but I think it is expanding beyond that now, and you’ll let us know, and that’ll help set the scene, I think, for the conversation going forward, but thanks for joining us. We look forward to hearing more about how you and your colleagues are tackling this very important challenge. Over to you.
Nicholas Rasmussen
Thank you, Robin, and thank you to Chatham House for organising this session, and Michele for your comments, as well, which are a terrific lead-in to the – to this – to the panel discussion. Just to set the frame very briefly with some factual refreshing of memories. GIFCT was founded in 2017 by YouTube, Facebook, Microsoft, and Twitter, and, at this point, seven additional member companies have joined. And those companies have come together to prevent terrorists and violent extremists from exploiting the digital environment, in many of the ways that Michele outlined in her remarks. Now, in doing so, those companies have also committed to a whole series of things, and among those include a very public affirmation of their respect for human rights, particularly freedom of expression and privacy.
But, as you acknowledged, Robin, in your remarks in your introduction, it’s timely for Chatham House to be doing this session, organised in this way around this particular topic because GIFCT is, as you alluded to, in the midst of a transition process. As I noted, founded in 2017 by the four founding companies. The decision was made last year by those four founding companies to transition the organisation into an independent, freestanding, non-governmental organisation. So, as of ten weeks ago, I took up the role of Executive Director of that organisation. We are very much in the process of building an organisation, but those companies made an important statement in my mind by seeking to institutionalise the work of GIFCT in an independent organisation.
Ten weeks on into my role as Executive Director, I’ll just offer a few initial thoughts, in terms of things we’ll want to accomplish with the work that we’re doing at GIFCT. First, I’ll start with an observation, though, more than an objective, and that is that the complexity of the multisectoral or multi-stakeholder environment is worth pausing and reflecting on for a moment. The set of stakeholders who have a critical need and a critical reason to be involved in discussions of this sort today is – ends up being nearly infinite, or seemingly infinite. And I’m certainly finding that out as I, you know, use Zoom technology and other technologies to meet with a whole series of individuals and organisations from across civil society, government, academia, and the private sector, all of whom very much have something to bring to the table in this effort to achieve GIFCT’s objectives. And, again, reminding ourselves that the objective is to constrain the ability of terrorists and violent extremists to exploit the online environment, and to do so, though, in a manner that is consistent with and that advances human rights to include freedom of expression and privacy. So, even in setting that objective, you can see just how far and wide the virtual table needs to reach, in order to involve everybody who needs to be part of that discussion.
So, I’ll just offer, as I said, a few quick initial priorities that I’m seeking to make progress on with GIFCT and to advance the work that I think we all agree is so important, and the first is a very simple one, expanding the organisation’s reach through its membership. Michele’s remarks, again, harkening back to those, teed up very nicely that the problem of online activity tied to terrorism and violent extremism is not simply a social media problem. It is also a problem of other technologies, other platforms, to include, again as was mentioned, gaming platforms, filesharing platforms, and a whole host of others. So, the bottom line, I think we at GIFCT will want to involve a wider array of companies and platforms and technologies in our GIFCT work, and I think that more of the surface area of the problem that we’re talking about sits there and not solely on social media platforms.
Secondly, in doing this work, it is my firm intention to be as inclusive as possible. GIFCT, in making this transition, has created a number of working groups to tackle particular problems, particular subsets of the overall problem. GIFCT has also created an international advisory – an independent advisory committee to assist in its work and to advise both me, as Executive Director, and the operating board of GIFCT, and both of those vehicles, a working group structure and an IAC structure, allow us the opportunity to bring many, many more voices and to be far more inclusive in this conversation in keeping with the multi-stakeholder environment.
I also think and intend to bring a greater – I think it’s important and I intend to bring a greater degree of transparency over time to GIFCT’s work. Again, a critique often levelled at the technology sector, and certainly at GIFCT in its early years, is, “We don’t understand what’s going on in those conversations. We are not privy to how decisions are made. We are not privy to which information is looked at in which fashion.” The phrase ‘content cartel’ has been used by an individual or two out there in civil society. And I think these are all worthwhile perspectives that need to be listened to, and so I will look for ways to bring a greater degree of transparency to what GIFCT is doing, and I think that is an important objective.
And, lastly, when talking about objectives, I think it’s – it may sound trite to say so, but I think it’s important that we focus on concrete results. There are plenty of opportunities to have conversations about these issues, but, in the end, we all want to find a way to shrink the size of the problem, so that we can turn our attention to other problems. There are certainly other problems in the online domain that deserve our attention, as well. So, the working group structure that GIFCT has created, the set of academic partners that GIFCT has, working through the GNET process, the work we’re doing with Tech Against Terrorism, much of it was outlined by Michele in her remarks. All of that needs to lead to actual concrete solutions that individual companies can then take and apply in their own work.
In the end, these are – content moderation activities are – involve decisions made by companies, using their tools, their capabilities, but it’s important, in this multi-stakeholder environment, to try to lead the way and show how this should be done and can be done. And, particularly with smaller, newer platforms, there’s an opportunity to, kind of, bring along newer entries into the marketplace, in this effort to moderate this kind of content and to eliminate this kind of content, you know, from the online environment. But, again, our measure of success here will be concrete results, and so, I look forward to the rest of the conversation with Samir, with Erin, with other colleagues, and, Robin, I’ll just pitch it back to you, and I’m happy to jump into the Q&A when we get to that point.
Dr Robin Niblett CMG
And thanks very much, Nick, and thanks for being bang on and disciplined with your time. I will take advantage of the fact you’re being disciplined with time. Just one specific issue, ‘cause you mentioned how important it is to get to concrete results. Could you say a word or two about the, kind of, lessons learned from Christchurch to then the response to the Halle attack? ‘Cause I understand that one of the things that GIFCT was able to do was come up with some fairly basic ways of being able to label, almost, I suppose, the data and share that information, and it just struck me, as I was looking ahead of this presentation, that was the kind of example I imagine of what you mean by concrete results? And could you just say a word or two about that, because that might then give us some framework for the rest of the conversation that we have coming up?
Nicholas Rasmussen
Certainly, Robin, and I’d certainly welcome Courtney’s views on this, as well, because Microsoft has been a big part of the process in this particular point I’m going to make. GIFCT now operates a Content Incident Protocol that allows the organisation to share information, or allows participating companies to share information across company lines in a real-time, crisis-driven environment. So, imagine the horrific events that we saw in Christchurch happening again. If that were to happen again today, I believe we would be in a better position for companies, at least of the largest social media platforms, to be able to share information in real time with each other about what was appearing on their platforms, having the capacity to reach across those company lines, you know, even Engineer-to-Engineer, so that they could narrow the amount of time that it would take to identify that material, that harmful material, and remove it, and prevent it from being further disseminated.
And so, again, I’ve said this publicly on more than one occasion so far, it’s always going – it’s hard to say you’ll never have a horrific event like Christchurch again. But what I’m confident now is that the participant companies in GIFCT are better equipped to narrow the amount of time that it would take to take real concrete action to limit the reach of that material. That is a process that will mature and get better over time. Each time one of these incidents happens, it’s an opportunity for lessons learned and for refinement of processes, tightening the connections between companies, making sure that information is shared in a form that it can be used in real time. All of these end up being technological issues, as much as issues of will and willingness, and I see Courtney nodding her head because she lives inside this environment at Microsoft, and I want to, you know, put a shoutout to Microsoft for taking a lot of the lead in this, even though it’s not necessarily their platform where this stuff happens and shows up.
But I think that’s just one concrete example, Robin, this Content Incident Protocol, that would allow GIFCT to have a real-world tangible effect, and we all saw in Christchurch what happens when there isn’t that kind of, you know, capability to respond in real time.
Dr Robin Niblett CMG
Yeah, thanks for taking it in specifically there. Now, I don’t know if Courtney wants to come in at some point, I’ll – I’m definitely bringing her in at the end, as she knows, but if she wants to come in beforehand, let me know, Courtney, as well, on any of these points, otherwise I will turn to Erin now. Erin, who is Head of Counter-Terrorism, as I said earlier, and Dangerous Organisation Policy at Facebook, focusing, in particular, on Europe, Middle East and Africa, which is a pretty big region, to put it mildly, in terms of coverage, and, in particular, in the seriousness of the challenges being taken on here. I think what’s interesting, just to give people a little more background, I think, Erin, if I understand it rightly, your background is in radicalisation and the, kind of, socio-political drivers of extremism, and you’ve published quite a bit on the nature of online extremism, what the motivation and drivers are. It’d be interesting to hear how a company like Facebook, obviously, which has been absolutely at the heart of this whole debate, of how social media platforms have been used for the propagation of hate speech and so on, how you are tackling this issue. And maybe you could say a word, as you’re going on there, Facebook obviously has established its own Oversight Board, which I think has had quite a bit of attention linked to it, and it’d be interesting to know how you see that form of content curation connecting in with the work of GIFCT, which obviously your members of, founding members of, as well. So, Erin, a delight to have you on this call, as well. Over to you.
Dr Erin Marie Saltman
Well, you’ve started with some easy primers there, but, yes, thank you so much to Chatham House, as well as to Microsoft as the 2020 Tech Chair of GIFCT, for organising and having me join this distinguished panel. I mean, you mentioned things like the Oversight Board, and I think that goes a long way to the tone of having organisations like the GIFCT, the fact that especially larger type companies, as our membership base grows and grows, we are having to lean into multisector infrastructure, lean into external experts, and humble ourselves, and not pretend that there’s one single algorithm that can techy our ways out of global issues. You mentioned I manage the team that looks after Europe, Middle East and Africa, I also look after some of our teams that are looking at Southeast Asia. If I go to different communities across that huge diverse landscape and say, “What is terrorism? What is violent extremism?” there are 1,000s plus answers of what that looks like, what xenophobic, hate-based tensions turn into violent has huge variation across the world. So, we also shouldn’t silo ourselves into thinking, when we say counter-terrorism online, that we mean just one or two groups or that we mean just one or two regions.
So, given this topic, and my area of expertise, I’ll talk a little bit about how tech companies are having to think through risk mitigation and wider safety concerns, and really that means threefold having to think through what your internal human expert teams look like. And then, on top of that, where can we bring in and scale up tooling and artificial intelligence and algorithmic solves, and then, most importantly at the end, where then do we have to supplement that even that will not be enough and where do we need to lean into these partnerships, which is where GIFCT really gets to the core of solution-finding? And while more people are connecting with friends, family and loved ones online, we also know that means bad actors will continue to try to exploit and abuse digital platforms, as well.
So, Facebook and other social media platforms really do hold up this mirror to society, and, with over three billion people using Facebook apps monthly, that puts a big responsibility on us and other platforms to decide where to draw the line. And, as mentioned by others, you know, when are we having to go into a space without much guidance, when, traditionally, this might be a government’s decision on what was and was not allowed in a particular space, so what can we do? To that first point about human expertise, we do have to continue to build out a diverse and global expert team. More tech solutions does not, in fact, mean less people. The opposite is usually true.
So, we have over 350 people at Facebook, purely focused on what we call dangerous organisations, which encompasses terrorism, and that expertise in-house ranges from former law enforcement, former national security, but also Data Scientists, Social Scientists, engineering, and they have to come from different parts of the world just to help us make sure that we’re keeping a finger on the pulse of what violent extremism looks like. And then this core group is supplemented by a further 35,000 people globally working on our safety, security and operations teams, so that’s a huge effort that can triage 24/7. And most of this team did not come from Silicon Valley.
You mentioned that my background is in processes of radicalisation. If you had asked me five years ago if I thought I’d work for a tech company, it would have seemed maybe laughable even, and these jobs just didn’t exist. So, it’s nice to see bigger tech companies building these out as core teams, and that’s not just Facebook, the other larger companies are doing the same. And obviously, we’re lucky because, when we interface with smaller companies, there’s no way you have the resource that your 100th hire, and definitely not your 25th hire, is going to be a terrorism expert. It’s maybe going to be your first Policy Manager, or the first person from outside of Silicon Valley, if you’re lucky. And so, that’s the other reason we have to lean into things like GIFCT, to support just the knowledge sharing on that base level of how you ramp up expertise to be able to be sensitive to those shifts.
And when we talk about tooling, what can we do more proactively, I do think that humans will always be needed for the nuance, for the ground truth, for the interaction with difficult subject matter, but what technology can do, what the tools can do, is really help us get to scale and speed. Just the vast amount of content and the speed at which sometimes it can trend. So, we do use things such as photo and video matching technologies, audio detection, logo detection, linguistic analysis, and things like strategic network disruptions. So, if you do study processes of radicalisation, you realise that radicalisation is not an individualistic concept, it is a social construct. It happens within a social community, whether that’s online or offline, and so things like strategic network disruptions allow us to map an online network and remove it at the same time, instead of relying on whack-a-mole approaches to pieces of content at a time.
What this ends up looking like is a huge amount of proactive removal of content, not just what we would call terrorism, but also hate-based organisations. About 99% of what we remove for terrorism, we find ourselves before it’s even flagged to us, and, when we look at numbers, if you look at just the second quarter of this year, we removed about four million pieces of content for hate-based organisations, and 8.7 million pieces of content for terrorism. And, again, I know some people that might be on this call are going to be very concerned about over censorship, so it’s also important that, at that scale and with increased tooling, there are things like appeals functions and ways that, if we get it wrong, things like false positives and false negatives can also be flagged to us and be reported on in our transparency reports.
But, again, this radicalisation is not just in a silo, as far as a community, but it’s not just in one place. So, I doubt that anyone dialled into this call has just one app or one call function on their devices, whether that’s your phone or computer, and so we’d be naïve to think that bad actors are also just on one app or one platform as we go. And this is why we knew we needed GIFCT, the research pointed to it, as well, this is transnational and cross-platform. And so, it’s only by working together with private sector, governments, NGOs, and academics that we can hope to effectively combat the threat. Since its foundations, it’s been really inspiring to see a wider range of tech companies join GIFCT. We sometimes learn more from talking with smaller companies that are having to make innovative fixes for the problems they see that maybe we didn’t even think of as a bigger company, and so it’s definitely not just a one-way dialogue of bigger companies dictating to smaller companies.
And we see, as well, that GIFCT was needed, it came out of dialogues with entities like UN CTED, and like the EU Internet Forum, and it’s important that, now that we have this independent NGO, that we stick with our core pillars, which really are prevention, response, and learning, and it goes back to where we can share knowledge, where can we get better action-oriented research from experts, and where we can share technology? On the knowledge sharing and mentorship of smaller companies, I think our partnership with Tech Against Terrorism, again a UN CTED mandated NGO, has been crucial.
Unfortunately, we can’t have in-person workshops right now, but, with Tech Against Terrorism, we’ve been able to convene over 140 different tech companies, 40 NGOs, and 15 government bodies around the world, over the last three years, in a way that has crucially meant that we get a better idea of what violent extremism looks like on the ground, whether that’s neo-Nazism on the rise and white supremacy, or Buddhist extremism, or Islamist extremism in all its various forms.
And that action-oriented research is also needed. So, things like the Global Network on Extremism and Terrorism has worked with UN CTED’s Global Research Network on having global academics come and give quickfire discussions that speak to tech companies, so that we can take that research and respond to it, instead of just, kind of, seeing an overwhelming report that might or might not have action items for tech companies. And I know that the Observer Research Foundation is also part of that Global Academic Network, and it’s been crucial to get those different perspectives. And the biggest concern has also been on tech solutions, and so the Hash Sharing Consortium is one of the most technical sides of what GIFCT does. It really does house a huge body of hashes or digital fingerprints, which help companies surface known terrorist-violating content, and it is a limited scope, compared to a lot of the other work that GIFCT does.
So, the Hash Sharing Consortium was based to only include a strict taxonomy for hashes, based on the United Nations’ Security Council consolidated list of terrorist individuals and entities. That is wordy, but accurate, and the only exception to that post-Christchurch has been on this Content Incident Protocol. Now, if you look at even the UN list, lists are always slightly imperfect, but I think we do know we want to expand those efforts. When you look at Content Incident Protocols, the two times it has triggered, since Christchurch, as you mentioned, it’s actually been Halle, Germany, and Glendale, Arizona, and both of those attacks were white supremacy-based attacks. And we don’t see those groups very often on government lists, so we know that one of the biggest things we’re going to have to tackle is how do we understand that adversarial shift, but not go into a space where we are afraid of over censorship, that we have still a defined parameter, that we’re leaning into those experts, and that that’s very tricky. It’s easy to get this one wrong.
And so, I think that’s really where we’re going to see the need for this. We need to lean into things like the knowledge-sharing platform, again that Tech Against Terrorism has developed, and make sure resources are there for a wider range of tech companies. So, really, again, I’ll just finish by saying it’s really when each of our sectors recognises where we’re best placed to work together, combining our expertise, that we get that real impact.
Dr Robin Niblett CMG
Thank you very much, Erin. Thanks for – and also for picking up the issue at the end on some of the very specific solutions. I want to make sure that I give a chance to our two remaining panellists to get their thoughts over, so I won’t do a follow-up question now, but I think we’ll want to come back later on to an issue you raised right at the beginning. You know, what is the definition of terrorism? What is the definition of violent extremism? And, as I head over to Samir, I don’t know whether he might hit on some of these topics, as governments become more involved in the process of what was – started out as a technical element, but has become as multisectoral, then the governments start to have a much bigger interest, and, yeah, I can see an element of tension emerging here.
Any case, so I’m going to turn now to Samir Saran, who, as I said earlier, is President of the Observer Research Foundation in India, which hosts the annual Raisina Dialogue, one of the big global gatherings that takes place each year, or has taken place each year, in Delhi, and also Chairs CyFy, India’s main cybersecurity annual gathering. Amongst the many things he works on, Samir specialises a lot on internet governance and cybersecurity. He’s on the Global Commission on Stability in Cyberspace and is a member of actually the Board of Microsoft’s Digital Peace Now initiative, amongst other things.
Samir, could you just give us a perspective from whatever angle you want to take us? I know you’ve got a few, I mean, one angle that might be interesting, though, I think, to our participants and audience here, as well, though, is the number of people who’ve come online in India in the last, what, three to five years must have – be transforming society, and the risks of extremism that comes with it and its emergence of the digital space. So, I mean, either go global or take your India example, I’ll let you walk through whichever door you want, and look forward to your remarks. Thanks for joining us.
Dr Samir Saran
Thank you, Robin, and let me congratulate you again in your 100 years. You are much younger than a 75-year-old UN, so I think Chatham House will have to help the next stage of UN’s re-evolution to become a retrofitted global organisation for the digital age, and I look forward to partnering with you, and it’s – I’m delighted to be here with Nick and Courtney and Erin, all of whom I have worked with both at the GIFCT and, of course, through the ORF and our bilateral partnerships.
Let me try and respond to some of what I’ve heard, and certainly your questions, Robin. I was – before this conversation began, I was just doing a rough, you know, deep research. I went to Google and just asked Google the question, “How many Police Officers are there in a country like India?” I came up with – it came up with a very old – and that’s the advantage of Google, it never gives what you want at that particular moment. So, it gave me a 2006 number that says there are 144 Police Officers for every 100,000 Indians. You know, now just do – start doing the maths. I’m too tired at this point, it’s 8 o’clock in the evening. But it means we are talking about billions of Police forces that keep a country like India safe in the real world, and it says – and by the way, that classification of India says it has a weak Police force, and the global average should be around 300. This is 2006 numbers, so I’m just trying to use this as a illustration.
Now, if digital India is million times bigger than real India, then what is the human institutional capacity we need to build to keep the space safe? And I think that’s the first question we truly need to ask ourselves, that, if the digital is far bigger than the real, if digital planet is X factor times bigger than the real planet, then how do we create a stability safety security net for this digital planet? And I can certainly tell you that all my partners here, who I really admire a lot, Microsoft, Facebook, and others, for their innovation and solutions, and, of course, all the troubles they also give to the world, I don’t admire them for that, but I certainly recognise that, I can certainly tell you that they don’t have that capacity. Now, any sensible estimation of the kind of human capabilities required to moderate and intervene in any form of activity, including terror-contained hate speech, radical content, propaganda, etc., I can tell you that any undergrad will come to the conclusion that they’re understaffed, and they can keep telling us they have 10,000, 20,000, 40,000, 100,000. It doesn’t add up.
So, I think we need a new framework of institutional capacity for digital safe spaces, and I think there has to be a new compact between the private sector, between academia, between society, between governments, to build this institution. I don’t think any single actor is going to get us there, and I think it’ll have to be a whole of nation, whole of world effort and, of course, those who have the capacity, capability, resources, money, finance, will have to do more. Because they are getting more out of the digital space, they will have to invest more in keeping it safe. I think that’s the first part of my response, I don’t know whether that makes sense to you, Robin, but I think that’s one thing that became quite clear to me, in my work over the last few years with many of these organisations.
The second thing that became quite clear to me was that we were supposed to drink the magic potion they served us. It was called algorithms. I don’t know what that means. I really don’t know what it means when they say algorithms are now able to track, remove, identify, flag content. I think there has to be algorithmic accountability. I think there has to be some framework that tells us how their magic potion works. It is now not going to cut ice simply by saying that we have developed some sophisticated protocol that works in a certain way because there are going to be questions asked that, “Why did it not work in instance A? Why did it work in instance B? Does it favour certain communities? Is it against certain identities? Does it support certain kind of bad behaviour?”
So, I think – for the sake of having a fairer, transparent, and accountable framework, I think there has to be a deep discussion on how we will deploy technology to respond to the challenges in the technology domain. And I think that is going to be, of course, the tip of the spiel for responding to countering violent extremism. I don’t think you will ever have enough human beings to do that. You will have to deploy sophisticated bots and AI and algorithms to respond to this challenge, but the framework and accountability of those technology tools must be established. So, I think there has to be an accountability around the technology we use, and I think there has to be some degree of communication of that with the communities that you serve, otherwise that trust will disappear in the relationship between the user and the technology company, and I think we need to ensure that trust is retained. That’s the second point I wanted to make.
The third, I think there is still a hesitant recognition of the deep nexus between the processes that respond to violent extremism and processes that are deeply political and are part of the national election and democratic campaigns, and I think the lines have blurred deeply. I think today, countering violent extremism is not thinking about ISIS. Countering hate speech is not about dealing with fanatics in various parts of the world. They’re deeply domestic political processes and I think, once they become deeply political for a nation state, and if you are a big nation state, you will see nations respond to technology companies, sometimes favourably, sometimes adversely, depending on the rub of the green.
And I think therefore, we need to be – we need these processes to become sensitive to the political element that is now embedded within what was earlier thought to be a law enforcement activity. This is now no longer law enforcement, it is no longer a national security activity, it is a – actually a political regime activity. It is part of the political regimes that are being established around the world, they implicate them, and therefore, they will be responded to in a very different fashion. So, are there folks in technology companies who are dealing with this? I mean, I know Courtney is, and Erin are, but do we have a cohort who are sensitive to the political nature of the work they do? And, therefore, do they propose solutions that take into account local sensitivities? And I think the number of people who are able to traverse technology and politics is very thin. I think that cohort needs to be built up very quickly.
So, like I mentioned in your last Chatham House conference, which happened in the real, the golden age of technology will only be sustainable if we have a golden age of social sciences. Now, I’m not making a plug for ORF because we want to do more research, but all I’m saying is, we need far more political and social sciences research if we want to make technology work for us, and work more sustainably, and I think, therefore, investment in research, investment in education, investment in knowledge, investment in – and incubating knowledge frameworks around this is extremely important.
I will make two short points before I end, and I know you’ve – once you give a mic to an Indian, you can’t take it away easily, but let me make two short points. One, I think we are targeting the wrong demography. I have always said this, and I am now convinced, after having spent the last six months in solitude like Papillon, I have come to new discoveries. The first discovery is that the solution is down in the schools. By the time the digital users reach college, we have lost them. We have to go much lower in the age demography to be able to create new cohorts, to build new narratives, to reject hate, to reject what I am going to now put forward an idea that is most insidious, to reject casual hate. Casual hate is what becomes eventually the biggest deterrent to decent speech, and casual hate begins very early. Casual identity debates, casu – we, in schools, were taught about civics, we were taught about how to cross the road, how to treat our neighbours, how to – who teaches digital ethics to anyone today? And the only input that 12-year-old get are from their mobile phone, not from the grandmother, like I got, but from the mobile phone. So, I think we have to go down to school, that’s one point.
The second, of course, is that we need to understand that there are shades of extremism, and we sometimes try to create false equivalences between those. And I think there must be an understanding that people creeping about domestic issues is not the same as people promoting terror and violence overseas. I think there is a big difference between the two. The act of going and destroying some other third country is different from fighting with your local communities to decide if we want to put a footbridge or a under the road bridge. That’s a different thing. So, I think the – we sometimes, algorithms and tech companies, create false equivalences between very different kind of narratives. We must be far more nuanced, technology allows us to do that, and I think we should use the powers of technology to target what is dangerous, and to not curtail what is freedom of expression, what is political speech, what is important release valves for democracies to puncture. I think sometimes, any form of agitation is targeted in much the same way. We must not draw false equivalences. I think we must be very careful between violent extremism and sometimes extreme debate. We conflate the two very easily. That distinction is important to make.
Dr Robin Niblett CMG
Thank you very much, Samir, for those points, and the last two, both about schools, learning, ethics, not from your grandma, but from your mobile, but also the point at the end about what we target and what we don’t target. You yourself said that actually, internet curation and content curation is becoming political activity, so the dilemma here is, who’s in charge of determining, and what might be right as a political activity in one state, could be seen as completely wrong in a different state. And if you’re a company having to operate across borders, you’re going to face really complex challenges, which obviously, everyone on this call is extremely aware of, given the tech companies and the big tech companies are very, very much global companies.
Again, I’d love to do some quick follow-up questions, but I’m conscious of time. We’re running about six minutes, seven minutes late. We’ve got about 20/25 minutes left, and we’ve got a few questions coming in, and I can remind you, if anyone has questions, they want to put to the panel, please ping them into the Q&A line please, and I’ll draw on them. I may unmute you or ask you to be unmuted, if you’re willing to be. If you’d rather I asked the question, as a couple of people have done, please say so and I will do so.
But we’re going to turn now to Fionnuala Ní Aoláin, and, Fionnuala, really delighted you have joined us, because, in a way, I think you’re the right person, if you don’t mind me saying so, to go last because you do have to deal, in your capacity as Special Rapporteur at the UN for the Protection and Promotion of Human Rights While Countering Terrorism, you’re at the crossroads of some of the dilemmas that Samir, Nick, Erin have noted for us already. Just so people know, Professor at the University of Minnesota, holder of the Robina Chair in Law, Public Policy, and Society, and Director of the Human Rights Center at the University of Minnesota Law School, but also been involved with the international tribunals, the criminal court, to deal with international criminal activities, but also, the International Tribunal for Former Yugoslavia. So, you’re somebody who’s been involved really on that hard end, some might say 20th Century end, though, sadly, continuing in the 21st, of war crimes, etc., but now applying a lot of your knowledge to this new digital world of extremism.
Fionnuala, where do you want to come in on this? I’m not going to try to lead you, I’m sure you can pull off the three first sets of comments, and I might follow-up with a question, depending on where you go. Over to you.
Professor Fionnuala Ní Aoláin
Great, thank you all, lovely to join you. It would be much nicer to be in Chatham House than to be in our videos, but let’s look forward to that after COVID. And so, I want to welcome – I welcome the opportunity to express the views of the mandate on the human rights and rule of law dimensions of the issues of countering terrorism on the internet. And I do want to acknowledge the difficulties that are faced by social media and other internet platforms, bearing in mind the tendency of states to outsource public interest tasks and expect these platforms to proactively police online content and behaviour. And, in this regard, I think my mandate has fairly consistently documented the legal concerns that arise from the sub – what we see as the subcontraction of state responsibilities to private entities, and the human rights implications of those moves. As my mandate raised, for example, in the context of the EU draft regulation on content moderation, which we had major concerns with, even it’s amended, though slightly better form, and in other contexts, like domestic legislation, a good example being the German one, which, while not specifically terrorism-specific, I think clearly actually reaches into this space.
So, I think states tend to impose very onerous obligations on platforms, making it impossible to meet these obligations without automation, and it doesn’t leave, in many cases – the nature of these obligations don’t leave a lot of space for a human rights implementation or analysis. So, to say the obvious, platforms as private companies are not states, and they’re not formally bound by international human rights standards, although some platforms, and here I think Facebook is a good example, do positively say that they are guided by international human rights law, and there have been some encouraging steps in that regard to bring onboard those standards into ‘these community standards, or reference standards’, and so some kind of closer alignment.
But the voluntarily stance, in my view, does not capture the totality of the legal human rights obligations in this space, including the human rights obligations of platforms and, in the context of platforms who are regulating access to the use of their services, they are both, well, triple fold. They are the standard setter, the enforcer, and the arbitrer, and they have quasi legislative, quasi executive enforcement, and quasi adjudicative roles, and most of that happens without meaningful external, including, I want to say, democratic oversight. And this is particularly problematic, having in mind the level of influence and the control that such platforms access over access to information, freedom of opinion and expression, freedom of assembly, and public interest discourse, including, I think, most particularly now we see it in the context of health or election debates.
So, I want to say a couple of things. First, that the power and influence of the platforms that these platforms exercise has to be met by corresponding levels of corporate responsibility. Terms of service, community standards, have to be based, in my view, on international human rights standards, and not just paying lip service to the application of Article 19 of the International Covenant on Civil and Political Rights, which is that provision of an almost universal treaty that sets out protection from freedom of expression. It involves freedom of expression, but there’s an at – sometimes the view that it’s only freedom of expression that’s implicated by access, and I think that is just – patently, sort of, underestimates the scale and scope of what access to these platforms or – mean.
It also means, in my view, that terms of service or contracts of services have to be sufficiently clear and detailed to allow people to understand exactly what kind of conduct will get them suspended or banned, as opposed to having a rudimentary, which we do have in some cases, public rules, and the detailed guidance, which is only available to company employees. States, in our view, have also a corresponding obligation to ensure that businesses that are operating in their jurisdiction respect human rights. And so, a good example of this kind of gap is if we look at the EU draft regulation on terrorism content, there’s no single provision in that draft that requires companies to have terms of service that are human rights compliant. That would seem like a 101 place to start.
So, let me say two other, kind of, or maybe three other big things. I want to talk about definitions, Robin, which is an issue that you raised right at the outset but is absolutely central to these words – to this regulatory space, ‘cause we say these words like we all understand what they mean. It’s like that Supreme Court test on pornography, I know it when I see it. But the problem with that is, I want to acknowledge that both the global platforms, and we all face a particular challenge, which is the lack of an internationally accepted definition of terrorism. There’s no multilateral treaty, although we do have suppression treaties, and we do have UN Security Council Resolution 1566, and we have the definition of terrorism, which has been crafted by the mandate I hold.
We also – so, at least in the area of terrorism, we have guardrails, and what we understand to be content, the terrorism acts, although be very clear, most state national definitions of terrorism, which my mandate has documented meticulously over the last 20 years, most national definitions of terrorism are so wide and vague that many would be – it would essentially, much of that, in many countries, is targeted at things that are legitimately protected by international law. I – we could name some recent high-profile cases of security legislation, in a number of countries, that make the point, I think, particularly well. When we get to extremism or violent extremism, we’re in a much more ambiguous space. We lack definition, and, increasingly, the mandate views this term ‘extremism’ as the nomenclature of preference by states, not least because it’s delightfully malleable. All kinds of things can be extremist, including people who disagree with you, simply.
So, I want to say that we have seen tech companies and platforms try to fill the gap. Facebook, for example, to its credit, has acknowledged the fact that we’ve had a sustained dialogue about the definition of terrorism used by Facebook, and we think that it’s improved, but still a challenge in the sense that I think there’s a problem, when we think of the scope of the regulatory space here, that these platforms are, in fact, self-creating their own definitions of terrorism outside of the multilateral context, in closed spaces, where experts and others have little consistent access, and civil society has virtually no meaningful access at all. So, I think that’s a really problematic place.
The second problematic place is lists, and I’m just – a lot to say here, but just to note that there’s a sustained critique, including by the UN Ombudsmen, on the Da’esh Al-Qaeda list around the processes by which those lists are constructed, and the due process rights of persons who end up on them. So, lists are not a panacea for this problem of definition, either, and I would just strongly suggest, a lot to say here about how we define, and I think that’s a conversation to be had, but loose and vague definitions of terrorism, extremism, and violent extremism are really anathema to the fundamental notion of free and open societies in which individuals get to participate, including online. And the mandate strongly suggests, in particular, that those platforms who come up with their own definitions, essentially that they – when they do so, they are particularly focused – we suggest they focused on conduct and not on affiliation. It seems to me highly problematic to have private entities running around defining who is and who is not a terrorist group and, while we recognise that there are problems with state definitions, I think that subcontraction is problematic for a whole different set of reasons.
Two final things, and I know we’re – I’m then going to come to the end of my time. The second is just the governance of counterterrorism, including the governance of this space. It’s an issue that I have addressed, in a number of my reports, and I think is one that’s particularly important in the content – context of governance of content moderation. So, I do want to say, in acknowledging that there’s a process under play – underway with GIFCT, but, more broadly, the mandate, my mandate, has spoken to the proliferation of these governance entities in the counter-terrorism space, many of them sitting outside the multilateral system. These entities are engaged in standard setting and guidance, and one of the things I experience at the national level is often going to states and telling them about their treaty obligations, and they are telling me that, in fact, they’re conforming to these soft standards that are produced by some entity and not their treaty obligations.
So, we have an inversion here, I think, that’s particularly problematic, in terms of where global standards are coming from, who’s engaged in the process of making them, and how accountable those standards are. Many of these entities, and I do want to include GIFCT in this, are not transparent, and I think Nick has acknowledged there that there’s a process of becoming more transparent, but they are, in many ways, human right light spaces. They’re not spaces like multilateral spaces, where human rights entities have a right to be there as of right, and I think, when they are engaged in standard setting and regulation in an area that has such a profound effect on the regulation of human life on this planet in the 21st Century, we should be asking really hard questions about the transparency, the governance, and the access to those spaces across the board.
Finally, I just want to say, as we conclude, sort of, with some words to the platforms themselves, I acknowledge maybe – this is maybe a more technical way to close, but to say that, while platforms often have no choice but to use automated tools for content regulation, I think the mandate would stress the continual importance of having humans in the loop. And there we mean an adequate number of well-trained, adequately paid, which is an issue, and not over – completely overworked human moderators, and that’s, as we know, been a challenge, in a number of spaces.
The second is that we need improved transparency, both – and I – by here, I want to talk about the transparency of the ways in which many of our platforms co-operate in formal and mostly in informal ways with governments, and that is particularly challenging in the area of terrorism. And, finally, around accountability. We have a – we need to have a better understanding of the ways in which content moderation will be accountable, including through accessible and comprehensive appeals processes. So, I’m going to close by Voltaire, we all know his famous, you know, “I disagree with you, but I defend to the death your right to say what you say.”
I think it’s imp – unpopular to be in the space of defending unpopular views these days, and all of the things we’ve talked about, there’s a hard space where we have deeply violent and deeply problematic people, but, at the same time, the value of expression, to open functional, transparent and engaged societies, it’s just the life breath of what makes many of us want to live in places that are open and safe. And it concerns me deeply that, in particular, because the language of terrorism and counter-terrorism is so slippy for so many states that, when we narrow this space, we’re not – we’re narrowing in a wholesale rather than a retail way. So, let me pause there.
Dr Robin Niblett CMG
Thank you very much, Fionnuala. I think you have touched on a number of important issues, including some that have come up in the questions. I don’t know how many of you have been able to keep an eye on the questions coming in, but the one by Deborah Brown, who’s with – I believe with Human Rights Watch, I think she’s had to leave probably by now, I haven’t checked, but any case, but she – you know, it’s a long question, but it is basically the issue you were raising there at the end, Fionnuala, about the overly raw and discriminatory criteria for determining and then removing what is terrorist and extremist content, and what plans GIFCT would have to deal with that.
Just conscious of time, I think rather than try to overly curate the questions, I think the simplest thing for me to do, and I definitely want to make time for Courtney Gregoire to come in, and I think maybe, Courtney, what I’ll do is actually run through – give a chance to Nick, in particular, who spoke a little while ago, to come in and address some of the topics that others have raised and hopefully that they’ve picked up in the Q&A, though I will also target them in that direction if they don’t. And, Courtney, then you have the last word, when I’ve gone through those four comments. It’ll probably take us five minutes, I would imagine, beyond our theoretical closing time, but we’ve still got the bulk of the – very much the bulk of people who were on the call at the beginning with us, so – and I want to do this justice. Nick, why don’t I turn to you first? I’ll literally just go through each of you. Try and be as telegraphic as you can with your points, I know they’re big and they’re complex, but, as I say, in a way, Fionnuala encapsulated all the questions, you can take them on. Why don’t you go first, Nick, and walk through whichever door you want?
Nicholas Rasmussen
And thank you, Robin, and I will be very brief, I want to just pull on one thread that Fionnuala was raising because it was something I didn’t lay out in my initial remarks, but it’s very much on my mind and on GIFCT’s mind, and that is dealing with these definitional taxonomy questions, Fionnuala, that you raised. The nature of extremism and violent extremism and terrorism across the globe demands that we do so because we are no longer in a world where the primary focus is on a narrow set of designated terrorist organisations that we are all comfortable defining as terrorist organisations, because they have been linked to Al-Qaeda or the Islamic State. At the same time, tiptoeing into these definitional debates and discussions is fraught with all of the complexity that you outlined in your remarks and that you’ve been dealing with, you know, in your mandate.
So, I think we head into that space by necessity, but we head into that space with a fair amount of humility about how much progress we’re going to be making, and I think it, in the end, ends up having to be a behavioural based discussion. What can we agree and define as behaviours that we do not want to see in the online environment? And look to be as precise and specific about those, because you’re right, all of the questions around affiliation are fraught with potential for abuse by, you know, government re – you know, undemocratic government regimes, you know, just to speak of the government side of things. So, I don’t have an answer to your set of questions on this, only that this is some – this is necessary work that’s going to take us, and that’s why I described also that the wide and deep stakeholder table that needs to be constructed virtually to involve all of the important voices that will contribute to that conversation. I’ll stop there.
Dr Robin Niblett CMG
And I would just note Adam Hadley’s added a link there for folks who want to tap into it, which I think has got some connectivity to the point we made here. Just one brief question – follow-up question to you, Nick, and then I – thank you for being brief, and I’m not inviting others not to be brief, but now that you’re a 501(c)(3), I think you said ten weeks ago or something, and so you’re now an independent organisation, but formally established, what’s going to be the composition of your board? Is that multisectoral? Do you bring governments into the board, do you keep governments out of the board? To me, that’s got to be one of the most interesting things. Who’s going to be on your board? Well, not who…
Nicholas Rasmussen
At present…
Dr Robin Niblett CMG
…but what – how are you going to propose that?
Nicholas Rasmussen
At present, the board consists of the four founding companies. Courtney plays the role of Chairperson of that board for this year. In our first board discussion, since my coming onboard, we’ve had preliminary conversations about, over time, looking to broaden the base of contributors and sponsors of GIFCT work, but it is a fact right now that this is an initiative that grew out of the large technology companies. I’m openminded about how, over time, that can change and, to my mind, any time you have a broader set of supporters and those who are investing in your work, it’s to the benefit of the organisation. Why don’t I leave it there?
Dr Robin Niblett CMG
Okay, thank you. I know it’s a complex question, but obviously, ultimately, a very important one. Erin, coming to you, I mean, the issue that keeps coming back is this idea of – that you can’t subcontract too much stuff to companies, they don’t have the political authority to be making public policy decisions. But, if you’re covering EMEA, what strikes me as particularly complex, and it comes in a little bit to the questions you may have seen at the top of the list there about, kind of, you know, Facebook decisions on content moderation and removal of things to do with Palestine and Israel, and there was a question specifically about not stepping up on the incitement that was done by, in many cases, official bodies using Facebook in Myanmar. Are you going to have to be local or global? Because when I heard Fionnuala say – Fionnuala was saying that companies all need to stand by international human rights norms, but we know that a lot of governments do not abide with them, and there are some governments right now that actually try to change the definition of what the Universal Declaration of Human Rights, in essence, is. So, as a global company, are you having to be local or can you be global? And it’s such a big question, I know, and I’m sure you’ve got a very short answer to it.
Dr Erin Marie Saltman
And no problem at all, whatsoever. No, and I don’t want to shy away from these questions because people raised, you know, Israel, Palestine, they raised the conflict in Myanmar in some of the questions and I think, oftentimes, working at Facebook or many of these larger companies, you feel like you are – you have your finger on the pulse of – especially for our teams, we are not the light, fluffy team that turns your face into a unicorn. We are the team that every day is looking at mass atrocity, that is looking at murder and attacks, that is looking at strategized, co-ordinated violence. We all, thankfully, have good therapy support, it’s definitely not a – you know, we could get into this, but we have our finger on the pulse of atrocity every day, and we lean into where we do have good international guidance, we lean into UN principles of human rights, we lean into where we can find organisational structure, whether that’s the EU or the UN, and other places. We look to smart regulation, we want smart regulation, but what was pointed out is that often the existing regulation does not answer our minutia questions, when, at the end of the day, we are being asked, “Does this stay up or come down?” It is binary, it goes out and…
Dr Robin Niblett CMG
Can a national government – can – just a basic question, which I should know the answer to. Can a national government ask you to take something down because it breaks the law in their national state, as they interpret it?
Dr Erin Marie Saltman
Yes. Yes, and I think one of the things that is most important is that, in things like transparency reports, we also have transparency about government requests for data and how much we’ve complied with it. So, there are cases where it will show we are not 100% compliant. In fact, I don’t think there’s 100% compliance with any one government, but it gives a read on what that looks like. And when it comes to defining terrorism, I think there’s this interplay where we did, as was pointed out, put open a public definition of terrorism. Most companies don’t say what their definition is and, if they use lists, they don’t say what those lists are, and we know that both have their issues, but by being transparent, at least we can have a better, harder conversation.
And I would say that our definition was put out there because it is based on behaviours, because it is based on things like knowing that violence was premeditated and ideologically motivated, but we don’t say what that ideology has to be, which means that, when we see something like Halle, Germany, when we see something like Christchurch, when we see something like Anders Breivik attack in Norway, we can call that terrorism. There is no – we don’t have to call it something else to make – that is terrorism, by our definition, and so we should be able to say that more transparency is definitely needed. For GIFCT, however, we are dealing with a wider group of companies, companies, as I said, that don’t – do not have a terrorism unit, and so we will have to lean into better international structures and guidance because you are not going to expect a small company to come to terms with it, and, even as bigger companies, we’ve gotten it wrong.
When we look at Myanmar, that was brought up, why didn’t we intervene and sooner? The ethnic violence in Myanmar is horrific, and we were too slow to prevent misinformation and hate speech that was around that violence, and so, since then, we’ve had to put a huge effort towards things like product, engineering, policy work, since 2018. And getting to goals around why aren’t people reporting more, how do we get that there, making pages more transparent, getting better third party factcheckers, again leading into partnerships, not just thinking we should go it alone. So, this touches on things that we’re all having to get to a better space around, and hopefully a lot of the smaller companies, as well, can learn from not just our successes, but where we have not gotten it right in the past, so that those aren’t replicated.
Dr Robin Niblett CMG
Just one – and we’re going – as you can see, we’re five minutes over, we’re going to go at least five more minutes, and realistically it’ll probably be half past by the time we finish, and I do want to give time to Courtney, so just be aware, everyone, but I’m sure that 15 minutes, you’ll all be fine. William Braniff, or Bill Braniff, I don’t know well enough to know which one he is, asks a question and which I think gets really to what you were just saying there, and I’m just going to deal with the last part of it. “How is the GIFCT going to avoid getting sucked into the allure of tactical incremental improvements, which often brief very well in front of legislators, and tackle the very important elements that we heard right at the beginning of the presentation, and that, you know, Fionnuala and others focused on, in particular, of, you know, really making sure that you’re not simply just doing quick reactions to the attack after it’s happened, but you’re part of changing the process of the driving of extremism online?” Because you could end up spending all of your effort responding and making sure that the atrocity isn’t reported, but not have the time to really stop the feedstock of atrocity. Is that something, I mean, Erin, just quickly, you could say a word about, and maybe Nick, and then I’m going to come to Samir and Fionnuala? Who wants to go first on that one of and you just go quickly first, Erin…
Dr Erin Marie Saltman
Sure.
Dr Robin Niblett CMG
…or not, yeah.
Dr Erin Marie Saltman
Sure, and I’m sure Courtney can speak to this, as well. I think we have to constantly go back to our mission statement, and go back to those core values of what does it mean for a tech company in the prevention space? What does it mean to respond, and how do we learn from both of those aspects? And so, I think our core mission statement – the big change that happened post-Christchurch was that our mission statement went from saying terrorism alone to terrorism and violent extremism. But, as people on this phone call have noted, just adding that violent extremism word, when we know terrorism is not an agreed upon definition, makes a lot of other people very concerned.
So, I think, again, having partnerships at the table, working groups are going to be important, and we have working groups specifically on transparency, and tech approaches, and crisis response, and I have to say, some of the human rights groups that are in those groups are going to ask, and are already asking, the hardest questions. And I’m very grateful that they’re in that group because, otherwise, we might just go full steam ahead and say, “Look at this bright shiny new tech, press the green button,” and we don’t know that that might create a huge amount of over censorship, and so, without those voices at the table, that’s the fear, and I’m glad that they are at the table as much as they can be.
Dr Robin Niblett CMG
And I’m just wondering, Nick, maybe I should let Courtney tackle some – did you want to say something or I can hand that to Courtney to do it at the end?
Nicholas Rasmussen
I’d be happy to have Courtney take it ‘cause it’s something we’ve discussed, certainly, as well, in our own conversation, so she and I are likely…
Dr Robin Niblett CMG
Courtney, do you want to come in now, or do you want to come in after Samir and Fionnuala? What would feel better in the flow for you?
Courtney Gregoire
I’m happy to give just a quick perspective here, and then…
Dr Robin Niblett CMG
Why don’t…
Courtney Gregoire
…I’ll come…
Dr Robin Niblett CMG
…you come in now? I think it’d go better with the flow, I agree. Over to you.
Courtney Gregoire
Well, you know, I think you asked a very good question, Robin, and I think it’s echoed in this comment, what is the structure of the GIFCT and how is it going to achieve both pragmatic solutions, [inaudible – 83:16] solutions, and achieve the challenges and the conversation that we’ve been talking about? And I think we’ve been talking about this, but it’s really worth reflecting on, the evolution of the organisation. Reflecting a year ago at the UN General Assembly, when we made the announcement it was time to move to an independent organisation, and where we were taking our learnings from? We were taking our learnings from, of course, important pragmatic work that had been happening from 2017, but really, the Christchurch call to action helped define an appropriate structure that really resonated for many of the stakeholders. It said we have a mission statement, and now you need to have clearly defined roles and responsibilities to tackle this whole of society problem of violent extremists and terrorists exploiting the internet. Responsibilities aligned to governments, responsibilities aligned to civil society, and then there are responsibilities for the tech industry.
And that helped us think about what we wanted in the new organisation, and so as we stood up as an independent 501(c)(3) with an Executive Director. We do have an operating board that is clearly responsible for the financial accountability and the work, but an independent advisory committee in a structure that we think is really important. It has seven government representatives, two international organisations, and 12 representatives from civil society. We expect that place to be a hotbed of debate, we expect that to ensure the healthy debate that we’re having today informs the mission and the priorities of the work, and then, as Erin just articulated, we structured the intent that the working groups would be multi-stakeholder, and they would be practically addressing the core work of the organisation.
And so, what our hope, I think, reflects is a couple themes that we’ve talked about. We need this to be multi-stakeholder in nature because of what we’re trying to tackle, but I really liked how Samir leaned in and said, “We have to be multidisciplinary about how we attack this. We need to be informed.” And so, we started with the mission of helping prevent the exploitation of the internet. We’re stilling with our pillars, prevent, respond, and learn, and the only way you can do all of those things, is by putting pragmatic solutions and being humble in how you respond, and learning constantly in an iterative process. So, I hope the structure – it will be tested, and we want to learn from it, but we do believe we’ve created a space.
Just to respond on one piece. We think it’s a space for pragmatic solutions, where smaller platforms can use them. We hope it’s a space to have some healthy discussion. We do not expect to be a standard based body. We are not taking over that space. That is not the role of the Global Internet Forum on Counter Terrorism, and we want to ensure that we’re clearly still staying in this space where roles and responsibilities are defined to achieve our mission of helping prevent exploitation on the internet consistent, not a afterthought, consistent with upholding human rights.
Dr Robin Niblett CMG
Therefore, thank you very much for those points, Courtney, and I’m going to go to Samir, I’m going to go to Fionnuala, and, as I see, Michele is still with us, Michele, if you want to – no, you’ve – she’s about to leave. That’s alright. That’s fine, Michele, don’t worry, you stay in the room as long as you can, I’ll let Fionnuala have the last word. But to your point there, it’s going to be incredibly important then, if you are not taking the role, as you said, of standard setting body, Courtney, that the transparency element, so people know which standards then you’re applying, do you see what I’m saying, are – can be gleaned, can be seen, and so on, because that way, at least you can say, “Hey, we’re following as best we can, but this is the way it is.” Samir, a closing thought from you, and then Fionnuala, and then we’re wrapped up. As I said, fantastic, just about everyone has stayed on this call, so keep going, Samir.
Dr Samir Saran
For the first short part, I think technology companies need to be humble, but also need to truly acknowledge that they are today providing quasi government services, public services, and therefore, if not elected by the people, their votes will have to be far more accountable to the people. I had once jokingly said that the next Mark Zuckerberg should be elected by the Facebook users. Now – and that is – and I can see Erin smile because, you know, then we can create a coalition of votes there. But the point I’m trying to make is that, in earlier years, governments used to keep information close to their chest. They never used to share, transparency was not a big thing. Eventually, governments realised it is okay to be wrong, it is okay not to get everything right, it is okay to share, it is okay to inform folks, it is okay to be irresponsible to certain communities. And I think the open governance that gove – that successful governments around the world have adopted, needs to be now the mantra for both groups, and I think they will not have the five decades that other democracies did in an analogue world. In the digital world, they’ll have to transit in a much different way.
The second short point, I think, let’s be honest here. Most of the nations that we are talking about here are also signed onto the international human rights law, so it’s not as if Facebook is doing something unique. All countries they operate in have also signed those laws, and they respect those laws by abusing those laws. Now, let’s be very – let’s be honest here, that all of those who have signed, all, including, Robin, this country you sit in, have actually been in breach of those laws. Now, having said that, I think just listing that law is not enough. I think what we really need to do, in the digital world, is to build a coalition of countries who are able to live up to those laws, and I think let us not try to get everyone on the boat at the same time. Let us pick up five, seven, ten influential countries. Let us create digital norms and digital processes that these big, large digital countries can sign onto, and let us attract others to join the club. I think if we can create a D10 to counter the Chinese, I think we can also create a D10 to protect our people from violent extremism. So, I think we should also use a coalition of – a small coalitions to promote and attract folks to sign onto a agenda that is going to be most crucial to the 21st Century. That’s the UN mandate, kind of thing, life after 75.
Dr Robin Niblett CMG
No, thank you, and at least that would then provide the framework for GIFCT and others, at least that would give them another point to be able to act on.
Dr Samir Saran
GIFCT and many such organisations are vital.
Dr Robin Niblett CMG
No, and this reminds us of the gap and the process that was used back in those days, you know, and coalitions were likeminded. Fionnuala, you get the last word, over to you. I mean, I think people have been on – in line with you, but what did you hear that you wanted to react to?
Professor Fionnuala Ní Aoláin
Well, I’ll just say three quick things. Everyone is in favour of human rights, it’s like motherhood and apple pie. No one is against it. But what really matters is actually creating the processes and the procedures that implement the rights. So, it’s fine if we say we’re all for it, but it’ll only count at the point in which these processes, including the work of the GIFCT, including the work of all these platforms, actually materialises that at every stage of process, and that’s the test.
The second thing I would say very clearly is that, yes, we are absolutely all agreed that violent extremism and terrorism is a whole of society problem, but the abuse of extremism and terrorism discourse is also a whole of society problem. My mandate has documented that over 60% of CT laws around the world are used against civil society actors. That’s just bad counterterrorism. It’s also conducive to the production of more violence, and it speaks to a parallel problem which, in this space, you cannot ignore, which is that, if states are abusing these terms to securitise, to limit, to disable the engagement of persons in their most – the exercise of their most fundamental rights, then you have to watch for the space being exploited, and I think everyone has to keep that threat as much in mind as the threat that, sort of, frames this discussion today.
And the final thing I would say is human rights is losing. We’re not winning this battle for human rights, and so it becomes increasingly important in these spaces that actually, we put in place the procedures, the processes, and not just the kind of commitment that sounds good, but doesn’t actually materialise the protection of norms in this space.
Dr Robin Niblett CMG
Fionnuala, you have saved me from having to make a closing statement with that last closing statement, ‘cause I agree with you, I’m concerned that what we take for granted as certainly in – I’m sitting in the UK, but in many societies, in free societies, as those basic universal human rights are being eroded right now, and deliberately so, in many cases, as a policy of a number of governments, which – so therefore, is going to put global companies, global tech companies, in an incredibly difficult position. And I go back to Samir’s point, you – one needs to create almost a group that can create some cover for those global companies to follow the correct best practice, based on those laws that were all agreed 75 plus years ago. So, I think your words were incredibly important there.
I want to do a very, very big thank you to everyone who was involved pulling this complex meeting together, my colleagues at Chatham House behind the scenes, but also, obviously GIFCT, the Global Internet Forum to Counter Terrorism, thanks for Microsoft for helping us pull this conversation together, as well, in their chairing capacity of GIFCT. Behind the scenes, my colleague, Marjorie Buchser, who runs our Digital Society Initiative, where we’re trying to grapple and are grappling with many of these issues, but, like others, trying to come up with solutions as well. We look forward to working with all of you, especially with those of you who took the time to join this call, put in some very good suggestions in the Q&A line, and some good questions, as well. I’m sorry I didn’t get to all of them, but I think I got to most of them. Let’s – you know, let’s see where we are, if not in a year’s time certainly in six months’ time. Obviously, a lot has taken place in the intervening period, but thank you very much, panellists, for your time, and thank you for joining us, as well. On behalf of Chatham House, goodbye and see you soon. All the best. Bye, bye. Thank you. Thank you.