Joyce Hakmeh
Great. Good evening, everyone, and a very warm welcome to all of you, and thank you for joining us for this panel discussion tonight, where we’ll be talking about The Global Fight Against Commercial Spyware. I’m Joyce Hakmeh, I’m the Deputy Director of the International Security Programme here at Chatham House and the Co-Editor of the Journal of Cyber Policy.
So, it used to be the case that only a few states had the resources to develop cyber tools powerful enough to pursue intelligence and law enforcement purposes. In the last decade, this has changed. We have witnessed a growing market of enterprises offering a wide range of products and services, from hacking as a service to hackers for hires, offering bespoke tools and services, to the sale of enabling capabilities, such as zero-day exploits. All these are available not only for states, but also for non-state actors, too.
We have been seeing multiple cases where these tools have been used or misused, with little controls and, you know, increasing the high risk of abuse. And they have led to disproportionately targeting Journalists, human rights defenders, political dissidents, Government Officials, and others in many countries around the world. And as the proliferation of these commercial hacking tools and services continue to grow, there are big concerns around the impact that this will have on countries’ national security, on citizens and more broadly, on global cyber stability and the peaceful use of information and communication technologies.
Now, there have been solutions that were put in place to deal with this issue, ranging from regulatory frameworks, to expert controls, calls for transparency and accountability, for best practices at industry level, for more collaboration and information sharing. However, the problem continues to grow, begging the question, what else needs to happen and who should lead the charge?
So, to answer this, we have a wonderful panel with us today, bringing a mix of government, industry, human rights and policy perspectives. We have with us, Jonathan Ellison OBE, Director for National Resilience and Future Technology at the National Cyber Security Centre, and we have Miriam Howe, the Head of International Consulting on Digital Intelligence at BAE Systems, Sara Al-Sherif, Migrant Digital Justice Programme Manager at Open Rights Group, and our own James Shires, Senior Research Fellow in the Cyber Policy Team at Chatham House. So, we were meant to have Jen Ellis with us, today, but unfortunately, she had some issues that, you know, basically, meant that she couldn’t be with us tonight.
So, thank you all for joining, thank you speakers, and this session is on the record, and we will be having a conversation, moderated conversation with our speakers, and then we will turn to you for your questions. So, do please prepare them and think about them in advance. So, maybe I can start with you, Jonathan. I said in my introduction earlier on, sort of, the proliferation of these tools, there are, like, you know, big concerns for government, including the UK. Can you just help us, sort of, like, explain what the landscape look like, what are these capabilities, who are using them, and what are the capabilities that are – that you’re worrying about the most?
Jonathan Ellison OBE
Yeah, of course. Thank you, and thank you for having me. I think I find it easiest to look at this as a spectrum, right? So, this is a pretty complex landscape, and so, if you think about it as a spectrum and at one end of that spectrum you’ve got companies that sell hacking as a service. And, you know, NSO Group are an obvious example of this that are in the news, that are well known, but there are many more. They’re not, obviously, the only company. These companies are principally selling to states, often providing advanced state-level capabilities. Capabilities that normally would’ve taken significant expense and significant time to have built.
So, those states that they’re selling to can then use them for a series of legitimate purposes. So, for example, to augment their own national security capabilities, but sometimes they’d be using for – them for things that we clearly would not think of them as legitimate purposes. They might be using them to target Journalists, to target activists, dissidents, Politicians, Lawyers, a whole series of uses. And I think our assessment is that this is likely happening at scale and it’s providing a range of advanced capabilities to nations that otherwise wouldn’t have had access to these kinds of capabilities. And we’ve seen, over the last decade, at least 80 states buy these kinds of capabilities.
At the other end of the spectrum is commodity tooling. So, a criminal marketplace for exploits, for vulnerabilities, including things like those advanced zero-days that you mentioned in the introduction. That marketplace might include, like, malware as a service or ransomware as a service, but essentially, it’s lowering the barrier to entry for people who wants to utilize these kinds of capabilities. So, if you imagine those are the ends of the spectrum, and then there’s a bunch of stuff in the middle, as well. So, hackers for higher, ‘cyber mercenaries’ it’s been called in, kind of, press articles. They might be badged as commercial, but they could be used for a range of purposes. For example, commercial espionage activity, or, you know, when you’re subject to litigation and, you know, you want to find out what the other side is up to.
But I think that spectrum is trans – we would say, I would say transforming the landscape when it comes to cybersecurity and what’s going on in the cyber world. No matter what – where you are in that spectrum, what it’s doing is lowering the barrier for entry. So, it’s expanding the number of actors and it’s expanding the number of, and the types of targets and victims to these kinds of capabilities. Ultimately, it’s putting capabilities that are – otherwise would’ve been out of reach into people who are just willing to pay. So, it’s no longer about developing the technical proficiency over a long process to build them. If you’ve got the money, then you can buy it.
And I think our view on the trajectory is that there will be more companies doing this, because it’s all about the money. There will be more companies doing this. We’ll see expansion of the sector, expansion of the types of products and services, therefore, expansion of the type of victims that are suffering from these capabilities. And unless we take action globally, the misuse of them will continue.
Joyce Hakmeh
Great. Thank you very much, and, you know, I like, sort of, the spectrum because it helps us understand the problem a little bit better. And, you know, you made the point very clearly that it’s no longer about the skills, you know, that will give you the access, it’s about do you have the resources? Then you can have those tools. How expensive are we talking about? Like, you know – and because that will help us understand who can access these tools.
Jonathan Ellison OBE
So, it can range. So, recently, I think there was a thread on Twitter the other week highlighting that one particular company was advertising zero-days – they wanted to purchase zero-days for iOS and Android devices, and they were offering multiple millions to procure these experts and capabilities. And they were advertising that they were doing that on behalf of the nation states, in order to procure it for a nation state. So, you can get a view of the, kind of, money that can be spent on these for the highest end capabilities. But equally, because this is a spectrum, ultimately, you know, you can buy the capability that you have the money to affect, and you will find something for that money. And so, then it depends on the type of target that you’re going after, how well defended are they? And that’s, kind of, proportionate to what you’re willing to pay for it.
The point is that, essentially, if you’ve got money of some kind, you’ll be able to afford something. It just depends on where you can land on that spectrum, all the way to the multimillions for the kind of zero-days for the most popular devices, down to something far cheaper if you’re not of that kind of money.
Joyce Hakmeh
Thank you. Miriam, so, in your line of your of work, you do a lot of, sort of, threat intelligence and you do a lot of research about what’s happening. So, do you agree with the picture that Jonathan painted?
Miriam Howe
Yeah, I really do and I think it’s – I actually would build upon that even further and think of it as an ecosystem, that you’ve got this very complicated landscape where you’ve got different actors, buyers, sellers, middlemen. You’ve got tools, you’ve got services, you’ve got different incentives, different use cases, as Jonathan said, and different business models. And the different business models are working, which is why it’s proliferating. And I think it’s really important as we think about, sort of, how to reduce and control the space in, sort of, being really clear about the differences between the different slightest through. So, Jonathan mentioned hackers for hire. Hackers for hire implies a service, an end-to-end service that can be provided and those can be bespoke.
Joyce Hakmeh
Yeah.
Miriam Howe
Or those can be menu-based. So, it could be just Initial Access Brokers. It could be an end-to-end surveillance service. And that, sort of, implies, you know, people, people skills moving around and that sort of business model that can be, you know, sold to states, but also sold to non-state actors. And we’ve certainly seen some articles in the press recently around, sort of, the usages by Private Investigators and that sort of thing. So, that, sort of, one, sort of, slice through this ecosystem. And then you’ve also got moving away from services, but into, sort of, more the products. So, the tools, the exploits that are, kind of, more commoditised and can be put on markets, you know, dark markets, GitHub even. So, there’s very little relationship there between the buyers and the sellers. And so again, the ways you control those two different things and the relationship with the buyers and the sellers are quite different.
So, it’s really important to understand the different pieces of the ecosystem and be quite exact with what services and what products, models are going on here. And I think it matters when it comes to really analysing and really coming up with, sort of, solutions, what set of rules, levers, etc., are going to work for different use cases. It possibly matters less when we’re trying to take big handfuls and say, “Well, what do we think is responsible? What do we think is legitimate?” Because ultimately, it comes back down to principles and values that can be defined in, sort of, slightly bigger handfuls, which I’m sure we’ll come onto later.
Joyce Hakmeh
Great. Thank you. If I can, go back to you, Jonathan, because, you know, Miriam helpfully, sort of, explained the ecosystem and the range and so did you. You talked about, you know, the, sort of, the – as a service, right, crime as a service, and it’s not just spyware, it’s also ransomware and a bunch of different things that are, you know, on the dark web and that are being used. And you talked about, you know, the, sort of, the cost will range based on what is it that you want and how much can you afford?
From the UK’s perspective, what are the capabilities that you’re mostly concerned about? Because, you know, you talk about this is a national security concern, it will affect, you know, sort of, the global stability of cyberspace, etc. What – you know, and what – can you give us scenarios about, you know, which capabilities are you worried about and how these capabilities can be used that worry you the most?
Jonathan Ellison OBE
Yeah, and it’s a good question. So, I think how I would characterise it is the thing that I’m worried about most is putting advanced nation state capabilities in the hands of far more people. So, there are – a lot of the kind of threat that’s out there can be dealt with, with some really quite basic cyber hygiene things. And, you know, you see reports from Microsoft and others around, you know, 98% of threats can be dealt with by the same seven different, you know, things that you can do about it. You know, we’ve got similar statistics in that similar kind of range. But the most advanced threats have, in the past, been limited to a small number of actors, and so, the thing that I am most worried about is that you are massively expanding that scope. You have the potential to, you know, we’ve seen these 80 states buy these kinds of capabilities, but we also see the trajectory being a not particularly good one.
You are expanding the scope of those actors out there that have the advanced capabilities that are really difficult to defend against. And so, it’s not a specific capability. It’s not like saying, “I am worried about, you know, mobile phones,” or “I’m worried about, you know, penetrating, you know, endpoint or networks.” It’s about putting those advanced capabilities, previously only in the hands of a very small number of nation states that would’ve taken a really long time and a lot of effort to build, in the hands of anyone who is willing to pay.
Joyce Hakmeh
Thank you, that’s very clear. On the point of cyber hygiene, which basically, means if we are, like, you know, diligent enough and, you know, like, double authentication, etc., then we can protect ourselves, to a very large extent. But we also know that a lot of these attacks happen, like there are zero clicks, so you don’t have to do anything and the malware is on your phone and can see everything that you do, etc. So, do – what do we do, like, as consumers of technology, how can we protect ourselves and what is, sort of, how the NCSC is thinking about this problem?
Jonathan Ellison OBE
So, I think you obviously do, do the basic cyber hygiene stuff and things like multifactor authentication and, you know, complex passwords or three random words is your password, all the basic stuff to stop it being easy for people to get onto your devices, to stop being a victim of the easiest things – the easiest ways that people can get at you or get after you. But if somebody is utilising the most advanced capabilities against you, going against – going – taking those kinds of actions isn’t going to protect you. And so, that’s why it’s not in the hand of an individual user to be able to prevent themselves being compromised by those kinds of capabilities. It’s about how nations and civil society and industry can work together to come up with a way to control the proliferation of these kinds of capabilities and do something about them at the other end of the spectrum because an individual user can’t.
Joyce Hakmeh
Yeah. So, work on just, like, minimising a lot, like, their existence in the first place and how they can be misused. Thank you very much. I want to go to you, Sara, and just to help us understand because we did talk about – you know, so Jonathan has been, you know, quite clear on the impact and the – what he’s worried about. And I talked a little bit about the human rights, sort of, aspect of this, right? And we’ve had high profile cases of people who have been targeted by these tools. Can you tell us a little bit more about that, and, you know, how – like, what is the – what does the, sort of, landscape look like? Is it changing as more people are being targeted? Please share your insights.
Sara Al-Sherif
Thank you. Thank you so much and yes, and I wanted to add, like, when we talk about the commercial spyware that we are talking about, like something so specific, like, it’s something like, on shelf and like a software or programming or – and the only way that anyone could afford it, it’s, like, usually state clients. So, the most clients or users of this kind of technology is the governments and the states, and they import it and use it, like, under names of like national security or against terrorists.
But, like, I want to mention a recent case, I might – everyone might – iPhone user will be aware of it. Like, recently, if you have an iPhone, you will receive the notification, security notifications for update about, like, iOS 17 and this security updates it’s coming sponsored by Egyptian Government because they used, like what you said, like, zero-day exploited, like, iOS 16 to target one of the Politicians, like, who, like, have an ambition to be a candidate for – in the Presidential election.
So, the question, the questions like, yes, like Egyptian Government used this kind of technology against one of, like, civilians and against one of the Politicians, but – which is, like, so costly. And to find, like, as Jonathan said, like, the kind of exploitations, like, to find this, kind of, like, zero-day is – it’s like it cost millions and billions. So, how could the Government afforded this, kind of, like, to buy this and to target? How Egypt become the end users of this technology? It’s our most important questions.
And this not the only case, this targeting happened in Egypt many times. This is not the first case. We have it, like – this case, like, they used, like, predators or Intelex, like, which based in Ireland and the company have, like, something called EU regulated company. So, the questions, again, like, EU regulated company and they promoting themself like as a EU regulated, so they, like, have a carte blanche for being – could export this technology anywhere or to anyone and having, like, a legitimate, according to their exportations.
Another thing, like, regarding, like – this is not the first case. We have it a few years ago with another two exiled oppositions in Turkey. So, people like the concept of, like, having a safe haven in different country and being safe. It’s not true nowadays because you will be targeted even if you are an oppositions to any government or any institute, whatever you are, even if you are outside of your country. And we see this, like, not only in authoritarian countries. We have it, like – and even in, like – when we – when a scandal of Pegasus happened, we discovered, like, with this report, like, 22 of European country already bought this kind of technology to use it.
So, it’s even compromise the safety of European country and democratic country who already, like, exporting this kind of technology, it’s coming against them. We find, like, in 10 Downing Street, like, they compromise some of their devices with this, kind of, like – with the software and the programming. So, it’s, like – but again, if even the country or government being targeted of this, kind of, software, like, they have the capability and resources to protect themselves. But when you target the civil society and individuals, they unlikely – they don’t have the resources to defend themselves or to protect themselves or to have the suitable solutions to secure themselves.
Joyce Hakmeh
I was actually just about to ask you this question because, you know, we have now seen many cases documented, right, by Citizens Lab and a bunch of other organisations about what’s happening. So, do you think that human rights defenders are actually, you know, sort of, doing some – any – something about that, like, to protect that? You said they don’t have the resources, but I wonder whether there have been, sort of, using some, other, kind of, like, approaches to limit how much they can be…?
Sara Al-Sherif
I’ll stand on this, Jonathan, when he said like, it’s not individual and civil society capability because they don’t have nothing to do when they become the victims because they can’t do – they can’t protect themselves. It’s like there is a kind of software and technology, like, it’s zero-click. So, whatever you are targeted, you will be targeted if they wanted to. Like – and the government, like, happily and allow – and could buy, yeah, could buy millions to target their citizens. And like, honestly, like, the point is, like, the government or the government’s exportations around this, it’s, like, the problematic issue around this is like the relation between the business and economic. So, it make it so complex to divide them. So, like, sometimes the government com – having complicit with the business and exporting this kind of technology to countries with very trouble human rights records.
Joyce Hakmeh
Thank you very much.
Miriam Howe
Can I just jump in? I think it’s a really important and quite another interesting distinction to talk about this, that the users, the end users being states or non-state. Because I think that we – you know, you have to divide which – and decide which part of the problem you’re going to worry about today. And I think that, you know, there’s a lot of uses by – particularly by non-state actors, that are clearly criminal and illegal, and there’s no legitimate uses for them. So that becomes a law enforcement issue, as opposed to state usage, which, you know, there’s legitimate uses and there’s illegitimate uses, and that starts to become quite subjective. And it becomes a whole different set of, you know, conversations that involve, sort of, policy norms and government-to-government conversations.
So, I think it is really, like, important to, sort of, draw these distinctions between what’s clearly criminal and illegitimate, and those that have to have a different, sort of, conversation about trying to define what we think is legitimate, what we think is irresponsible and why, and be really objective and explicit about that. And I think that one of the real values that some of the at – work by, you know, the Atlantic Council and the stuff that’s come out from NCSC and other governments have been really good in, sort of, being really explicit about why we think this is illegitimate. And I think that’s a very important part of the progression.
Joyce Hakmeh
Thank you. James, so we’ve heard about the capabilities, the impact, etc. Where – like, which regions, which countries are most involved in this? Like, is it across the board? Do we see it more – you know, we had, sort of, like, you know, Egypt. Is that, kind of, like, the trend? Do we – or is it everywhere?
Dr James Shires
Yeah. Let me pull down my map. No, it’s – as Jonathan said, right, there are 80 countries that they’ve noticed, there are likely to be more. So, this is a global problem. There are, however, clear centres of gravity to this problem. There are states, who are particularly active in this area, and we can think about them in three different ways.
One is those state users, where they are particularly prone to human rights abuses, and these are the ones that you will have seen on the news and in the media, and many of them in Latin America. For example, Mexican, purchases of NSO Group software, targeted Journalists, opposition Politicians, media and human rights defenders. This was connected to things like murders and arbitrary detention, as well. And there were multiple agencies in Mexico using this, right? So, this is not just a state issue or a state user. There are different state agencies at play here, often competing between themselves.
Another good example is in the Middle East and in the Gulf. You have the terrible case of Jamal Khashoggi, who was murdered in the Saudi Consulate in Turkey, and many of his associates had this spyware on their devices. So, it’s clearly being tracked as it went along. So you have clear cases where these – where certain countries are using them well beyond the pale, well beyond what is responsible or legitimate in international law.
You then have the other side of the coin where there are countries who are producing the software, right? And these countries are not necessarily the same ones. In the Middle East you look across the Gulf to Israel, which, of course, is a democratic state, and a very high tech industry and is producing many of these technologies for export around the world, notably in Africa, Latin America and the Middle East, as well. Other similarly democratic, similarly high-tech players are in India, where you have an extremely large and highly skilled IT industry, and a very small size of that industry chooses to go down this far more problematic route.
And then, once you have the producers and you have the users, you have a, kind of, third country, somewhere in the main, where they have a permissive regulation, where you might just have companies able to set up there. They’re not necessarily looking to be based there or recruit people. They’re not interested in selling there, but it gives them the legal basis for which they can maybe sell into other markets.
Here you have Cyprus as being a very good example, companies using Cyprus as a base to sell into the EU. These are companies that also have complicated tax structures. NSO Group had a lot of different subsidiaries and some in tax havens in the Caribbean, as well. So, then you have these smaller countries as well. That’s, kind of, the global picture.
Joyce Hakmeh
Brilliant, and that’s very helpful, and, you know, so, this clearly illustrates how, you know, global the problem is, right? And so, now I would like to, sort of, shift gears and talk about the solutions. What can we do and what is being done? And I’ll stay with you, James, and if you can just help us understand, because I talked in the – my introductory remarks about, you know, there are some laws, there are the export controls, there are, sort of, like, some non-binding guidance. Miriam talked about norms. What is out there, who’s doing them and how effective they have been?
Dr James Shires
Yeah. So, to start with, we could look at the industry itself and say, “What is the industry doing, what can they voluntarily take on?” And after multiple spyware scandals emerge, you do see some industry players begin to pay at least lip service to certain kinds of voluntary constraints, right? They put human rights conditions in their contracts. They might have misuse or abuse clauses where they will investigate cases that are reported in the media, and they will have ethics boards or ethics committees to scrutinise sales and ongoing issues in maybe more risky areas.
Now, I mean, a lot of this is frankly, window dressing, right? They are looking to make profit from these technologies. They don’t frankly care where they’re used or if they’re used in, really, human rights violating ways if they get their money at the end. But they do want to satisfy their state hosts, who might be more concerned about that and, of course, the wider public. So, they do this as part of a PR campaign. If that’s not going to work, we then have to move to the national level, right? This is voluntary regulation not working.
So, at the national level, you have a very long history of states trying to clamp down on the tools themselves through export control, so preventing companies selling these outside their borders. This is really difficult because it’s very hard to distinguish a particular tool for offensive purposes, for hacking into something, from something that could be used to protect networks or defend devices. And you don’t want to stop people taking legitimate code on a laptop from one country to another if they’re part of the same company, for example. So, things like the Wassenaar Arrangement for Export Control, a multilateral agreement, that ex – regulates not only cyber, but also many other weapons exports, are really controversial in this space and they get a lot of pushback from legitimate cybersecurity companies.
So, you can look elsewhere, right? As Miriam said, this is a very complex ecosystem. It’s not just the tools themselves, you could try and sanction the companies. The US has done this relatively successively with NSO Group, but it’s also something that you could put more pressure on the legal entities themselves. You could also look at the people doing this because actually, you know, there aren’t – there’s not a massive pool of people who are both skilled enough and willing to go down this path and they’re in high demand.
So, in the classic cases of the UAE and what’s called Project Raven, you had former US Intelligence Government employees going to UAE and helping them set up their equivalent intelligence and spyware systems in the UAE. This probably would not have been possible without that help. So, maybe if the US Government had put more constraints on its own employees, stopping them going to these other countries, taking greater salaries and helping them, then we might not have had as bad a problem in the first place.
And then finally, you look at the, global level and you say, what can civil society and industry do to counter this, right? One of the interesting things about NSO Group is it’s very much put big tech on the other side of the fence. You have lawsuits from WhatsApp, you have Microsoft really pushing heavily down it. You have Google Tag, seeing spyware – the Google Cybersecurity Group, seeing spyware as one of its main threats. So, big tech are really saying, “We don’t like spyware. We’re going to try and combat it.” You have Citizen Lab, academic institutions like Citizen Lab, media voluntary associations, things like reporting around the Pegasus scandal. And these are crucial to raising public awareness and pointing to solutions. I’ll stop there.
Joyce Hakmeh
Brilliant, great, that’s very helpful, James. I want to go back to you, Jonathan, to talk about what the UK is doing. So, James talked about, you know, the number of solutions, you know, the target tools or the people or companies, etc. What is the UK doing at the national level and perhaps with allies? And I just here wanted to note something that I, kind of, found interesting in the NCSC assessment around, where you say, “Oversight of the commercial intrusion cyber sector will almost certainly lack international consensus, be difficult to enforce and subject to political and commercial influence. However, it is likely that many commercial cyber companies will be incentivised to vet and limit their customer bases, should effective oversight internationals on development and sale of commercial cyber capability emerge.”
So that’s quite an interesting, sort of, like, position to take. So, if you can just tell us a little bit more how you’re thinking about it, and importantly, how will your thinking adapt to how the landscape is evolving?
Jonathan Ellison OBE
Yeah. I mean, I think I agree very much with much of what James has just said, that there’s no silver bullet. There’s no one thing that will really fix this problem. There’s no quick solution to it either, global problem and it needs a, you know, a set of global answers to be able to resolve it. Talked a bit about export control and I think, I agree with the limitations of that. I mean, traditionally, export control is about something physical to see. Something that you can, kind of, examine at a border in terms of it being exported.
Obviously, you can transfer software and lines of code without that kind of inspection. And whether or not a state has the export control laws in place, obviously, it’s fairly easy to circumvent them when you’re talking about code and you’re talking about something that’s digital. So, export control is one element of this answer, but absolutely is not going to resolve this or solve this.
And I really agree with the point, ‘cause I’m just adding to the complexity, about the, kind of, not making it more difficult to defend ourselves. So, I think there is something really important about vulnerability research, about, you know, penetration testing, about the kinds of techniques and tools and skills and companies that really help a nation to defend itself. And being really – it’s really important that the solutions we find and the way in which we try and, you know, regulate this or work internationally, doesn’t make it harder for ourselves to defend ourself against the very thing that we’re trying to protect against. So, that kind of complexity is the, kind of, background for it, really difficult.
In terms of international work, so the Summit for Democracy, I think early on this year, at the very start of 2023, 11 nations signed up to a joint statement. That was trying to shift the conversation to – about the kind of behaviours expected, the free, open, peaceful, secure cyberspace. And we’ve – and that talked a bit about industry and government and civil society needing to work together for this. So, trying to move that conversation away from export and away from export control, to how do international partners work to create that kind of norms of what is and isn’t acceptable? Which, kind of, picks up on that point from the assessment around norms of behaviour being a really important component of this.
Then I think that the UK, we worked with the French earlier on this year to – I think the UK-French Summit about how we could work together globally to, kind of, lead some of this conversation. And the Paris Peace Forum in November will bring this topic to life again with a much wider stakeholder group. The UK will host a multi-stakeholder event in the first quarter of next year to discuss that broader coalition, because it really does need to be a broader coalition in terms of that collective action. So, what is domestic best practice? What are the concrete actions that one can take? How do we agree what responsible behaviour is? So, what does responsible use look like and what does responsible controls on industry within the different territories, within the jurisdictions, look like and how do we work together?
The final point I just wanted to pick up on, James mentioned about the, kind of, West Coast reaction to some of these. There is a really important thing here about secure software and secure by design. So, one of the ways in which you tackle this problem is by making vulnerabilities more scarce. The more you can crack down on the number of vulnerabilities in software in the first place, the harder it is to find the zero-day, the harder it is to find something that you can then go off and exploit. So, I think the, kind of, long-term answer to this has to be around secure software design and how we work collectively together with industry.
And this is very much the secure by design principles that ourselves, CISA, our American equivalents, and others have worked on for a number of years. How we embed those as we build the next generation of platforms and software so that we then can minimise the problem to the extent we can, which can then use some of these other solutions around, you know, domestic controls, around the companies, around norms of behaviour, around responsible use, to, kind of, try and put as much of a control as you can over this, kind of, issue.
Joyce Hakmeh
Thank you very much. What you’re describing is, obviously, you know, as it is often the case security, issues is like – you know, requires like a – it’s a journey, right, that requires, like, you know, like, consistent effort over a long period of time. So, there’s that, kind of, like, that mind shift that you described and how we can, sort of, move towards that normative framing. How do you, sort of, square that with the urgency that you talked about at the beginning? And we need to, kind of, solve this without – you know, before it gets to the point where it affects our national security.
Jonathan Ellison OBE
Absolutely. I mean, there are no quick wins. If there were, we would be very much lobbying for them. So, I think convening that – you know, the UK trying to take a, you know, a global leadership role in bringing parties together to talk about this early next year and building on the set of conversations we had this week, keeping it on the agenda for the Summit for Democracy and others, is the only way in which we’re going to get that, kind of, global consensus. So, would I like us to do this more quickly? Absolutely. I think it’s quite difficult to see what else and how else we can act even more quickly than we are doing at the moment, but I’m very open to suggestions and ideas.
Miriam Howe
Well, I mean, I think that actually that is the answer. I think we have to recognise that this is a continuum. You know, something that has been around for a while is, sort of, suddenly growing. So – and I think it’s really important to recognise the continuous activity that’s been going on for a while and recognise that these still are relevant. So, for example, the Secure by Design piece, you know, has been – you know, irrespective of this particular debate, has been something that has been, kind of, growing in, in urgency for quite some time to, sort of, counter a – you know, the situation of essentially insecure infrastructure, which has evolved. So, I think it’s actually important to recognise a lot of these threads. A lot of the solutions are continuous, and it’s not about a cliff edge, it’s actually about sort of surging activity that’s been going on for a while.
I mean, I think that, you know, talking about, you know, the role of industry and, you know, obviously, this is a really good whole of nation, sort of, activity to be thinking about, is very interesting the role of big tech in this. You know, the – one of the things that I think works is the whole threat intelligence picture. So, you know, you saw Microsoft and Google Tag working with Citizen Lab and really doing the investigations and exposures that lead to, essentially, some companies folding because of, you know, the, sort of, the incrimination from that.
So we have a threat intelligence team. We know we don’t, you know, follow all of it. You know, we particularly follow things like cracked uses of Co-op Strike. We particularly follow things like hackers for hire and I think having teams that are really looking at the detail, because it is a lot of detail that creates the picture. And one of the things that James was saying around the fact that it’s a bit of a moving picture. So particularly with hackers for hire, you get a certain amount of pressure and on the – you know, certainly amount of exposure and you know what? The website disappears and, you know, some of the people end up in a different company. But it’s the detailed threat intelligence that is tracking the TTPs, the people that are creating those maps and saying, “Do you know what? That organisation over there is, sort of, a reappearance of the organisation that disappeared over there.” So that detailed tracking of what’s happening, as well as the investigations is, I think, a really important part of what will work.
And I think it works in a couple of different ways. One is that it imposes costs, you know, and once you surface things, once you surface TTPs and capabilities, you know, ultimately, that can end up burning them, and so they have to go and develop new capabilities and that makes the job harder. So that’s a win. The second part of it is actually the naming and shaming. It does have an effect. It’s not a silver bullet, but you can see the behaviours of both buyers and sellers reacting to attributions and exposures that mean that, for example, things like Pegasus is a bit more risky investment than it was before. So, it doesn’t change everything, but actually, the naming and shaming and the highlighting and the exposes do have the normative effect that we’d been looking for, as well as imposing costs by making the job a bit more difficult. So, every one of these things is not a silver bullet, but they all contribute to trying to control the space a little bit.
Joyce Hakmeh
And help understand the scope of the problem, right?
Miriam Howe
Exactly.
Joyce Hakmeh
Yeah.
Miriam Howe
And bring that clarity that we need to what is actually going on here.
Joyce Hakmeh
Yeah, yeah.
Miriam Howe
And are we talking about tools and markets? Are we talking about services and trying to identify which bits of those we think are illegitimate and which bits of it we need to worry about?
Joyce Hakmeh
Great. I will go to Sara now, I had a question for you, but just in the interests of time, I’ll go to Sara, and we heard the solutions, right, and what should happen. What is your take on what’s happening in the moment in terms of solutions? Do they work from your perspective?
Sara Al-Sherif
I’ll just, like, I’ll borrow, like, a quotation from the UN former for Freedom – Former Special Rapporteur for Freedom of Expression, who has said, like, the sufficient structure to control the use of, like, surveillance technology. It’s, like, “It’s not broken. It’s hardly exist.” And while we are – we were having a, kind of, a wake, like, after Pegasus and, like – and we still have the power when we have a government willing to tackle this kind of issue. Like, we having – like, this Pegasus company is, like, in verge of being bankruptcy. So, most of the time the problem is, like, the lack of willing to do this stuff because a complex of relationship between the business and the political situations.
And I just, like – while I’m just, like – I – I’ll be – I was happy to see the statement of 11 countries saying, like, they are against of spy – commercial spyware. There is like a, kind of, like, we need, like, we need to see this in numbers. Like, we need to see, like, how many license, like, this country revoked, like, about, like, marketing and using this technology and that they consider their partnerships with, like, with companies and business who doing – who are selling this stuff, like, and abusing this.
And while we are talking about, like, cybersecurity, like, in design, we need to see, like, a strong and binding rules, like with – to, like, prevent human rights violations for using this kind of technology. And while we are using – we are saying too, like, there is a legitimate use for this kind of surveillance technology. There is, like, there is a law, or international law, saying, like, it should be like narrowing and so tied to special and tailored incidents. And when you have, like – you used all other stuff because it’s like it’s affecting people’s right. It’s not only you affecting only individuals, it’s, kind of, like, sometimes like you’re applying mass surveillance over, like, a very big population, you exploiting and you, like, violating the rights to privacy and freedom of expression, associations and many of this stuff. So, it’s like – yeah, just like.
And one of the stuff, like, I want to highlight, it’s, like – I think, like, one of the things, like, it’s so complex to identify, like, the companies and the construction and exportation to tackle and to follow it, especially for Researchers who’s trying to figure out the violations of human rights as a secrecy and back of, like, structure of the companies, sometimes using the loopholes of the regulations in the EU or US or UK. So, sometimes, like, you find, like, the company, like, operate in EU, but they have a [inaudible – 42:22] company in different states. So, they go around, oh, around the human rights, like, due diligence reports.
So, we need to see, like – I think, like, if the country or government’s more willing to look into this, kind of, like, steps, I think we could find a progress, like what we see during the Pegasus. Because after all this work, there was a, like, a publicly recommendations and, like, they said, like, they will put some laws or reg – legislation in place, but we didn’t see it after that. It was in this moment, but we didn’t see it. So, I think, like, we need to bind the rules for this and we need, like, as Jonathan said, about, like, visionary agreements, it’s, like, it has its limitation too because it’s voluntary at the end. So, like, it’s not – and there’s no obligation for everyone or any country to be binding for these rules.
Joyce Hakmeh
One of the things we didn’t talk about there is the Cybercrime Treaty that is being developed and to which extent this binding instrument, if it comes to light, could help address some of the problems. But just before I turn to the floor, a question to you, James. I think it was the same Special Rapporteur that called for a moratorium, is that the word, for – on all these psych tools. Do you see a world where this could happen?
Dr James Shires
No. So, the short answer is no, because although the Special Rapporteur said, you know, “This is out of control, right? They – we need to have a moratorium. We need to stop this happening because the rules in place are not sufficient, right? We can’t guarantee it’s going to happen in a human rights compliant way, in an international law compliant way. So it should not happen at all.”
That doesn’t recognise the economic reality of the situation, which is that companies are incentivised to operate in the sphere because they make money. And they make money because they, especially at the high-end, they sell capabilities to states who want them, right? And this is where we have to step back a bit and say, “We’re talking about one tool,” ‘cause spyware, it’s usually mobile device spyware, which is an intelligence gathering tool from individuals and it finds out whatever’s on your phone.
But this is not used as a single tool by states. It’s used as part of a broader intelligence apparatus, right, they – for both physical, both in-person, but also digital, as well. And so states are trying to upgrade their intelligence capabilities to what they see as the peak of the market, right? They look at states like the UK and think, “We want some of that.” And they – the way they see they can do it is by buying into it rather than developing it, necessarily, themselves. So that’s the reality of the high-end part of this market and why it exists, and that’s why a moratorium is unlikely to happen as long as there is an interest from states in owning and having access to these capabilities.
Joyce Hakmeh
And also, of course, like, there’s this point that technology itself is not bad or good, is how you use it, right? And we’ve heard repeatedly about that all the legitimate users that these tools can provide.
Sara Al-Sherif
Yeah, I want add something to James. Like, if we push the country, like, if with limitation, we push the country, like, instead of, like, buying the commercial spyware to using the hire – the hacker and hire, it’s success because then the capability of the hacker and hire is more or less than the commercial spyware. So it’s a, kind, like, it’s easy and more capability to use the commercial spyware. So, if you narrowing the capability of, like, the – of selling this, kind of, technology to authoritarian country or, like, without doing the human right due diligence or with abuse, if use to any country, if you’re narrowing this, kind of, space to – and push them to use the hacker and hire, this is, kind of, success.
Joyce Hakmeh
Okay, turn to the floor. So, we have the questions, please. Start with you first. If you can quickly introduce yourself and the question.
Dr Missy Nadeau
Of course, yeah, Dr Missy Nadeau, member of Chatham House, contractor with the US Military and NHS High Security Service. When I hear you talk about secure design, I have to think of the film Jurassic Park, where they were so sure they could contain the dinosaurs. What I want to throw out to the panel is AI and how will AI complicate the landscape of managing commercial spyware?
Joyce Hakmeh
Yeah, we’ll take another one and maybe we’ll take one – well, no, let’s do just another one and then we’ll go to the online questions.
Hugo Barker
Thank you. Hugo Barker, Chatham House member. I think it was an interesting discussion and there may be an elephant in the room that we haven’t discussed, in that what is the difference between commercial spyware and states’ spyware if states are buying commercial spyware? If – would it be better if a state developed it by itself, and does that then bring into question the entire idea of spyware in the first place? Is it right for states to have this capability? Should we be monitoring all spyware generally? Why do we need to divide about commercial and state level spyware? Thank you.
Joyce Hakmeh
Okay. Can I start with you, Jonathan, and then maybe go to the panel?
Jonathan Ellison OBE
Yeah.
Joyce Hakmeh
So we had the question.
Jonathan Ellison OBE
Pick up the AI one first.
Joyce Hakmeh
Yeah. On this one and yeah, the AI, thank you.
Jonathan Ellison OBE
This is one of the first meetings that I’ve been to for a while that AI hadn’t come up in the first, like, 46 minutes. So, there’s a timer as to when AI appears in the conversation. I mean, I think, in terms of what does the current generation of AI mean? I mean, it’s essentially again, continues to lower the barrier for entry, right, and it makes things easier because it enables things to propagate in a way that it didn’t before. It enables more people to get access to build code or build malware or whatever that does it, that does the bad things. But also, I think the thing that I’m, kind of, worrying about or think about is some of the companies that are building these kinds of capabilities starting to utilise AI as a way in which they can make their attacks more effective in the first place.
So, there’s something in the first place, right? We talked about that spectrum with criminal activity at one end and states at the other end. You know, AI or generative AI, certainly enables, you know, one end of the market, or likely enables one end of the market to get better access to capabilities they otherwise wouldn’t have got. So, therefore, it adds to the proliferation conversation. But the companies that are building these kinds of things will start to look at AI as a mechanism by which they can make their tools more effective and at lower cost. Because if you can create brilliant phishing emails through AI tools, then you might not need the really expensive zero-days or accesses or whatever in order to deliver your tool in the first place. So, fundamentally, it’s going to make the problem even more difficult and even more complicated.
Miriam Howe
Yes. Just to add to that, I think that – I was actually chatting to some of our team around the opportunities for using AI to enable detection, to improve the benefits of detection, reduce the time to detect. So, obviously AI, not specifically just for this, but, you know, cybersecurity in general does offer benefits in detection.
I think just building on Jonathan’s point about it’s also an aid to the attackers, and, you know, you probably see this already. I think one of the interesting, sort of, nexus areas here is that we’re trying to make vendors of cyber capabilities more responsible. That includes predictability, you know, contained effects, you know, precision, etc., etc. If offenders aren’t concerned about those principles and then you check AI into the mix, which if you’re not really careful about how you use AI, it could potentially have quite unpredictable effects in itself. So, it really compounds the – I guess the lack of precision, the collateral damage that we see from quite irresponsible sale of those tools. So, I think that’s quite worrying in itself. I probably hadn’t put those two things together until you mentioned it, so thanks.
I think just going back to your point about the, why do we worry about its commercial or not? I think that’s a really, really good point, because actually some of the tools that we’re worrying about aren’t sold. They don’t have licenses, they’re openly available, they’re open source, that sort of thing. But I think that one of the reasons it’s interesting is because if it’s, sort of, state developed, then you don’t have to worry about a certain set of levers to do with is there something around the sale and responsible selling behaviour that you need to do? You get much more into, “Well, if it’s state developed, then responsible sale isn’t a thing. The thing you have to focus, therefore, is what is responsible usage?” So, it, sort of, divides up the problem into which conversation you need to have with which, sort of, set of entities. So, I think is a useful addition.
Sara Al-Sherif
I wanted to answer the questions about, like, why is spaci – commercial spyware, not the old spyware? Like, first in place, like, the state when the – if the state develop its own surveillance technology or spyware, it’s more way – we are far, far away expensive and not everyone has a cap – every country has a capability to do or to build this, kind of, like, tailored and sophisticated and targeted spyware. And – but I agree with you, with a point like, yes, we have to take a step far and say, like, this kind of surveillance or technology, it should be, like, being limited overall. It should be used whether from, like, commercial or a state or hacker and hire, like, in very, very, very narrow cases. And it should be like having an oversight, having legislations, having a system in place to control it. Because, like, I think, like, it’s harm everyone and even the country, like, exporting this kind of technology, it’s come after them after that. Like, we could see it in espionage cases, like, the country, like, develop this kind of technology used against them in many cases.
Dr James Shires
Yeah, I completely agree, and just on that point. This is where the mercenaries’ analogy is useful, right? There’s a lot of problems with it and I wouldn’t want to call these companies mercenaries in most senses, it’s too close to conflict of war, but here’s where it works well. It’s because we want to regulate mercenaries in a different way to we want to regulate states, right? The idea of private military companies being involved in conflict creates a whole different set of dynamics and incentives for why conflict might happen or perpetuate, and this is exactly the same for commercial spyware. So, there are different dynamics at play because these industry – these companies have a profit incentive as opposed to a national security incentive, however broadly or in – ill-appropriately defined.
Joyce Hakmeh
Jonathan?
Jonathan Ellison OBE
And you just said it at the end “the economic incentive.” I think that’s the fundamental difference here, is that it’s market driven, right? That is what these capabilities are. They’re going to people who are willing to pay. States don’t have that same driver, right? And so, this becomes a market problem about how do you interfere with the markets and change the incentives and change the levers? As opposed to the state problem, which is about responsible use and all the things, Miriam, that you were just talking about.
Joyce Hakmeh
Great. So, I have a couple of questions online and probably we’ll have time for more questions from the audience. So, the first question is about Pegasus and whether “Pegasus and similar tools are being used for insider trading?” And we have another question on whether there are any “legal means of redress in the UK or elsewhere for individuals who fear their safety or human rights are at risk from spyware operators?” James, would you like to go first?
Dr James Shires
Sure. I think that first question is great, right? This is the B2B version of the spyware problem, right? It’s companies selling spyware to other companies. Now, commercial espionage is not new, it’s happened for a long time, it’s still happening, it’s happening in the digital era, it’s happening with spyware, right? So, this is a problem that we haven’t really seen, right, because it’s not had the human rights consequences openly so far, but it will be there. And often, the way that I’ve looked at it in my research is actually, you have com – chains of companies operating on behalf of either state or political actors who then also have this B2B dynamic. So, law firms operating on behalf of recommendation management companies working for Politicians might then say, “We want to know more about this person who’s leaked this information.” They might then go to a spyware company, right? So, there’s a whole chain of actors that are not just the industry themselves who are also commercial, who also have their own marketing, as well. So that’s where I think the insider trading point comes because it’s much more about B2B rather than business to state.
In terms of legal redress, yes. Right, so there are national laws, there are – many of these actions violate international law, as well. There are legal challenges and lawsuits in the UK. There’s been some in Israel, as well. I mentioned the commercial ones in the US and some of these are going to the European Court of Human rights, as well. And there’s also investigations by the EU Parliament. So, there are legal challenges, but they are slow and they are complicated. They do not provide redress in the timescales in which commercial spyware works.
Joyce Hakmeh
Thank you. Anything you’d like to add?
Sara Al-Sherif
I wanted to add actually, like, about the second questions regarding the, yeah, how UK, like, I think around this. Like – and just, like, I wanted to talk about, like, the exportation itself because it’s part of the problem, like the lake of, like, the legalisations on the exportations. Like we have, like, a dual use export, but it doesn’t cover everything. We have, like, a lot of, like – we have, like, communication interceptions, like – it’s, like, if it’s like for mobiles or for phones, it’s need license or it need approval. But if it’s, like, for network, it doesn’t. If the – if companies wanted to export facial recognition technology, which have a lot of flaws, like, to use it, like, it – there is no need for any license for this kind of stuff. So, we have a very big grey area in this and this technology need to be covered.
Miriam Howe
Just briefly on James’s point around the supply chains. I suppose it’s really, really interesting when you look at, sort of, where the end customer is and then you have a Lawyer and then you have a Private Investigator, and then you have a hacking company and then you have your victims. So, I think that only really becomes clear once you really, really, kind of, have to research and map that up and that’s quite labour intensive. But one of the things I think that comes hand-in-hand with what promoting responsible behaviour means is, sort of, reducing that deniability, reducing – and, like, kind of, increasing the transparency and, sort of, the behaviours that would be discussed would be things like knowing your seller and knowing your buyer, and it’s not okay to not know, it’s not okay to not check. Particularly if there are, sort of, laws and, you know, criminalisation of some of the activities and usages, which any part of that chain is expected to do.
Jonathan Ellison OBE
I don’t think I’ve got anything to add to that.
Joyce Hakmeh
Great. Any more questions from the floor? Again? Okay.
Hugo Barker
Sorry, another question, a different question. I don’t know how much everybody’s aware of the Online Safety Bill in the UK and this, kind of, talk about the getting rid of the end-to-end encryption and the issues around that, and particularly, the protection of political distance and the issues that can come with that. Are we not allowing, kind of, state mandated spyware to be created to allow for these, kind of, policies to be enacted?
Dr James Shires
So, I can start. So, there’s a thing called client-side scanning, that – so, enter it – step back a bit, right? These are parts of the same debate, right? States want access to intelligence or information, right? They either get it through interception of communications, which is very difficult, if not impossible, with encryption and end-to-end encryption, or they get it through access to devices. And this circumvents the encryption because you get the stored communications rather than in transit.
So, if you’re an intelligence agency or a law enforcement agency, you’re balancing out these two options for gaining intelligence in terms of their legal permissibility, their resource cost, the – whether you can buy them from providers. And so, you know, if you lean more heavily as a state on the provision of encrypted services, you probably move the balance in favour of using more targeted solutions instead, right? If you have very – if you don’t allow encryption at all, then you might have many more roots for intercept and you might not need these kinds of targeted statements, so they’re – these kind of targeted solutions. So, they’re part of the same debate.
To go to the second point on whether companies are mandated to do this thing called client-side scanning, i.e. put particular bits of code on devices for things like child sexual abuse material or terrorist images, now there’s a big debate, right? And one half of the debate will say, “That is spyware,” the other half of the debate will say, “No, that’s a legitimate privacy protecting way to follow laws.” And the answer is going to be somewhere in between because it doesn’t take the data – it doesn’t give access to – read access to law enforcement or something, it just raises a flag. But how it does that, is really controversial and a live issue at the moment.
Sara Al-Sherif
And I think that question is so good and, like, we – ‘cause, like, you touched on something, like, we were debating, or everyone were debating about it, and it’s going to affect everyone, like, because of the encryption of our message or communications. And again, like, it’s about, like – it’s like when we’re talking about, like, solutions about commercial spyware, we should talk about how much the government willing to do this? Because we were in a verge, again, to enforce the companies or the tech companies to install spyware over the user’s devices to scan their messages. And, like, the way that – to do this, it’s so controversial because it’s, like, for – because of, like, the high fines over the tech company, if they didn’t operate according to the law, it’s like they’re going to – it’s the easier way is to programming the AI to just like filtering everything.
And this is again, another problem. It’s like we create a problem, we have a problem and in a state of, like, having a solution, we create a mass around it to – in a way, to solve it. So, you’re going to try to programming the AI to filtering the photo and it come with lots of flaw, like, and its bias. So it’s going to block a content, like, it shouldn’t be, like, violating anything and will allow the content that you wanted already to block it. So, I think, like, the way, like, it’s – the technology it’s not the absolute solutions for everything. So, it should be like a collective solutions if we like about Online Safety Bill about, like, the exploitations for the child and using the internet.
Joyce Hakmeh
Right, and I think that’s a good point to end this conversation about the, sort of, the multitude of solutions that we need. And then, it’s not always about the, kind of, like, one solution is as a silver bullet. This has been a fascinating discussion. I hope you enjoyed it. I think we covered a lot of ground around the, kind of, the scale of the problem, but also the solution. Some optimism there that there are already things that are working, some stuff aren’t working. We need to be more ambitious and we need to be – avoid double standards and be clear on what we are trying to achieve and respect this ‘no harm’ principle and how we think about security by design and, kind of, continuing to work on that.
So, thank you very much, for joining us, tonight. Thank you for the audience online and in-person and just to say that Chatham House is doing work on this, so watch that space and this will be – this won’t be the last time we talk about this issue. So, thank you very much, and please join me in thanking our panel [applause].