Dr Robin Niblett CMG
Ladies and gentlemen, welcome to Chatham House. I’m Robin Niblett, the Director of the Institute. I know many of you here, but not all of you. Delighted that you could join us today on this quiet news day in the UK, to talk, which we love doing in the UK, about things that are global and international. And I’ve got a great group with us today to tackle our news of the day, Technology and Diplomacy in the Digital Age, and it’s a subject that is very dear to our heart. Chatham House has been actively involved in technology related issues, through all of its areas of research, for a number of years. But we’ve tried to, sort of, capture the fact that technology is so interwoven with all aspects of policy, through the Digital Society Initiative that we setup about a year and a half ago, formally run by Marjorie Buchser, my colleague who’s at the front. And we’ve done a whole series of technology related events, across the whole spectrum of dimensions of policy that it affects, but I’m not sure we’ve done one specifically on technology and diplomacy.
And we wanted to take this opportunity, not least because I think I can say my friend, Casper Klynge is here, who was kind enough to host me in Silicon Valley, I’ll say, I don’t know which, quite where we were in your residence. It was a very nice day we had, that I do remember, which the Valley, and Silicon Valley has plenty of. But Casper is one of the first in the world, Technology Ambassadors, and therefore, responsible for that new area of diplomacy that many people now call TechPlomacy, I think I’ll get that right, and it’s a fascinating evolution that the Danish Government have undertaken, where he, as an Ambassador, based in Silicon Valley, but with staff reporting to him in Copenhagen and also reporting to him in Beijing, I believe. Yeah, so I don’t know if you’ve got any other offices, as well. You’ll say about that maybe in a minute, but I think it really captures the way in which some of the smaller states, by a population size, are really seizing the opportunity of technology in their diplomacy, and in the way it’s changing diplomacy. And we thought – we’ve been looking for an opportunity to have in here, yeah, at Chatham House, for one of our public members’ events, and having lured him here, we thought it would be good to get a group together.
And so I’m thrilled that Liane Saunders, who’s the Director of Strategy at the Foreign Office and somebody who’s therefore, intermittently involved and knowledgeable of all of the dimensions of British strategic approaches to its foreign policy, including digital diplomacy and digital relations, has also been able to join us at this quiet period that you have at the moment, from this dimension. And what’s good is, Liane has worked in a lot of the traditional areas of British diplomacy: counterproliferation, managing crisis centres in the Middle East, you’ve been very much at the coalface of what it is to have to be a Diplomat in the digital age. And therefore, we thought it would be fantastic to have her join us as well for this panel.
Ambassador Tiina Intelmann, who is the Ambassador of Estonia, here in London. Estonia always gets rolled out for these types of events, as she will know extremely well, because under Toomas Ilves, the former President, Estonia has been one of the pioneers of integrating technology into domestic politics and how you run the country entirely. And we thought it would be particularly interesting, therefore, to have you here, for this panel, of how the international dimension, the diplomacy as well connects and how Estonia is thinking about that dimension as well.
And then, Ashish Jaiman, who’s a technology innovator now with Microsoft, where he leads a number of very interesting initiatives. The one that’s definitely got him on this panel is Director of Technology and Operations for the Defending Democracy Program, and the mere fact that a technology company as ubiquitous, large, successful as Microsoft would have somebody in your role, Ashish, tells us a lot. And we thought, obviously having business and those developing the technology very much at the table, would be important, ‘cause otherwise, as the saying goes, you would be on the menu. So, it’s better to have you here, as well, being able to. We really look forward to your viewpoints and how companies are trying to tackle this dimension.
So, this meeting’s on the record. We’ll have a conversation. We’ll open it up for comments and questions from all of you here. Thank you for joining us today. And I suppose the simple thing, Casper, is to start with you, and I did want to say just one extra element. We have a essay competition actually, we’re doing, aren’t we, of being Technology Ambassador for a day, as part of our World Today Essay Competition this year for students. A deadline through 20th of December, so if any of our student members are here who haven’t thought about applying for that, the information is all on our website under our World Today page. But I suppose that gives me an entry question to you, which is, you know, what does it take to be a Technology Ambassador? Why did your government do this? What is it that you think you’re bringing differently about being based the way you are there, not as an Ambassador to the country, but as somebody responsible for this whole area outside your country from there? And just tell us a little bit about that to start with.
Casper Klynge
Yeah, well, thanks very much for organising this, to all of you, and can I disrupt it a little bit by asking the audience a question? Because I think, could you raise your hands if you’re more concerned about technology today than you were two years ago?
Dr Robin Niblett CMG
Anyone who’s got an arm you can put it up, really.
Casper Klynge
And, you know, we had event in Copenhagen a few weeks ago, actually, with the President of Microsoft at the IT University, where you’re educating Software Engineers and Programmers, and I did the same, sort of, poll. And I think, basically, everybody raised their hands and you’re reminding people that these are our students that will probably, you know, want to join Microsoft, a Google, or a Facebook in the next couple of years. And I think this shows that the debate about technology is different to what it was a couple of years ago. I will probably argue that it’s more mature, less naïve, and it’s very much one of the reasons why it is so is because of some of the scandals we’ve seen, Cambridge Analytica, leaks of personal data, etc. And I’m sure if you go on the streets of London or in Copenhagen or even in Palo Alto, and ask people, so, what has the biggest impact on your life? Is it what happens on Instagram or on Snapchat or is it what happens in country X or country Y?
A lot of people, perhaps some of them a little bit younger than we are, would probably answer, “Well, it’s actually what happens on some of the social media platforms.” So, why did we do this a couple of years ago? Well, we did it for that exact reason that, you know, if we look at where the impact on the Danish society, or Europe, or our citizens, where’s that coming from today? It’s also coming from technology and I would probably argue, it’s very much coming from technology, and the big technology companies are playing a significant role, not replacing nation states, but becoming very powerful, and not only from a financial point of view, but in fact, also, trying to influence policies, regulations. If you look at Microsoft, you know, actually pushing countries to look at a Digital Geneva Convention, so basically fulfilling a role, which is very similar to what nation states did in the past.
And then you can say is that a good thing or a bad thing? Well, I don’t think that’s the interesting question. I think the interesting question is, how do we relate to that as a country? And we decide what the best way of trying to influence those companies or influence where technology’s heading, is also to create a diplomatic posting that focuses entirely on technology and the big technology companies. Now, we also work with governments, that goes without saying, but part of what we do is really into the big technology companies. And it’s both trying to, sort of, gather information to inform our decision-makers on what we do internationally, how do we regulate, how do we create tech policies, how do we influence, let’s say, content moderation of the big technology companies? But it’s also to push out views on behalf of the Danish Government, so, in that sense, I think my job is no different than most embassies or most Ambassadors all over the world.
And the only thing I would say, and then I’ll stop talking, how has it been, and I thought it was Chatham House Rules, but apparently, this is public, but I can still say, it’s been a very sobering experience. And I normally make a very inappropriate joke by saying that I was working in Afghanistan in Helmand province, together with the Brits, a couple of years ago, and I think doing business with the insurgence and with the Taliban was a pretty good preparation for dealing with some of the big technology companies. This is a joke, just to make it perfectly clear. Although they dress more or less the same, in these circumstances, they don’t like foreign intervention either, let me put that out there as well.
No, it’s been a sobering experience in the fact that I think we are standing at a crossroad, where we have to demand from the big technology companies that they take societal responsibility. No technology is neutral and these companies, because of the power they yield, will have to engage with government, civil society, media, and I think, by the way, their employees are going to hold them to account even more in the future. And I think that is why we need, you know, diplomacy and why we need more international collaboration also in the digital age.
Dr Robin Niblett CMG
But you said, and just one follow-up point here, you made a very interesting comment that digital companies are more like nation states. You needed to be, therefore, in the place where these companies playing that role emerge. But even in these last two years, and certainly in the last year, and you yourself mentioned about this Digital Geneva Convention, surely these companies have just – are evolving incredibly rapidly into any company. And by that I mean where they’re now building their offices and building them up is in Washington, or inside the Beltway, because in the end, they are discovering surely that they can’t change a policy. They maybe asking for policy, like many fossil fuel companies are asking for more clarity on a carbon tax. But are they really that different, as companies, or it just at the moment, the speed at which they’ve emerged, the fact that regulation’s not being able to keep up, but that if we were to be here in two years’ time or three years’ time, actually running digital regulation, digital diplomacy is going to have to be done in Washington or Brussels or New York or UN organisations or Beijing, or wherever. Is this like an early outpost because government has not kept up, but that you expect those companies to, sort of, slip into a similar profile as other very large multinational companies? Or is there something fundamentally different about this business that means Denmark will still have an Embassy, you know, in Silicon Valley, in two years’ time, three years’ time?
Casper Klynge
Yeah. Well, first of all, we’re not replacing traditional diplomacy. I think that’s important to say. We’re supplementing it by focusing on a specific area. And I think what you’re basically saying, Robin, is, of course, at the end of the day, you know, policy reigns supreme. You know, it’s still important what governments think.
Dr Robin Niblett CMG
And governments do it.
Casper Klynge
Yeah, we’re trying to do it. I’m sure we’re going to talk a little bit about pace of technology development and whether we’ll be part of the farce to keep up with that pace, as governments or international organisations. But you know, the question you asked, are the multinational companies: Microsoft, Google, Alibaba, etc., are they different than, you know, the ExonnMobils or the Shells of the last couple of decades? And I think they are, in one way, because if you look at a company like Google, you know, what is Google? Is that an ads company? Is it a search engine? Do they do data driven healthcare, autonomous vehicles? And the answer is, of course, all of the above, and that is where I think these companies are different. They transcend everything, the cross-fertilisation between the different business areas is enormous and of course, the impact on our societies is, for that reason, different than I think some of the older companies were. And that is why I think we need to treat them not as nation states, but as an entity with an impact on global affairs, on due political affairs, and we have to, again, hold them to account, have a dialogue with them, a loving dialogue, but also sometimes a critical dialogue.
Dr Robin Niblett CMG
That’s very helpful. I think that helps capture that moment. As you said, it’s not just the timing of the last two years, but this whole of society, whole of economy, whole of life impact that they’re having.
Liane, let me turn to you. British Government, I think, has been quite forward leaning in the digital space. I remember we – there was a Cybersecurity Conference was held here, gosh, seven – six-seven years ago. Had Chinese, had Indians, all, sort of, countries from all over the world there, where does – you know, is digital, for the UK Government, more about how you’re going to be doing diplomacy, or is it as much of a subject itself, in this kind of whole of society way that we heard Casper describe just now?
Dr Liane Saunders
So, I think it’s not just digital. So, in a way, digital is the old world, you know, and we’re already through digital, and technology and emerging technology has to be the, kind of, new world. And it’s, for all of the reasons that Casper has said, the connectivity that you have, you know, whether it’s through companies or through states’ use of – and sort of different sectors’ use of technology, means that it is a, sort of, all pervading aspect of our lives. And just like every other all pervading aspect of our life at a system’s level, that’s the sort of thing that Diplomats, particularly when they’re engaging in, you know, kind of, who sets the rules, what those rules are, and how that enables citizens to go about their lives, feeling safe and secure in what they’re doing. And to be able to develop the, sort of, prosperity and the sustainable development of the world, you know, all of those will be tech enabled in some way.
Now, they may be tech enabled in good ways, bad ways, as Casper said, in a way, you have to be, sort of, agnostic when you approach this. I think from the British perspective, I mean, one of the things that, you know, we, sort of, have become aware of is that, in one of my old domains of counterproliferation, it was very easy to distinguish between what was a duel use. So, something that could be used for, sort of, military or, sort of, military power type aspects, security related ways, by both legitimate and non-legitimate actors, and something that was, you know, able to be, kind of, anodyne and therefore we didn’t need to worry about.
The point about many of the emerging technologies is that they are all multiuse, and I think, you know, being aware of that, is really where we come into the picture. Because I think, as Casper said, you know, too often we, sort of, treat these domains as separate, we treat it as a commercial domain, and we treat, you know, what we’re doing in government as the government domain. But actually, we need to be able to have a conversation across those domains.
Now we can’t start that conversation at the point where we realise that something needs to be regulated. We need to start that conversation, right back as companies are developing technologies, as they’re innovating. Not to stop or suppress that innovation, but really to shape it and make sure that when it’s being developed, the companies are really thinking about, in that societal way, the potential aspects and the potential applications that that technology may have. And I think that’s important for a range of reasons.
I was at a conference recently and a technology specialist admitted that when they had setup the early social media platforms, they had been more focused on the criminal damage that might be done through those platforms, than anything to do with maligned state actors and the way that a state might seek to manipulate or use disinformation or develop deep fakes.
Now, I suspect that if a government had been involved in that conversation right from the outset, we would have probably thought about that, because actually, from a national security level, that’s, sort of, one of the things that we always have in our mind. And it’s not saying government is better at doing this; patently we’re not Technologists, but it is about making sure that we’re applying all of our skills to be able to have the right, sort of, conversation, so that we’re aware of the risks, and we’re thinking, right from the outset of a technology, about how to manage the risks.
Obviously, from the UK perspective as well, you know, because tech is enabling, the way that we manage our diplomatic service is changing. This isn’t just about, you know, every Ambassador overseas having a Twitter account. This is about, you know, how do we actually make our operation, our global operation sustainable? How do we use technology to support that? How do we use it to reach our citizens? We’re using a lot more machine learning enabled technology to support our consular services, for instance. So there are lots of ways that we can actually work more effectively and more positively, as a diplomatic service, as well as thinking about the diplomacy of tech in the world.
Dr Robin Niblett CMG
Could you just say one thing, you know, the UK is renowned for being in the middle of lots of clubs, so the international clubs. Maybe one less club, but we’ll see, but you know, UN, WTO, G20, the Government Group of Experts, etc., etc., do you think this is a space where the UK Government sees that, kind of, rule writing opportunity? Is that – would it be fair to say that the British Government has identified technology, I’m going to use that word rather than digital, as a space in which it wants to play a role, as a rule writer? Or is it thinking about it more as UK Plc, what the UK needs to do for its own interests, for its own, you know, development of its companies, development of the technologies, is this going to be a focal area for the UK, in your opinion? Has it been already for rule writing?
Dr Liane Saunders
Yeah, so, I mean, I think it’s an important area and it’s certainly an area that we have applied ourselves to, and partly because it’s one of the areas where, because the technology is changing so fast, there aren’t always rules and there aren’t always standards, and I think that’s also one of the things to think about is, it’s not just the, sort of, the rules you set for how technology will be used. It’s actually the standards you set when you are designing the ethics, for example, behind AI systems, and that sort of thing. And it’s really important that that’s a broad conversation.
So, I would hope that the UK, as a diplomatic service, that prides itself on its ability, not just to be a rule writer in its own respect, but to broker a broad community and a broad consensus around the, sort of, rules and the, sort of, standards that need to be written. And, you know, on the eve of our election, I don’t think I’m saying anything, kind of, controversial, in terms of that being a broad area, you know, kind of, across any potential government that we might have.
Dr Robin Niblett CMG
I think technology as being one of the toughest areas to broker consensus around the coming years, but we’ll come to that maybe in a minute, and certainly in the questions, I would imagine.
Tiina, if I can come to you, as I said, Estonia being at the cutting edge of rolling out digital connectivity and services at the heart of its government operations. When you look at the diplomatic angle, you know, what do you see? I mean, Estonia has also been the target for some pretty active government disruptive activities, trying to turn its strength into a vulnerability, but is that the main focus of the Estonian Government, how to protect itself, how to protect its system? Or are you also forward leaning country, in trying to take your experience in writing rules or providing best practice for others? Where does this subject hit you?
HE Ms Tiina Intelmann
Usually, when I’m invited to speak, then I have to speak about eGovernance in Estonia and then, of course, you know, very proudly say that 99% of public services are online that are only three things that you cannot do in Estonia online: sell real estate, get married, and get divorced. And also, I always say that you – we conduct elections online and feel free to ask me why we’re not afraid to do that. And you’re absolutely right, most of the activities that we have undertaken, over the past 20 years, have been to make our state consumer friendly, but also, to reinforce our state.
You may remember that for a big part of 20th Century, we existed [inaudible – 21:16], but the factor not so much. So, the result of that is that we opened a daytime Embassy, which means that the whole Estonian state is backed up in Luxembourg and is being constantly, so to say, kept up to date. It doesn’t mean that we are preparing for the Estonian state to disappear, but it’s always good to have a backup.
And the second thing, also, that comes to the point of reinforcing our own state, is that we have launched a programme of eResidency and at first we thought that it would be fantastic. We offer our digital platforms to citizens of the world, who want to conduct business on our platform, and we might make money. But when I look at the Embassy, what is happening with our Embassy right now, we have seen eResidents come in to collect their documents and to be fingerprinted and we are identifying our new honorary consults from amongst these people, who are tech entrepreneurs, and who will finally become our Ambassadors, so to say, right? Making us, probably, redundant. I don’t know.
But it doesn’t really stop there. Because while we were doing our eGovernance activities, we started seeing people coming in and asking, you know, “Can you also – can you explain us more in detail what you’re doing?” And as a result of that, we now – you have DFID, we have eGovernance Academy and mostly, our development corporation is geared towards explaining to people how eGovernance works. Why is it useful? To save money, and what it is useful for. For instance, you know, cutting corruption, because, you know, you increase transparency. And it also helps with SDGs, and it also helps with all kinds of other things.
So, we now have an MoU with the African Union. We are working together with UNDP and we’re also working together with the separate countries, you know, who are interested in getting our experience. But it really doesn’t stop there, because you know, when we saw that using digital signature saves us 2% of GDP annually, in Estonia. We started thinking that, you know, making use of digital solution should also become the norm elsewhere. So that we could have digital signature also across the board, for instance, with Finland we have it already. And as a result of all of that, our realisation that, you know, you are not strong when you’re doing it alone. We became much more forceful together with the UK, in pushing these issues in the European Union, in the Council of Europe, and elsewhere. And of course, you already mentioned the fact that we were attacked.
We – it was a massive cyberattack in 2007, which made it a very big necessity for us to start pushing also, cybersecurity issues globally. We now have a NATO Cybersecurity Centre in Estonia, but we also have to have these discussion even at the United Nations. Even if we don’t agree, we have to have this discussion, what, kind of, activities in the cybersphere are permitted and what, kind of, activities are not permitted? So, all of that – those are very important areas and we try to be on top of issues and we try to be leaders in this field.
Dr Robin Niblett CMG
And fascinating to hear the way, as a country, you’ve evolved this process. Interesting about the Academy as well, the idea of trying to export it. Can I just flip the question to you for one more point, which is, who provides the technology within Estonia? ‘Cause obviously, the big discussion here in the UK and many other countries right now is, who – you know, what is the nationality of companies or their residents, or however you want to define their nationality, in terms of providing? And given that you’ve got 99%, as you said, of social and economic activity taking place online, including all the things you’ve described, are these global companies, many Estonian? How do you manage that element? Who provides this stuff?
HE Ms Tiina Intelmann
You know, a very funny thing happened. In 1991, when we restored our independence and then this idea of making a new sort of governance in Estonia emerged, because we realised that we were quite poor. We were not able to run our country like the UK is running. So, there were attempts to link with a big US tech companies, to see how they could help us, and actually, this idea did not – well, it was not very attractive. It was not too attractive for the big tech companies. So, the government thought that, you know, why don’t we allocate money, and we tried to promote our own entrepreneurial activities and it actually happened. So, the bulk of these services have been developed by Estonian companies, who, from there, now, have become also global. Yeah.
Dr Robin Niblett CMG
So, you were doing – but you’re saying you were doing that principally for need and you weren’t getting the input from the others. But I presume now with hindsight, as you look at it, that would feel like the right decision? And in a sense of, or are you now inviting – now that you’ve got a more successful economy and it’s, you know, proved its resilience, do you now have companies like Microsoft, or others, knocking at the door saying, “We’d like to come and provide that service?” And you say yes or no.
HE Ms Tiina Intelmann
No, we don’t say no because the economy’s open. But the fact that Estonia may be one of very good examples where the government has had the lead and the businesses have followed and now the governments have grown out of this – waiting what – the businesses have grown out of what the government wants them to do and they have spread their wings. So, in a way, yeah, it was a blessing.
Dr Robin Niblett CMG
Which, I mean, we’ll come back to in a minute, but it sounds like almost the polar opposite of the world that Casper is sitting in, in Silicon Valley. I mean, I can’t think of a more opposite way of describing it, than companies that have not emerged out of what government wanted it to do. They’ve emerged out of whatever the public need was, but – and now you’re trying to reverse engineer the government around it. Very interesting.
Ashish, perfect time, of course, to come over to you. You may or may not want to play the mantle of all tech companies in your comments, or even, entirely, Microsoft. But I’m just struck that they – that there is somebody at Microsoft with the title that you have, in your title as defending – you know, Director, Defending Democracy Program. I mean, it says a lot that the company has that. Can you just say a word or two about, I suppose, like we’ve heard it. What’s your job? And – but which democracies are you defending? How – you know, how? Is it demand led? Are you offering? Obviously, we can see companies want to get ahead of this particular debate, as much as they can, and Microsoft maybe has the benefit of not being in the acronym gaffer, to the best of my knowledge. So, somehow Microsoft has managed to slip away from that frontline. But go ahead.
Ashish Jaiman
So, everything is true, right? As Casper said, you know, we realised that we are a big tech company, which has some social responsibility as well. Our President, Brad Smith, is a pioneer in his thinking about saying, you know, “We’re doing business, but business runs on trust,” right? So how do you gain trust of the market, right? So there are many things, many initiatives in Microsoft, and one of them came out of the very fact that in 2016, we saw something in the US, right, which was like yes, we’ve seen it in other countries, but in the US it was first an experience for the country to see that there were mechanisms to disrupt the basic principles of democracy. So, our team was founded on that same principle with Brad saying, “Being the world’s biggest software company, we have a social responsibility and we should do something.”
Because it’s true both ways, right? One is, yes, it is a do-good mission. It’s a positive mission from Microsoft because we are, you know, very big, both in terms of revenue, but we have a lot of employees, as well. But also, you know, democracy’s good for business as well, right? Because, you know, if you think about that, you know, in democratic countries, the decision is not made by, you know, a few individuals. It is, you know, in a long run, innovation is fostered in democracies more than non-democratic countries. So, if you think about both of those things, our team was founded on very basic principle of, we call our self defending democracy, but our idea was actually very simple, which was okay, can we bring in cybersecurity capabilities to the key institutions, both the institutions that run elections, but also the institutions who want to get into the office, like political campaigns, political parties, and help them with right kind of cybersecurity infrastructure, tools, knowledge, you know, skillset, to make sure that their process is not disrupted by nation states, right? And that’s how we were formed.
But that was our principle, and then we said – when he looked back he said, “Alright, you know, this is big mission for a – a global mission for a ten people team, so what do we do?” So he said, “Alright, you know, we’ll distil it down to three things, right?” We’ll provide campaigns and political parties and the likes, with cybersecurity awareness, knowledge, skillset, so that they can improve their security posture. We’ll also work with the electoral authorities, and every country has a different mechanism to run the process of voting, with again, cybersecurity principles, with focus on election integrity.
And the third pillar of our mission is what we call disinformation defence, because we think that that is also very important, both from a perspective of, you know, providing capabilities to the institutions, but also, awareness in the democratic countries itself, those populations for civic societies. So we work with think tanks, and others. So that’s how we have structured our team.
Dr Robin Niblett CMG
So, just one question on that point, because I’ve written so many questions down, based on what you’ve just said, maybe others will have them as well. You fight disinformation, who decides what is disinformation, if you see what I’m saying, within the company? I mean, we’re sitting in the middle of an election here, I’ll simply say that, and there’s been some debate about what some of the parties have put out. Is it disinformation or is it aggressive campaigning?
Ashish Jaiman
Yeah, so we – so again, our role is not to define this information, but provide tools if someone wants to figure out if it miscontextualising some information or disinformation or maybe malinformation, then we can help provide both from an awareness perspective, as well as tooling perspective, to the key constraints of the stakeholders, right? Again, are we – in all of our mission of defending democracy, we are not here to put our thumb on the scale, we’re not here to decide, you know, whatever democracies do, we’re just providing the technical solutions if they want to use, specially around cybersecurity and now on disinformation around AI.
Dr Robin Niblett CMG
One last question. I’m going to open it up, ‘cause I’m conscious there’s so much to say and we’ve only got 25 minutes or so to go, and we all have plenty of chance to come back maybe on each other’s points as well. Would another part of Microsoft, or you work with non-democracies?
Ashish Jaiman
So, as a business, right, Microsoft has presence in – across the globe, right, you know.
Dr Robin Niblett CMG
I’m thinking of the latest headline about China moving people out of the United States?
Ashish Jaiman
And a lot of countries, right? Our focus again is to make sure that we provide cybersecurity tooling, as well as AI knowledge for democracies who want to leverage and also fight the external influences in their process.
Dr Robin Niblett CMG
Clear answer and it was on the record. So this is good. Right, questions, thoughts, comments, and as we’ve got a panel, I’m going to take two or three, you know, at a go and give them an opportunity to come back and have a bit of a conversation. This person here, yeah, person at the back, there. Yeah, think I’ll start here. Microphone’s coming to you.
Member
I just wanted to say thank you so much for your remarks. That was very insightful. I worked in Silicon Valley in public policy for the last three years, prior to becoming a student here in London. And at the beginning of that very short part of my beginning of my career, I saw a big shift in how we’re – tech companies that we represented were treated. You touched on it, but I think the Atlantic was the first to call it the techlash and how tech companies went from being seen as the purveyors of like solutions for the future and prosperity, to mistrust and something that we should look out for. The public now, kind of – the narrative is seen as we don’t trust tech companies, for the reasons that were described, they’re purveyors of inequality. And then, from the government side, as we’ve seen in hearings in Washington DC, many of our legislators around the world just don’t understand how these tech companies work. So, how do those two elements of the techlash, if you will, shape your work?
Dr Robin Niblett CMG
Okay and let’s hold the techlash thought. At the back and then I’ll come here, first right at the back. Yeah.
Sam Alvis
Thanks a lot. Sam Alvis and I work on country extremism policy at the Institute for Global Change. I had a question, if your resource was limited and you’re a country like the UK, where do you put your efforts, if you were seeking to write those – the rules for tech diplomacy in the future? Is it influencing the Saudis for their G20 Presidency? Is it von der Leyen at the EU? Where is the biggest power broker, when it comes to these rules?
Dr Robin Niblett CMG
Very interesting question, and yeah, do you want to pass it back, and this gentleman there, as well, had a question.
Frank Gilly
Frank Gilly, Member of Chatham House. The question of disinformation, trolling, defence or democracy seem to be all interrelated. Only yesterday we had a shocking example of trolls, trying to besmirch the case of – shocking case of a boy who had been left lying on a hospital floor for hours and pretending – trolls pretending it was all fake news, when in fact it was true. How can we do something to prevent that kind of shameful misinformation?
Dr Robin Niblett CMG
Right, and we’ve got a first menu, and I’ll look around. So I think I’ll let – just pass these questions around. I think, Casper, maybe start with you and give a chance to go around. Don’t have to answer all of them, but the techlash one, obviously, I think you almost described that as being part of the context that drew you into creating this Embassy as well. That and/or obviously the trolling resources where you put them?
Casper Klynge
Yeah, but I think you’ve been in Silicon Valley, you’ve lived the dream, but I think you’ve also experienced that it’s not only a dream, there is also a little bit of nightmare involved in the situation today. You know, if I look back two and a half years ago when I arrived together with my team in Palo Alto, I think there is a different discussion today with the big technology companies. So I’ll give them credit for the fact that many of them are recognising that the – that they have a responsibility. You know, also the word ‘regulation’ or the word ‘governance’ doesn’t give a lot of people notion anymore. I think that was the case two and a half years ago, and I think that’s a good development.
Now my cynical – I’m a Diplomat, so I’ll have to do this anyway, my cynical assessment of this is that without external pressure, nothing will happen by default. So without no governments teaming up without civil society beginning to have an interest in this, without academia beginning to focus like you’ve done, Robin, here at Chatham House on these issues and without employees beginning to hold the companies to account, I don’t think we will naturally see companies, even companies that are a product of western liberal values, take the necessary responsibility. And I think some of them are doing it deliberately, others are doing it because they’ve been surprised about the power and influence that their platforms have developed into an – and, you know, let me go on record by saying that I don’t think, you know, you’ll find evil people in – among the C-suite in many of these companies. There might be a few here and there, but let’s not go there, but I think many of them are struggling with very complex issues.
So actually answering a little bit your question as well in the back, sir, what’s the deduction of that? What’s the consequence? What’s the lesson learned for our diplomacy and for over the last couple of years? Well, two points. First of all, it’s not about Denmark. It’s not about, you know, that we are a hyper data [inaudible – 38:52] country. That doesn’t really matter, because I think all countries are going to be impacted in different ways. If you live in Europe, different set of challenges. If you live in Africa, very different set of challenges. Fragility will pop up in different ways, and by the way, the digital divide, on a global scale, I think is something we need to pay a lot of attention to, because that will be the root causes of migration, of extremism, of terrorism, long-term.
But the deduction of that, what do we do about it? Well, I mean, Robin and I spoke about this on a podcast some months ago, we need to begin looking at coalition building and alliance building in a different way. It’s not only about NATO, countries working together, it’s also about saying, we have to bring in the private sector. We might not always like what they’re doing, but I think we will struggle with finding the right policies, finding the right piece of regulation, unless we bring in the private sector in those conversations. Both to hold them to account, but also to learn and, you know, have an educational experience.
And I’ll be very frank, I think one of the reasons why we’re out there is also because the Danish Government recognised that we don’t fully understand where technology is today, and I think the gap between our policies, our regulations, our understanding where the world is, unfortunately, that gap is increasing each year. So, we need to come together. We need to work with responsible actors that are willing to do what is good for humanity, not only in the sales lines, but also in reality.
Dr Robin Niblett CMG
And just bringing in Liane, and picking up on Casper’s point here in answering this question, there are a number of governments that would take a very different view about how one should regulate and manage the digital space and technology, that feel it should be very much government led and that the, kind of, multi-stakeholder approach of having business at the table, and so on, is not the way forward. That actually, this is a space that needs to be regulated more on a top down way from the outset and the get-go. How – you know, is that your interpretation as well? And if it is, how can one play a collective role, or are we risking a future in which we’ve got two different types of regulated digital spaces? One in which it’s very multi-stakeholder, the companies involved at the frontend, and the other is one in which it’s much top down, and actually, the standards diverge, the approaches diverge, even the markets diverge. You know what I’m talking about here, obviously, with the China dimension, as well.
Dr Liane Saunders
Yeah, so, I mean, I think it depends what technologies you’re talking about, what stage of development they’re at, and the community that understands them and how that operates. And in part, that’s also an answer to, you know, kind of, where do you put your diplomatic network? Actually, one of the reasons why the UK has a Global Diplomatic Network and why we, kind of, try and reach as many places as we do, but also, as many different regimes and technical standards bodies as we do, is because increasingly, you know, different types of technology will have either an effect on the development of rules in a particular space, or we need to think about the, kind of, ethics of a technology within that space. So, you know, two very different examples would be in telecoms where, you know, we obviously do have to think about these things, and where there is already a very well developed structure and system. So it’s about how do you use that structure and system to talk fluently about the, sort of, technologies that are going to impact on that sector?
Another area would be something in the cultural space. So, UNESCO has designed AI ethics. Well, it’s obviously vital that that’s being done and as that, kind of, develops, that that’s being done, you know, because that covers the, sort of, whole scientific and cultural reach. Now, you know, neither of those bodies are necessarily, sort of, attention grabbing, on a world or a daily basis. But they are really important, because if you don’t build the right rules for the role and you don’t have the right people in the room to have the conversation. And I think that’s the point. Diplomats are always about having the right room, right people in the room to have the conversation, and there might be a series of conversations and that means that it doesn’t have to be a bifurcation of only government makes the rules, or you know, a multi-stakeholder. You can have a combination of those things. The question is, have you had the right influences, at the right point in the decision-making process, to inform the outcome that you have?
Dr Robin Niblett CMG
Tiina.
HE Ms Tiina Intelmann
I just think that we’re leaning at the moment a little bit of a shock that we realise that tech companies have done something that is – has changed the world and that we have not been thinking about regulating it. And now the tendency is that oh, let’s just get together and regulate. Whereas we think that, you know, if we approach the issue this way, the digital space is yet another place where humans operate. So a lot of the rules that we have for operating here or in a society elsewhere, you know, in a physical space, should apply. ‘Cause otherwise we just go and handover regularly.
Also, the question of this disinformation and trolling and defence of democracy, it’s a very serious question, and in Estonia, we’re getting that a lot. Probably, because of the fact that during the 20th Century, as I already told you, but we have come to the realisation actually, you cannot control information. And that it’s more and more that people have to use their own common sense and be aware of the dangers, because how can you control? How can you tell people what is true and what is not true if there are so many different sources of information?
Dr Robin Niblett CMG
That’s such an important point, one we’re all grappling with here. Ashish, do you want to come in on this, and maybe if you want to, in some way, might say something about it as well, the EU role in this? Because that was a little bit one of the questions, the EU has set itself up in a way as the, kind of – as a regulator and it’s ironic, ‘cause it doesn’t generally have the companies. But has therefore almost stepped into the role of trying to set some global standards, and the most obvious being the GDPR, the General Data Protection Regulations, but there are likely to be others coming down the pike, and there’s AI conventions on ethics and so on. So, yeah, do you want to pickup some of these points and maybe address where a company in the US sees a group like the EU doing the regulation.
Ashish Jaiman
Yeah, so I’ll come to that, but quickly, you know, going back to the regulation part, I think is, you know, I agree with you saying, you know, the regulations that we have in our normal human life, actually, most of them can be very well translated into digital life as well. So, you know, whenever we see a problem, the first answer is hey, let’s actually have a regulation to curb it. I don’t think – you know, this information could be an example like defects, right? Impersonation of someone and falsifying information has, like, all kinds of regulations there, right? Financial regulations in a country, like FDC in the US would have, all kinds of regulations around, you know, impersonation, and impersonating a business, doing false advertising, and all that. So you can actually bring a lot of that into regulations on the digital life, as well.
So that’s one thing. The other thing I want to add to it is, companies are realising that, you know, they’re moving very fast and regulators are actually trying to catch up. And there may be a day where you will say, “Alright, you know, let’s just bring in regulations,” right, and then a lot of regulators sit in a room and all of a sudden we have to then start thinking about. So, most of the companies what they are doing is they are getting in front of it, right, by creating some, kind of, internal principles, policies, and regulations as well. And I’ll give you a very good example in Microsoft, where we published six principles of AI, an index, essentially, right? So, and they’re very, very simple to understand, as well, right? Which is fairness, so any tool that Microsoft creates, any AI system that is being created, has to treat all the people fairly, right? So, simple principles like fairness, inclusiveness, reliability, transparency, privacy, and accountability, of any tool that comes out of Microsoft. So, essentially, trying to get in front of that regulation train that maybe coming, right? And it’s not just us, right, you know, Google has created an AI Ethics Board, there are other companies who are doing the same thing.
So, my call here is that regulations may not be the answer, right? Like, a hard, kind of, regulation sometimes may challenge the innovation that is going on. And I’m not saying that regulation is not needed, I think the multi-stakeholderism of, you know, everyone sitting at the table, bringing the right people in, and having common sense simple regulations can suffice, also aligning it with existing normal regulations we already have, bringing them into the digital life survey as well.
Dr Robin Niblett CMG
Okay, a question front, question at the back, three. Yeah, and I’ll go here. Right here, first at the front and the back and at the front again. Thank you.
Hilde Rapp
Hilde Rapp, a Member of Chatham House and Centre for International Peacebuilding. All of you have mentioned, implicitly or explicitly, the national and international security implications of AI, and I believe that both Denmark and Estonia have, because they’re small countries, have made this an all-society endeavour that actually, everybody in the country is responsible for the ethical rules, that you mentioned, Ashish. And that you’ve put a lot of your effort into education, as a way of getting young people, and I’m aware I’m terribly old in this room, to, kind of, appreciate how to think, how to act ethically, and how to use these new tools wisely. And I think you’ve been talking a lot about how, in your previous work, the strategic role of understanding the role of diplomacy also in understanding how we use weapons for both for defence and for enabling development, or technology, rather. And I think all things are one part of the same scheme, and unless we connect the pieces and having an ethical perspective on all these aspects, regulation, in a sense, will be sitting on top of something that should be a top down and bottom up effort.
Dr Robin Niblett CMG
Okay, right at the back.
Raluca Ursu
Thank you so much for your discussion. I’m Raluca Ursu, Member of Chatham House. My question is, how do you tackle uncertainly when regulators wouldn’t want to sit down with you and have this conversation to move forward? Are we talking about, you know, potential [inaudible – 50:31] coming up soon enough, if these things are not regulated, so how do you tackle this uncertainty, in terms of what we can do now, soon, and fast, rather than long-term?
Dr Robin Niblett CMG
So, how do we – just so we’re totally clear on this end, how do we deal with the uncertainty in the short-term around technology regulation, is that what you’re saying, given the geopolitical competition that’s going on at the moment? It’s very hard to get fast. Okay, right at the front.
Trisha de Borchgrave
Thanks very much. Trisha de Borchgrave, I’m a Freelance Writer. Are we ever going to, in terms of, you know, information is going to be what it is and we have to figure out the ways of deciding whether we agree with it or not, are we anywhere nearer holding a social media platform accountable for what is posted on that platform? Or is this going to be just, you know, massive class action suits down the pipeline?
Dr Robin Niblett CMG
Okay, we’ve got some very specific questions there. I’m going to run a little over, just warning you all here, I’m going to run a bit over 2 o’clock, if you guys are alright here, because normally when we have a panel, we try and do it more like an hour and ten minutes or so, ‘cause we can’t get it all done in an hour. So, we’ve got three questions there. Actually, shall we start at the other end? No, you can talk about social media platforms, without having to be one, so that helps you as well. If you want to take that question. LinkedIn, oh yes, of course.
Ashish Jaiman
We do have a social media platform, but, you know, it is very self-regulated platform and the goodness of LinkedIn being a professional network, it’s typically self-regulated and people don’t show up their biases on our platform so easily, as they would honour theirs. So – but to your point, right, it’s – there are some reg – in some jurisdictions in some countries, there have been some regulations to – on social media platform and tech companies. But I’ll tie back to one of the questions that I heard is, it’s good to be in a democracy, from that perspective, that tech companies would be involved in those discussions of regulation. Because in democratic environments, the whole democracy fosters that, kind of, discussion, right? So it’s very hard and again, it goes back to my initial point is like, why is democracy good for business? Be it social medial platform, big tech, you know, or any other, kind of, business, is that there will be a voice that can be heard.
So, I think yes, there will be some instances where the regulations would come with a heavy hand, but I think mostly, tech companies and social media companies would have a hearing before regulation comes down. That’s my personal thought.
Dr Robin Niblett CMG
Making yourself adapt, is almost what you’re saying, in that case. Yeah, yeah, please go on.
HE Ms Tiina Intelmann
And I just wanted to come back to the security implications of all that we’re seeing right now, and it’s a very fashionable word now, which is called ‘resilience’, resilience of societies and that actually involves everything. That involves people using their own brain. Involves also looking at what is happening in cyberspace. Involves something that we call cyberhygiene in Estonia, increasingly and also teaching that at schools, you know, what to do, what not to do. But also involves, for instance, something that we use in Estonia is a cyberleague, which means that people who are normally doing other jobs, in the run-up to elections, they also voluntarily monitor what is happening in the space and what information is there. So, resilience is a very, very important issue.
Dr Robin Niblett CMG
And where would you come down on the regulation of social media platforms and where – how is Estonia managing that challenge?
HE Ms Tiina Intelmann
Well, probably there has to be some regulation, but it’s clearly beyond what Estonia can do. So…
Dr Robin Niblett CMG
Yeah, but your voice, again, I’m trying to think of your voice as a, kind of, leader in this space, are you advocating a particular approach?
HE Ms Tiina Intelmann
I think the overall approach is, as I already explained, that we should not overregulate things.
Dr Robin Niblett CMG
Yeah, okay.
HE Ms Tiina Intelmann
And the temptation is there, be it domestically, be it internationally now, because it’s like we have now, all of a sudden, we woke up and we discovered something. So we need the urgency of doing something, and of course a normal thing is, let’s just regulate everything. But it’s not going to be the solution.
Dr Robin Niblett CMG
And one observation I’d make, of course, the place where social media platforms are heavily regulated or not to exist, are in Iran. You know, a few places that are not, have the democracies we’ve been talking about. Casper and then, yeah, and then I’ll bring Liane in.
Casper Klynge
The past two questions, also, I want to address the security aspect of it. On regulation, we’re seeing more and more CEOs or company representatives advocating for more regulation, and on the surface we welcome that. I think it’s fantastic, but I’ll share as small state secret from the frustrated life of a Danish Tech Ambassador, doing business with these companies, and that is when you dig in or deep dive into what do they actually mean by regulation? Then it becomes slightly more fluffy. And then let me give you a very complete example. I mean, the issues that we bring to the companies, on behalf of Danish authorities, are not small, tedious issues. It’s about terrorism, it’s about child abuse, and you know, it’s about illegal content. And the reply I normally get, when I’m sitting in front of somebody from the Executive Group is, you know, “We remove 99% of all illegal content.” And I’ve learnt the lesson now that we shouldn’t congratulate, that we should say, “Let’s focus on the 1%.” So, what is that in absolute numbers? Where’s it coming from? What do you do as a company to try and mitigate that? What do we need to do as governments to do the same? And then I can tell you that that conversation finishes rather quickly, because there is no specific desire for digging into those details.
So my reply, when you see adverts in Washington Post advocating for more regulation, it’s fantastic. Are you willing to be transparent enough for governments or international organisations to do the regulation, which is necessary? Because you cannot regulate if you’re half blind. So, I think with regulation or the need, or the desire to do more regulation, we need to see more transparency on the side of the technology companies. Now, some are doing it. Others are less so.
Now, I just want to come back and perhaps address it from a slightly different point of view, not so much the educational aspect, but why is this, you know, security policy, in the most traditional sense? Well, one of the reasons is that, you know, the big data driven software companies are now becoming, you know, operators in this sphere of the military industrial complex. So it’s not only about Lockheed Martin, it’s not only about [inaudible – 57:27], it’s also about, you know, the software companies because not only with the tons of weapon systems, but in general, these weapon systems are going to increasingly be driven by machinery, by algorithms, by artificial intelligence. So, in my view, even if you look at it from a very old school security policy point of view, we do need to have a conversation with the companies, as we would with any other producer of weapon systems today. And that, I think, is altering, you know, both the battlefield, but certainly also how governments, they need to look at it. And by the way, that’s one of the reasons why I think last week, here in London, you had the leaders’ meeting in NATO, and you know, we actually put a non-paper forward in NATO, based on around diplomacy initiatives, saying that the Alliance will need to focus more on disruptive and emerging technologies.
And I have to say, it was only one and a half sentence in the communicate out of the summit, so we’re a little bit disappointed, but it’s there. Go and find it. It actually talks about emerging technologies. And why did we do that? Because whether you are in the European Union, whether the United Nations, whether you’re working inside NATO, ASEAN, we need to focus on the consequences of new technologies, but both looking at opportunities, let me be very clear on that, but also trying to mitigate some of the risks that are following with the digital age.
Dr Robin Niblett CMG
Yeah, and some of these points.
Dr Liane Saunders
Yeah, I mean, in part just, sort of, following on from that, because I think, you know, we’ve talked a little bit about some of the, sort of, the big well known technology companies. But I mean, actually, the whole point about this is that emerging technologies are disruptive. I mean, by their very nature, and it won’t always be the big familiar companies, you know, in different domains, you know, synthetic biology, for instance, you know, that there will be new actors. They may well be small actors, and so, the question that the lady asked at the back about uncertainty. I mean, I don’t think we will reach a time where we are in certainty, because actually, the nature of the technologies and the way that they potentially integrate with one another, and have a, sort of, catalytic or accumulative effect, will mean that we have to be comfortable living with uncertainty. And that goes to Tiina’s point about resilience, because you need to be resilient against a range of scenarios. And at the heart of it, I think that really means educating citizens to be critical thinkers and, you know, that’s something that within democracies, we value very much. But it needs to be, sort of, taught in a new sense, and we can enable critical thinking with technology, and some of the examples that Ashish has given about tools that can support that political thinking, are really important. But I think we need to recognise that we will be facing a range of technologies. We won’t be able to regulate everything. Any regulation needs to be proportionate, because ultimately, we need citizens to be able to go about their lives, you know, living comfortably and not worrying at every second of the stage, about, you know, what a technology is going to do in the negative. But actually being able to concentrate on the very many positives that these technologies also have to offer.
Dr Robin Niblett CMG
Let’s do one last round of questions. I’ve got a couple of last ones, and we’ll be able to close up. Lots of hands going up, as always happens at the end. I’m going to take the four I’ve seen, one, two, three, four. So, please at the front, first, and we’ll be quick on our answers, in case people have to go to something else, yeah.
Libby Cash
Hi, Libby Cash, student of LSE and student at Vassar Chatham. So, my question’s actually for Tiina. You know, in your opinion, is using common sense and citizens using your brains, as you put it, enough to actually decipher between disinformation and deep fakes and reality, and you know, is this something that can actually be taught, when this technology is becoming more and more advanced? Thank you.
Dr Robin Niblett CMG
That’s a good question. I mean, you can use your intelligence, but intelligence before by deep fake, I mean, yeah, that makes it incredibly difficult. I think there was a question just behind, yeah, two seats, two rows behind.
Muhammad Zaheer
Muhammad Zaheer, Public Policy at King’s. My question is, without a proper arbitrator between what is actually misinformation, do you think the tools that you have in place are actually effective, especially when you think about governments using misinformation on their own citizens, as well? Do you think that the tools can be misused to actually clamp down on whistle-blowers or opposition, as well? Thank you.
Dr Robin Niblett CMG
Great. There were two over here. Yeah, go from the back and then there was – yeah, then the lady here, yeah.
Matthew Houlihan
Thanks. Matthew Houlihan, I work at Cisco on government affairs. So this might be a naïve question in what feels like an era of, kind of, declining multilateralism, but do we have the right international institution as to deal with all of the issues that we’ve been talking about today?
Dr Robin Niblett CMG
Okay, let’s – a nice big one. Good, I’m glad we had that one on the agenda, and there’s a young lady here at the front. Near the front, second row from the front.
Javin
Hi there, I’m Javin, also a student at the LSE. My question is on the multi-stakeholder approach, which seems to be the preferred method. By bringing tech businesses into the policymaking decision more and more, do states risk losing their sovereignty to the interests of these same tech firms?
Dr Robin Niblett CMG
Yeah, multi-stakeholder, it’s such a soft term, but within it there will be power balances, as you just noted. Look, a lot of good questions here. I think I will go reverse, so I’ll give Casper a chance to have the last word. Ashish, pickup any of these you want. I mean, deep fakes I know is something you work on specifically, I’m wondering whether you had a thought about whether common sense is going to be enough on this?
Ashish Jaiman
Yeah, so…
Dr Robin Niblett CMG
And resilience, yeah.
Ashish Jaiman
…I have some thoughts on deep fakes synthetic media, and one of my pet peeves with this word ‘deep fakes’ is that is has a negative connotation, right? Deep fakes, but the word itself means something malicious about it, right? Versus synthetic media has so many good use cases, right, that – and, like, you know, art and expression, right? You have seen in Hollywood and given a student, an arts major student an ability to create a Hollywood like movie, to put a point out there in three minutes’ video is powerful, right? Activism, the four principles of democracy can actually be – synthetic media can be used there. Accessibility, which is so-called to my heart, you know, giving voice to a voice impaired person, you know, synthetic voice, right, or people who are visually impaired, the phone can talk to them about what’s – how does the surrounding look? It’s all synthetic, right? At the end of the day, this is deep fake. Deep learning technology used for good, right? And that’s deep fake, too, so, you know, that’s where I want to go is, yes, every technology can – is a beautiful tool, but can be used as a weapon, as well. And that 5% weaponization of a technology gets all the air cover, we’re not talking about the 95% of the good use cases.
But I want to go back to your point is, it’s very hard to detect. It’s like we are meeting at a point where from a purely technical perspective that it is becoming virtually impossible, even for machines to detect, and that race is going on. In three months it would become even harder to detect a defect, right? So it goes back to the basic idea of hey, could there be some right counter measures, right, and that could very well be citizen awareness, like, or education, literacy about what a deep fake means, as well as labelling of content, right? There is a regulation in China, which actually comes, like, from Customs, I think next year, January, where you have to put an attribution on a synthetic media, right? So, anyway, so that’s where I want to put my comments.
Dr Robin Niblett CMG
There’s so many good questions, so little time. So, if each of you could, kind of, pickup one point in particular, and we’ll close up, that would be great, ‘cause I know you’ve all got to go onto another thing here at Chatham House, with a lot of the young people in the audience, as well. So, yeah.
HE Ms Tiina Intelmann
And me?
Dr Robin Niblett CMG
Yes, please, yeah.
HE Ms Tiina Intelmann
Okay. No, I fully agree that it may not be enough to use your brain or common sense. But if you look at it this way, propaganda has always been there, lies have always been there. Now that the lies have moved into the digital space, it’s clear we cannot take all content down. We can take down the content that pertains to child pornography, for instance. We can do a lot of things. But at each and every point and individual in a democratic society also has to take some responsibility. It cannot be – we are – I think that we are super-used to being totally pampered now, and the world is a very dangerous place to live in. What can you do?
Dr Robin Niblett CMG
I was about to say, maybe a little.
Casper Klynge
On this happy note.
Dr Robin Niblett CMG
I know, exactly, that’s why you’re going to have the happy note. I’ll make the one cynical comment, though, of course, you know, is a deep fake a political leader telling a lie? Another time. They get called down the line if they didn’t say it, do you know what I’m saying? But never mind, that’s a broad comment before the end.
Dr Liane Saunders
So, I’ll just make a brief comment that tries to take the multi-stakeholder and the multilateral. I mean, the answer is, no it’s not fully, kind of, capable of dealing with all of these things. Has the multilateral system ever been fully capable of dealing the treats it faces? And the answer to that is no, as well. So, we have to do the best job that we can.
One of the ways that we can do that, through multilateral approach and multi-stakeholder approach is to make sure that, as Robin said, you’ve got the right balance of power in the room, and that means that it isn’t just the big tech companies you need in the room. You need those smaller actors. You need those voices that aren’t heard so much, and that often requires the convening power of governments to bring that together. But it certainly shouldn’t be all within the hands of government or, indeed, within the hands of any one sector.
Dr Robin Niblett CMG
Yeah, and that would bring the point I was going to bring to you later, but including best developed economies, this is your point, I think earlier, I think it’s where the UK Government will also try and get in a bit. So a very important point. Last word Casper to you.
Casper Klynge
Do we have the institutions that are necessary for dealing with these issues? Probably not, and I think one of the problems is, you’ll have the vertical discussion about these different technologies. So, if we talk about cybersecurity, it happens in two different working groups in New York right now. If you talk about mobile technology, a different institution in Geneva. Can we create something that would encompass all of it? Probably not either. So, as a small state, and what is the reaction to that? And that is to create a small multi-stakeholder approach, where we basically try and create an alliance on responsible technology. Are we going to succeed in that? Well, I don’t know. But if we do, I think that could be hopefully a beacon for, you know, pushing companies, and I’m not talking about Microsoft here, but other companies to take more responsibility, but also pushing governments to look at this issue as something that is so fundamental, that it’s so transformational, that if we don’t begin looking at it in a more systematic way, we might lose all opportunities. So, basically answering your question, will we lose a bit of our governance power and oomph by doing this? Probably. But by not doing it I think we’re going to lose it altogether.
Dr Robin Niblett CMG
So, a very interesting panel. I took a few words away, just ‘cause there was so much on here. Obviously resiliency, the fact that we’ve lived with this kind of problem in the past, it’s just a different version of a problem, and maybe more scaled up. But if we’re more aware, that’s actually progress in itself, I would argue, and a lot of all the people on this panel have been helping with that awareness. Educated citizens, transparency, but also in the way systems are designed, not simply once they’re presented, which brings us to the multi-stakeholder approach. It’s not just multi-stakeholder responses, but somehow companies having to let regulators and governments and civil society into the process of creation. I mean, they’re incredibly secret organisations, in terms of the monetisation side, so trying to break down that particular barrier.
And the last point, applying rules we already have, as you said, on copyright, etc. In the end, it’s about enforcing rules. You don’t always have to design new ones. There are a lot of rules out there, if we can design them towards existing systems. So, with that, even though we could’ve gone on for a long time, but thank you very much for coming, and thanks for a great panel [applause].