Bronwen Maddox
Great. Everyone, welcome, great to see you here. I’m Bronwen Maddox, I’m Director of Chatham House and I’m delighted that we have here Nick Clegg, Sir Nick Clegg, President of Global Affairs at Meta. Thank you all very much for coming. I’m going to mention just one bit and it’s not exactly housekeeping, but the QR code you can see flashing up is – will take you straight to the new issue of The World Today, which we’ve brought out again in its new, better than ever form, and you’ve also copies there on the side. Please do, and it touches, I suspect, on some of the issues we’re going to talk about today.
Well, we have a clean hour, or maybe 59 minutes, and we’re going to be roughly half-half my conversation with Nick Clegg and your questions. So, please do get them coming, start thinking of them, and we have a title that could span an enormous range of things, Can Democracy Survive the Pace of Technology? Possibly a slightly defensive title. We’ll come onto that. And we’ve been talking a lot, within Chatham House in the past week, about the ways that this is coming up. I’m thinking of two pieces written in the past week, one by William Hague, who indeed, we were just discussing upstairs, put his name with Tony Blair some time ago to our piece talking about the future and the optimistic potential of technology.
But he said in The Times, “Look, this is” – or “I acknowledge all these other wars and battles going on, and Ukraine and Sudan” and so on, “but this is one of the defining battles of the 21st Century between governments and technology companies and it matters for how we are going to organise countries, organise democracy.” On the other hand, I can say it’s almost the other side, we had Ambrose Evans-Pritchard arguing in The Telegraph apropos a European gathering in Italy, saying, “Europe is really losing the race in new technology. It is so obsessed with regulation that it is going to choke off any potential of this new technology if it even knows that that potential is there.” So, this is the kind of waterfront that we are arguing over.
So, let me start by asking you within this, we’re in this big year of elections.
Sir Nick Clegg
Hmmm.
Bronwen Maddox
Every organisation in our line of work and many newspapers made great play from November to January onwards, saying, “This is the year of elections.” What have you been doing at Meta to handle this year of elections right round the world?
Sir Nick Clegg
So, yeah, it is by far the most consequential year when it comes to – I think – I can’t remember what the figure is, sort of, four billion people around the world are, sort of, eligible to vote this year. I don’t think that’s ever happened in the democratic era, and you’ve already had a number of elections: India, Indonesia, EU, UK obviously, France and so on. It’s without precedent the number of elections that have taken place, and as you will remember, and no doubt William said this himself, there was lots of breathless predictions towards the end of last year and the beginning of this year that this would be the year that democracy would with upended by AI, or the latest manifestation of AI. AI, of course, has been around for a long time since the, sort of – certainly spoken about since the 1950s, but, sort of, gent…
Bronwen Maddox
Oh, tripped up or compromised or…
Sir Nick Clegg
Yeah…
Bronwen Maddox
…it…
Sir Nick Clegg
…and I think what – so, you asked what a company like Meta’s been doing. We’ve obviously got…
Bronwen Maddox
And even specifically…
Sir Nick Clegg
Yea…
Bronwen Maddox
…what Meta’s been doing.
Sir Nick Clegg
Yeah, yeah, yeah, and I – the – so, we have around 40,000 people or so who work on how elections and generally, how, sort of, safety/integrity plays out on Facebook, Instagram, WhatsApp, Messenger and so on. And of course, we – I think we’ve had around 200 elections, where all of those apps have been playing a role in one description or another since 2016. And so, we, sort of, developed a whole bunch of muscles over the years.
We now have, I think, the world’s largest network of independent factcheckers, around 100 of them, working in, I think, 60 languages or so. We have an Independent Oversight Board, the first of its kind, every 12 weeks, along with our financial results. We publish all the stuff about, what’s the bad stuff we find? What’s the – you know, what have we done well? What have we not done well? Not just marking our own homework. I think we have EY auditing it and so on.
And what that picture shows, and what our experience shows so far this year, especially related to AI and democracy, is first – I think the most surprising thing so far – this could change and it could change overnight, it could change from one moment to the next. In fact, it almost certainly will change as the technology evolves, but so far, it is striking how little effect AI has had at a systemic level. Of course, it’s been using, for good reasons or bad reasons, it’s used by campaigners for fundraising, for running political adds. I remember Imran Khan ran a, sort of, AI avatar from prison to communicate it. So, it’s been used, but we haven’t, and I don’t think anyone has – the MIT published a quite interesting review confirming this earlier this week, I don’t think anyone has identified that AI has played a systemic role in any of the elections.
Secondly, where AI has played a role, it’s tended to be as a, sort of, technological accelerant of problems we already know about, most particularly the creation and dissemination of misinformation. And that’s terrifically important for these Big Tech platforms because the way we design our policies, we can argue whether we draw the line in the right place, you know, what it – in terms of what is allowed, what’s not allowed, and what’s demoted, and what’s not demoted and so on. But the way we design it is to be technologically agnostic.
In other words, whether a piece of misinformation is produced by a human being or by machine is irrelevant. Our systems should, they don’t always, they’re certainly not perfect, our systems should identify them, particularly if they’re going viral, regardless of the genesis of it. And I think that’s quite an im – sort of, important comp – sort of, second observation.
And then, the third observation I make is – ‘cause this is often overlooked in the – well, I totally understand this. I don’t criticise it, but quite a lot of the commentary tends to always dwell on, because it’s more exciting to talk about the negative, sort of – but spooky, sort of, apocalyptic side of new technologies, rather than the upside. But it’s worth remembering, and I’m sure this is relevant to most of the big platforms these days, how much the tech companies regard AI as a sword and a shield, so…
Bronwen Maddox
What do you mean?
Sir Nick Clegg
Well, it – what I – people dwell on how bad people might use AI for bad purposes. It’s also the most powerful tool to identify bad content and stopping it disseminate. I’ll give you an example. So, the prevalence of hate speech on Facebook, that, in other words, the percentage of hate speech that can be identified as a percentage of the total amount of content on Facebook, and this, again, this is a – this is not just me plucking this figure out, this is an audited statistic, has been reduced by over 50% over the last two or three years, for one reason only, because of improvements in AI. So, the prevalence is now about 0.02%. So, that means if you’re scrolling endlessly, I don’t recommend this, but if you’re scrolling endlessly on your feed, out of the 10,000 bits of content you might see, you might find two bits of hate speech.
But my point is that – I wish you could reduce it to zero. I don’t think you ever can, ‘cause remember, hate speech is almost always legal speech. It’s just speech that we have just…
Bronwen Maddox
I’m going to come onto his point, but yeah.
Sir Nick Clegg
Yeah, yeah, but my point is I think what is often overlooked is that the advances in keeping people safe online, trying to defend against particularly foreign interference in elections, the spread of misinformation and so on, is itself a battle fought through AI and invariably the greatest advances are through AI. And I just – in certainly my, sort of, day-to-day job, a lot of the, sort of, work that the teams of Engineers do is to constantly improve, because you’re dealing with a scale.
I mean, WhatsApp alone, I think now there are about 100 billion messages on WhatsApp every day worldwide and then, of course, you have to multiply that Messenger and Instagram and Facebook. It is vast. As I say, we employ 40,000 people to do this stuff, but even if you employed 40 million, you’re still not – you still can’t by hand, curate an experience which is, by definition, user, you know…
Bronwen Maddox
Alright, so…
Sir Nick Clegg
…user generated. And that’s why AI is so important as a weapon against bad stuff, as much as it, of course, can be used by bad people for bad purposes, as well.
Bronwen Maddox
No, this is an important point and I find it absolutely persuasive as far as the scale of this sifting and judgment goes, but it takes us immediately, inevitably, to this question of how you judge and who should judge what is legal but harmful…
Sir Nick Clegg
Yeah.
Bronwen Maddox
…and what should be removed…
Sir Nick Clegg
Yeah.
Bronwen Maddox
…and what should be allowed.
Sir Nick Clegg
Yeah, so the way we…
Bronwen Maddox
And how…
Sir Nick Clegg
The way…
Bronwen Maddox
…you know, how does Meta do it?
Sir Nick Clegg
Yeah. So, the way we do it, and if anyone has better ideas, tell me, because it’s literally just a non-stop process of one bunch of people saying, “You’re censoring too much” and another bunch of people saying, “You’re not removing enough.” Particularly in America, where literally half the country thinks the former and another half thinks the latter. And the way we try and pick our way through this, because again, you’re asking a private company, an engineering company, to what – to make what are highly consequential ethical and societal judgments about content that legislators have decided not to make illegal, right? So, you’re literally just – you know, you’re – and I sometimes, when the legislator shouts at us, I say, “Well, pass a law then and we’ll follow your law. But as long as you don’t pass…
Bronwen Maddox
Yeah, so just…
Sir Nick Clegg
…how we do it…”
Bronwen Maddox
Well, just…
Sir Nick Clegg
Yeah.
Bronwen Maddox
Even before the how, because you’ve taken us to the even more interesting point of, do you think you should be doing it at all?
Sir Nick Clegg
Well, we can’t – we unavoidably have to. We unably – and – well, unless you do an Elon Musk and you say anyone can say anything, but that’s not the way we run Facebook and Instagram and it’s not what – but remember, advertisers pay our lunch. Adver – you know, if you’re selling soap powder or a car, you don’t want your content – no – so, when I read people confidently saying, “Oh, these platforms have got a commercial incentive to spoon-feed ghastly stuff.” No, we have an incentive to do exactly the reverse because the advertisers don’t want their ads next to the bad stuff.
Whether we succeed in finding it, and it’s a different matter, but we have no commer – we have commercial incentive to probably be even more censorious, to make it a complete, sort of, Walt Disney experience. But at the same time, we want to make it not such an insipid experience, but a place where people can express themselves, which is rooted in the – particularly for American company and princip – First Amendment principles of free expression.
But to your query…
Bronwen Maddox
And the…
Sir Nick Clegg
…about how…
Bronwen Maddox
…how…
Sir Nick Clegg
…we do it.
Bronwen Maddox
…question, yeah.
Sir Nick Clegg
So, how we try and square that circle, if you like, is firstly, we do it very, very openly. So, we have, I – 20 or 21 categories of content that we publish under something called the Community Standards, where we say, very openly, whether it’s bullying and harassment, hate speech, IP fraud. Anyway, a whole list of things and we say, “We don’t want this on our platform,” and that Community Standards is a, sort of, living and breathing set of standards. We consult every two weeks with a international panel of scholars, of free expression experts and so on, and we adapt it.
Then we’ve set up this, sort of, independent, sort of, almost court, this Oversight Board, we’re the only company to do this, and so that a user can go to the Oversight Board and say “Facebook removed my post of my cat, saying it was a demonic symbol inciting hatred. It wasn’t, it was just my cat.” And then, the Oversight Board can say to us, “You idiots, it was just a cat. Restore that post,” and we will do that. And we are duty bound to restore or indeed, take down the content that they instruct us to. And then, as I think I referred to earlier, along with our financial reporting, we disgorge huge amounts of data under each of those 20/21 categories, showing every 12 weeks how effective our systems, the automated AI systems, but also the human beings I referred to earlier, are at identifying it and either removing it or demoting it or labelling it, and how successful we are at doing that ourselves before users report it.
And that’s – all of this great paraphernalia is expensive. I mean, we’ve spent about US $20 billion in recent years on that work, about five billion in the – you know, in the last year. It…
Bronwen Maddox
I’m thinking, we were referring to it upstairs, these are numbers that Rachel Reeves will be very covetous of. We might get…
Sir Nick Clegg
Possibly, yeah.
Bronwen Maddox
…can come onto that.
Sir Nick Clegg
Don’t give her…
Bronwen Maddox
It…
Sir Nick Clegg
…a suggestion. I mean, you know…
Bronwen Maddox
Also, a point we’ll come onto.
Sir Nick Clegg
Hmmm.
Bronwen Maddox
So, that’s how you – how – and you referred to it in the – Facebook and Instagram and Meta is a US company. The First Amendment is absolutely core US value. How do you go out and apply this to parts of the world that really don’t share that value, at all?
Sir Nick Clegg
Yeah, I think my own view is that that’s one of the most difficult conundrums we face. Because, in many ways, I think all these companies and these technologies and these platforms, they were born in the, sort of, high – sort of – they were – they thought – the, sort of, apogee of globalisation and yet, we live in a world in which politics is deglobalizing. So, you’ve got – underlying all of this, you’ve got these two, I think, very powerful forces, which is the globalisation of technology, and these are seamless, borderless technologies which allow people to communicate, you know, at the swipe of a screen, from one part of the world to the other. And yet, you’ve got a politics, particularly in the post-2008 era, almost everywhere, whether it’s Modi, Trump, Orbán, it doesn’t matter whether it’s Brexit, it’s all – it’s – the principal political motivation, in recent years, has been about recapturing sovereignty, asserting political control. And that’s, of course, why people like William Hague write these thunderous prose, saying, “Oh, you know, that’s the great battle of our times,” ‘cause it’s the conflict between political sovereignty and globalised technologies.
And I think one of the things that – one of the dilemmas that the tech platforms face, and certainly one of the dilemmas I face in the various, kind of, responsibilities I have at the company, is that the community standards that I talked to you about earlier, and I’m exaggerating for effect ‘cause – but we’re almost treating the world as if it’s flat. As if you can apply exactly the same standards everywhere. But of course, as you quite rightly alluded to, people have different attitudes towards different kinds of cont – I mean, to give you a…
Bronwen Maddox
And of course, what hate speech is…
Sir Nick Clegg
Well, yes, indeed…
Bronwen Maddox
…too, towards…
Sir Nick Clegg
…but I can…
Bronwen Maddox
…what harm is.
Sir Nick Clegg
I will give you a meaningful, but perhaps slightly comic example. Every time I meet any Ministers and Politicians from Scandinavia, one of the first things they berate me about is why we are so – we’re – why we’re such a censorious American company that doesn’t allow them to post pictures of their topless Baltic summer holidays, because nudity’s not allowed and so on and so forth. You would – you could argue that American culture tends to be slightly more tolerant of violent content. So, you’ve got these – in India, the great focus there is about intercommunal violence. In the EU you’ve now got legislation in space, the Digital Services Act. Here in the UK, you’ve got a, sort of, version of it, the Online…
Bronwen Maddox
Online Safety.
Sir Nick Clegg
…Harms Act.
Bronwen Maddox
The R – and now, became the Online Safety…
Sir Nick Clegg
But all of…
Bronwen Maddox
…Act, yeah.
Sir Nick Clegg
…those acts, of course, because guess what? The Parliaments find when they grapple with it, it’s bloody difficult. So, they end up actually leaving huge amounts of space for the regulators to fill the gaps and then you have all the different voices and lobby groups and others saying, you know, in – sort of, saying to the European Commission, or OFCOM in this country, “You should interpret these adjectives and adverbs in the legislation in this way or that way.”
It’s a non-stop debate, but it’s one, to your point, where I – my own feeling is, my own – it’s just probably my view rather than Meta view, if I can put it like that, my own view is that the world is generally becoming more fragmented in these matters, and the platforms will need to be responsive to that.
By the way, in a context where the internet itself may, of course, become ever more splintered and fragmented…
Bronwen Maddox
Hmmm, the…
Sir Nick Clegg
It already is.
Bronwen Maddox
The…
Sir Nick Clegg
We don’t have a global internet, it doesn’t exist. We have a Chinese internet, you’ve got a non-Chinese internet, Russia’s going in one direction, Turkey and others are trying to go in that direction. It – I think that trend towards fragmentation is most likely, I wish it were otherwise, but I think it’s most likely to accelerate.
Bronwen Maddox
Really interesting point. So, just come on, a technical point, what you’ve done about open-source. Just making this available to the world, where there’s an instant battle of, “This is great for the world to understand how you put yourself together,” against the, kind of, people saying, “This is really good for terrorists who want to get hold of it.”
Sir Nick Clegg
Yeah, so…
Bronwen Maddox
What…?
Sir Nick Clegg
…this is quite a vigorous debate in Silicon Valley and elsewhere about – so, as I’m sure many of you know, generative AI starts with what are called foundation models. They’re – it’s like the, sort of, great hunk of clay or the great, sort of, unchiselled hunk of marble, upon which you then build all the AI applications and use cases and you can customise it. And that’s an incredibly expensive thing to build, I mean, extraordinarily. I mean, I think we – I need to probably check, I think we are – and remember, by market cap, we are actually very small compared to Microsoft, Amazon and Google, so they’re far, far – Apple, far, far bigger companies. But you – I think we are – Meta is spending this year, I might probably need to check this, around US $40 billion just this year on building new AI data capacity, and that will continue, in fact, will almost certainly increase over time.
Why is it so expensive? ‘Cause you need a huge amount of data and you need a huge amount of compute capacity. You know, that’s why NVIDIA’s doing so well and everybody’s buying their GPUs and so on. So, there are only a very, very small number of companies, a handful in the West Coast, there’s one or two slightly smaller operators in – and there’s, basically, none in the UK. There’s one in France, Mistral, there’s ALEPH ALPHA, I think, in Germany, and there’s one or two in China.
And then, the question is, if you’re spending all of those billions, you want to recoup the money, of course you do. These are businesses, they’re not charities. What most of those companies are doing, it’s a totally legitimate business choice, is they say, “Oh, well, in that case, we’re going to ask people to pay for” what they call in the jargon, “API access,” so, sort of, a proprietary model. So, you, basically, pay a licence fee of some description to have access to those foundation models or variants of them.
What we have decided, in large – well, I’ll come to that in a minute. Partly because we feel we can afford to, because it’s not a, sort of – it’s not foundational – sorry, I’m mixing – I’m using the word ‘foundational’ too much, but it’s not crucial to our commercial business model, is we, basically, give it away for free. We – literally anyone can use it. You can – the smaller versions you can write on your laptop and so on and so forth, and crucially, by the way, if you use LLaMA, that’s our large language model, the various generations of it, unlike the proprietary API in models, none of the data goes back to us.
So, one of the great advantages, in my view, of open-source, is it gives people complete sovereignty, particularly governments, for instance, who want to use it for sensitive purposes in the NHS, or intelligence or defence or whatever. It means that they can then, basically, take it from us and it’s been downloaded – Lla – versions of LLaMA have been downloaded around 350 million times, Researchers, Developers, Entrepreneurs. It’s hugely, hugely useful for that reason.
Now, the – my answer to your question about is it, sort of – is it a, sort of, gift to ‘bad people’? I think it’s worth reflecting on what’s happened in the internet over the last 20 years, or in the cybersecurity world, or in encryption. All the advances have been open-source and the whole of the cybersecurity industry is based on open-source technology. Encryption protocols are open-source. The internet itself, I mean, if you can remember, go back to the, you know, the whole battle between Microsoft and Linux, Linux won out. Open-source always wins out.
It not only wins out because it democratises the technology, which I personally think has got to be better in the long run than having a tiny number of very, very, you know, big companies in California, basically, that own the plumbing, the infrastructure to the whole of the way in which we’re going to interact with the online world in the future, and it actually appears to, over time, be safe because it allows everyone to prod holes and examine it, you know.
So, we, for instance, subjected LLaMA, I think it was last summer, to a hackathon in, I’ve forgotten the name of it, in Las Vegas. You know, all these people, sort of, trying to poke holes in it, and then – so, you’re not relying on just one company to patch and mend and play, sort of, Whac-a-Mole with the errors in its own, sort of, secret system. It’s completely open and that’s why so many of the standards and protocols and so on of the cybersecurity industry have been constantly strengthened through open-sourcing.
For what it’s worth, I think sitting here in London, it would be mad for UK Plc not to advocate open-sourcing, ‘cause there’s no way that the United Kingdom is going to come up with its own foundation models at the pace in which they’re being developed elsewhere. And it – and as a matter of sovereignty, I would’ve thought it’s crucial for countries like the United Kingdom to be able to take the work done by these private sector companies and then make it their own. You can’t do that if you’re…
Bronwen Maddox
So, this is an absolutely essential point. Again, I find that one very persuasive of – and I wanted to ask you what you thought governments could best do. No, I don’t – not – I’ll come onto regulation in a moment, but on the investment side, there’s a lot of it, talk about what, you know, what – the benefits that this can bring. The amounts of money that government has at its disposal do not compare with the…
Sir Nick Clegg
No.
Bronwen Maddox
…discretionary money you’ve just been talking about. And there’s much in the UK papers about Rachel Reeves paring money off the Edinburgh…
Sir Nick Clegg
Yeah.
Bronwen Maddox
…supercomputer and so on, in order to spend on train drivers. This is not a political statement, but what is it that you think governments can best do?
Sir Nick Clegg
It’s a good question. I think thankfully, we’re beyond the point that worried me most. Actually, I mean, one of the – I remember saying this to the then Prime Minister, Rishi Sunak, and Oliver Dowden, who – they’d organised this rain-soaked get together in Bletchley Park, what was it, a year or so ago, and I was there on behalf of our company and so on and so forth. And I remember saying to him, like, “You – do you think Mrs Miggins, who’s, sort of, living round the corner in, sort of, 56 Orchard Close, is now more keen on AI or more terrified of it after two days of wall-to-wall coverage of the, sort of, terrors of – I can tell you. She’s a lot more worried than before.”
You have to make the case for yes, of course the safety work is important, but if all you talk about is some distant apocalypse, which might happen, but I’ll tell you, this technology is way more stupid and rudimentary than people think it is. It – these things so-called hallucinate. Hallucinate means they give very stupid answers because they’re, basically, still guessing ga – there’s – they’re guessing machines. They’re ingesting vast amounts of data and then, they predict the next what’s called ‘token’ in response to a prompt. Certainly, the large language models, they will now become multimodal, they will also be able to operate – they’ll start seeing, if I can put it like that.
So, of course, it’s evolving fast, but I do – so, I think there is a, sort of, bully pulpit point about saying that – you know, if you speak to anyone in medical diagnostics, you were saying this to me earlier…
Bronwen Maddox
Now, I was saying ear – before that…
Sir Nick Clegg
…they…
Bronwen Maddox
…that’s, sort of, the [inaudible – 28:26] are…
Sir Nick Clegg
I mean, if any of you in here – I mean…
Bronwen Maddox
…I feel most excited are in…
Sir Nick Clegg
Of course it is.
Bronwen Maddox
…medical diagnostics.
Sir Nick Clegg
Of course it is. We’re identifying, as a world, we’re identifying antibiotics, new antibiotics, for the first time in 60 years because of advances in AI. Advances on – a team I – sort of, one of my teams in Meta is called Data for Good, has released open datasets on forest coverage around the world, which has sim – materially advanced the science of climate change and of forestry. You’ve got – you’re – just think of the advances in terms of personalised education. There – it’s just – it doesn’t mean you have to – it doesn’t mean you should in any way ignore the di – should always be mindful of them, but for so long, the political deba – and clearly, the press-driven debate, ‘cause it’s always more interesting to slap on the front page of a tabloid a picture of a scary robot with glaring red eyes and saying, “They’re coming to get you.” It’s going to, of course it is.
But I think there’s a political act of leadership that you have to explain. You can’t wish this technology away. You can – of course, you need to try and make it at site, but you want to try and harness it for the public good. And if even half or a fraction of some of the breathless predictions from the Analysts materialised in terms of the effect of AI on productivity are true, then for a country which has laboured – including, by the way, the time I was in government, under this what the Economists call ‘productivity puzzle’, that even if we produced jobs and all the rest of it, we just weren’t improving our productivity, we should be grabbing hold of that.
Bronwen Maddox
So…
Sir Nick Clegg
But this is the final point, if I can make.
Bronwen Maddox
Yeah.
Sir Nick Clegg
Because what government can do, and actually, particularly perhaps in a country with such centralised public services we have, is we can deploy A – we’re not going to invent AI any more effectively than the – some of these Silicon Valley companies. We can de – there’s no golden rule that says because you’ve invented a technology, you’re going to be better at deploying it. And if anyone wants any evidence of that…
Bronwen Maddox
No, the brain’s…
Sir Nick Clegg
Does anyone…?
Bronwen Maddox
…watching.
Sir Nick Clegg
Has anyone here got an American bank account? I mean, the American banking system is like – it’s stuck in the 70s. You still have to have a chequebook, you still have to ring a human being.
Bronwen Maddox
Well, it’s one of…
Sir Nick Clegg
It’s just…
Bronwen Maddox
It’s one of the facets of a federal, kind of, country when the federalism comes back to bite it. But on this point…
Sir Nick Clegg
But do you know what’s funny…
Bronwen Maddox
…one…
Sir Nick Clegg
…about deployment?
Bronwen Maddox
Yeah.
Sir Nick Clegg
There’s a huge – and I think, to be fair…
Bronwen Maddox
No, no, and Britain is a very good example of what you’re talking about of, often, you know, inventing things and then not making…
Sir Nick Clegg
Yeah.
Bronwen Maddox
…things of them. But one of the people who has put himself in that mully – bully pump – pulpit you were talking about is Tony Blair.
Sir Nick Clegg
Yeah.
Bronwen Maddox
Who’s been saying, really very loudly, a lot recently, “Look, this is the answer to Britain’s productivity problem,” and saying to government, the new government…
Sir Nick Clegg
Yeah.
Bronwen Maddox
…“This is part of the answer to your budgetary problems. This is how to get more…
Sir Nick Clegg
Yeah.
Bronwen Maddox
…out of the public sector.” Now, obviously, you know the workings and non-workings of the UK Government quite well. Do you think he’s right?
Sir Nick Clegg
Yeah, I think you’re probably right – I think he’s probably right, in a sense that you have to – so, particularly if you’re running a government, you – there’s only so many hours in the day and politics is about a choice, it’s about what you emphasise and what you don’t. You can’t do everything and particularly if you’re a Prime Minister, or you’re senior in government, it – you ju – you have to be super unambiguous.
Bronwen Maddox
But just to bring it down to the granular, non-metaphorical level, what is it that it can do that the British Government, Civil Servants and all this are now doing? What can it – how can it improve?
Sir Nick Clegg
AI?
Bronwen Maddox
Yes, in…
Sir Nick Clegg
Well, I think how we interact as citizens with the state can be completely transformed by AI. Whether it’s pension chatbots that you can actually get hold of and can immediately give you an answer. Whether it’s how your child is actually taught through tools which give personalised education, rather than just the same standard wrote learning to all 30 kids in the school. Whether it’s the improvement and the speeding up of diagnostics. Whether it’s the urban planning and the planning of traffic planning in our congested – if you did – anything which requires the processing, digestion, prediction, based on large amounts of data, it will, of course, be affected by AI.
And I suspect, by the way, it’s – you know, this is a bit further afield, you know, I suspect almost – well, actually, in many ways already, I mean, certainly in the services I know about, there is nothing that you will find on Facebook and Instagram which is not already, in some shape or form, sliced or diced by what are called AI classifiers, who, you know, who rank and decide what you see, in what order and so on. I think in future, pretty much everything where we – but – unless just pure messaging apps where you’re literally just sending your text, but even that, I think we’ll interact with businesses through AI powered entities. I think we’ll be able to book our holidays through a – through AI entities. You could literally say what you like, have a discussion with them and they’ll come up with a proposal. I just think it’s going to be transformative in so many different ways. Doesn’t mean there aren’t downsides.
But in answer to your question, every single department in Whitehall should be working out how to deploy it to good effect. And to your question about what Britain can do, I think one of the very live issues right at the moment, which refers perhaps to one of the things that, did you say Evans-Pritchard was writing…
Bronwen Maddox
Hmmm hmm.
Sir Nick Clegg
…about in – at The Telegraph? Is, as I explained, these AI models are nothing without data. I mean, the models don’t – just don’t work. It’s like a car without electricity or fuel. It just won’t move, right? They’re very data hungry and there’s a whole – much more, sort of, futuristic debate from some very, very clever AI Data Scientists, who say in the long run, actually, you’ve got to change the paradigm and move towards generative AI, which is much, much lighter on data.
Bronwen Maddox
Interesting.
Sir Nick Clegg
Almost thinks – mimics the human brain and so on.
Bronwen Maddox
And they’re – yeah.
Sir Nick Clegg
Because there are some, literally and metaphorically, some sustainability questions about the current…
Bronwen Maddox
Yeah, it is, and therefore on storage, therefore on energy, and so on.
Sir Nick Clegg
All of that.
Bronwen Maddox
All that.
Sir Nick Clegg
Exactly, all of that. But right now, the advances are – and here’s the strange thing in Europe. I mean, I have to say, and I say this as someone who was an MEP for five years, who was an ardent, and remains, sort of, ardent critic of the self-harm of Brexit. I worked even for – as a European Official in the European Commission for five years. It – the EU seems to be, at the moment, sort of, moving towards the most perverse decision, or set of decisions, taken through a, I won’t bore you with it, a rather…
Bronwen Maddox
Well, which ones…
Sir Nick Clegg
…comm…
Bronwen Maddox
…are you thinking of?
Sir Nick Clegg
…common…
Bronwen Maddox
I mean, I’m thinking of the – well, among others, of the Google tax…
Sir Nick Clegg
Yeah, no, that’s a…
Bronwen Maddox
…evolved thing.
Sir Nick Clegg
…tax thing.
Bronwen Maddox
And it’s a…
Sir Nick Clegg
And that’s a…
Bronwen Maddox
…tax thing, so slightly separate.
Sir Nick Clegg
That’s a – no, that’s, sort of, a slightly more plain vanilla tax thing and I won’t…
Bronwen Maddox
Yeah, so what are you…
Sir Nick Clegg
No…
Bronwen Maddox
…thinking of?
Sir Nick Clegg
…it’s this. It’s about the use of data. If you want European AI foundation models, you inescapably, have to train those models on EU data. Otherwise, you can’t, right? Otherwise, all you end up doing is using hand me down AI models from California. It’s the same in the UK. If you want – if we want AI models, if we want AI avatars, chatbots, you name it, who understand our language, our history, our landmarks, our very different English idiom compared to American idiom, you have to train it on UK data. And on nothing spooky, public data, data that’s out there on the internet, right? That’s what all the companies like us are seeking to do.
The European Union appears to have said that we’re not allowed to do that, to which – so, in an act of regulatory sovereignty, they will end up a) I suspect, just not having a bunch of products being produced by these companies rolled out in Europe, and they’ll be rolled out everywhere else, but secondly, they’ll be in the weird situation of relying on AI models that are not Europeanised. It – I – it is completely beyond me why this is in Europe’s interest not to allow these AI companies, notwithstanding the fact that they’re American, to actually produce – if I can put it, customised Europeanised AI foundation models. And that’s where the UK, which is still, basically, abiding by the same legislation, GDPR, could, if it wanted to, take a more nimble approach.
Bronwen Maddox
Really interesting point. On that, let’s go to questions. I’ve loads more, but there’s going to be loads and loads and loads here, and there’s loads online. So, let me – alright, I’m going to take them in pairs, because there’s loads of them. So, let me start right here and on the aisle there. Let me take those two first.
Digby Walker
Is there a microphone situation?
Bronwen Maddox
Yes, there is a ‘microphone situation,’ because we have lots of people online and they won’t hear you without one.
Sir Nick Clegg
‘Microphone situation.’
Digby Walker
So, Nick, that was really fascinating. Digby Walker…
Bronwen Maddox
If you’d like to say…
Digby Walker
…from The Carbon Trust.
Bronwen Maddox
…who you are.
Digby Walker
Digby Walker from The Carbon Trust, but…
Bronwen Maddox
There you go.
Digby Walker
…asking on behalf of myself. Taking us back briefly to the question, which was, “Can Democracy Survive the Pace of Technology?”
Bronwen Maddox
Thank you.
Digby Walker
Do you think AI and technology is more of a positive for democracies or more of a threat or positive, in their sense, to authoritarian regimes and that, sort of, contrast between the threat and positives for different types of regimes and technology?
Sir Nick Clegg
I definitely think technologies are used by authoritarian jurisdictions to surveil – so, I mean, just look at the Chinese internet, it’s quite, quite different. Oh, it’s a totally different paradigm, and you build a, basically, a wall round the internet and you heavily surveil people inside it. That’s how the Chinese internet was quite, quite different to hours. So, clearly all technologies, but the same could be true of camera technology, you know, of course, technology.
I think for democracies, my own spectacularly unfashionable view, but I feel I have a little experience now, after six years or so in Silicon Valley and poring over the research on this, is that people tend to somewhat exaggerate the role that technology plays in how people vote and political behaviour.
I’ll tell you the two sets of people you shouldn’t listen to generally about technology is the most ardent fanatical advocates of technology and the most ardent fanatical critics of it. They both reduce human conduct and behaviour to, sort of, techno-determinism. You know, “Oh, it’s the algorithm that made you think this.” No, it isn’t. It all – it’s – we are much more complex animals like that.
And interestingly, the research that’s been done, and I can share the link if you – there’s a research programme which I helped set up called US 2020. We worked with, I think it was about 11 universities across the US, so to run – to allow academics, sorry, in a privacy protected way, of course, to run very sizeable experiments during the 2020 US election on how people used Facebook and what effect it had on their voting behaviour. And if I can just share, ‘cause it’s really interesting, there’s two examples.
So, I may get some of these stats wrong, but they took, say, I mean, large samples, 150/200,000 people who volunteered and they said, “You’re now going to use Facebook up to the point at which you vote in 2020, for Biden or for Trump. And you’re going to use Facebook, but you will have no virality, no distribution at all.” And then, another 150/200,000 saying that “You have no algorithmic ranking at all. It’ll be completely chronological.” And I’m probably simplifying the – because the academics have published this in peer review papers, in nature and so on, science and so on. We had no role over – all we did was just make – build the systems so they could run these experiments.
They basically concluded it made almost no difference at all, because, you know, in the same way that the research has shown that this common refrain about filter bubbles doesn’t really, kind of, exist. The idea that there’s a causal link between the use of social media and polarisation. Actually, polarisation’s gone down in a lot of countries where social media use has gone up, or polarisation has gone up amongst the sections of the population that are least active online.
So, I just think when you break it down, I don’t want to be dismissive or complacent for all – otherwise, we wouldn’t be spending US $20 billion and I’m – you know, I wouldn’t have – we wouldn’t have 40,000 people working on this. There are heavy responsibilities that the platforms have, but I just think there’s this tendency, always, to – here’s the final observation on this.
The weirdest thing I have found about the debate around AI in the last couple of years is that we all immediately start anthropomorphising something we call artificial. We confer on AI, sort of, humanlike and often, sort of, all knowing, omniscient, slightly demonic powers. They’re machines, they’re machines, they’re not people. They’re not – they don’t have the intelli – they’re artificial and it’s just – I find it so frustrating that we really, as humans, can’t resist but project ourselves onto these machines.
And yes, of course, they play a role in all of our lives and the evidence is they don’t actually, in democracies, have nearly the, sort of, determining, you know, effect. In the same way when I was in politics, I used to go, “Ah, why is the newspaper saying this about me and that about me?” I found after a while, you know, it didn’t matter how many times the Daily Mail screamed at me, there were still, amazingly, Daily Mail readers who voted the Lib Dems. I couldn’t work it out myself, but guess what? It meant that the newspaper really maybe wasn’t as important as I thought it was. I just think, I think we sometimes, kind of, try and reduce how people think about themselves, their family, what’s important for them, the choices they want to make, in a somewhat, sort of, simplistic way, and I – we certainly do that with technology.
Bronwen Maddox
I’m going to go onto this other question, but there’s a lot online, which I’m going to play back to you, on – pushing at that point, and whether – not disagreeing with the point you’ve made that the – that this is technology, not human, but whether it amplifies some of the worst characteristics of humans. I’ll come back to that. Question here.
Latika Bourke
Hi. Is this on?
Bronwen Maddox
Yes.
Latika Bourke
Yes, okay. My name’s Latika Bourke. I’m a Writer-at-Large in The Nightly in Australia. The Australian Government has just announced it wants to ban social media for under 16s.
Sir Nick Clegg
Hmmm hmm.
Latika Bourke
And Meta’s response to…
Sir Nick Clegg
It’s under 14. Oh, no…
Latika Bourke
Teenagers, yeah.
Sir Nick Clegg
Yeah.
Latika Bourke
Meta’s response so far has been to say that you believe the entire App Shore – App Store should be made downloadable, pending parental consent. If you’re not successful in outsourcing that obligation, will you enforce this ban? Can it be enforced, and do you think it’s even desirable?
Sir Nick Clegg
Right, for those who don’t know, I think, but you might want to correct me, is that the Prime Minister of Australia has announced in outline, without any detail, that he wants to pass legislation that below a certain age, I don’t think he’s specified which, but I thought I’d heard 14, maybe, you would not in a – no child below the age of 14 would be able to use social media at all. And I think between the ages of 14 or 16 they would need to have parental consent to use social media.
I mean, look, at the end of a day, if a democratic government passes legislation on that, we will, of course, will abide by it. I would only say two things to make it actually work. Firstly, as anyone who’s got teenagers will know, no teen uses one app. I mean, in the – I don’t know what the statistic is here or in Australia, but in the US, and I couldn’t believe this, but many, many teens use up to 40 apps. And teens are incredibly versatile at using apps and messaging apps to, kind of, communicate with different people, show a different side of their identity. Some of it’s very, you know, photographic and public, some of it’s a bit more intimate and so on.
So, you have to have a system which captures that whole ecosystem, and that’s the point about app stores that I think you were referring to. Well – and not just Meta, lots of people are saying this, there are only two chokepoints in the modern internet. It’s the operating systems, it’s – upon which everything else is built, which is iOS owned by Apple, and Android by – owned by Google. So, we’re not I think you said ‘shuffling’. We’re not shuffling off anything. We’re saying if, from a parental point of view, or indeed from a government’s point of view, you really want to have control over who uses soci – apps or not, the only way you could just practically do it is through the point at which you – the – as I say, there’s a, sort of, chokepoint, when you actually download the apps from the App Store.
And so, that’s why I think there was an increasing number of people who say – you – there’s no point asking – of course, big companies like us will do it, but there are some multitude of different apps now, some small, some – you know, with different policies. I mean, X and Facebook are completely different now and TikTok and Snap. If you ask each company to play Whac-A-Mole with these things, it’s going to be a nightmare for parents, a night – ‘cause they’re going to have to do it on each single app. So, that’s the first point.
And then, the second point, related to that, is make it comprehensive, because if you don’t, it will become a fragmented thing, do – companies will do different things. It won’t actually capture the full online experience of young people. So, I think, you know, inasmuch as the debate unfolds in Australia and elsewhere, if you’re going to do – make a big move like that, just literally saying, you know, “At thi – we decree at this point,” you’ve got to make it workable. It’s got to cover all the apps that young people use, not just some of them, and you’ve got – and I – in our view, and no-one’s come up with a better answer, you’ve got to be practical about how you make that happen for all apps. And the only way is through the app stores. There’s no other way.
Latika Bourke
Do you think it’s desirable?
Sir Nick Clegg
I’m not really going to – it’s not – look, the last thing you want from me – look, and no-one’s going to – it’s – it – no-one cares whether Meta thinks its desirable or not.
Bronwen Maddox
Hmmm hmm, hmmm hmm?
Sir Nick Clegg
I think it’s – I think if that is what society…
Bronwen Maddox
It’s a bit…
Sir Nick Clegg
…decides and…
Bronwen Maddox
…disingenuous. I think yeah, people do care, hmmm.
Sir Nick Clegg
No, no, I – cour – I massively care, of course I massively care. Otherwise, well, over the last several years…
Bronwen Maddox
No, no, I was going to say people do care what Meta thinks, you know.
Sir Nick Clegg
Yes, I know.
Bronwen Maddox
I think they do.
Sir Nick Clegg
I was giving you the answer.
Bronwen Maddox
Alright.
Sir Nick Clegg
And of course I care. By the way, as a father of three teens and – well, they’re a bit older now, but of course I want, and we want, teens who use our apps to do it in a fruitful way, a positive way. Clearly, that doesn’t happen in a minority of cases. All the evidence suggests it is the overwhelming time, for the overwhelming majority of young people, it’s a positive experience in terms of developing friendships, developing their own identity and so on and so forth.
We’ve introduced around 50 tools, over the last few years, to give parents more control over the time that their kids spend, who are they friending with? We default under 16-year-olds into a much more restrictive setting, so they can’t be connected or communicated with by people they’re not connected with, etc., etc.
And guess what? One of the things we do find, I’ll be very open about this, is that even when we build these controls, parents don’t use them. So, we have a behavioural issue, which is that we, as an engineering company, might build these things and then we say at events like this, “Oh, we’ve given parents choices to restrict the amount of time kids are on.” If parents don’t use it, of course, it’s not – so, I’m actually – I think – I actually think there’s more we can and should and will do. I hope we’ll be making – look out for it, I think we’ll be making some very significant announcements fairly soon to try and really make these controls simpler, easier, particularly reassuring for parents.
And if governments want to take those measures, fine. It’s their – totally their, sort of, their right to do so, but make it workable. Make it workable in a way that actually, the lived experience of teens – I mean, you know this. The number of Politicia – I’m an ex-Politician, the number of Politicians who talk about, well, apps they’ve never used, don’t even know how to find it on App Store, have no idea how teens actually live their life, it’s really – you’ve got to go with the grain of actually how teens live an online life. That’s why make it comprehensive, cover all the apps, administer it, the checks, the controls, the age verification, through the App Stores.
Bronwen Maddox
Okay. Just take some more questions. I’m going to take one on the aisle there.
Carole Walker
Thank you. Carole Walker from Times Radio. You talked about how AI is used in – how you screen out material that shouldn’t be on platforms like yours. I would just like to understand a bit more about how this works, because clearly, when you come to, as you’ve talked about this, legal but harmful material, protecting peo – vulnerable people and protecting free speech is a very difficult dividing line. Did you at some stage talk about having ‘100 factcheckers’? That didn’t sound like very many to me. So, I’d just quite like to understand, does an AI bot just determine this post falls the wrong side of the line and it’s out? Does it always go through a human, and how narrowly can you set the definitions and the terms for the bots that you use?
Sir Nick Clegg
So, the Community Standards, and I hate to suggest this to you, but have a look, it’s – they’re very readable. I mean, they’re online. Have a look at…
Carole Walker
But that’s a category.
Sir Nick Clegg
No, it’s not a category. That literally lists the 2021 categories of content and defines them, defines hate speech, defines bullying and harassment. And then, as I say, is then reinforced with a lot of data about how much of that content we identify, remove, and the percentage of that content that we remove because our systems, which I’ll come to in a minute, have identified them, rather than users have reported them to us. ‘Cause those are the two main routes, either we find it or someone complains and we respond to it.
And in answer to your question, the – in the jargon, we train things which are called ‘classifiers’ and the more that you can train a classifier on a pattern, which repeats itself, the more effective they become. ‘Cause they’re pattern recognition systems, which is, in effect, what AI has always been. So, for instance, terrorist related content was a huge issue, quite rightly, you know, ten years or maybe less. I mean, I remember in government, I used to beat up tech companies, “You’re not doing enough to take down ISIS and Daesh related content.” I think, I need to check, I think now 99.8% of all ISIS and Daesh related content is removed by Facebook by our systems, before anyone reports it. And the reason for that is because these classifiers, we’ve been able to train them on the kind of repeat content that is ISIS and Daesh related.
The difficulty, to your point, is when you get edge cases, when you get things which – you know, is this someone expressing just horror or inciting, you know, I don’t know, a video of the Police versus a mob, and someone posts, “Oh, they get what they deserve.” Are they saying that about the Police or are they saying it about the rioters? How are you supposed to understand that? Our system is not going to always understand that. So, you have a very sophisticated, but never perfect, process of triage, where the overwhelming majority of content, given the amount and the speed with which it is being generated, is triaged by systems. And we often get ferociously criticised for taking down content, or demoting content, because we haven’t thought about the context enough, which is why we built this Oversight Board and no-one else has done that, so that people can then go to the Oversight Board and they can tell us we were wrong.
And then, you get a, sort of, group of edge cases, which will then be referred to human Content Moderators, who will then look at it. And it’s, you know, it’s painstaking and difficult work, ‘cause it – you know, at great speed they’re having to look at lots of often really, really unsavoury content, as well. The factcheckers, well 100 factcheckers is a lot, actually, around the world, and I’ll tell you why. It’s very important to remember this. There’s no point saying to a factchecker – I think, well, the factcheckers in the UK are Full Fact and Reuters, is that right, and others? There’s no point saying to them, “Find everything that you think might be partly false or missing context and false and flagging it,” because the vast majority of content like that, no-one act – or only a tiny handful sees.
So, the way we design the system is that we have, again, automated systems which will try and predict whether a piece of content is likely to go viral or not and is going to enjoy significant distribution. So, we – and then there’s a, basically, a queue of content where these independent factcheckers then can decide what they take out of the queue to look at it. And then they decide – they have a – oh, this is probably too much detail than you want, but they – it’s important. They’ll – they have a numb – they can either just say something’s completely false, in which case it will be demoted and a, basically, a filter is put over it, saying, “An independent factchecker has found it was false.” Or they can say, “It’s missing context” or “It’s partly false” and so on.
But the system is built to try and – and then, one of the things that again, is difficult about this debate is quite a lot of it sometimes gets a little bit focused by one person finding one piece of egregious content, saying, “Isn’t that terrible that’s online?” Rather than asking themselves, the much more important thing is, how many human beings have actually seen it? And I’ve even…
Carole Walker
I mean, there’s certain things, I have…
Bronwen Maddox
Could – can…?
Carole Walker
…to say, “Swallow a bottle of bleach and it’s going to kill your COVID.” I mean, does that get screened out by AI, or does the stuff immediately get taken off, and if that person happens to be the President?
Sir Nick Clegg
Well…
Carole Walker
I mean, does it have to go to a human?
Sir Nick Clegg
Well, it wasn’t Meta’s responsibility that Donald Trump said that.
Carole Walker
No, I am…
Sir Nick Clegg
Yeah, so…
Carole Walker
…aware of that.
Sir Nick Clegg
…I know we’re con…
Carole Walker
I’m giving…
Sir Nick Clegg
I know a lot – we’re blamed for a lot.
Carole Walker
I’m giving this as a…
Sir Nick Clegg
But then…
Carole Walker
I’m giving you a…
Sir Nick Clegg
…he said it on television, he said it on radio. He’s repeated it in every single newspaper. You’ve got, you know, you’ve got to be – and by the way, that is one of the reasons, incidentally, why, if controversially, but I staunchly believe this was the right thing to do, we decided not to ask the independent factcheckers that I referred to earlier to factcheck the content and the speech from Politicians. Because at the end of the day, a tech company should not be standing between what Politicians in democracies say and the voters who hear them, the good, the bad and the ugly. That is not our role.
Now, some people, particularly on the left in the United States, “That’s outrageous. You’re allowing Donald Trump or anyone off the hook.” No, we’re not going to start becoming a referee of political speech by political candidates. That is not our role. People always complain about the excessive power of these platforms. Just imagine if we started literally editing the – you know, what the…
Bronwen Maddox
But this is exactly the point, that people do want you, to some extent, to be a referee between things that can accentuate…
Sir Nick Clegg
I get the…
Bronwen Maddox
…essentially…
Sir Nick Clegg
My impression is…
Bronwen Maddox
…violence. We’ve got some references here to Facebook’s role in amplifying violence against The Rohingya and Myanmar in 2017 and so on.
Sir Nick Clegg
Hmmm.
Bronwen Maddox
But it is…
Sir Nick Clegg
Right, so…
Bronwen Maddox
So, now – and I’ve got a lot of questions online saying, “Are you not dodging that point about being a referee…
Sir Nick Clegg
I hope I haven’t dodged.
Bronwen Maddox
…a referee of the information?”
Sir Nick Clegg
I’ve tried to be very, very clear that in that huge swathe of category – of content which legislators have decided to keep legal, but we don’t want on our platform, our advertisers don’t want on our platform, our users don’t want it, and we don’t want it, we take a very, very different stance to other platforms. X, under Musk, has basically just said anyone – you know, so, for instance, in the riots in the United Kingdom recently, some of the main propagators, Tommy Robinson, the EDL, Andrew Tate, they were all banned on Facebook and Instagram years before, but were allowed to run amok on X or indeed, on Telegram.
So, these – so, I’m just – I can only speak for our platform, we are super open about where we draw the line. Of course I accept – this is what – I live and breathe it every, sort of, day in the office, is that people always say, “You’re not going far enough.” Other people say, “You’re going too far.” We can only be as open as possible about how we adjudicate on that. Of course, one person’s hate speech will be another person’s claim to the right to free expression. That is an age-old debate. But we consult widely, we are super open about why we take the decisions.
We hold ourselves to account through the – all the data we publish, increasingly making ourselves more open to research, I think objective research, not meta research. Research done by academics as per the US 2020, to Digby’s question, the US 2020 project is a good model which I want to replicate elsewhere. So, for instance, we’ve just recently entered into an arrangement with the Center for Open Science in the United States, to again, give unprecedented amounts of data available to them so they can do independent research into how teens use our apps. So, I think those are all the things that we can do.
Does it absolve one of responsibility? No, and particularly of events in the past. You mentioned the amount of the – the companies now thankfully, completely different since then. We’ve built up all these different guardrails since then. No, but I don’t – unless someone has got any better answer, I don’t know how else – in a space where governments and parliaments have decided not to act, I don’t quite know what other principles we can proceed on…
Bronwen Maddox
So, you’re obviously…
Sir Nick Clegg
…other than the one of transparency and accountability.
Bronwen Maddox
…as is entirely reasonable, speaking for Meta, but do you think, the would be better and indeed, more stable if X and Telegram and others took that line?
Sir Nick Clegg
Look, they need to decide and governments need to decide what they want to do in response. I mean, you know, one of the complicated things, which is my point to the, sorry, the lady, I didn’t – your – that – who was asking about Australia just earlier, you’ve got to remember, these apps don’t operate in isolation. So, stuff that gets posted here gets shared there. Some – a comment there gets photoshopped at – gets screenshot and set – you know, shared elsewhere. And one of the problems, actually, we found in the UK riots recently, we had to block a whole bunch of content and links to Telegram groups, which were, basically, being used for the instigation of those riots.
Because it – this content sloshes from one platform to the other and I – certainly in the time that I’ve been in my job, I have seen, interestingly enough, the different apps operating in increasingly different ways. And I think one of the challenges for policymakers is that the debate you keep hearing is, “social media.” It’s not a lumpen thing. In the same way it makes no sense for me to say to you, Carole, “Oh, it’s like the media.” I mean, like, the FT is different to the Daily Mail, Channel 4 is different to BBC Radio Lincolnshire. People use it differently; they have different policies. It’s – you know, you – I think you increasingly have to look at these apps in different ways, and if you’re going to take measures, as per the one in Australia, you’ve got to make sure that it can actually be administered to cover them all.
Bronwen Maddox
Okay. Zoe Kleinman online, I was about to ask your question. I think we’ve just about covered it. Do we have one over there? Yes.
Alex Krasodomski
[Inaudible – 58:46] at Chatham House. Cool, thanks very much.
Bronwen Maddox
But…
Alex Krasodomski
I just wanted to finish on a question about…
Bronwen Maddox
So, it’s Alex Krasodomski from Chatham House.
Alex Krasodomski
Yeah, thank you. I look after…
Bronwen Maddox
One of our Digital Techno…
Alex Krasodomski
…some of our technology work. I wanted to finish on a question around values. I think it’s – I’m really stuck, looking at what we’ve seen in Silicon Valley over the course of this electoral period in the US, is a number of voices at platforms that I think previously had tried to come across as pretty politically neutral, tried to stay out of politics, have changed their tune in quite a big way. You’ve obviously mentioned X and Musk. There’s a lot of investment and so on. Have you seen that change happen at Meta? Have your values changed over the last ten years? Is it still the company that was once, sort of, heralded as “The vanguard of democracy sweeping the Middle East?” And now, what are we left with today?
Sir Nick Clegg
Yeah, there’s been a significant change and it’s in the other direction, and it’s user led. I mean, look, it – I don’t – I won’t ask, I won’t put a – ask for a show of hands, but people who go to X want to go to X to start yelling at each other about current affairs and politics. It’s not actually why people join Instagram and Facebook. You’ve got to understand, again, to my point, how different they are. X is tiny, it’s tiny, it’s for elites, it’s for people like us. It’s, honestly, it’s not used by the vast majority of normal people around the world. No, really isn’t. It’s smaller than Snap, I think, in size, alright? So, it’s a tiny elite news obsessed, politics obsessed, app.
The vast, vast, vast majority of people join Facebook and Instagram because they – ‘cau – for much more playful reasons, or they – they’re not – you know, they want to exchange stuff with their family and friends, or about their holidays or about their daughter’s soccer team. Or they want to join – you know, it’s – and what we keep hearing from our users, over and over and again, is, “Can you just get the politics out of my feed? I just don’t want it. I don’t – you know, I don’t want it.” And anyway, it’s organically reduced.
I think the total amount of politics and current affairs on our services now, as a proportion of the total, is less than 3%, So, it’s been declining and declining and declining, which is why you have this very difficult debate between platforms like us and Publishers and the press, ‘cause they’re saying, “Oh, you’re taking our hard worked,” sort of “content so that you can engage users.” No, we don’t want it, we don’t need it.
So, our users don’t come to – so, I think what we have done as a result is we’ve very – we’ve removed the boosting, if you like, the amplification of current affairs and politics. We have made our rules on politics and campaigning much more stringent. So, you can’t now – you can’t use our services to so-called micro-target political ads anymore, ‘cause we’ve removed all the categories that you could use formerly. We now have an ads li – a political ads library, where if you run an ad, your ad will stay there for seven years. You’ll see who paid for it, who they were trying to address, what the content – way more transparent for Researchers and so on than any political advertising on television, you know, radio and so on and so forth.
So, politics, if anything, in answer to your question, Alex, whilst, of course, X has moved and it’s obviously, for Musk, it’s become a, sort of, one man, sort of, hyper-partisan ideological hobbyhorse, for us, as Mark Zuckerberg said, you know, recently, he’s – he doesn’t want to get drawn into this. He doesn’t want anyone to, kind of, be interested in how he’s going to fo – ‘cause we – if you’ve got four billion people around the world using your services, you’ve got people with lots of different persuasions. But the key thing is, and I said earlier, they don’t really tend to use our services to yell at each other about what the latest headlines are. It’s just not the ki – nature of the product. So, I think there’s been, I wouldn’t call it de-politicisation, but there’s certainly been a tendency for the amount of political content to, kind of, diminish over time on our services.
Bronwen Maddox
May I, on that note, just bring us – I’m going – we’ve got one micro-minute, and I want to bring us back to the UK. There’s a question from Ade Clulow, saying, “What about the recent riots in the UK, which had – which gave us the example of social media platforms finding language acceptable even though the words turned out to break the law of the land?” What is your comment on that?
Sir Nick Clegg
Don’t quite know what he’s referring to, but as I said, we – I mean, all I do know is – so, a number of my teams round the world that run according – you know, that respond to – I mean, there are crises happening around the world all the time, around the world all the time. We have a internal crisis protocol and we set up a team instantly, the moment those riots started. I think we were in touch with the Police and law enforcement within an hour about – hour of hearing about them. We took down a number of pages, a number of entities. Our fact checkers factchecked – published a number of factchecked articles, as I say. The main, sort of, protagonists were banned on our platform anyway, as I said. We actually severed some of the links to other apps and platforms.
If the question is about how public Facebook content might be cited in criminal cases against individuals, I just would – I wouldn’t…
Bronwen Maddox
There was a woman then charged for hate speech, with – which platforms had published.
Sir Nick Clegg
Right, and without knowing…
Bronwen Maddox
Yeah, that’s fine.
Sir Nick Clegg
…the – how it was cited in Court, I can’t – it’s just super difficult for me to know.
Bronwen Maddox
We’re going to have to stop there, in any case. Thank you. There’s some terrific questions and online, I’m sorry, I could barely do justice, including to the debate that is happening within the platform about – and you answered it, but the people online would’ve liked a lot more, about whether content moderation is driven by revenue. We had one from Peter Price, I was so – I was al – it was almost irresistible to say, “What are your best and worst experiences in Silicon Valley?” but we couldn’t get there. And a really interesting debate, Alan Houmann and others, about whether “posting on social media should be anonymous or not, whether people could be held to account,” on the other hand, “it’s a protection that many need.” And that debate is going on as we are – and there were a forest of hands up here. And I’m sorry, this is human moderation of this event and nece – with all the failings that that implies, but the biggest constraint being time.
Thank you for coming. Can you join me in thanking Nick Clegg?
Sir Nick Clegg
Right, thank you [applause].