Esther Naylor
Hello, and welcome to today’s event, COVID-19, Trends in Technology: The Perils of ‘Tech-Solutionism’. Technology has been central throughout the COVID-19 pandemic and the development of national track and trace apps and mass data collection for virus monitoring have highlighted a multitude of new realities and challenges for society worldwide. The pandemic has undoubtedly accelerated digital transformation.
Over a year into the pandemic and around 48 countries have developed their own contact tracing apps. There have been roughly 13 COVID-related apps released in the US and 11 in India. By 2020, the UK COVID-19 app had been downloaded 20.9 million times, with a population estimated to be well over 66 million. In Singapore, the TraceTogether app was downloaded by four million people, with a population shy of six million. However, there has often been mixed evidence as to the effectiveness of apps themselves, and more often than not privacy and security concerns have dominated the narrative. These apps are a treasure trove of personal data and have been accused of ushering in a surveillance state without limits. More recently, Google and Apple have blocked a planned update to the UK’s NHS COVID app on the App and Play Stores, citing privacy violations. Who’s responsibility is it anyway to uphold and safeguard individual rights and freedoms?
But there’s a new kid on the block, vaccine passport apps or digital health certificates. Proof of immunisation status is the latest tech solution that’s at the forefront of discussions on whether or not we need an app for that. Both the UK and US governments have publicly stated that there is no intention to make these apps mandatory. New York has recently released the Excelsior Pass for residents, which verifies a negative COVID status and a vaccine status. In Israel, there is a Green Pass, an app that’s needed to gain entry to things like gyms, weddings and indoor events. The EU has also endorsed the idea of a digital green certificate to be rolled out in the summer of 2021, the details of which are left to member states to decide. While the perceived failure of contact tracing apps is a natural reason to dismiss things like digital vaccine passports, this new tech solution has revived the same privacy and ethical concerns around centralised and decentralised data collection. And the use of data for public good, but not further entrenching inequalities and limits rights and freedoms, as well as disproportionate power that some technology companies wield over elected governments.
If anything, this past year has been a huge test of interest in governments, technology and society itself. Today’s event will reflect on responsible and accountable technology policy and governance a year into the pandemic. My name is Esther Naylor, and I’m a Research Analyst for the International Security Programme and on the Editorial Team for the Journal of Cyber Policy. Today, we’re joined by four excellent panellists: Malavika Jayaram, the Executive Director of Digital Asia Hub, Carly Kind, the Director of the Ada Lovelace Institute, Dr Rachele Hendricks-Sturrup, the Health Policy Counsel and Lead for Future of Privacy Forum, and Emily Taylor, Associate Fellow of the International Security Programme and also, the Editor of the Journal of Cyber Policy.
So before we jump into today’s event, I just have a few housekeeping notes for all of our participants. This event is being held on the record, so please feel free to tweet about this, using the hashtag #CHEvents. I will also ask all attendees to put your questions in the ‘Q&A’ function. We will be asking attendees to ask their questions live, so if you prefer to ask – if you prefer your question to be read out, please indicate so in the Q&A. This event is also being held in collaboration with the International Security Programme, and it builds on a research paper entitled, The COVID-19 Pandemic: Trends in Technology, Transformations in Governance and Society. And this paper examines some of the risks that have been highlighted and aggregated, as society has transitioned into a more virtual way of living, and it’s part of broader work undertaken by the International Security Programme on Trends in Technology: What Does the Future Hold?
So, I’d like to turn to the panellists now, and open – and start by asking a question to each of you. What, if any, positive/innovative developments have you seen with regards to tech, other policy [inaudible – 06:56] in the pandemic? And Emily, I’d like to turn to you first.
Emily Taylor
Well, thank you very much for that introduction, Esther, and for inviting me to be part of the panel today. I think with any crisis like the pandemic, the positive aspects, I suppose, are that it can be a great galvanising force to eliminate bureaucratic and other hurdles, institutional inertia, and foster creativity, and there’s been a huge uptake in use of technology. But I think the example I would highlight, and it’s probably in contrast to the app, is the rollout of the vaccine in the United Kingdom. And in that way, you know, technology is being used in a very low-key way to support efforts to rollout the vaccine, but the whole programme has been rooted in local communities, in the GP lists. They know already who’s on their lists, who is the most vulnerable, and so you’re sort of using that local expertise and supplementing it with a sort of a – there’s a mixture of centralised and decentralised, local and national, so that we are getting as a country statistics every single day about the vaccine programme and how it’s being rolled out, but it – essentially, this is an intervention that is based in the pre-existing expertise, institutions and human systems.
Esther Naylor
Thank you for that, Emily. Rachele, can I turn to you and ask, you know, about any, kind of, innovative positive developments that you’re aware of?
Dr Rachele Hendricks-Sturrup
Absolutely. Well, where I live in the United States, which I can give for an example, obviously, digital contact tracing apps, vaccine passports are equally emerging. A lot of these initiatives are rolled out at the state level. There’s really no federal level tech solution, in that regard, here in the United States. Google and Apple released last year, at the start of the pandemic here, anyway, or at least at the start of the shutdowns here, an API that – or an application programmatic interface that would allow state authorities to create their own digital contact tracing apps that would allow Apple phones to talk with Android phones, so that way digital contact tracing can occur, through the use of measuring proximity from one of those types of phones to another. There were significant, you know, limitations to that particular solution rollout, and namely, what we saw here is a lack of trust in these apps, in that not everyone felt comfortable downloading them. And for those that did feel comfortable, it wasn’t effective enough, in terms of enough people using the app, so that way the app could function appropriately or the way that it was intended. So, although we saw these solutions here, it wasn’t necessarily a popular solution, or rather a solution that was embraced largely by the public.
But that’s really what we’ve seen here, and today, we’re seeing vaccine passports emerge as well, but we have a lot of people here, and what I’ve seen abroad as well, there are a lot of questions about the use of these types of passports and how they might differ from typical initiatives where an individual must present their vaccine records for access to things like public education, whereas these apps are now coming into question about access to day-to-day services and activities. So, there’s a lot of ethical and equity and trust issues and questions coming up here.
Esther Naylor
Thanks for that, Rachele, and we’ll catch up on, you know, contact tracing apps and vaccine passports, and I think that you’ve, kind of, foreshadowed some of the issues that we will be talking about, about these ethical and privacy concerns. Malavika, can I turn to you, in terms of, you know, any innovative developments that you’ve been around.
Malavika Jayaram
Hi, thanks for having me. I want to sort of preface it by first saying that when we think of tech solutionism, we tend to think of the big tech platforms, the big providers, the gaffers, but vaccines are technology too, and we don’t tend to think of them in quite the same breath. So, for me, I think one of the few positive things to come out of the pandemic is international collaborations to develop vaccines in the first place, the relaxing of certain kinds of intellectual property norms, sharing knowledge, I think that’s incredible, the speed and skill at which solutions have been found.
I think another really great trend is the move towards decentralisation and privacy preserving technology, and not, you know – I’ve always said that the road to privacy hell is paved with good intentions. I think no more so than in the context of COVID, but I think to the extent that you have new protocols like the DP3T efforts, and you have other kinds of efforts to really bring in privacy preserving tools into the mainstream and have them adopted not just from the fringe, but into big tech platforms, I think that’s great.
And I think, for me, the other really good thing, even though it isn’t a solved issue, the fact that it’s exposing invisible marginalised populations and exposing structural inequality, even though we haven’t fixed it. I think the fact that entire populations that have remained under the radar, invisible, like migrants in Singapore or people who are undocumented, the fact that the pandemic is surfacing the need to take care of them, even if it’s from a cynical point of view that to take care of us we need to take care of them. I think that conversation is sorely needed and very, very past its due date, yeah.
Esther Naylor
Thank you, and I hope that we will revisit that later on in the discussion, that question of, you know, what do we mean by tech solutionism and which tech solutions are we going to favour and what are the, kind of, you know, unintended consequences of those?
So, turning to you now, Carly, and I’d like to hear from you on any, kind of, you know, innovative developments that are coming out of the pandemic.
Carly Kind
I would just add to what Alison said, this is building in particular on Emily’s point of view. I think technology has been most successful during the pandemic when it has built on existing structures, existing communities and processes, and where it has been really the answer to a specific problem, rather than coming ahead of the problem itself, you know, and that is the idea of tech solutionism.
I’ll give two examples. One, I think is the use of WhatsApp to support mutual assistance groups. That was really an organic thing that grew out, certainly in the UK, and I imagine that’s been around elsewhere, of the need to provide and to bond together local communities to provide support and WhatsApp has been incredibly helpful in facilitating that community intervention.
The second is around the use of some predictive analytics to identify vulnerable populations here in the UK. We conducted a study, over the past year, of a London local authority using a predictive analytic system, and one of the ways in which it was incredibly beneficial was around COVID and the way in which it was able to surface vulnerable people and populations that weren’t otherwise visible to the state. But there you had an existing piece of technology that was already – an immense amount of work had gone into integrating that into the local authority, they were already using it for other purposes, and they were able to repurpose it in the context of the crisis to some benefit. And I think that, you know, it really benefited from being embedded in an existing process and existing local knowledge as well. So, those are two examples I’d give of positive uses of tech.
Esther Naylor
Thank you, and those are all very interesting. Those are very interesting developments, and I think that we’ll catch up on this, kind of, theme that’s emerged from your remarks on, you know, technology as a solution that enhances these existing infrastructures, but also, where technology is shedding a light on inequalities.
Now, as I mentioned earlier, this event is drawing from a research paper that looks on trends in technology throughout the pandemic, and that has, you know, focused on transformations in society and in governance. And our colleague Emily Taylor was an author, and Emily is an Associate Fellow with the International Security Programme, and she’s also the CEO of Oxford Information Labs. And your chapter, Emily, focused on the UK’s experience with the COVID-19 app, and you touched quite profoundly on the dynamics between big tech and government. And I’d like to hear from you on why was this, you know, experience so controversial? And what does the issues that you raised in your paper – how is that relevant today? We almost don’t hear about the contact tracing apps themselves. We’ve now – it seems that we’re, kind of, focusing on new developments. So it would be great to hear from you on this.
Emily Taylor
Thank you very much, Esther, and yes, I mean, trying to figure out what’s happened with the app, I sort of revisited it a couple of weeks ago to – just to follow-up and see what had happened to the app, and almost nobody knew anything. People – and it wasn’t just in a sort of, you know, friends and family, it was people who were, you know, public health experts, central to the fight against COVID, or people working on the technical side. And I think, you know, it’s sort of quite a reflection that something that was going to be world-beating in the government’s announcement just a year ago, seems to have plummeted and disappeared without trace. But I don’t think that that’s completely a UK experience. And as Rachele said in her opening remarks, there’s been, you know, generally across the board a, sort of, an underwhelming feeling about the effectiveness of the contact tracing apps.
And I came to the subject having, you know, cared very deeply about personal privacy and the, sort of, the evolution of the, sort of, tech economies that we’ve seen, which are really based on some pretty extensive privacy intrusions. So, you know, I come from that, sort of, background. But what really struck me and made me feel uncomfortable, as I talked to people in preparation for the paper, particularly those in public health, was that they were adamant that the only real way to get an effective app was, apart from a, sort of, very high uptake, which Rachele referred to, you also need centralised data storage, and it’s only going to be that that gives you the whole national view of the spread of the epidemic, allows you to identify super spreaders and, sort of, make the – you know, look at the big patterns.
And actually, looking at the original design of the UK app, it was, you know, fairly privacy respecting. It did – and in the best sort of way of privacy by design, that it was – it showed a lot of restraint in not collecting data. And so it seemed like the government was working with health experts, with tech people, working also with the Information Commissioner here, and sort of doing the right things.
But why did it all go wrong? Essentially, Google and Apple imposed their own policy solution on not just the UK Government but other elected governments. And while we can all have a really interesting argument about the merits or demerits of centralised or decentralised data collection, ultimately, you’ve got a democratically elected government being forced to take – to adopt a policy solution that is essentially determined by two unelected companies, that between them, they represent 99.75% of the global market for mobile operating systems, and with that, the App Store. So, it’s like, “Well, good luck distributing your app without us.” And so, you see, the companies deploying really awesome lobbying power and going in, playing hardball with Politicians, often just before the launch of these apps, and essentially, getting their own way. And so, I think that this throws up some very uncomfortable questions about the relative power dynamics and the fact that we don’t really have an internationally applicable set of rules to govern, you know, the way that we handle data internationally. Thank you.
Esther Naylor
Thanks, Emily, and I think that you’ve – you know, your last point there has really highlighted the crux of one of these issues is about this lack of international norms and standards. It’s all very well, you know, Google and Apple offering a solution to governments in a time of crisis, where often governments don’t have the capacity to create, you know, an app that’s downloaded by millions and millions of people, that works perfectly, that has the, kind of, technological capabilities. And so, you’ve touched on this experience, and now I’m going to turn to Carly Kind, who’s the Director of Ada Lovelace Institute, and before I do, my colleagues from the Members’ Team are going to put a poll on your screen about whether or not you would use a vaccine passport app. So, please take the time to answer that question, and we’ll turn to the results before the Q&A starts.
But Carly, the Ada Lovelace Institute, who are a research institute and a deliberative body, with a remit to ensure data and AI work for people and society. You’ve been working on this issue, and I would be very interested to hear, and is this something, you know, is this something to be immediately dismissed, to have this, you know, potential app to help open up society? Or is this something that we can learn from lessons with other solutions?
Carly Kind
Thanks. Sorry, thanks, Esther. So, I would start – on vaccine passports, I would start by separating out, for the purpose of this conversation, whether or not the system is digital versus the underlying objectives, to establish a system of permissions, based on vaccine status. And I separate out the technology from the underlying system, not only because I think that is the better way to develop policy generally, but also because, to our knowledge and in the conversations, we’re having with government here and elsewhere, government is also taking the same approach. That is, unlike with contact tracing app, this is not an app-lead approach or an app-lead policy, I would say. This is a policy which is asking, “What is needed in terms of infrastructure generally?” and then, “To what extent should that infrastructure be digital?” and the related questions with that.
And so, I think there are, kind of, two sets of moral, legal, practical challenges. If you begin with just with the question of vaccine passports, digital or non-digital, there’s a range of different considerations to be had. We outlined them, in numerous reports we’ve put out so far, but to briefly canvas them here. There’s essentially the question of whether it’s defensible, from a public health perspective, to attenuate access to places and travel on the basis of vaccine status. We know – we don’t, for sure, know at this stage the extent to which vaccine affects transmission of the virus, although we know of course it does affect transmission, to some extent. As that evidence emerges, the public health case gets stronger for some type of privileges based on vaccine status, but at the moment, you know, there are differing views, as the extent to which we can rely on vaccine status as a useful way to distinguish who is transmitting the virus and who isn’t, and of course that’s complicated by new variants, mutations, etc.
And the second question then becomes, you know, how do you set up a system of social filtering and structure based on, you know, one’s access to vaccine, which, in this country and in most countries at the moment, has not yet been available to everybody, and so, you know, is it ethical, is it permissible to give people certain privileges off the back of something they have no control over, their access to a vaccine? And unless and until that vaccine’s become widely available to everyone, there will be questions there around fairness, discrimination, and ethics, I would argue.
Then, I think there’s a question of, kind of, misuse and abuse. How might such as system be used to marginalise certain groups? How might it be used to exclude certain populations from employment, for example? You know, you can imagine at the moment, there is a very real concern that young people are the last in line for the vaccine, and yet they’re being asked to work in the venues that would be potentially controlled by access to a vaccine passport. So why should young people put themselves at risk of infection to be able to staff the same venues that they can’t attend, because they don’t yet have access to a vaccine? So, there’s a range of really complex social questions. And all of those arise before you even get to the question of privacy and surveillance, which of course, is real as well, in any, kind of, data driven infrastructure, and in particular, here we’re talking about one which is essentially akin to an identity system.
So, you have to get past all of those various questions before you then ask, “And should it be an app?” And of course, there is, you know, inclinations towards digitising such a system for convenience purposes, I think all governments that we’ve spoken to so far are aware that you would – anyway, you would need a paper-based system to go alongside an app-based system, if – at least because there are, you know, between ten and 20% of people who are excluded from digital technologies to begin with. And then, I think, you have a range of other very real concerns about the challenges of establishing, essentially, digital identity infrastructure, how you do maintenance around that, and how you standardise it, how you make apps interoperable and how you ensure that privacy and security is respected.
The only other kind of distinction I would make, which you didn’t exactly make in your poll, and perhaps you were trying to get at this in your poll, there is a real difference, I think, between vaccine passports to international travel versus domestic vaccine passports for access to leisure venues or to workplaces. Each of those three, kind of, buckets raises very real differences. I would say employment, importantly, raises a lot of questions around employment discrimination, and international travel, on the other hand, you can imagine there’s a lower threshold, in terms of concerns for applying restrictions in that space, given how, you know, we accept already a, kind of, degree of rights infringement at borders and in travelling, and it’s also a kind of privilege rather than a necessity to be travelling internationally, although I’m sure we could have debates about that as well.
I would say, my view is that an international system of some sort is most likely to emerge, and that it will be digitised, but that the challenge of interoperability and recognising credentials from other countries will be a serious one, and I would anticipate that we won’t have such a system ready for this pandemic. I think now will be the time when we begin to build an international vaccine certification system for the next pandemic, but I can’t imagine how such an impressive piece of digital infrastructure will be in place for – to deal with the next year or two. You can see I have a lot to say about this issue, so I’ll stop there, but happy to answer any questions.
Esther Naylor
Thanks, Carly, and I think that what’s really emerging is the difference between, I guess, you know, contact tracing and, you know, thinking about a vaccine passport, is – both are, you know, public health phenomena. So, contact tracing was a well-established practice before an app came along, and it seems like there is an emergence of a, you know, a concerted thinking about the issue at hand first, and then thinking about how that could potentially be digitised, you know, to make it a bit more effective and I think – I liked your separation of the, kind of, you know, different issues. And there are lots of ethical issues in there in and of itself, you know, whether it’s access to the vaccine, whether, you know, the epidemiological phenomena are – of the, kind of, immunology – the immune status of having that vaccine, and I think that sometimes we do rush, you know, especially policymakers do want to have a solution without thinking – getting all the necessary stakeholders in the room.
So, on that note, I would like to turn to Rachele, who is a Health Scientist and Health Policy Counsel at the Future of Privacy Forum. Her work involves using mixed methods and research to explore and address ethical, legal and social issues and implementation barriers at the forefront of health policy and the 21st Century Cures Act innovation. And so, Rachele, we’ve heard a lot of these, kind of, complex social and ethical and legal issues, which arise even before you’re thinking about tech solutionism. But I was wondering if you could tell us a bit about some of the best practices that we can borrow from the health field, when we’re thinking about developing tech solutionism. So, concepts such as privacy, confidentiality, have been in the health sector far longer than, you know, technology has. So how do you square that need for, you know, continuous data sharing, but also, in a way that doesn’t, kind of, further entrench these inequalities?
Dr Rachele Hendricks-Sturrup
Sure, thank you. So, I – you know, building on what Emily and Carly just mentioned, you know, there are a lot of things to consider. Obviously, without careful attention to important social risk factors, such as the risk of discrimination, the risk of implementing passport programmes differently amongst groups differently, that could be bias-driven or otherwise, then the erosion of equity, fairness, privacy and personal freedoms is likely to occur. And we can’t necessarily assume that once these tech solutions arrive at the table or even become broadly available, that they will be implemented equally and equitably, and that’s especially true in countries where we’ve seen various racial tensions, like the United States, for example, and that’s historically and today. So, that’s something that we have to certainly think about when assuming or attempting to understand whether or not tech solutions will in fact be a solution as they’re allotted to be.
Also, if these solutions are implemented to help address COVID today, it could certainly create the infrastructure to perpetuate at a large and potentially irreversible scale in the future, human error or bias, which will go even further, in terms of exposing social discrimination issues, and that will obviously further drive social inequities during and after COVID-19. One thing that’s really important, at least in my opinion and several others to consider, is that when we create these infrastructures today, we tend to be short-sighted, in terms of how they might affect others in the future, our generations in the future, especially as these generations become increasingly diverse and also, as travel opens up, you know, more broadly. Obviously, we – a lot of us hope to resume to international travel, where, you know, vaccine records are not uncommon, but rather the way in which vaccination information is presented will be different following this pandemic.
So, there are programmes that are probably more analogue or manual that have worked in the past around vaccine passports. But with them going digital and for those arguing for a more international infrastructure around these passport regimes, we have to, obviously, again, think about how their implementation might affect the rights and privacy of individuals and certain groups that are at high risk of discrimination and bias and so forth. And then, also, you know, although privacy and ethical law standards and principles, regarding the implementation of these tech solutions could address COVID-19 today, you know, these privacy laws and regulations and standards are all under development.
Here in the United States, for example, we’re seeing these laws emerge in a very piecemeal fashion. So, again, we have to think about as these tech solutions rollout, we’re implementing the solution before we’ve had a chance to address some of the legal issues, some of the ethical issues first. There’s a lot of distrust, in general, for the companies. In terms of privacy, there’s a lot of distrust in the companies that are developing or co-developing these solutions. So now we are allowing them to come to the table to create these broad stroke solutions that will affect everyone internationally, and we’re doing – we’re relying on them to do this before we’ve even had a chance to reconcile the concerns that we’ve had around privacy and the erosion of democracy and fairness. So, all of that, again, being said, we have to understand the extent to which we – the extent to which harm can occur after prematurely – potentially prematurely implementing these solutions before we’ve had a chance to adjust our legal infrastructure and programmes and also before we’ve had a chance to fully address some of the pre-existing structural issues that existed prior to COVID, that really and truly expose a lot of the social inequities that we’re seeing today in the aftermath of the onset of COVID-19.
Esther Naylor
Thank you, Rachele, and I think it’s important as we – just before we turn to Malavika, to remember that, you know, some of these, you know, in the pandemic, a lot of these social inequities, most of them have existed pre-pandemic, it’s just that there has been a magnifying glass and a spotlight on these inequalities. And while we are developing these solutions, be it, you know, data collection or a contact tracing app, there is huge criticisms that there are no time limitations for the collection of this data or surveillance. And so, I would actually like to turn to Malavika and hear, you know, the perspective from, you know, where you sit in Singapore and how has, you know, tech solutionism disproportionately affected marginalised populations in the context of the Global South?
Malavika Jayaram
Thanks, Esther. I live in Singapore, but I think Singapore isn’t the only country I’d like to talk about. Singapore was interesting because COVID revealed two Singapores, in a way. You had the elite, privileged, mobile, entitled part of Singapore, that functioned like nothing had changed in their lives, but you had entire dormitories of migrant workers, labourers from Bangladesh, from India, from different countries in the region, who were living in such cramped quarters that the spread of COVID was accelerated just by them sharing a restroom between, you know, 20/30 people, eating in the same room, living in really tiny bunk, you know, dorms with bunkbeds.
So, I think until that happened and those were, sort of, the predominant clusters, people didn’t even realise there were these dormitories in Singapore. It was like, “Oh, which neighbourhood is that?” and, “I never drive by, so it must not exist.” Right, so there’s that, kind of, redlining that becomes more apparent. But, I mean, the wonderful thing is people, sort of, came out in huge numbers to actually contribute and donate towards people who were still trying to send money home. Singapore tried to provide benefits to workers, so that they could still send wages, even though they weren’t earning. So, you know, if there were these, sort of, positive side effects as well.
But I think in the rest of Asia, the pandemic ended up being a little bit of a tipping point along the trajectory of surveillance infrastructure that had been building and corralling more and more users over time. And I think this gave the, sort of, fine – you know, previously, it was national security that was, sort of, the catchall, you know, magic wand that allowed you to be privacy invasive, and here was something which said, “Well, it’s a public health crisis. Surely, you don’t care about your individual privacy against this collective public good that can emanate from sharing data?” And all kinds of institutions that were trying to link different forms of ID, that were trying to digitise people as part of this utopian ideal of digital transformation, of financial inclusion, social inclusion, I think this was, kind of, the last sort of nail in the human rights coffin.
It was, sort of, you can argue perhaps about other things, but it’s really hard to argue against health being a legitimate need to curb human rights. But it had effects on everything from protests and dissent. I mean, Hong Kong protesters have been practically overnight, once you had the pandemic raging through. You had a weird situation for a while where you could be in trouble if you wore a mask, because you were supposed to have your face capable of being identified, but you were also in trouble if you didn’t wear a mask under a different, sort of, authority and under the health perspective. So, I think it’s been this, kind of, a slippery slope towards creeping surveillance and the health excuse was, sort of, a tipping point to enable these kinds of infrastructures, which previously had been under legal challenge. I think a lot of the challenge got muted; a lot of the protest died down.
But I think there was this other thing, which is really blatant, which is that we’ve had trickles of arguments that Asians aren’t private, right? Chinese aren’t private, Indians aren’t private, we ask people their salaries, we ask them nosy questions, if they’re married and have kids. You know, why they are gay, and do they think they could go for therapy and will they change? We think it’s perfectly legitimate to ask incredibly creepy questions, but I think this was yet another example of saying, “Well, we have communitarian values, we have Confucian values, we don’t function at the level of the individual, that’s a Western Enlightenment construct, we’re all about communities and villages. Therefore, we’re happy to give up our rights, we think it’s part of our duty, and that’s as much of a duty as any, kind of, individual conception of human rights.” So I think these kinds of communitarian arguments have become even stronger during COVID.
And I think the final thing I’ll say is that a lot of these countries have very few resources to deal with a crisis of this magnitude. They have – we have some of the largest populations in the world, yet some of the smallest resources. And so, it seems a, sort of, recipe for disaster when governments then reach out willingly to the private sector to say, “You have to help us,” whether it’s partnering on solutions, whether it’s government saying, “We will share all of the data of our citizens with you to help train machine learning algorithms, in exchange for you running these platforms, in exchange for you running the backend of our infrastructure.” So, I think that sort of public/private partnership has got a [inaudible – 40:12] and we’ve already seen a huge push towards artificial intelligence and automation, very much in the, “Here is the solution, what’s the problem?” kind of approach.
And I think this is exactly the kind of crisis where something like AI and automation, which by definition is contactless and has very little human, sort of, fingerprints on it, is perfect for a crisis where you’re trying to say not having humans do things is fantastic, outsourcing everything to a platform or to technology is wonderful. And I think we’re seeing a lot of that against public sentiment that this is a sliding slope. And in Singapore, they were all kinds of promises about TraceTogether never being used for anything other than public health needs, but we’ve already seen, even before the crisis has ended, let alone in a post-pandemic world, that they’ve said, “We’ll use it to solve crimes.” And the outcry has not been around scope creep, the outcry has been about the government lying to people about its use. People are saying, “We’d be fine with it if you just told us upfront. We don’t think it’s creepy, it’s fine. We just wish you’d said so.” So, I think those are the trends that I’m seeing in the region. So, I really worry about how much worse it will be, following the pandemic, where none of these infrastructures will be dismantled or none of the data will be deleted.
Esther Naylor
Thank you, Malavika, and you touched on some very poignant points there, in terms of, you know, these cultural norms, in terms of privacy and ethical debates and how, you know, we are talking about tech solutionism, and you know, we’re broadcasting from Chatham House, which is based in London, but there are lots of different perspectives that need to be taken into account, especially when, you know, the UN estimates that the majority of the world isn’t online and doesn’t have access to these solutions, and I think that that’s something really important, is when we’re developing these solutions, who are we developing these solutions? Who has the power?
And so, just before we turn to the question-and-answer session, I would like to remind everybody, please feel free to put your questions in the ‘Chat’. You may answer those questions – you may ask those questions live, and if you wish for the questions to be read out by myself, please indicate that when you’re posing your questions. So, we had a poll, and before Carly spoke about vaccine passports, and the question was, “If required for access to a venue, workplace, travel destination, would you use a digital vaccine passport app?” and 92% said yes, and 8% said no. And I think something that’s clear, especially from your remarks, Carly, is that this is not necessarily a black and white issue, in the sense that there is no – there is a lot more nuance that needs to go on, in terms of whether or not people will download these apps, and I think that some, you know, some people don’t necessarily – won’t necessarily have a choice, and they may be required by their employer. And one thing – so I would like to use my position as Chair to ask one or two quick questions, and then I’ll turn to the audience Q&A.
These kinds of solutions are here to stay, and there is a question – there is a debate about regulation and about how we manage this relationship between big tech companies and government. And in the US, we see these, you know, Senate trial hearings, and in the EU we see, you know, increasing desire for regulation. And I just wondered if any of our panellists would like to comment on who isn’t being included in this conversation? Is it – you know, we’ve heard earlier, technology isn’t just the big tech companies. Does anybody want to jump in here? Emily, would you like to jump in?
Emily Taylor
Thank you very much for the question, Esther, and I think that this also brings in some of the issues highlighted by Bella Wilkinson’s questions as well. And there’s – in an emergency, like the global pandemic, there’s a need to act quickly, and we’ve seen, with our own national government, and I’m sure that it’s been repeated elsewhere, lots of examples of failing, as well as some examples of getting it right. And, you know, I think that, you know, in an ideal world, you would, as a policy, a thoughtful policymaker, you would consult with and listen to all of the affected communities. But as the other speakers on the panel have highlighted, one of the things that has emerged in the pandemic is how visible it has made societal inequality and marginalised communities. And so, you know, one would hope that that lesson will stick and that it will provoke and motivate policymakers to really use technology to reach out or actually use people to reach out to those communities and make sure that their interests are taken into account.
I say this in the context of needing to act quickly in an emergency, and necessarily I think policy interventions, all interventions, have been compromised. But I think that the big human rights risks occur after the immediate emergency has, sort of, gone away, and that’s where you need to make sure that you are sunsetting and not keeping emergency powers going for too long. So, I’ll leave it at that, I’m sure then other panellists will have a lot of stuff to say on that.
Esther Naylor
Thanks, Emily, and I think you mentioned Bella Wilkinson. Bella, would you like to ask your question? Would you like to unmute and ask your question?
Bella Wilkinson
Hi, is my audio working okay?
Esther Naylor
Yes, it is, yes.
Bella Wilkinson
Great, thank you so much, everyone, for your incredibly insightful comments. I would like to pose this question equally to all panellists. It would be great to, kind of, get local and national perspectives on this. But as I put in the ‘Chat’, I wanted to ask, how should national and/or international policymakers provide more and more meaningful space and platforms for historically more marginalised communities to, kind of, co-create these tech-based solutions to COVID-19? And more specifically, the lived and long-term consequences for their communities in particular. Thank you so much in advance.
Esther Naylor
Thank you for that question, Bella. And before the panellists jump into answering that question, I’d also like to read out a question on the role of human rights organisations as well. So, my colleague, Joyce Hakmeh has asked the question, “Human rights organisations have raised serious concerns that a neglected human rights crisis around the world, that have the potential to undermine already precarious global security, as governments continue to use COVID as a cover to push an authoritarian agenda. How much are the tech solutions contributing to this deteriorating solution? And do the speakers think that the emanating consequences are irreversible?” So, would any – do I have any takers to answer either of those questions first?
Dr Rachele Hendricks-Sturrup
Sure, I can – well, I think that question and the one before it, I think that both of those questions are a bit multifaceted. So, I think, you know, as I also mentioned, that we still need to figure out how we can resolve trust issues, privacy issues, with the companies creating this technology. You know, in addition to that, I think it’s also important to acknowledge the privacy standards or privacy policies that they have created around this technology, for example, the use of a decentralised tech platform rather than something centralised, and I think the companies are – I believe they’re not budging on changing that. Decentralisation of this information is privacy by design, so to speak. So, acknowledging privacy by design aspects to the technology is equally important as it is to acknowledge some of the issues around the erosion of fairness and equity and privacy and so forth.
So, I think it’s important to bring all the facts to the table, and really ask ourselves, what it is that we prioritise, in terms of privacy through decentralisation or privacy only to the extent of the information being able to be centralised for various purposes, like public health research, public health surveillance, and that’s in the event that a certain population probably would prioritise that, more so over individual privacy rights. So, there’s still some reconciliation that needs to be had at, you know, at that level, across different countries, including the US and the UK, Singapore and others. So understanding, all of that being said, understanding how cultural norms and expectations play a role.
As Malavika mentioned, that in Singapore there’s a – and other Asian communities, there’s a sense of togetherness that is more so prioritised over the sense of individuality that we see more as a Western construct. So, I think what we have to – we have to take that and understand that there’s a story behind that culture. There’s a reason behind that culture, and I think in countries that are more westernised, like the United States, we have seen an unequal distribution of power to the point that it’s resulted in violence and harm against certain groups. So, prioritising the individual is at times lifesaving.
And so, for that reason, it’s important to bring groups that have a certain interest, like, for example, LGBTQIA groups, who also share the notion of togetherness and creating safe spaces to come together, and therefore, they can’t always social distance or physically distance from one another. Bringing people like them to the table, cultural – or groups that share a culture that has been marginalised, groups that share a particular skin colour that have been marginalised or disenfranchised or even killed. So, bringing them to the table to ask them, “Okay, let’s talk about this technology. If anyone has built this technology, let’s talk about your experience with that and what you did to build this technology in a way that makes sense for you in your day-to-day and the day-to-day of those you care about, their safety as well as yours. Let’s talk about principles that we need to have and consider as we implement this technology.”
And at the Future of Privacy Forum, that is something that we do, it’s something that we’ve done, with regard to digital contact tracing. We do have principles that will be shared or disseminated soon, after having these conversations, but creating that safe space to have the conversations, creating financial support programmes to have the conversations and also build the technology with an actual representation of stakeholders present, there’s a need to really invest in that, to make sure that we’re creating the infrastructure, in addition having the conversations with the right people, building the infrastructure that is by the people and for the people, as well.
Esther Naylor
Yeah, thank you, and I think that the point that you’ve touched on there really drills down to this whole concept of trust and trust in institutions. And the reason why we’re talking about the, kind of, ethical and privacy concerns is that there’s, you know, as an individual user, you’re hoping that, you know, your tech solution is secure, so that it’s not, kind of, vulnerable to a cyberattack. You’re also hoping that the data is only shared with the relevant partners. And so, you know, when we’re thinking about who needs to be involved in the, kind of, questions of privacy by design, it’s who’s privacy are we preserving and how are we doing that?
I would like – I would want to give the panellists an opportunity to react to the questions that I posed earlier. Malavika, would you like to jump in? Or would anybody like to jump in on the question about human rights organisations and human rights crises around the world? Yeah?
Malavika Jayaram
I could say something very, very quickly, which is that funding sources are really tight, especially in the Global South. There’s a line of funding from companies to do work on silly things like fact-checking, which all the literature has shown really does very little. It’s like a tiny drop in the ocean against design changes that platforms could make to tackle things like misinformation. So, I think it’s really hard to get funding to interrogate tech platforms and solutions, when countries are really resistant to being called out for privacy violations. And there are so many new norms that actually prevent foreign funding of local organisations, and that see it as interference in local political affairs. So, I think being really creative in how to support these organisations is a really big priority.
Esther Naylor
Thank you, and with five minutes left, thank you everyone who has asked a question in the ‘Chat’. I wanted to pose a final question to all of our panellists. And we know, from the discussion today, we know that some of these solutions, you know, have been very useful, in terms of the pandemic, and the things that they’ve brought out, especially in terms of inequalities, the power structures, you know, equitable use of these technology solutions. We know that most of these solutions are here to stay, and whether it’s, you know, a digital vaccine passport, whether it’s the use of facial recognition technology, as is the case increasingly, I would like the panellists to reflect on, do they have, you know, a single piece of advice in maybe a sentence or so, that they could give to governments or developers of these technology solutions, to, you know, think about how do we make sure that when we’re developing these solutions, what’s the, kind of, key message and piece of advice that you would give to the developers? So, Carly, can I start with you, if that’s okay?
Carly Kind
Sure, I definitely can’t do it in one sentence, but you’ve been warned. I suppose there’s a really good maxim in the disability and access community, which is nothing about us without us, and I think that that should really apply to the use of technology for public policy purposes, you know, just in terms of the answer that Rachele gave before, I think having the broadest representation, not only in terms of demographic diversity, but in terms of disciplinary diversity, in the room, so decisions about technology policy is made, is so important.
One of the things we saw around the crisis was this, you know, quite trite saying, “Oh, we’re following the science, we’re following the science.” And certainly, for the first six months, the pandemic response in the UK was very driven by public health people and, kind of, and technical design, Computer Scientists, people who were looking at the problem purely from that angle. And over time, what we’ve seen is it’s very important to understand social science, sociology, behavioural science, and the ways in which people integrate with technology, to understand if tech is going to work.
And I’ve already spoken too long, but just one quick example, you know, one of the things we were saying around the contact tracing app last April in our conversations with those developing it was, we don’t know that people will listen to a notification that tells them to stay home and self-isolate. We don’t yet know how people react when they’re told by an app to do something. And in particular, if there are financial circumstances to self-isolation, there is a high likelihood that people will neglect an app notification that tells them to self-isolate. And it took a great many months for the app developers to build in payments for self-isolation into the app, to incentivise people to self-isolate. And so that, kind of, behavioural integration, social integration of technology, is so important to think through from the start, because you might end up undermining the entire purpose of the tech to begin with.
Esther Naylor
Thank you, and Emily, over to you. What one piece of advice would you either give a government or even a tech company in developing these solutions?
Emily Taylor
Well, I think the one piece of advice I’d like to give, building on what Carly’s saying, is really think through the perverse consequences of what you’re doing. I think, you know, your poll shows that there’s a very high level of tolerance for some quite severe measures in response to an emergency. But what happens when it – when your measures, like a vaccine passport or whatever, create a financial incentive to lie and fake the results? And then you’ve got the worst-case scenario, which is a sense of security with no security at all, and a system that is really leaky and has holes and is also privacy infringing for the majority. So, it’s actually, I think, for – the pandemic has caused me to reflect and question a lot of the things that I thought were the most important in life, and I think I’ve been really drawn to the differences between communitarianism and individualism that Malavika mentioned in her remarks. So, I could go on, but I’ll leave it there. Thank you.
Esther Naylor
Thank you, and Rachele, over to you.
Dr Rachele Hendricks-Sturrup
Yeah, a big question. So, several answers, although I’ll try to compress them. So, from an implementation standpoint, I think we have to understand the technical limitations, in terms of no system is perfect, and to the extent those imperfections manifest themselves, who has or who will be at risk of harm, as a result of those imperfections in the system, due to flawed technical capabilities or flawed decision-making behind the implementation of those technical programmes or aspects as well? Also thinking about bias around implementation, bias in interpretation of the information that’s being received and processed by individuals, or even entities that are manning these systems, so that way they oper – or manning them on a day-t0-day basis, from an operational standpoint. Thinking about all of that is extremely important, and I would urge policymakers to be intentional around ensuring that these benefits can be maximised, and the risks don’t result or go as far as a person or a population being significantly harmed by the implementation of this technical, or whatever technical solution that is coming about during and after COVID-19. As we have seen, governmental entities and others can quickly change their minds about how they want to collect or use data, so we need to have safeguards in place to ensure that we’re intentional about protecting populations, as well as individuals, from any harm that might ensue from those certain decisions.
Esther Naylor
Thank you, and Malavika, a final word, over to you.
Malavika Jayaram
Yeah, I think I would really urge privacy by design and default, because I think cascading the choice and consent down to unsuspecting individuals, who have different levels of understanding and literacy, will mean that it’s a false choice, under duress they’re going to pick something that isn’t in their best interests. And I think the other thing I would say is that to not link it to identification and the biometrification and gamification of a lot of other surveillance infrastructures, and to actually treat people as people and to give them a vaccine because they’re human and not because they have a particular number associated with a particular system. This is an excuse for allegedly making people visible, but that visibility comes at great risk to people who are protected by being under the radar, and I think to, sort of, pierce that, sort of, veil has very disproportionate consequences for marginalised populations and to keep that in mind when they seemingly include people.
Esther Naylor
Thank you, Malavika. And I’m going to conclude this event here with a note of perhaps some of these things that our panellists have discussed could be addressed at an international or a regional level within the development of international norms and standards, especially with regards to data and privacy. And I think that the point about, you know, this is trying to reduce the erosion of trust between individual tech companies and governments and how we can make sure that where we are using tech solutions, we are making sure that they’re for the benefit of everybody rather than further entrenching inequalities.
I’d like to say a special thank you to our panellists and our Chatham House members, as well as DXC Technology, who is the funder for part of our Trends in Technology work, and we’ll be continuing the conversation on Trends in Technology. So, thank you everybody, and I hope that you have a good rest of today. Thank you.