This chapter incorporates a case study of the UK’s COVID-19 track-and-trace app, and what its history reveals about the power dynamics between big tech and elected governments. The app’s story reflects trends in both the tech market and in public health in the UK, including the consolidation of mobile operating system and app store markets, the centralization of the UK’s public health provision and the impact of successive budget cuts in the decade since 2010, as well as the pitfalls of what has been termed ‘tech-solutionism’ in the face of complex public health and policy problems.
Whatever the merits of the competing design architectures of the first version of the UK’s contact-tracing app and the Google–Apple model that was to replace it, the failure of the UK’s first app was due to the imposition of a policy decision on a democratically elected government by two unelected, unaccountable tech companies, raising important questions about the legitimacy of the resulting policy. In essence, Apple and Google withheld access to essential technologies until the UK agreed to align its data storage model with that advocated by the tech companies.
This paper also considers the resources dedicated to the app in the context of the UK’s wider public health response. Was investment of £11.8m in the first app’s development worthwhile, or was policy ‘led by technology, rather than the other way around’?
Big tech and public health before COVID-19
Normalization of surveillance, market concentration and political influence
‘Google knows more about you and me than the KGB, Stasi or Gestapo ever dreamed of.’ So said the German business daily Handelsblatt about Google Street View in 2010, three years before Edward Snowden revealed the extent of big tech’s data-processing activities. In the private sphere, the free-to-use platforms Google, Facebook, Twitter and, more recently, TikTok have normalized exploitative levels of data collection permitted in their terms of service, termed ‘surveillance capitalism’ by Shoshana Zuboff and ‘extractive industries’ by John Naughton.
After a short period of extreme openness and innovation, the online marketplace is now in the hands of a few ‘privately controlled industrial behemoths’., Consolidation is evident, both at the application level and within the deeper layers of the internet’s architecture, where the same familiar names – Google, Amazon, Facebook, Apple, Microsoft – provide critical infrastructure on which all other services depend.
The mobile telephony environment is even more tightly consolidated, with two operating systems accounting for 99.75 per cent of the global market: Google’s Android and Apple’s iOS, each with their own app store. Apple’s App Store is the only means for an app developer to distribute software on iOS devices; and while Google does permit other app stores on Android, Google Play is dominant. Apple’s conduct in relation to the App Store has raised antitrust concerns on both sides of the Atlantic, including the denial of third parties’ access to key technology in order to gain competitive advantage.
Commercial success has brought political influence to match. In a sector initially shielded from regulation, later attempts by regulators to rein in the market power of big tech have had limited success, and may have had the perverse consequence of entrenching existing market power.
The competitive advantage to be gained from the algorithmic manipulation of big data has engendered a culture of secrecy. A lack of transparency on the part of tech companies makes their processing techniques difficult to assess, critique or regulate.
The EU’s flagship privacy regulation, the General Data Protection Regulation 2016/679 (GDPR), was ‘one of the most lobbied pieces of European legislation in European Union history’. While the GDPR has required enterprises to make substantial adjustments in the way they handle personal data, it has barely impacted the core business model of targeted advertising, enabled by the storage and processing of enormous data troves.
The competitive advantage to be gained from the algorithmic manipulation of big data has engendered a culture of secrecy. A lack of transparency on the part of tech companies makes their processing techniques difficult to assess, critique or regulate, whether by governments, academics, civil society or even by other parts of the tech industry. As a result, ‘dominant platforms exploit their gatekeeper power to dictate terms and extract concessions that no one would reasonably consent to in a competitive market’. A policy vacuum has been created by Western governments ‘declining to regulate or ducking contemporary challenges’.
Whatever else the past two decades have brought us, they have not delivered a blueprint for sound technological governance.
UK public health 2010–20: centralization, defunding, and marginalization of local expertise
Track and trace – done by humans – is a basic task of public health authorities, having long been deployed to mitigate outbreaks of infectious diseases. Contact tracing is a skilled job and requires local knowledge. Through interviews, a public health official can help people piece together their movements over a relevant period, jogging their memory while looking out for anomalies or ‘red flags’ (such as ‘… and then I went to visit my mother, who’s in a care home’). The contacts thus identified are then followed up by the team.
A local public health team ‘has deep knowledge of the characteristics of [their] patch that make its health inequalities so stark and its residents so vulnerable’.
A decade of austerity in the UK, from 2010, led to substantial cuts in public health provision. To take the example of England, the elimination of its regional health authorities left most local public health teams having to coordinate with local authorities (numbering in the hundreds) in the absence of the much larger regional bodies (numbering eight to 10) that had previously coordinated between central and local government on policy and service provision. A ‘huge disconnect’ thus developed between public health and different branches of government. Functions such as environmental health, community and neighbourhood teams, and youth services workers were lost, ‘the kind of staff […] used during 2009 swine flu to work closely with the NHS’. In the place of local public health teams emerged a centralized provision, often outsourced to private companies.
COVID-19 trends
The outbreak of a new coronavirus, SARS-Cov-2 or the COVID-19 virus, thought to have originated in Wuhan, China, rapidly developed into a global pandemic during the first half of 2020. As the disease spread remorselessly throughout the world, several national governments announced that a track-and-trace app would form part of their public health response to the pandemic: these included Singapore, South Korea, Germany, Switzerland and Ireland. In brief, such apps work by tracking an individual’s movements, using Bluetooth Low Energy technology to detect and identify the phones of other app users, while collecting data about interactions with others (how close, and for how long). If an app user develops symptoms of COVID-19, the app notifies all those who have come into contact with that user during a predefined period. Individuals are then able to self-isolate or take other measures to safeguard their health.
The development of any health app raises considerations of human rights, technical and practical challenges, and cybersecurity issues.
Human rights: the three key threats
Data relating to an individual’s health is protected by Article 8 of the European Convention on Human Rights, and is a special category of personal data under the GDPR, attracting higher levels of protection than other data. The serious cross-border threat to public health of an ongoing pandemic is a justifiable ground for some limitations on individuals’ right to privacy, but such limitations may not remain justifiable once the current pandemic has been brought under control.
The law encourages the anonymization of data, and, appropriately handled, this can be a way of reducing the risks associated with data processing. Yet numerous studies have shown the ease with which data can be deanonymized.,
Human rights experts advise that a COVID app would raise three key risks of interference with Article 8:
- Centralized data collection. Should the data generated by the app’s use be stored centrally, in a single database, or should it be decentralized, with the majority of data processing and storage occurring at the level of the user’s handset? A centralized data collection system would require substantial safeguards to avoid potential abuse by the data controller (whether government or private sector). Technical and human rights experts favoured a decentralized model for apps: one in which the data is stored on the user’s device. This avoids the privacy risks of a centralized system, while delivering essential health information to individuals.
- Mandatory app use. Matrix Chambers considered a combination of a mandatory and centralized design to comprise a ‘wholly unprecedented level of granular data about the social network of the majority of the population’. Some employers are already reported to be insisting that staff use the app, a predictable development that would impact on a state’s human rights obligations to individuals.
- Immunity passports. If the app were used to generate immunity passports ‘on the basis of […] location or immigration status, it might give rise to stigmatisation and indirect discrimination’. Discrimination on the basis of age or race could occur where mass statistical data fails to take adequate account of personal characteristics.
The public health advantages of centralized data storage
The primary function of a COVID app is to inform individuals about their potential exposure to the virus. Either a centralized or decentralized model would achieve this objective. In addition to the primary function, the app could potentially serve as a public health intervention to suppress the pandemic, offering health officials a view of the entire country’s level of infection, identifying virus ‘hotspots’ and enabling the swift mobilization of resources. According to one epidemiologist: ‘One of the advantages is that it’s easier to audit the system and adapt it more quickly as scientific evidence accumulates.’ To perform this function, the app would require centralized storage of data and would need to be downloaded by a substantial proportion of the population. In interviews conducted as part of the research for this paper, public health experts described a centralized model as prioritizing the collective good (control of the pandemic) over an individualistic/libertarian approach – a tension that is also apparent in other contexts such as the wearing of masks in public places.
Technical and practical challenges – getting the app to work
Any successful app would need to provide ‘proximity event logging’, detecting other devices running the app via Bluetooth at frequent enough intervals to measure the duration of encounters between people and at a near-enough range to capture encounters at risk of transmitting the virus, without draining a device’s battery. The app must work while a device is locked.
Working with Bluetooth creates technical challenges, particularly in detecting proximity within the 2–4 metre range. Bluetooth has a history of security breaches that have been comprehensively reported and studied. Security-conscious smartphone users are often advised to turn off Bluetooth when it is not needed.
To work, COVID-19 apps need users to keep Bluetooth running – particularly when they are in public places – which holds the potential to expose users to attack or surveillance.
To work, COVID-19 apps need users to keep Bluetooth running – particularly when they are in public places – which holds the potential to expose users to attack or surveillance. Cooperation with the two biggest mobile operating system platforms, Google’s Android and Apple’s iOS, was essential so that the app would be authorized for inclusion in their respective app stores.
Cybersecurity challenges
A centralized contact-tracing system would require high levels of competence and planning to mitigate the risk of unauthorized access, particularly if the data were to be stored centrally. Cybersecurity risk mitigation should also seek to reduce the impact of eavesdropping or fake exposure events. A bad actor could target specific populations, using a powerful antenna (for example, outside a police station or healthcare facility), and submitting (via the bad actor’s app) a false report of infection. This could result in the needless quarantining of key workers.
Case study: Google–Apple and the UK app
The ‘Google–Apple’ model
In April 2020, after several national governments had already deployed their own track-and-trace apps, Apple and Google entered the market. The two companies announced that they were developing a functionality, to be built into their mobile operating systems, that would allow governments to build apps. Based on a decentralized model, central servers would not contain information regarding who may have been infected with coronavirus from whom.
Neither Apple or Google was prepared to permit apps to run Bluetooth contact-monitoring technology in the background of their operating systems in a way that could allow governments to collect an anonymized overview of contacts that were taking place. The companies stated that this would set an undesirable precedent, allowing governments to track their populations for potentially malicious purposes. ‘If [public health authorities] create an app, it must meet specific criteria around privacy, security and data control.’ These criteria were set by Apple and Google.
The UK’s COVID-19 app, version 1.0
Since early March 2020, the UK government had been developing its own app. The UK initially opted to use a centralized data storage model for epidemiological reasons, while incorporating numerous privacy and cybersecurity protections. The first version of the app was developed by the National Health Service’s specialist unit for technology, digital and data (NHSX), in close consultation with cybersecurity experts at the National Cyber Security Centre (NCSC) and the Information Commissioner’s Office (ICO).
Shortly after the UK app launched, it was reported that it was failing to discover iPhones where devices were locked. The UK government undermined its own arguments about safeguards protecting the identity of individuals in a centralized system after advice to ministers was leaked suggesting that they could be given the ability to deanonymize the data gathered by the app.
Despite extensive negotiations with a number of governments, Apple was not willing to shift its position on allowing Bluetooth to operate in the background for apps not using their decentralized infrastructure. In response, several countries that had originally pursued a centralized model – among them Germany, Italy, Denmark and Singapore – made the decision to switch to the Google–Apple model.
The political influence of the tech companies became apparent as they teamed up with privacy campaigners and often ‘play[ed] hardball with politicians’. Those familiar with the development of the UK app describe how the US tech giants worked behind the scenes to persuade elected decision-makers across several countries to reject ‘home-grown’ apps in favour of the Google–Apple model – in some cases, just prior to the planned launch.
Having tried and failed to craft its own app, the UK announced that it was shifting to the Google–Apple decentralized model, combined with a QR code-based check-in to pubs and other public venues. Version 2.0 of the UK app was launched on 24 September 2020. By the end of October it had been downloaded 19 million times. Shortly after the launch, there were reports of ghost ‘possible exposure’ messages, which security researchers attribute to competing risk algorithms being run both in the Google–Apple back end and the UK’s front end, developed by NHSX.
What can we learn from the UK app story?
Tech platforms impose policy on governments
The story of the UK’s track-and-trace app demonstrates the influence exerted by Google and Apple over elected policymakers. In June 2020 Health Secretary Matt Hancock accused Apple of being ‘intransigent’ and of not doing enough to work with ‘democratically elected governments’, adding that ‘… Apple wouldn’t make the change to allow [the UK app] to work on Apple’.
However it was accomplished, the outcome was that two companies withheld access to essential technologies on the basis of their own preferred policy solution: decentralized data storage. While this may have been the option that human rights activists and technologists would have championed, it does not achieve the epidemiological benefits of the initial NHSX app design. It raises questions over the legitimacy of the policy outcome, as tech companies imposed an individualistic ideology on the technical solution over one that prioritized collective public health.
The episode highlights the power imbalances between elected governments and private sector corporations. There are significant differences in levels of accountability and transparency between the public and private sectors. It underlines the realpolitik of corporate power over that of democratically elected governments, and the willingness to block access to essential technologies and deploy soft power in the form of lobbying. It is ironic that Google, itself a voracious collector of centralized data even where this is unnecessary to perform the relevant contract or service, could participate in barring democratic governments from adopting centralized architecture for a health app during a pandemic, on the grounds of privacy – a case of ‘do as I say, not as I do’.
The lack of international technical standards for COVID-19 apps
Healthcare interventions typically need to conform to the highest standards of safety and efficacy, and are covered by international human rights laws. COVID-19 smartphone apps constitute a healthcare intervention, and yet, despite the pandemic’s global reach, countries are developing apps independently, and there are no internationally agreed technical standards that are both privacy-respecting and secure by design, which could guide the development of track-and-trace apps in the UK and elsewhere. Such standards could potentially offer interoperability if individuals travel overseas, and at the same time protect against overreach by governments, some of which are reported to be using a COVID-19 app to record data including names, addresses, sex, gender, age, location, disease symptoms and test results.
Was the UK app really a threat to privacy and security?
By the time the UK announced its planned transition to the Google–Apple model, the development of the original app had cost approximately £11.8m.
The UK’s decision to pursue a centralized model for its app was criticized on human rights grounds. However, the public health professionals interviewed for this paper were unanimous in their opinion that centralized data is essential for epidemiological purposes. While the UK government admitted that it failed to fulfil its GDPR requirements – by deploying the app without a Data Protection Impact assessment – its other choices in creating the app showed a high level of respect for individual privacy and security by design, contrasting with the ‘surveillance capitalism’ of big tech.
In theory, the Google–Apple app should not collect location data. In practice, concerns have been raised about Google’s collection of data associated with app use through the software that powers its app distribution service Google Play. Despite location data not being collected in Ireland’s track-and-trace app, for example, it appears that Android users suffer a degradation in service if they do not enable data sharing at a low level in the platform.
Additionally, there are concerns over the issues that may arise in the longer term if Google and Apple monetize these services in the future. This infrastructure could, in theory, allow the development of large-scale, multinational contact maps that could enable the capture of significant amounts of network information. It is unclear how domestic or international regulation could prevent this. Moreover, third parties – for example health insurers – might in the future produce their own apps that use the Google–Apple back end (through the published API), but that collect additional data with user consent. Although under the decentralized model proposed by Google and Apple identifiable data are not uploaded to a central server, there do appear to be methods for harvesting data that the Irish app, for example, has used to report on its effectiveness. The Irish app invites users to give consent for the collection of some data.
Was the app the correct public health response?
In June 2020 Australia’s deputy chief medical officer Nick Coatsworth criticized the Google–Apple app model thus: ‘It fundamentally changes the locus of control and takes out the middle person and the middle person is the contact tracer, the people who have kept us safe.’ Dr Coatsworth’s remarks emphasize the critical role of the human contact tracer.
A feature of the UK coronavirus response to date is how little it has leveraged the expertise and resources of its local public health teams.
A feature of the UK coronavirus response to date is how little it has leveraged the expertise and resources of its local public health teams. Even when the UK switched to a human-first track-and-trace response in June 2020, it bypassed local public health teams, preferring to recruit centrally through private sector-led initiatives contracted out to private service providers Sitel and Serco. There have been repeated criticisms that the UK’s human track-and-trace efforts failed to make efficient use of the available skills and resources. One of the early recruits to be a Tier 2 contact tracer, Dr Nick Cavill (who holds a PhD in public health), was interviewed as part of the research for this paper. Dr Cavill completed his training in April 2020, but by the time of our interview, in September, had not received a single assignment. It is unknown whether Dr Cavill’s experience of the track-and-trace effort during this period reflected a lack of testing, lack of budget or other factors.
Apps are better than humans at ‘remembering’, but humans are better at understanding the significance of details, such as whether two people in contact were wearing masks, were behind protective screens, or were separated by the thin walls of adjoining flats and never in contact at all.
As a whole, the UK’s pandemic response has been criticized for being too centralized. The erosion of the public health function over the past decade, coupled with the poor use of the skilled resources that remain in place at the local level, has created a gap that even the best app could only partly fill.
Conclusion: who should make the policy decisions – tech or government?
The COVID-19 app in the UK has become emblematic of a troubling power imbalance between technology firms and elected governments. Google and Apple withheld access to essential technologies and their app stores, and deployed their lobbying power to impose an ideology that championed individual rights over collective public health. Both outcomes have merit, and it is clear that a successful solution should simultaneously be both respectful of individual rights and robust from a cybersecurity perspective, while also effectively serving essential epidemiological goals.
The episode also highlights double standards in the accountability of government versus that of the private sector. Privacy advocates rightly called for accountability and transparency from the UK government over its plans to develop a COVID-19 track-and-trace app, and they were given it – with publication of the source code and of extensive detail on how the GDPR’s data minimization principle would be respected, information security risks would be mitigated and data protection authorities involved in the design. When the UK failed in its obligation to conduct a Data Protection Impact assessment, it was rightly held to account.
In the case of Google at least, the big tech track record on data privacy is poor. Google’s terms of service expose an exploitative level of data processing, and the impact on individuals’ privacy is amplified when combined with the platform’s many popular services – resulting in a firehose of data from search queries, mapping, DNS queries, Gmail and video consumption on YouTube, not to mention the incidental collection of location, device details and IP addresses.
The story of the UK app can also be seen as an example of ‘tech-solutionism’, rather than a response to a well-evidenced need. The insistence on a decentralized solution placed the app primarily in the hands of individuals, rather than in those of local public health teams for whom it could have served as an additional resource.
The app story has highlighted the impact of concentration in the mobile operating platforms; the pivotal role of the app stores in enabling access to markets; the lobbying power of big tech companies, which themselves lack accountability, and their ability to withhold access to essential technologies until their preferred policy solution was adopted. Simply put, governments had no choice but to comply.