Human Rationality and Nuclear Deterrence
Introduction
Like any human belief system, nuclear deterrence depends on a nest of assumptions. One core assumption is that decision-makers are able to rank their preferences rationally and act accordingly. In this schema, it follows that the likely catastrophic consequences of nuclear weapons will induce decision-makers to act – for the most part – with greater caution, the closer they come to the nuclear brink. This allows reasonable predictions to be made about how decision-makers will behave, even in nuclear crises. In this way, cooler heads on both the US and Soviet sides are said to have prevailed throughout the Cold War, with the effect that a nuclear conflagration was avoided during various crises.12
Today, there are dynamics at work that differ from those that characterized the Cold War. A growing multipolarity in international security competition and the introduction of new strategic technologies challenge previous understandings about nuclear deterrence and strategic stability. Even some strong supporters of nuclear weapons as a deterrent capability have lately come to express doubts about their efficacy.13 Moreover, new scientific findings are increasingly calling into question some previous understandings about utilitarian human rationality – with implications that may ultimately be profound for the practice of nuclear deterrence in whatever form it takes. This chapter introduces some of these rationality-related issues.
What is nuclear deterrence?
Deterrence has been defined in various ways, but at its root it means seeking to induce caution in others by threats of pain – in this case, through the use of nuclear weapons. Deterrence and nuclear weapons are not synonymous: there are various means to deter. In hindsight, in the early years of the Cold War there was an extraordinary period of intellectual activity concerned with the relationship between the two superpowers, and with the specific goal of avoiding nuclear war.14 Indeed, nuclear deterrence engendered some uneasy stability between the two superpowers, although neither the US nor the Soviet Union fully accepted the notion of mutually assured destruction, and the US would never cease exploring technology, such as missile defences, that might eventually transcend it.
Utilitarian rationality-based models were never fully dominant in nuclear deterrence policymaking, even in the West. But nuclear deterrence became strongly associated with them – and, in particular, game-theoretic approaches like those pioneered from the late 1940s by analysts at the RAND Corporation.15 However, it is important to recognize that the theoretical underpinnings for nuclear deterrence did not precede or even accompany the invention of nuclear weapons. Instead, they emerged as a response to the real-world existential threat of nuclear warfare.16 (The US monopoly on nuclear weapons lasted only four years; in 1949 the Soviet Union demonstrated its own nuclear capability.) These theoretical approaches added a level of intellectual respectability and rigour to considerations about strategy forced on decision-makers by the new nuclear reality. The paradox was that each superpower prepared for the use of nuclear weapons against the other, even as there was widespread acknowledgment that nuclear warfare was highly risky and best avoided due to its catastrophic consequences. Theory and paradox have been in tension ever since in policy debates about nuclear weapons.
Mitigating the existential threat of impending thermonuclear war to bring about a relatively stable ‘balance of terror’ between the superpowers was in itself a major achievement. It depended on acceptance of the notion that an opponent will act rationally, and that what each side needs to do is to ensure it is never the rational choice for their opponent to act in a way that would prompt nuclear retaliation. In the terminology of game theory, acting rationally means maximizing a utility payoff. At the policy level, however, the fear of events in a nuclear crisis escalating out of control (as almost happened in the 1962 Cuban missile crisis), the possibility of misperception, and the effects of bureaucratic and domestic political concerns shaping other understandings among nuclear-armed rivals, meant that it was difficult to be so sanguine. A chief concern about nuclear deterrence and the balance of terror during the Cold War was the risk of nuclear miscalculation.
Nevertheless, the logic was clear that the effect of nuclear deterrence should be nuclear non-use, even as nuclear deterrence depends on the credible threat of actual use. The longer nuclear weapons use was avoided, the more it cemented the idea that nuclear deterrence was sustainable, especially in an era in which nuclear disarmament did not seem a convincing alternative. An emergent non-proliferation norm further buttressed this apparently stable relationship between nuclear deterrence and non-use.17 In the process, retaining nuclear weapons became self-rationalizing, even when conditions changed after the end of the Cold War in ways that created greater uncertainty about whether stable nuclear deterrence is enduringly viable.
Tricky assumptions
The basic concept of deterrence has been sliced in various ways for purposes ranging from deterrence denial to extended deterrence in order to discourage adversaries from coercing or attacking allies.18 At the heart of all these permutations of nuclear deterrence is that nuclear weapons will either deny an aggressor their objective, or will punish them in a way that makes the cost of their behaviour unacceptable. This near-certainty of denial or punishment means that, if they are rational, a potential aggressor will not take such actions in the first place. A vital element of deterrence is thus a party’s ability to convince their adversary that they will act on their commitment to use nuclear weapons if necessary. One important part of signalling this resolve is by visibly making preparations for nuclear weapons use. Another is devising the means to convince an adversary that beyond a certain point it may not be in one’s hands to prevent nuclear use from occurring – the ‘threat that leaves something to chance’19 – which is supposed to be a further inducement to caution. The assumption of rationality also offers the possibility of coercing others to behave in certain ways through nuclear threats.
Challenges to the nuclear deterrence security framework take various forms. One notable problem is increasing multipolarity. Although there were eventually other nuclear powers, the US and the Soviet Union, as peer strategic competitors during the Cold War, were for the most part principally concerned about each other. As the geopolitical balance has altered, and the number of nuclear-armed states has increased to nine,20 so nuclear strategies have to adjust. Instead of an assumed dyadic confrontation, there are now strategic triads and even chains of nuclear-armed states21 in which crisis escalation and signalling may be considerably more complex to manage.22
Advancing technologies of concern include highly precise low-yield nuclear weapons, cyber offensive capabilities, autonomous weapons and artificial intelligence-based decision systems, hypersonic glide vehicles, anti-satellite weapons, and missile defences. Each of these capabilities, or the responses to their use, could blur the line between the use of conventional and nuclear weapons.
A second challenge arguably arises from the threat to nuclear stability posed by some advanced technological capabilities. Stability in this sense can be defined as the absence of incentives to use nuclear weapons first (crisis stability), and the absence of incentives to build up a nuclear force (arms-race stability).23 Advancing technologies of concern include highly precise low-yield nuclear weapons, cyber offensive capabilities, autonomous weapons and artificial intelligence-based decision systems, hypersonic glide vehicles, anti-satellite weapons, and missile defences. Each of these capabilities, or the responses to their use, could blur the line between the use of conventional and nuclear weapons. This in turn could break down the distinction between nuclear and non-nuclear warfare.24 While some of these new strategic technologies might be countered or deterred in similar ways to those employed historically, during the Cold War and after, some, such as cyber offensive operations, cannot. Cumulatively, the use of these technologies could generate significant additional ambiguity in a crisis.
Questioning common rationality in crisis
Both multipolarity and the emergence of certain new strategic capabilities complicate and undermine the theory and practice of nuclear deterrence. A further problem has gradually come to the fore as science has increased understanding of human decision-making psychology. It is that the utilitarian rationality assumption possesses significant shortcomings, a critique that in recent decades has begun to be applied to nuclear debates.
Schelling and others were convinced that deterrence could work in the short and the long run because the catastrophic effects of a deterrence failure would induce each party to be very cautious in their actions. Each would take great care to avoid using nuclear weapons for any reason, except as a last resort or unless the adversary crossed a clear line. One person’s definition of careful is not necessarily the same as another’s, however. Each individual has a perception of risk that is at least partially subjective.
New understandings about rationality and the way people really tend to behave in stressful or crisis situations indicates that (a) people often do not have fixed or even stable preferences; (b) they are subject to cognitive biases or constraints that shade their thinking, without them necessarily being aware of this, especially in complex or crisis situations; and (c) humans have a poor intuitive grasp of probability. Taken together, these raise questions about the assumption that, in crisis, decision-makers can depend on their sharing a common rationality with the other side.
Problems with preferences
Emotions tend to specify a range of options for action in a given context.25 How these rank in preference is frequently not rational in utilitarian terms. For instance, in humans and other primates, perceived unfairness is a powerful driver for behaviour that, in utilitarian terms, is not in one’s best interest.26 This is demonstrable even in simple games such as the ‘ultimatum game’, in which human test subjects will punish others for low monetary offers, even when in absolute terms they themselves stand to lose from the deal.27
People appear to have an inbuilt bias in which they are more risk-seeking when there appears to be something to gain, and more risk-averse when they fear they have something to lose.
A second issue with the idea of stable, underlying preferences results from psychologists having shown that these preferences often depend on the way in which a situation is framed.28 People appear to have an inbuilt bias in which they are more risk-seeking when there appears to be something to gain, and more risk-averse when they fear they have something to lose. Daniel Kahneman, whose work Thinking, Fast and Slow draws on joint work undertaken with Amos Tversky, showed that this frequently leads people to be inconsistent in their preferences and decision-making.29
One upshot of this is that decision-makers in a nuclear crisis may well act less predictably than the assumption they are rational suggests they would. The issue is not because the people involved in decision-making are unaware of the potentially catastrophic consequences, nor that they choose to act irrationally. It is that, in some situations, humans are not equipped to think in a rational way, although nuclear deterrence theory assumes that they will. One study considering the role of emotions as being potentially at odds with nuclear deterrence theory observed that emotions can ‘create an overriding bias against objective facts or interfere with support mechanisms of decision-making such as one’s working memory, or the lessons we draw from past experience’.30
Cognitive bias
The kinds of issue described above relate to the fact that humans have a range of innate cognitive heuristics or biases. Psychologists have demonstrated two decision-making systems – one that is intuitive and ‘fast’ but often imprecise (termed ‘system 1 thinking’), and one that is more considered, ‘slower’, and more akin – or amenable – to utilitarian rationality (‘system 2 thinking’). Some situations contribute to confusion between the two systems. Scholars have found instances in the Cold War when reasoned, human judgment (system 2 thinking) averted close calls of nuclear use, but also instances of very great psychological pressure in crisis that came close to nuclear use, such as when the US Navy inadvertently depth-charged Soviet nuclear-armed submarines during the 1962 Cuban missile crisis.31
In addition, humans exhibit a range of biases that can affect the ways in which they observe, collect, process and evaluate information, and which can make them less aware of what is really going on around them.32 A common theme with many of these biases is that they tend to lead to interpretations of events that support existing desires and beliefs, both in individuals and groups. Thus, it cannot necessarily be assumed that adversaries in a nuclear crisis even share enough of an outlook to enable reasonable predictions about the preferences and behaviour of the other side. This would appear a particular danger in the context of very isolated nuclear decision-making elites – in North Korea for instance – in whose perception nuclear use might be preferable to other outcomes such as losing power.
Probability
A third problem with the assumption of rationality in nuclear deterrence theory is demonstrated by mounting evidence that human minds are poorly equipped to understand certain aspects of probability. Yet assessments of probability are important for ranking preferences rationally. In particular, humans tend to misjudge randomness and non-linearity,33 and are cognitively biased to consider unlikely events to be essentially impossible events.34
This is a problem not only for decision-makers in crisis situations, but for the broader policy discourse around the risks of nuclear weapons and nuclear deterrence. Nuclear weapons have existed for 75 years. Yet they have not been detonated in anger since August 1945, when the US dropped atomic bombs on the Japanese cities of Hiroshima and Nagasaki. Instead, nuclear weapons have assumed core and enduring functions within the strategies of the world’s most powerful states and their allies to deter their adversaries. Some proponents of nuclear deterrence point to this situation of nuclear possession without detonation as proof of the concept’s continued efficacy,35 when, in fact, there is no definitive evidence to support this confidence.
Inductive reasoning about the past is not of unconditional benefit in considering ‘black swans’ (extremely low-probability/exceedingly high-consequence events) that underlie many things in reality, including the risk of nuclear war. Rather, it is conceivable that the world has simply been lucky to avoid nuclear use, either inadvertently or deliberately caused. This is something that, as scholars have observed, nuclear policymakers seem to find hard to accept.36 Nonetheless, it would be rational to adopt a greater degree of openness towards rigorous probabilistic analysis,37 and it may lead to greater urgency among possessors of nuclear weapons to finally begin to transition away from these as a basis for their security.
Concluding thought
This brief essay has introduced some issues with assumptions of rationality as it relates to nuclear deterrence. The broader point is that, when it comes to future nuclear crises, we simply do not know whether a common rationality will hold among the decision-makers involved to prevent nuclear weapon use – whether deliberately in certain cases, or as a plausible inadvertent outcome. The deductive logic of nuclear deterrence theory is not backed by enough empirical evidence based on nuclear crisis or war to say with a high level of confidence that nuclear deterrence will lead to non-use in all cases. Consequently, there is an inherent risk of nuclear use in nuclear deterrence as a practice (which, as noted above, is by its own logic intrinsic to nuclear deterrence). Given this risk, a crucial question to be asked is if nuclear deterrence is still worth it, whatever its efficacy is perceived to have been in the past.
12 For instance, see Allison, G. T. (2012), ‘The Cuban Missile Crisis at 50’, Foreign Affairs, 91(4), July/August 2012, pp. 11–16, https://www.jstor.org/stable/23218035 (accessed 14 Aug. 2019).
13 For instance, see Krepinevich Jr, A. F. (2019), ‘The Eroding Balance of Terror: The Decline of Deterrence’, Foreign Affairs, 98(1), January – February 2019, https://www.foreignaffairs.com/articles/2018-12-11/eroding-balance-terror (accessed 14 Aug. 2019).
14 See Brodie, B. (1966), Escalation and the Nuclear Option, Princeton, NJ: Princeton University Press; Snyder, G. H. (1961), Deterrence and Defense: Towards a Theory of National Security, Princeton, NJ: Princeton University Press; and the work of Schelling, T., especially Schelling, T. (1960), The Strategy of Conflict, Cambridge, MA: Harvard University Press, and Schelling, T. (1966), Arms and Influence, New Haven and London: Yale University Press.
15 See Poundstone, W. (1992), Prisoner’s Dilemma, New York: Anchor Books. For background on game theoretic approaches see Williams, J. D. (1982), The Compleat Strategyst: Being a Primer on the Theory of Games of Strategy, New York: Dover. (This book is a reprint of a RAND Corporation publication originating from 1954.)
16 Harrington, A. I. (2016), ‘Power, violence, and nuclear weapons’, Critical Studies on Security, 4(1), p. 92.
17 Freedman, L. (2013), ‘Disarmament and Other Nuclear Norms’, The Washington Quarterly, 36(2), pp. 93–108.
18 For a discussion, see Mazarr et al. (2018), What Deters and Why: Exploring Requirements for Effective Deterrence of Interstate Aggression, pp. 7–10.
19 Schelling, T. (1960), The Strategy of Conflict, p. 187.
20 China, France, India, North Korea, Pakistan, Russia, the UK and the US, together with Israel, which neither denies nor confirms possession of nuclear weapons capability. Of these nine states, the US, the UK, France, China and Russia are parties to the NPT; India, Pakistan and Israel have never been parties to the treaty; and North Korea withdrew in 2003.
21 Einhorn, R. and Sidhu, W. P. S. (2017), The Strategic Chain: Linking Pakistan, India, China, and the United States, Arms Control and Non-Proliferation Series Paper 14, Washington, DC: The Brookings Institution, https://www.brookings.edu/wp-content/uploads/2017/03/acnpi_201703_strategic_chain.pdf (accessed 14 Aug. 2019).
22 Krepon, M. (2015), ‘Can Deterrence Ever Be Stable?’, Survival, 57(3), pp. 111–32, doi:10.1080/00396338.2015.1046228 (accessed 14 Aug. 2019).
23 See Acton, J. M. (2013), ‘Reclaiming Strategic Stability’, Carnegie Endowment for International Peace, 5 February 2013, https://carnegieendowment.org/2013/02/05/reclaiming-strategic-stability-pub-51032 (accessed 27 Nov. 2019). Acton notes this definition derives in turn from Schelling (1960).
24 Tannenwald, N. (2018), ‘The great unravelling: The future of the nuclear normative order’ in Tannenwald, N. and Acton, J. M. (2018), Meeting the Challenges of the New Nuclear Age: Emerging Risks and Declining Norms in the Age of Technological Innovation and Changing Nuclear Doctrines, Cambridge, MA: American Academy of Arts & Sciences, p. 13, https://www.amacad.org/sites/default/files/publication/downloads/New-Nuclear-Age_Emerging-Risks.pdf (accessed 14 Aug. 2019).
25 Thayer, B. A. (2007), ‘Thinking about Nuclear Deterrence Theory: Why Evolutionary Psychology Undermines Its Rational Actor Assumptions’, Comparative Strategy, 26(4), pp. 311–23, doi: 10.1080/01495930701598573 (accessed 14 Aug. 2019).
26 Proctor, D., Williamson, R. A., de Waal, F. B. M. and Brosnan, S. F. (2013), ‘Chimpanzees play the ultimatum game’, Proceedings of the National Academy of Sciences, 110(6), pp. 2070–5, https://doi.org/10.1073/pnas.1220806110 (accessed 14 Aug. 2019).
27 For a description, see Borrie and Thornton (2008), The Value of Diversity in Multilateral Disarmament Work, p. 31.
28 For example, as shown by the Wason Selection Task. See Borrie and Thornton (2008), The Value of Diversity in Multilateral Disarmament Work, p. 32.
29 Kahneman, D. (2011), Thinking, Fast and Slow, London: Allen Lane, pp. 367–8.
30 Thayer (2007), ‘Thinking about Nuclear Deterrence Theory: Why Evolutionary Psychology Undermines Its Rational Actor Assumptions’ , p. 317.
31 Lewis, P., Williams, H., Pelopidas, B. and Aghlani, S. (2014), Too Close for Comfort: Cases of Near Nuclear Use and Options for Policy, Chatham House Report, London: Royal Institute of International Affairs, pp. 8–9.
32 Fine, C. (2006), A Mind of Its Own: How Your Brain Distorts and Deceives, London: Icon.
33 Taleb (2004), Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets.
34 Borrie and Thornton (2008), The Value of Diversity in Multilateral Disarmament Work, p. 43.
35 For instance, see Tertrais, B. (2011), In Defense of Deterrence: The Relevance, Morality and Cost-Effectiveness of Nuclear Weapons, Paris: Institut Français des Relations Internationales, p. 26, https://www.ifri.org/sites/default/files/atoms/files/pp39tertrais.pdf (accessed 14 Aug. 2019).
36 Pelopidas, B. (2017), ‘The unbearable lightness of luck: Three sources of overconfidence in the manageability of nuclear crises’, European Journal of International Security, 2(2), pp. 240–62, doi: https://doi.org/10.1017/eis.2017.6 (accessed 14 Aug. 2019).
37 See Box 1 in Borrie (2014), Risk, ‘normal accidents’, and nuclear weapons.