Overconfidence in nuclear weapons policy refers to the tendency of policymakers and decision-makers to overestimate their own country’s military capabilities, while underestimating the consequences of decisions such as not engaging in arms-control efforts. This phenomenon can lead to reckless policies and strategies that heighten the risk of nuclear conflict. Overconfidence can manifest in a variety of ways, from the underestimation of the potential for miscalculation to even the accidental use of nuclear weapons. The dangers of overconfidence in nuclear weapons policy have been demonstrated in historical examples of near nuclear use, such as the Cuban missile crisis, and remain a significant concern in contemporary global affairs.
Possible examples of overconfidence in nuclear weapons policy include believing that ‘tactical’ nuclear weapons are usable and that nuclear escalation could be controlled during a crisis. It can also mean assuming that adversaries have the same perspective or assess situations in the same way. This was made evident in Kofman’s work that highlighted how drastically Russian perspectives on military strategy differed from those of the US, using the example of the US concept of anti-access and area denial (A2/AD) that has been used to analyse a ‘Russian doctrine or strategy for warfighting that frankly does not exist’.
One of the most notable instances of overconfidence in nuclear weapons policy occurred during the Cuban missile crisis. The Soviet Union and the US engaged in a high-stakes game of brinkmanship that nearly resulted in a nuclear war. Both sides initially believed that its nuclear arsenal could deter the other from launching a first strike. However, this confidence rested on incomplete intelligence about the other’s intentions and capabilities, as well as misunderstandings over the potential for miscalculation. Soviet and US leaders only reached a turning point in the crisis when each became increasingly concerned about being unable to control escalation. The crisis highlighted the dangers of overconfidence founded in incomplete intelligence and lack of control, and underscored the need for cautious and deliberate decision-making in nuclear policy.
The risk of overconfidence in nuclear weapons policy extends beyond intentional actions. Accidental or unintentional use of nuclear weapons is also a significant concern.
The risk of overconfidence in nuclear weapons policy extends beyond intentional actions. Accidental or unintentional use of nuclear weapons is also a significant concern. Overconfidence in the safety and reliability of nuclear weapons could lead to complacency and a failure to take necessary precautions. This, in turn, could increase the risk of accidental nuclear war and errors in nuclear weapons policy. Experts consulted also noted how overconfidence can extend to nuclear signalling and the interpretation of that signalling, which can lead to unnecessary escalations and potential conflicts. Nuclear systems are highly complex and tightly coupled, which introduces an additional layer of risk. In such situations, accidents can result from inherent system complexities, where the interactions between components are so interdependent that errors or failures in one area can cascade and lead to catastrophic outcomes in other areas. To visualize this concept, James Reason argues that levels of complex systems align like slices of ‘Swiss cheese’. Within each level (or ‘slice’) of the system, there are inherent weaknesses or vulnerabilities, represented by holes in the cheese. If the holes of each slice align, the safeguards present at each level of the system all independently fail, leading to a system-wide accident.
Tools and solutions
In his seminal work Expert Political Judgement, Philip Tetlock shows that experts are often highly overconfident – and wrong – in their efforts to predict future events. In policymaking, this overconfidence can cause decision-makers to enact unrealistic plans or take excessive risks, leading to failure – for example, an infrastructure project going massively over-budget, or a government being blindsided by unexpected opposition to a new policy. Similar manifestations of overconfidence, termed the ‘planning fallacy’, have been observed on numerous occasions in international security, including in the Russian invasion of Ukraine in 2022, in which the Russian government expected to achieve its goals much more easily than was the case in reality. When actors realize their mistake, they often continue on the wrong course due to the ‘sunk-cost fallacy’ – i.e. the desire to avoid their investment in resources and human lives being for nothing – making them dangerous actors in the international system. As the theory of loss aversion shows, when individuals perceive themselves to be in the domain of losses (for example, losing an armed conflict), they become more risk-seeking in their behaviour. In conflicts involving nuclear weapons states, such behaviour risks a conflict escalating to the nuclear level.
In the field of nuclear decision-making itself, where the possibility of learning from mistakes in calibration is limited, overconfidence can have catastrophic consequences. As identified in expert interviews for this paper, overconfidence may be an issue for those in key positions within nuclear decision-making structures. However, it is important to note that the level of overconfidence can vary widely among individuals, regardless of their profession. Norbert Schwartz’s findings suggest that when individuals feel powerful – or even if they are simply reminded of an earlier time when they felt powerful – they demonstrate higher levels of trust in their intuition. This tendency could bias individuals towards acting in line with their intuition or their moral or religious convictions, rather than following considered, slow and cross-institutional deliberation. One illustrative example comes from the Cuban missile crisis, when President Kennedy’s military advisers continually pushed for a full-scale invasion of Cuba, despite the potential for nuclear confrontation with the Soviet Union and despite calls for restraint from US defence secretary Robert McNamara. The advisers’ attitude arose from a belief that the US armed forces could overpower the Cuban military and an underestimation of the number of Soviet troops stationed in Cuba.
Training and drills among individuals involved in decision-making – including policymakers at various levels – can include behavioural exercises that both highlight the complexity of nuclear deterrence and challenge the belief that a nuclear confrontation can be contained. Training exercises are already common in both civilian and military institutions responsible for nuclear weapons strategy. However, additional exercises that seek to challenge the thinking behind overconfidence are rarely incorporated into these training regimens. Such exercises can help address several problems raised during expert consultations for this paper, examples of which are:
- Set-piece exercises, such as NATO’s annual Steadfast Noon exercise, do not reliably replicate the uncertainty of real-life scenarios. Such exercises can therefore feel artificial, even if they fulfil other important training functions.
- Training simulations involving nuclear weapons may desensitize military personnel to their use, potentially leading to overconfidence in the ability to use a small number of tactical nuclear weapons without escalating the conflict into an all-out nuclear war.
- The way arms control and nuclear weapons history are taught in many institutions may increase desensitization. For instance, nuclear weapons history is frequently taught as a series of geopolitical decisions, treaties and strategic calculations. While these aspects are crucial to understanding, human consequences of nuclear testing and use are often overlooked.
The behavioural exercises discussed below could help stakeholders – ranging from policymakers to nuclear weapons operators – develop better rules of thumb for nuclear decision-making. Such basic rules are particularly relevant in moments of heightened tension or crisis, when there is little time to study crisis manuals. These exercises may also enable stakeholders to reconsider long-held assumptions in ‘cooler’, lower-stakes moments.
Premortems
‘Premortems’ involve groups imagining that a plan has failed – for example, a nuclear weapon being used accidentally. The group is then tasked with working backwards to identify potential causes of the failure. The aim of the exercise is to bring to the fore previously unconscious knowledge about weaknesses in the group’s assumptions. Once weaknesses have been identified, steps can be taken to minimize the likelihood of these causes being triggered. Gary Klein has produced a practical guide to holding a project premortem. Although the guide is intended for a corporate audience, the set-up of the premortem is replicable in any team setting.
Calibration
Fundamental to overconfidence is a miscalibration between expectations and outcomes. To correct this miscalibration, stakeholders can be asked to guess whether a series of statements are true or false and state their degree of confidence when doing so. If key stakeholders are shown to be overconfident, a ‘recalibration’ can occur whereby individuals adjust their confidence in a wide array of assumptions, including their assumptions surrounding nuclear risk, their confidence in the information they are receiving from their intelligence agencies, and their confidence in how an adversary will perceive their actions.