There have been numerous incidents of errors throughout the history of nuclear weapons. For instance, cases of lost nuclear weapons from the Cold War, such as the 1966 incident in which four hydrogen bombs were released over the Spanish town of Palomares following the collision of an US bomber with a tanker plane. (One of those bombs was never located.) More recent incidents, such as the collision of British and French submarines carrying nuclear weapons while on patrol in the Atlantic in 2009, have highlighted the fact that incidents involving nuclear weapons have not ceased over time, despite the introduction of more sophisticated systems and technologies, as well as the evolution of standards and policies. While the full extent of past incidents is unclear, multiple publicly known accidents and errors could have resulted in accidental nuclear use, while there have been others in which inadvertent escalation leading to nuclear weapons use was a real possibility.
Nuclear weapons operate within a complex and multilayered system. As such, errors can arise for a range of different reasons, including technical malfunctions, procedural failures, and human and system failures. Rigorous policies and procedures for nuclear safety and security have developed over recent decades. However, the risk of human error through cognitive biases or performance can never be fully eliminated. In the words of Lieutenant General James Kowalski of the US Air Force (USAF), ‘… the greatest risk to my force is an accident. The greatest risk to my force is doing something stupid.’
Yet the human element of nuclear safety and security has historically been a crucial factor in the prevention of escalation and accidental nuclear use. Thus far, multiple past cases of near nuclear use have been averted through good human judgement, often through individuals acting against protocol to avert nuclear catastrophe, such as in the 1983 Soviet Union nuclear false alarm incident. In the context of a period of heightened tensions within the Cold War, Stanislav Petrov, the commander on duty in an early-warning satellite system control centre, interpreted an incoming US nuclear launch warning detected by satellite sensors as a false alarm. Petrov’s good judgement – or as he described it, a ‘funny feeling in his gut’ – may have prevented further escalation up the chain of command. This perhaps averted an unintentional nuclear exchange, demonstrating the crucial role of human judgement in crisis decision-making.
However, decision-making can also be negatively impacted as a result of individual human judgement and biases, which are shaped by a complex range of factors, including the background, existing beliefs, culture and religion of an individual, as well as organizational biases more generally. One former senior official interviewed as part of this project highlighted that those responsible for handling nuclear weapons tend to come from a similar demographic background and therefore share similar beliefs and preconceptions. In crisis situations, a lack of contextual background and perspective to the work of personnel could result in a lack of situational awareness and the misinterpretation of events. Particularly in hierarchical models such as military settings, unquestioning obedience to standard operating procedures and protocols, as well as overconfidence in early-warning systems and other technologies, can inhibit the ability to exercise discretionary judgement in crisis situations, creating the potential for inadvertent escalation up the chain of command.
Errors can also result from human performance-related factors in personnel carrying out routine duties related to the handling of nuclear weapons. The day-to-day, largely repetitive duties of personnel can lead to a lack of vigilance over safety and security protocols. In 2013, several misconduct cases were reported in the US, including that of an officer caught sleeping with a door open, violating safety and security procedures. The monotonous duties of individuals handling nuclear weapons can result in sloppiness, as personnel find ways to overcome boredom. Spanning across the history of nuclear weapons, there have also been numerous reports of alcohol consumption and drug misuse among personnel, despite the development of rigorous screening processes.
Long periods of time in confined and often stressful conditions, such as in the case of lengthy nuclear submarine patrols, can lead to cases of depression and sleep deprivation among personnel that can impair decision-making. The unusual shift hours of individuals handling nuclear weapons and related control systems can disrupt the natural functioning of human circadian rhythms, and has been correlated with greater incidences of accidents. Studies from the airline industry can help draw useful parallels for nuclear weapons operators, where most pilot accidents in aircraft simulators occur between the hours of 3 am and 5 am.
Bigger errors often result from a series of low-level errors that become normalized over time, such as during routine tasks that are repetitive or overly cumbersome.
One interviewee for this paper indicated that bigger errors often result from a series of low-level errors that become normalized over time, such as during routine tasks that are repetitive or overly cumbersome. The combination of these low-level errors can result in a more serious error. Other errors can result from the misapplication of standard procedures or direct violations of safety and security codes. The organizational culture is an important determinant in shaping the conduct of individuals in the course of their duties. However, where procedures become too rigorous, such as through stringent inspections and overly punitive measures for mistakes, a culture of fear can develop. Personnel may then become unwilling to report errors for fear of repercussions on their career and reputation, which could lead to commanders covering up incidents. For instance, it was reported by the USAF in 2014 that 34 officers responsible for launching nuclear missiles had been suspended for cheating in proficiency tests.
However, given the sensitivity of nuclear safety and security, the full scope of accidents across nuclear weapons states remains unknown. While information surrounding certain incidents has become publicly available, little is known about past accidents in most nuclear weapons states. This is due, in part, due to concerns that incidents might reveal weaknesses in a state’s nuclear weapons capabilities, and to prevent unwanted external scrutiny surrounding the general safety of nuclear weapons. Where information is released, the seriousness of accidents is often downplayed by governments, or the full details are not made publicly available. The US has gone some way in acknowledging and addressing cases of nuclear weapons incidents with a degree of transparency. For instance, in 2007, the USAF introduced policy changes for the handling and delivery systems of nuclear weapons, following the reported mishandling of six nuclear warheads, including the labelling of nuclear weapons in storage hangars with placards.
The secrecy with which nuclear weapons states treat past incidents inhibits both accountability and organizational learning. Organizations develop their own sets of shared biases that can impact the ability to learn from past incidents and the subsequent development of effective safety and security measures. The lack of transparency of most nuclear weapons states on past incidents poses challenges for assessing the extent to which incidents were met with effective policy and procedural change, as well as understanding wider political attitudes towards nuclear weapons safety policy. Failing to provide a full account of an incident risks the development of dangerous historical narratives of incidents that prevents learning. As such, greater transparency at the higher levels can lead to swifter policy changes in response to incidents, thereby mitigating risks arising from future incidents.
Tools and solutions
Many of the potential errors in the nuclear weapons field are a consequence of human behaviour going wrong. This is because the execution of nuclear weapons policy involves two extremes. On the one hand, weapons operators must remain vigilant despite an overwhelming majority never being called on to perform the services for which they have been trained. On the other, when they are called on to act, the time frame for action is small and the consequences unimaginably high. By targeting problematic behaviours, behavioural insights can contribute to reducing the possibility of errors in both situations.
In his work The Design of Everyday Things, Donald Norman identifies two different kinds of errors: mistakes and slips. Whereas mistakes are the result of an agent having the incorrect goal, slips are the consequences of incorrect actions in the pursuit of correctly identified goals. Mistakes can be further subdivided into two main categories:
- Rule-based mistakes, where the incorrect course of action for resolving a correctly identified issue is taken; and
- Knowledge-based mistakes, where the agent misdiagnoses the problem due to erroneous or incomplete information.
Slips, which are more common still, can also be broken down into two categories:
- Action-based slips, where the wrong action is performed despite the correct course of action being known; and
- Memory-based slips, where memory fails, meaning an action is not done or its actions are not evaluated.
Table 2 presents examples of, and potential solutions to, the four kinds of mistakes and slips applicable to nuclear safety protocols.