Perceptions, beliefs, culture, religion and education are complex factors informing human judgment and thus nuclear decision-making. Crisis decision-making is influenced by behavioural and cognitive factors such as biases, noise, perception and intuition.
Nuclear decision-making is dependent on many interconnected and interdependent layers of structures. These include sociocultural and historical factors, relations between states, regional crisis/conflict, threat assessments (e.g. data readings), intelligence-gathering and analysis, reconnaissance and surveillance capabilities, strategic reliable communications architecture, command and control structures for authorized use, and sensor capabilities to detect a nuclear weapons launch. Nuclear weapons are also composed of complicated structures with complex design, connected to other complex systems. So far, there is no agreement within the expert community on whether such a high degree of complexity within systems design creates a risky situation, or whether it helps states to achieve strategic objectives to deter an adversarial attack through uncertainty. This emphasizes the notion that each nuclear weapon state has different levels of risk appetite and perception and that it is therefore hard to estimate general responses and reactions for all types of situations.
Nuclear decision-making is also complex because it involves multiple actors. The chief decision-makers, such as presidents, prime ministers and ministers of defence, receive the highest attention in this regard. Yet, nuclear decision-making also crucially involves duty officers, operators, defence and intelligence units, and military personnel, among others. Officers and operators read through and analyse data and therefore have the power to escalate a situation to higher echelons of command or to de-escalate it in times of crisis. Individual factors such as religion, education, culture and upbringing all have an impact on the decision-making process.
Historical cases such as the Able Archer-83 NATO exercise and the 1983 Soviet nuclear false alarm incident, analysed in detail in the following sections, illustrate the value of focusing on both high-level decision-makers and duty officers to address complexity in decision-making.
While the events and actors involved in each historical case are unique, there are several common themes within these examples. The decision-making process is informed by evidence and facts but also relies on other complex factors, including perception and human judgment, and their influencing factors (including beliefs, culture, religion, education and upbringing). Moreover, complex individual and behavioural factors, such as cognitive biases, intuition and gut feeling, all have a determining role in decision-making.
Biases
Nuclear weapons are increasing in salience in all nuclear weapon states. As new risks of inadvertent, accidental or deliberate use of nuclear weapons emerge, the role of humans – be it the role of a duty officer or of a high-level decision-maker – is increasingly important. Emerging technology applications (such as automation, cyberthreats, ‘deep fakes’, AI, and so on) raise concerns over reliable and credible chains of command. An often neglected but important element in decision-making is the possibility for decision-makers to make errors due to bias, faulty memories or a flawed understanding of historical incidents.
Operators’ obedience to standard operating procedures and protocol, their trust in early-warning systems, or their judgment of the adversary’s willingness to conduct a nuclear first strike are all elements that can be influenced by behavioural and psychological factors, along with evidence and facts.
A common cognitive bias at play in times of crisis is confirmation bias, which is the psychological predisposition to interpret new evidence in ways that align more closely with ‘existing beliefs, expectations, or a hypothesis in hand’. In a crisis, even though the initial information or intelligence might not provide a holistic picture, the decision-makers, including officers and intelligence analysts, may ‘anchor’ their decisions based on ‘the first piece of information they receive’. Thus, depending on the type of information received, there exists the risk of misinterpretation and faulty decision-making.
Another type of bias is conformity bias, which presents itself in hierarchical workplaces where staff are more likely to follow the orders of higher authorities without questioning the validity or morality of the action involved. In the 1960s, the Milgram experiment showed a surprising level of obedience to authority, even when the participants were knowingly harming other people.
Noise
Human error might result not only from judgments that are biased but also from judgments that are ‘noisy’. In the latter form, similar groups of experts presented with the same information often come to different conclusions; this is due to preconceived notions or potentially confounding factors that interfere with decision-making (e.g. the time of day a decision is taken). An example might be when two judges deliver very different sentences (ranging from one month to several years) for defendants who have committed the same crime. Similarly, doctors may differ in their diagnosis for patients with the same disease. The system (e.g. the judicial system, medical system, etc.) is noisy, and this explains variability and inconsistency in decisions and judgments. While bias occurs systematically and consistently, noise occurs randomly. Both are types of judgment errors, and a decision can be both noisy and biased. Noise should be considered as an influencing factor in everyday decision-making.
Throughout history, near-miss incidents have often been averted by human judgment, though it is uncertain whether such judgment will suffice to prevent all future catastrophes.
Decision-makers could be ‘presented with the same question and asked a very precise question’, and the variability in response and judgment would be because of the system noise. In nuclear decision-making, instances where expert judgment may vary fall into this category, for instance: Under what conditions would you alert higher echelons about a problem in the nuclear command centre? or: When is it allowed/appropriate to use or threaten to use nuclear weapons? In order to mitigate noise, social psychologists advise that ‘noise audits’ be conducted at the organizational level. Incorporating standard operating procedures, checklists and ‘decision hygiene’ into the decision-making process may contribute to better-informed decisions on a day-to-day basis.
Social psychology scholars indicate that ‘[j]udgments are both less noisy and less biased when those who make them are well trained, are more intelligent, and have the right cognitive style. In other words: good judgments depend on what you know, how well you think, and how you think’.
Bias and noise do not necessarily drive more aggressive or escalatory behaviours: in fact, they are coded into human cognition. It is hard to recognize biases and system noise in nuclear decision-making and in nuclear weapons policy. It takes effort and practice for decision-makers to adopt the habit of recognizing biases.
Throughout history, near-miss incidents have often been averted by human judgment, though it is uncertain whether such judgment will suffice to prevent all future catastrophes. The incorporation of emerging technologies and of changing systems engineering practices adds into this complexity.
Perception
Another layer in nuclear decision-making is the role of human judgment, including various risk perceptions and the assumptions of parties involved in decision-making. Perceptions can be based on past experiences, informed by beliefs, culture and religion, as well as facts and evidence. Every individual involved in the nuclear decision-making process and nuclear weapons policy, at whatever level, naturally holds an individual understanding of the adversary that makes risk perception ‘at least partially subjective’. As personal judgment and subjectivity is shaped by facts, opinions and feelings and is unique to every individual, it is impossible to eliminate it. Yet, acknowledging one’s subjective thoughts, understanding the fallibility of one’s beliefs and perceptions and employing critical thinking in a routine manner will be highly beneficial in times of crisis decision-making.
There have been several notable cases where misperception came into play and had an effect on high-level policymaking. In 1976, under the direction of George H. W. Bush, the US Central Intelligence Agency (CIA) undertook a comparative intelligence analysis experiment, composed of two groups (Team A and Team B) with competing views around Soviet strategic intentions. The aim of the experiment was to better analyse Soviet capability and intentions and to assess whether or not the Soviet Union’s missile capabilities could overtake the capabilities of the US. While the results of the comparative analysis are controversial due to the clashing views of Team A (led by CIA agents) and Team B (led by an outside panel of experts), the experiment highlighted the influence of perception on various groups, depending on their individual backgrounds and positions in bureaucratic or organizational politics. Overall, bureaucratic politics and varying assumptions, preferences and positions of power, along with clashing policy goals between the two teams, had an impact on their competing analyses.
Perception of the adversary also played a role in Soviet decision-making. For instance, in the 1980s there was a growing fear within the Soviet decision-making apparatus that the US and/or the NATO allies could use a large-scale military exercise as a cover for launching a surprise nuclear missile attack. Hence, the Soviet leadership focused in the first instance on ‘detecting and pre-empting’ such an attack through Operation RYaN, a Soviet intelligence-gathering operation conducted by the KGB (the security agency of the Soviet Union) together with the GRU (the Soviet military intelligence service) to prepare the Soviet Union to detect and pre-empt a first nuclear strike from NATO or its allies. Part of that intelligence-gathering, for example, required KGB operatives to monitor blood banks in the UK, and report back if that country’s government was requesting an increase in blood supplies, was paying high prices to purchase blood, or was opening new blood-donor reception centres. The Soviet assumption was that increased blood supply – along with activity around places where nuclear weapons were made or stored, or activity around places government officials would be evacuated – might mean readiness for a surprise attack. This perception is likely to have had a significant impact on the Soviet Union’s strategic vision and operational planning, potentially affecting the reading of large-scale military exercises and including the misperception of NATO’s Able Archer-83 exercise as a surprise first-strike attack.
It should also be noted that decision-makers and the expert nuclear policy community can have a somewhat distorted perception (at least discursively) that actors operate with all the information necessary to make a fully informed decision. Moreover, there is an assumption that this information exists. In reality, ambiguity and uncertainty play a key role in nuclear strategies and thus contribute to incomplete information. Therefore, even a well-informed decision does not imply infallibility. Being aware of the limits of information and data, and making decisions using this awareness, may help decision-makers and experts to perceive the problems differently.
Intuition and ‘gut feeling’
Decision-making is based not only on facts and evidence, but also on intuition, sometimes referred to colloquially in the literature as ‘gut feeling’. In nuclear decision-making, it seems that both intuition and information play a role, although the specific contribution of intuition to the process has not been studied in detail in the nuclear literature thus far. In fact, the mainstream nuclear theories rest on the assumption that decision-makers are rational actors, despite examples of policymakers who themselves refer to intuition. Former US president Ronald Reagan, for instance, often referred to his ‘gut feeling’ in his diary entries from 1981 to 1989.
In both the 1983 nuclear false alarm incident and Able Archer-83, as the following sections will analyse in detail, individuals responsible for decision-making referenced how their gut instinct and intuition played a part in their judgments. The Soviet officer responsible for averting nuclear conflict in the false alarm incident, Lieutenant Colonel Stanislav Petrov, famously referred to a ‘funny feeling in his gut’ as a crucial part of his decision. Similarly, one of the critical reports produced in the aftermath of Able Archer-83 highlighted how a key actor, Lieutenant General Leonard Perroots, ‘acted correctly out of instinct, not informed guidance’.
Intuition emerges from past experiences, behavioural factors, beliefs, biases and so forth. It serves a different perspective to the individual’s decision-making process. When they have opted to act on the basis of their intuition or gut feeling, people generally cannot point to why they took one decision over another – it is an intuitive ‘judgment call’ that requires them to have knowledge without being able to pinpoint exactly what they know. Decisions based on intuition ‘could be right or wrong’, and thus trusting gut feelings requires special attention. Although ‘judgment heuristics are quite useful, they could lead to severe and systematic errors’. In general, experts seem to have better, more accurate intuition than a layperson does, because their judgment is based on experience and recognition of cues and patterns. However, prudence is needed, as ‘intuitions do not all arise from true expertise’.
How humans think has been a point of discussion in many fields, notably including psychology and behavioural economics, for several decades. Dual-process theory in social and cognitive psychology explains this question through the lenses of intuition and reasoning. There are, however, several competing models for explaining the relationship between intuitive (automatic) and deliberate (controlled) thinking. This is mainly because experts look at the issue from different angles: while Daniel Kahneman explains dual-process thinking from a cognitive psychology perspective with a focus on judgment and decision-making, Jonathan Haidt and Joshua Greene expound it from a moral psychology perspective.
Borrowing from Kahneman’s explanation, human brains have both a fast-thinking (system 1) and slow-thinking (system 2) mode. While system 1 is quick and impulsive, system 2 is deliberate and relies on logic and analytical thinking. In medical science, similar considerations exist, with the two hemispheres of a human brain providing insights into the link between intuition and reason. While the left hemisphere is responsible for analytical processes and logic, the right hemisphere has the role of recognizing visual patterns, hunches and non-verbal cues. System 1 is responsible for the emotional response and the gut feeling mentioned above, and it ‘will interact with the gut as well as, for example, cardiovascular, respiratory, and hormonal systems’.
In day-to-day decision-making, intuition and reason are not distinct processes; they are inherently mutually reinforcing. Information informs intuition, and vice versa. A meta-analysis study on the subject found that both processes are a continuum of each other. Moreover, both processes could be fallible. For instance, while a common source of error in system 1 could be for the outputs to be biased (e.g. biased decisions), errors may occur in system 2 because the decision-maker cannot come up with a solution based on their existing knowledge. The latter case could occur as a consequence of a lack of training or knowledge on the part of the decision-maker.
However, decision-making in times of crisis, and especially nuclear decision-making, is not a usual business, mainly because decision-making when there is time for deliberation is different from decision-making when under pressure.
In nuclear decision-making with high levels of uncertainty, for instance due to clashing intelligence reports, system malfunctions or human error, an analysis purely based on information is hard to find. At times of high uncertainty, duty officers need to quickly assess the situation and decide whether to report it to higher echelons. Such decisions at the strategic level may include the possibility of ordering nuclear weapons use to escalate a conventional conflict to a nuclear level.
When making decisions under stress, the human mind is cognitive and emotional, racing to search for ideas and solutions – it is, so to speak, in a different mindset. However, it is not only the brain that responds in this situation, but also the gut. Research on the microbiome reveals the connection between the gut and the brain (referred to in the literature as the gut–brain axis). Although much remains to be discovered, the link between the central nervous system and the enteric nervous system signals ‘direct and indirect pathways between cognitive and emotional centres in the brain with peripheral intestinal functions’. Psychological stressors (relating either to chronic or acute stress) may also have an impact on this unconscious process.
The decision-maker should have a critical mindset in order to realize that their preconceptions and prejudices might affect their instincts, or that they may not have the full information at hand to help them decide. In times of crisis decision-making, such awareness would help them to assess the situation with a different mindset. This could, for instance, be achieved through training the mind and assessing potential pathways (as well as alternative situations and scenarios) ahead of a crisis. This type of training could be conducted with individuals and collectively (e.g. in teams, departments, within governments, or on an intergovernmental basis). While individual training could address challenges that are pertinent to personal characteristics, collective training could help address cognitive biases such as groupthink.
In the context of medical surgery, Rahul Jandial, a neurosurgeon, follows a similar training practice: