Previous cases of near nuclear use provide insights into the human judgment processes in decision-making, highlighting the role of uncertainty and complexity in determining the outcome of critical nuclear decisions.
To illustrate uncertainty and complexity within the nuclear decision-making process, this section focuses on three case studies: the 1983 Soviet nuclear false alarm incident (also known as the Serpukhov-15 or Petrov incident), the Able Archer-83 exercise, and the 1995 Norwegian rocket launch incident (also known as the Black Brant scare).
The authors have chosen these particular cases to analyse in further detail for the following reasons: they provide a comparison of the impact of the security environment in nuclear decision-making; and they enable an understanding of the role of uncertainty and complexity under consistent security conditions.
Both Able Archer-83 and the nuclear false alarm incident took place in 1983, providing an opportunity to compare two incidents where security concerns were similar, in a context of elevated security tensions. On the other hand, the study of the 1995 Norwegian rocket launch incident allowed the researchers to judge whether – and to what extent – the degree of tension in the security environment matters, for instance in causing misperception, escalation or de-escalation.
Whereas Able Archer-83 provides insights into the misperceptions of chief decision-makers, especially in relation to the military exercises and training of forces that take place at times of elevated tensions, the 1983 nuclear false alarm incident captures the role as a decision-maker of a duty officer (Lieutenant Colonel Stanislav Petrov), who trusted his ‘gut feeling’ and decided to relay information about incoming intercontinental ballistic missiles (ICBMs) to higher echelons as a false alarm. This action de-escalated a situation that could have led to the issuing of a preliminary command, which would in turn have unlocked the command and control chain to make it ready for a launch order from a decision-maker. Although Petrov trusted his intuition, guided by the experience that Soviet early-warning systems were patchy at the time, it is likely that he made a definitive choice without being consciously aware of it as he interpreted the data which suggested the presence of oncoming missiles. The interpretation that came to Petrov’s mind might have dominated the situation, and he may not have been aware of the uncertainty and ambiguity at play at that moment. As Kahneman, a social psychologist, put it:
Lastly, the 1995 Norwegian rocket launch incident highlights the importance of information sharing and situational awareness in preventing misunderstandings, even in an amicable international security environment. Information sharing diminishes misperception and aids de-escalation.
These cases not only highlight the importance of leadership and human judgment but also provide insights into cognitive biases (i.e. over- and/or underconfidence in nuclear command and control systems) and their impact on decision-making. Trust is also a common tenet within these cases, especially in terms of reliance on humans as gatekeepers to prevent future catastrophes. These incidents showcase the importance of maintaining an amicable security environment to ensure that uncertainty does not reach unacceptable levels. The lessons learned from them can feed into arms control negotiations and discussions at the NPT review process.
Case study 1: the 1983 Soviet nuclear false alarm
Introduction
The events of 26 September 1983 occurred in the context of an atmosphere of extremely high tensions, and the incident is widely regarded as one of several low points in US–Soviet relations during the course of the Cold War. In the early hours of the morning, the Soviet ‘Oko’ [Eye] missile defence early-warning satellite system, the control centre of which was located at Serpukhov-15, southwest of Moscow, detected a suspected inbound attack from the US. This case study considers the decision-making process that unfolded in response to this warning; the context in which the decisions were made, including the role of uncertainty and complexity; and the lessons that this event provides for nuclear scholars and policymakers.
Distrust and uncertainty were at some of their highest ever levels during this period of modern history, as a result both of a ratcheting-up of anti-Soviet rhetoric by the US and deep mistrust of the US on the part of the Soviets, which had been punctuated intermittently by a series of events outlined below. As a consequence the margin for miscommunication, misinterpretation or error was perhaps at its slimmest, and both US and Soviet nuclear arsenals were on near hair-trigger alert, a policy by which nuclear weapons are maintained in a ready-to-launch status to enable rapid deployment. It is for these reasons that this false alarm, in which the Soviet early-warning satellite system mistakenly identified an inbound nuclear attack from the US, could very easily have resulted in retaliation in the form of nuclear war.
Security environment
The early 1980s was a particularly tense and unsettled period within the Cold War. The Reagan presidency, which began in 1981, was characterized by a more robust and confrontational rhetoric on the part of the US towards the Soviet Union than the presidency of Jimmy Carter which preceded it. One of the first indications of this more hawkish approach came rather swiftly in 1981 when Reagan prioritized the modernization of the US nuclear arsenal, in combination with an accelerated general military build-up. It was Reagan’s belief that this development of military capabilities was required in order for the US to be able to bargain with the Soviet Union from a position of strength, leading to a 34 per cent increase in US defence spending between 1981 and 1986.
It is hardly surprising that this posturing did not go unnoticed by the Soviets, who had become increasingly concerned about the prospect of the US developing the capability to carry out a decapitating nuclear first strike. In early 1981, the then chairman of the KGB, Yuri Andropov, stated that the prospect of such an attack was sufficiently significant for Operation RYaN to be initiated in order to counter this threat. This operation consisted of extensive intelligence-gathering on possible signs of an impending first-strike attack by the US, reports of which were often inflated and inaccurate, which then increased the appetite for even more intelligence. This contributed towards the perpetuation of a cycle of fear and mistrust that continued apace from 1981 and became more acute as Andropov ascended to become leader of the Soviet Union in 1982.
Further contributing factors to the degradation of the US–Soviet Union relationship in the early 1980s included the failed initial efforts in 1982 towards the bilateral Strategic Arms Reduction Treaty (START), President Reagan’s subsequent public designation on 8 March 1983 of the USSR as an ‘evil empire’, and the announcement on 23 March of the US pursuit of the space-based anti ballistic missile Strategic Defense Initiative (SDI – also known as the ‘Star Wars’ programme). In combination, these strategic developments contributed incrementally to the worsening of US–Soviet relations to depths that had only previously been witnessed during the Cuban missile crisis.
An additional factor that contributed to heightened tensions between the US and the Soviet Union was the shooting down on 1 September 1983 by Soviet forces of a Korean Air Lines Boeing 747, acting under the presumption that it was a US reconnaissance flight, demonstrating and further exacerbating the strained relations between the superpowers.
The Korean Air Lines Flight 007 incident is a good illustration of decision-making amid uncertainty. The civilian Korean Air Lines flight, which was en route from New York to Seoul via Anchorage in Alaska, entered Soviet airspace due to an error made while flying in autopilot. As a result, the airliner went off course and headed towards the Kamchatka Peninsula in the far east of the Soviet Union. On the same night, the Soviet Union was tracking a US Air Force (USAF) surveillance jet plane (an RC-135), which was waiting to monitor a scheduled Soviet ballistic missile test. Both planes had similar features (e.g. four engines under the wings); confusion occurred when both planes crossed paths in the radar readings, and the system ‘somehow lost the RC-135 and picked up the 747, now unexpectedly heading directly for Kamchatka’. The USAF jet returned to its base, as the expected missile test did not take place. The ground controllers tried to determine the type of the plane based on its number of engines, but the weather conditions were poor. A Soviet fighter pilot, Lieutenant Colonel Gennadi Osipovich, tailed the Korean airliner and reported that the plane had ‘flickering flashing lights’, but this did not raise alarms on the ground. Osipovich received the order to fire. All 269 of the civilians on board the aircraft – including Larry McDonald, a member of the US Congress – died in the incident, which President Reagan called a ‘massacre’. This incident was a tragic culmination of the fear, distrust and paranoia that had been engendered through an extended period of escalation.
In the case of the September 1983 false alarm, the impact of mistrust between the US and the Soviet Union meant that the warning of an incoming attack came at a moment when an unprovoked attack had been considered and feared for some time, thus affording a greater degree of credibility to the veracity of the threat than would have perhaps been accorded at other stages during the Cold War, and providing an example of confirmation bias at play. Only months later, these events were also to contribute towards the overarching security environment in the case of Able Archer-83, as will be covered below.
Timeline and decision-making
The 1983 nuclear false alarm incident occurred in relative isolation from the external world, and in a compressed time frame. Other than among the Soviet leadership, no details of the incident were made public until the 1990s (when former colonel general Yuri Votintsev made reference to it in his memoirs). Given that information on the incident was strictly classified, the following account has been woven together from the fullest information available – primarily from published interviews with Lieutenant Colonel Petrov, whose role in the decision-making process cannot be understated.
In the early hours of 26 September 1983, Petrov sat at the control console of the Oko early-warning satellite system at Serpukhov-15, the military hamlet that housed the system’s control centre. Suddenly, a launch warning began to flash, indicating at first that a single inbound ICBM travelling from the US had been detected by the satellite sensors. Shortly afterwards, several more ICBMs were registered, giving a total of five registered launches.
As the commander on duty, responsibility fell to Petrov to verify the accuracy of these warnings and relay his assessment to his superiors. What is considered to be the most reliable and comprehensive account of the standard operating procedures in these circumstances was recorded by David Hoffman, who noted that: ‘Petrov was situated at a critical point in the chain of command, overseeing a staff that monitored incoming signals from the satellites. He reported to supervisors at warning-system headquarters; they, in turn, reported to the general staff, which would consult with Soviet leader Yuri Andropov on the possibility of launching a retaliatory attack.’ Petrov was therefore obliged by protocol to inform his supervisors. However, before doing so, he first attempted to verify the unexpected warning.
As a first port of call, Petrov checked the computer readings from additional satellites within the Oko constellation, which matched those of the initial warning. The accuracy of these warnings was also bolstered by the fact that the figure ‘3’ had appeared on the command console, which meant that the reliability of this assessment was of the highest order. It was this detail that began to sow doubt in Petrov’s mind as to the veracity of the satellite warning. As it was later noted, in such conditions, ‘the system technically could not give the highest degree of reliability’. At this juncture, Petrov sought to further scrutinize the satellite data by cross-referencing it with other sources. Unfortunately, poor weather stymied the possibility of visually verifying the satellite information.
As a result, the decision as to whether to confirm the threat as legitimate or to inform his supervisors that this was a false alarm fell to Petrov. The window of opportunity available to him was limited, which undoubtedly contributed to the pressure under which his decision was made. While Petrov admitted later in his life that he believed that the odds of the threat being genuine or false were ‘50/50’, he ultimately made the decision to report the incident as a false alarm. A number of considerations influenced this final decision, ranging from a lack of corroborating radar or telescopic data to assumptions that missiles would not be launched from only one base, nor would they be so few in number in the case of a first strike, as well as an impossible-to-ignore feeling in his gut that the US would not launch a sudden attack on the Soviet Union in this manner. All of these indicate how system 1 thinking (in other words, quick and impulsive thinking), along with system 2 logical considerations, was at play within Petrov’s decision-making.
The decision-making process throughout the 1983 nuclear false alarm makes clear that there are moments at which different decisions could have been reached.
Following the relaying of the nuclear false alarm message, it was then a matter of waiting to be proven either correct or incorrect. Fortunately, Petrov’s assessment proved correct in this instance, thus arresting the chain of crisis escalation at a relatively early point in the decision-making process by incorporating all the available information at his disposal and reporting his judgment along with the facts, in spite of the standard operating procedures. Later assessments indicated that the incident occurred because of the reflection of sunlight on the satellite’s infrared sensors. As a result of Petrov’s actions, it is generally understood that the details of the suspected attack were not discussed with senior Soviet officials at the time of the incident, though some divergences exist on this.
The decision-making process throughout the 1983 nuclear false alarm, outlined above, makes clear that there are moments (or critical nodes) at which different decisions could have been reached, thus affecting the trajectory of the crisis. The possible alternative outcomes and important nodes are outlined below, to provide a better picture of the complexity inherent in nuclear decision-making.
Critical nodes and alternative pathways to decision-making
1. The functioning of the early-warning system
Several elements combine to make the eventual outcome even more fortuitous than it appears at first sight. Firstly, minor conjecture exists as to whether the launch reports from the satellites immediately bypassed Petrov and were automatically escalated to his superiors for their immediate consideration. As Hoffman reported:
Meanwhile, Eric Schlosser notes that: ‘[t]he Soviet general staff was alerted, and it was Petrov’s job to advise them whether the missile attack was real’, while Forden, Podvig and Postol have suggested that it was ‘possible that these warnings were automatically sent on to the Soviet General Staff’.
In an interview with TIME magazine in 2015, Petrov remarked that: ‘We built the system to rule out the possibility of false alarms’, which points to the system engineering concept and the best practice measure of ensuring security by design. Regardless, nuclear command and control architectures are composed of complex systems, interacting with each other in a non-linear manner, and they can still fail to operate as intended, just as had occurred in this incident.
Moreover, the past experiences of both the General Staff and Petrov with the failure of the early-warning systems seem to have had an effect on their judgment. The accuracy of the Oko satellite system and the fact that ‘it had been rushed into service’ might have provided a baseline understanding of what constitutes trust, or lack thereof, in these systems. By the time of the incident, as historians have indicated, 12 Oko satellites had already failed, having only been put into service the previous year, which provides justification for the presence of doubt over the reliability of the system.
Petrov’s retrospective thinking on the subject also shows personal, societal, political, economic and cultural considerations within Soviet decision-making, specifically in that the leadership did not overtly acknowledge the incident at the time. As Petrov noted in a 2004 interview with the Moscow News: ‘If I was to be decorated for that incident, someone would have had to take the rap – above all, those who had developed the [ballistic missile early-warning system], including our renowned academicians who had received billions and billions in funding.’
This incident also highlights the role of the human–machine interaction and the need to keep oversight between the two, given the fallibilities inherent in both.
2. The Soviet nuclear posture
US–Soviet relations appeared to be at a low ebb at the time of the false alarm incident. Despite this, it is believed that the Soviet posture was to launch under attack, rather than to launch on warning. This is critical, given the earlier consideration of the uncertainty that surrounds whether the warning reports were indeed escalated to the General Staff automatically when the satellites began to register suspicious activity. Had the Soviets adopted a launch-on-warning posture, in light of the perceived level of threat from the US, this would have narrowed or perhaps even removed entirely the decision-making window from the already limited eight- to 15-minute range that it is estimated there would have been for checks to be carried out.
3. Individual decision-makers’ characteristics
The happy chance that Petrov was the officer in command on the night of the incident has been well documented, and it is difficult to understate the centrality of his thought process and character to the avoidance of escalation. The combination of Petrov’s education and intuition was instrumental in the handling of the crisis.
Petrov’s background as a scientist was extremely important in determining his response to what was placed in front of him that night. It just so happened that Petrov was not originally scheduled to be on shift that evening, but was in fact covering for a sick colleague. It is suggested that Petrov’s scientific background led him to critically scrutinize the data from the satellites to a greater extent than one of his colleagues might have done. It is likely that Petrov’s assessment of the situation was informed by his engineering background and aptitude with computer malfunctions. Petrov had been ‘de-bugging the main computer for several weeks’. Thus his experience and expertise most likely led him to recognize the patterns and question the information with which he was being presented. The US nuclear security expert Bruce Blair, who had the opportunity to interact with Petrov on occasion, believed that Petrov ‘had not been trained and conditioned to respond to warnings by checking boxes and accepting computers’ assessments as final’.
Particularly when considering the relative isolation in which this incident took place, and the short time frame in which it played out, it is impossible to discount the personal characteristics of those involved – specifically Petrov – in determining the outcome of the incident.
Lessons learned
The 1983 nuclear false alarm incident provides several lessons as a cautionary tale for future decision-makers.
The first of these lessons concerns the oversight of digital systems, such as computers and AI – which is likely to be of greater relevance in the coming years. Neither a computer nor a human is infallible, and both have the capacity to make mistakes or misinterpretations. The incident also highlights the possibility of unforeseen chains of events, such as how the satellites had mistaken the reflected sunlight as the launch of ICBMs.
In future, human oversight and understanding of how these systems work will be imperative as states aim to automate a greater number of the processes involved in nuclear command, control and communication. Just as Petrov, through his background as an engineer, was able to cast a critical eye over the data that had been gathered by the satellites, he was also able to bring wider nuance to this calculation – for example, the strategic understanding that a first strike of only five ICBMs would be highly illogical. While several factors, such as the number of incoming missiles, could be programmed into the algorithms within early-warning systems, a significant challenge lies in recognizing the limitations of programmers in terms of their ability to include all possible outcomes and to account for robustness by ensuring that systems that are trained in one environment can still function in other environments.
Greater automation is inevitable in the coming years: however, the lesson from the false alarm incident is that while human oversight over these processes must be retained, decision-makers must encourage duty officers to think creatively towards the systems that they interact with, and the way they interpret the information and systems calculations, in order to improve their understanding of the limits of both man and machine. Recognizing that the possibility of failure is inescapable, the human mind (or the human–machine interface, in the future) could prepare for alternative options and solutions ahead of time.
A second, related, lesson that can be drawn from this incident is the value in having a breadth of actors within the decision-making process at different levels and with complementary expertise. As was highlighted, the fact that Petrov was perhaps an outlier within the environment, having not had as substantial a record of combat history as his colleagues and being an individual with a scientific background, may have proved beneficial to the decision-making process. Conversely, the presence of differing perspectives alone may not be enough for the decision-making process to receive adequate scrutiny, and could perhaps be considered as an undesirable impediment to swift decision-making in times of crisis, when windows of opportunity are small. However, it does seem that this case study illustrates how the incorporation of civilian perspectives and/or scientific rigour, as well as individual habits and perspectives more broadly, can play an important consultative role within the decision-making process on the authorization of a nuclear launch or retaliation.
Recognizing that the possibility of failure is inescapable, the human mind (or the human–machine interface, in the future) could prepare for alternative options and solutions ahead of time.
A final lesson from the false alarm case centres around lines of communication during such crises and the need for alternative means by which reliable information can be gathered when relations between countries are at their lowest ebb. As noted in the earlier section on the security environment in advance of the incident, both rhetoric and actions on both sides had been deteriorating at a significant rate in the preceding years. Thus, when the events of that day began to unfold, the possibility – however extreme – that the US had indeed launched a limited first strike was not wholly discountable. Faced with such a major decision to be made, and without the means by which to further verify the accuracy of the satellite reports until it was perhaps too late, the Soviet authorities were faced with the prospect of making this decision with very patchy, incomplete information. Although the Soviet Union and the US had established a communications hotline in the wake of the Cuban missile crisis, it was not used in this incident. Engaging in military-to-military communication or using established hotlines could reduce tensions and clarify miscommunications in times of crisis.
Case study 2: The Able Archer-83 exercise
Introduction
The declassification by the US government in the early 2010s of several Cold War-era documents has shed new light on the events surrounding a NATO nuclear-preparedness exercise, codenamed Able Archer-83, which began on 2 November 1983 and which has served to highlight just how close the US and the Soviet Union came to a nuclear confrontation as the exercise progressed. Able Archer-83 was designed to test NATO’s operating procedures should a conventional war in Europe escalate to the potential use of nuclear weapons. This incident elevated tensions between the US and the Soviet Union to a point where a nuclear attack from either side could easily have been provoked.
Security environment
As highlighted above in relation to the 1983 nuclear launch incident, by the time the Able Archer-83 exercise took place in November 1983 relations between the Soviet Union and the US were particularly tense. Detente between the two superpowers had largely subsided, and both sides’ leadership had begun to adopt distinctly more hostile rhetoric towards one another. Reagan’s renowned speech in March 1983 in which he dubbed the Soviet Union an ‘evil empire’ also marked the beginning of his plans to develop the SDI (see above), a proposed missile defence system capable of intercepting Soviet ICBMs, thus highlighting the defensive nuclear policy posture of the US.
By 1983, suspicions within the Soviet Union in relation to US actions had similarly reached an unusually high point, even in the context of the Cold War. In particular, the Soviets’ Operation RYaN had contributed significantly to the atmosphere of mistrust at the time of Able Archer-83. Operation RYaN was an exercise intended to help plan and prepare for a defensive, pre-emptive first strike, based on the premise that intelligence could be gathered on a range of social, economic and political indicators to signal that the West (notably the US) was preparing for nuclear war by means of its own all-out nuclear first strike.
KGB documents dating from 1983 – but not released until some years later – had warned of ‘indirect indications of preparation’ for nuclear war by NATO nations, a misperception reinforced by the Soviet Union’s detection of a spike in classified communications between London and Washington. The nature of this Soviet intelligence collection operation, and its founding assumption that a US/NATO first strike was inevitable, highlights the role of perceptions and the state of mind within the Soviet Union – which was mirrored within the US leadership. NATO’s war gaming exercise was thus conducted in a security environment fraught with hostility and suspicion that exacerbated the repercussions of the ensuing misinterpretations.
Timeline and decision-making
NATO’s Able Archer-83 war gaming exercise took place from 7 to 11 November 1983 and postulated a hypothetical scenario where Warsaw Pact forces outnumbered those of the US and NATO. The exercise itself differed in several ways from previous iterations of the annual exercise: these distinctions are likely to have contributed to the Soviet Union’s misperceptions and misunderstandings and to have given rise to its heightened response. One of the most significant differences was NATO’s inclusion in Able Archer-83 of rehearsals for the launching of nuclear weapons. This exercise was also more ‘provocative’ in nature than its predecessors, as it uniquely involved differently coded messaging formats, a higher state of alert than previous iterations, and the incorporation of ‘live mobilization exercises from some US military forces in Europe’. Other non-routine elements included long radio silences, a shift of command to an Alternate War Headquarters, and reports of ‘nuclear strikes’ on open radio frequencies that could have been interpreted as real. In the context of Operation RYaN, with the KGB and GRU paying heightened attention and reacting with a sense of alarm to changes in routine procedures, the specific features of Able Archer-83 led the Soviet Union’s authorities to consider that differences had been deliberately introduced to cover for a real first strike against the Soviet Union.
Despite the solely preparatory nature of the Able Archer-83 exercise, an emergency flash telegram was sent on either 8 or 9 November by Soviet intelligence officers to KGB residencies in Western Europe to inform them that NATO forces had been placed on high alert and to ask intelligence officers to seek out further information suggesting US/NATO preparation for a first strike. In response, the Soviet Union moved its ICBMs with nuclear warheads to their launch sites, deployed submarines carrying nuclear ballistic missiles under the Arctic ice cap, increased the number of reconnaissance flights and heightened the readiness of Soviet air units in Eastern Europe. There is contention over whether the Soviet leadership did in fact consider that an attack was imminent, due to the general absence of mentions of Able Archer-83 in Soviet leaders’ memoirs: however, the scale and nature of the Soviet response emphasizes the likelihood that Soviet leaders were involved in the decision to heighten their state of alert.
The heightened response from the Soviet Union to Able Archer-83, while not recorded by the US’s early-warning system (it is not known why), was observed by Lieutenant General Leonard Perroots, the assistant chief of staff for intelligence at the USAF in Europe, who reported the atypical Soviet heightened state of alert to General Billy Minter, the commander-in-chief of the USAF in Europe. When Minter asked Perroots whether the USAF should increase its real force generation, Perroots advised that there was insufficient evidence to justify doing so, and that the situation should instead be closely monitored in case of any changes. As a result, neither the US nor NATO decided to increase real force generation and the Soviet Union lowered the state of alert of its missiles and forces. Able Archer-83 concluded on 11 November without a military confrontation between the two superpowers.
Critical nodes and alternative pathways to decision-making
Throughout the trajectory of Able Archer-83, there were several moments when different decisions or circumstances would have led to significantly different pathways and potentially to more escalatory outcomes. These critical nodes highlight how the events of November 1983 could easily have escalated into a nuclear stand-off between the US/NATO and the Soviet Union. This section will explore the various decision-making scenarios that, had different decisions been made or alarms been raised, could have led to a confrontational and nuclear outcome.
1. The Soviet Union’s response to Able Archer-83
In response to the perceived threat of the Able Archer-83 exercise, the Soviet Union began preparations for a possible use of nuclear weapons and placed the Soviet 4th Air Army into a heightened state of readiness.
Several factors contributed to the Soviet interpretation of Able Archer-83 as a cover for a first strike, including: a) the non-routine elements of the exercise; b) the particularly hostile relations between the US and the Soviet Union in November 1983; and c) the possible confirmation bias within the Soviet authorities as their intelligence analysts were seeking evidence to suggest that the West was preparing for a first strike.
If NATO had signalled to the Soviet Union that it was planning to conduct a non-routine exercise, and had warned about the integration of rehearsals for the launch of nuclear weapons as part of the exercise, it is likely that the overall misinterpretations and heightened state of nuclear alert that were engendered by Able Archer-83 would have been avoided. Had NATO avoided conducting an exercise of this scale and nature during a time of heightened tensions, the misinterpretations (and the escalated responses) would have been avoided altogether.
Alternative pathways in this incident could have brought about a Soviet conventional weapons attack or a pre-emptive first nuclear strike in response to the perceived threat of the Able Archer-83 exercise. However, a greater number of steps would have had to be involved for any sort of weapons release to have materialized. Had the evidence suggesting that Able Archer-83 was a veil for a real attack been more convincing and incontrovertible, it is possible that the Soviet Union might have launched a conventional attack against the West. A conventional attack conducted in the tense security climate of 1983, with nuclear weapons on heightened alert, could also have escalated into a nuclear stand-off between the two superpowers. This alternative scenario would have been more likely had senior members of the US leadership – including President Reagan and Vice-President Bush – participated in the Able Archer-83 exercise, as had originally been planned. The decision not to include the president and vice-president had been taken by the US national security advisor, Robert McFarlane: it is likely to have contributed to alleviating the sense of alarm within the Soviet decision-making structure, and it possibly averted a more rapid and escalatory response from the Soviet Union.
2. The role of individual decision-makers
The role played by Lieutenant General Perroots in de-escalating the Able Archer-83 crisis is a thought-provoking departure for an ensuing (and perhaps likely) alternative pathway that could have resulted in a nuclear confrontation between the US/NATO and the Soviet Union. Perroots’s individual decision-making characteristics played a significant role in the Western allies’ decision to leave the alert posture of their forces unchanged; his decision against escalating US forces in response to the Soviet force escalation, as highlighted in the 1990 report from the President’s Foreign Intelligence Advisory Board at the US Department of State, was largely based on ‘instinct, not informed guidance’ and highlights the influence of gut instinct in determining the trajectory of nuclear decision-making processes. The decision did not follow US standard procedure at the time, which would have required Lieutenant General Perroots and General Minter to alert their superiors in order for the US to decide whether to increase its state of alert. Had a different officer been on duty, or had a different commander received Perroots’s recommendation to do nothing, the US might have increased its own real alert posture. Indeed, Perroots stated, in his end-of-tour report addendum, ‘If I had known then what I later found out I am uncertain what advice I would have given.’ Perroots was referring to a full understanding of the scale of the Soviet alert: this statement highlights the fortuity and serendipity of the decision taken. Not only does his decision point to the role of gut instinct in nuclear decision-making, it also signals the role of luck that, according to Benoît Pelopidas, ‘seems to have constantly escaped the learning process’ in nuclear weapons policy. In the case of Able Archer-83, Perroots’s decision to trust his instinct, in preference to carrying out standard operating procedures, may largely be attributed to luck, which can be argued to have played a significant role in averting nuclear crises historically.
The personal nature of this decision also highlights the influence that individual actors had in de-escalating the Able Archer-83 crisis. The role of the commander-in-chief, General Minter, who not only asked Perroots for his recommendation as to which pathway to follow, but also followed this recommendation, has been largely overlooked by scholars. Theoretically, General Minter could have ignored Perroots’s recommendation and chosen to increase the US real force posture. This would have further escalated tensions and introduced the potential outcome of a nuclear first strike from either side. What is most significant about this alternative scenario is that had standard procedures been followed, whereby senior US and NATO officials would have been alerted to the Soviet escalation, the decision-making pathway would have been completely different, and the US and the Soviet Union would potentially have found themselves in an escalatory stand-off in Eastern Europe, with both sides’ nuclear missiles on a hair-trigger alert.
3. US indications and warning system failure
Throughout the escalatory Soviet response to Able Archer-83, the US indications and warning (I&W) systems ‘sounded no alarm bells’ despite the rapid escalation of Soviet forces and missile deployment. The I&W systems constitute a ‘network of intelligence production facilities with analytical resources’ that both produce and disseminate intelligence products within and across commands. While the reasons why the US’s I&W systems failed to signal the heightened state of alert on the part of the Soviet Union have not been explored in detail, Lieutenant General Perroots attributed the error to an electronic miscommunication whereby ‘electrically reported GAMMA material’, or communications intelligence products, was not adequately distributed to those whose need to see the material was the greatest. It is worth noting that Perroots has stated that this error was rectified after Able Archer-83, as it presented a significant failure in the US intelligence cycle.
Despite rigorous testing and planning, technical errors do take place in complex systems, and the cause of such an error may not be human-related. As system engineers argue, ‘complexity is leading to important system properties (such as safety) not being related to the failure of individual system components but rather to the interactions among the components that have not failed or […] malfunctioned’. This type of incident symbolizes the ‘unknown unknowns’ (see Chapter 2). Preparing for the ‘unknown unknowns’ and embracing uncertainty requires the establishment ahead of time of resiliency measures, such as investing in updating and changing redundant systems, and the training of staff.
As previously stated, the Soviet Union began to escalate its real alert posture after nightfall of the first day of Able Archer-83. However, despite the startling and atypical nature of the Soviet Union’s force escalation and missile deployment, the US I&W system did not raise any alarms. The change in Soviet posture did raise strong concerns within the UK government – specifically on the part of the cabinet secretary, Sir Robert Armstrong, who was alarmed by the adoption of such a military posture by the Soviet Union during a major Soviet national holiday (7 November, the commemoration of the October Revolution of 1917). Armstrong warned that the escalation was unlikely to have arisen from routine Soviet procedure, due to its timing, and that it could instead be a reflection of genuine fear within the Soviet Union that the West was preparing for a first strike. Armstrong’s analysis can be seen as evidence that his perception of the Soviet leadership’s behaviour was based on patterns and past experience, and highlights the role of human cognition in the decision-making process.
Lessons learned
There are several lessons to be learned from the Able Archer-83 case that are valuable in informing future decision-making in the nuclear policy arena.
One of the key lessons learned from the events of Able Archer-83 is the danger of conducting large-scale military exercises in times of heightened tension, often created by hostile rhetoric from leadership. For example, Reagan’s ‘evil empire’ speech of March 1983 and his subsequent labelling in September of the shooting down of Korean Air Lines Flight 007 as ‘an act of barbarism’ are likely to have contributed to the antagonism between the US and Soviet leaderships that provided the hostile context for Able Archer-83 to be misinterpreted. This rhetoric is also likely to have increased the ‘risks of miscalculation, escalation and propensity for considering nuclear response’. It was clear from the events described above that leaders and decision-makers needed to be more conscious of the impact of their rhetoric on heightening tensions, a lesson seemingly learned by Reagan, who initiated a remarkable policy shift, demonstrated by his call for the total elimination of nuclear weapons in early 1984. Thus, one of the key lessons from the Able Archer-83 exercise was the necessity to communicate intent to the adversary ahead of time. This has become even more important in an era when the media is far more pervasive and all types of information are liable to ‘go viral’ on social media platforms. Today, all NATO exercises are declared and strategically communicated to other parties to reduce chances of a misunderstanding.
Effective, open and genuine communication channels and regular NATO messaging could have served as a means of mitigating the Soviet misinterpretation of this exercise as a veil for a first-strike attack against the Soviet Union.
Furthermore, Able Archer-83, and specifically the role played by Operation RYaN, highlights how confirmation bias can play a role in influencing intelligence operations and thus the ensuing military or policy responses. Operation RYaN postured the Soviet Union in a defensive manner that assumed an inevitable pre-emptive strike on the part of the US; any non-routine elements, whether misinterpreted or not, contributed to the hypothesis that a first strike was likely, as well as altering perceptions. This change in perceptions played a role in increasing the propensity for misinterpretation and miscalculation in the context of a large-scale military operation.
Effective and timely communication, as well as clear messaging on nuclear command and control exercises, is essential for avoiding crises. According to the Soviet defence minister, Dmitry Ustinov, NATO’s military exercises were ‘becoming increasingly difficult to distinguish from a real deployment of armed forces for aggression’. Effective, open and genuine communication channels and regular NATO messaging could have served as a means of mitigating the Soviet misinterpretation of this exercise as a veil for a first-strike attack against the Soviet Union. In 2013 it was revealed that in early 1984, in response to the concerns surrounding the inadequate messaging and informing of NATO exercises, the UK Foreign and Commonwealth Office and Ministry of Defence had drafted a joint paper for discussion with the US that proposed that ‘NATO should inform the Soviet Union on a routine basis of proposed NATO exercise activity involving nuclear play’. Reagan also began to take action to improve communication with the Soviet Union in the aftermath of Able Archer-83, delivering a speech calling for increased dialogue on 16 January 1984.
The documentation now available on Able Archer-83 provides a unique opportunity for scholars to analyse how leaders and decision-makers ‘might not have learned as much from the Cuban missile crisis […] as they should have’. During the course of the Able Archer-83 exercise, and despite the increasingly escalatory posture adopted by the Soviet Union in response, there was again no use of the crisis communication mechanisms that had been established as a result of the Cuban missile crisis, including the hotline between the US and the Soviet Union. Neither the US nor NATO communicated that the exercise was taking place, despite the non-routine elements that risked misinterpretation; nor was the West warned by the Soviet Union of the escalating tensions and the heightened alert status of the latter’s own forces.
Despite the measurable gains in the nuclear decision-making process that have been achieved by using I&W systems, as well as increasingly automated technologies and communication systems in the field of military intelligence, there remains scope for error. The necessity for human supervision is demonstrated by the failure of the US I&W system to accurately provide a timely signal of the heightened state of alert of Soviet forces and missiles. With the increased automation of early-warning systems and of the means by which these warnings are distributed, this lesson is even more critical today.
Finally, the Able Archer-83 exercise also provides lessons on the value of declassified archival material in building an understanding of nuclear decision-making and the likelihood of inadvertent nuclear war. It provides a clear example of the dangers of allowing information about nuclear near-miss incidents to remain secret, as it can provide valuable further lessons for nuclear policy. Indeed, much of the material related to Able Archer-83 and the decision-making process on both the US/NATO and Soviet Union sides remains classified, significantly hindering decision-makers’ ability to learn from the miscommunications and misinterpretations that took place. Several critical documents, including Lieutenant General Perroots’s end-of-tour addendum and the 1990 report by the President’s Foreign Intelligence Advisory Board, were declassified in 2015, since which date they have shed valuable light on this nuclear near-miss incident.
Case study 3: the 1995 Norwegian rocket launch
Introduction
The Norwegian rocket launch incident took place on 25 January 1995. Norwegian and US scientists launched a Black Brant XII four-stage sounding rocket from the Norwegian island of Andøya. The rocket was designed to assist with the scientific study of the aurora borealis (Northern Lights) by collecting data on atmospheric conditions at various altitudes. As the launch was a scientific endeavour, it was not covered under the 1988 Ballistic Missile Launch Notification Agreement between the Soviet Union and the US. Thus, the details of the rocket launch were communicated in advance to Norway’s neighbouring states, including Russia, by the Norwegian foreign ministry by means of a letter of notification – however, it is not known whether this information ever reached the relevant Russian authorities.
Upon the launch of the rocket from the Andøya Rocket Range, its radar signature resembled that of a Trident II submarine-launched ballistic missile (SLBM), and it had a higher boost range than previous Norwegian rockets. Thus, the Soviet early-warning radar misidentified the rocket as a nuclear-tipped ballistic missile. Several scholars argue that Russian president Boris Yeltsin was notified of the launch ‘within minutes’ and was presented with the Cheget, a connected transmission system in the form of a portable ‘nuclear briefcase’. In fact, it is not at all clear at what stage (i.e. immediately, on the same day, or after the fact) Yeltsin became involved in this incident (see below). This incident attests to the importance of hotline communications – both internal and external – at all levels of decision-making to prevent an inadvertent escalation.
Security environment
This incident occurred in a post-Cold War security environment, a relatively stable period during which US–Russia relations were relatively amicable. The previous decade in particular had seen both superpowers’ nuclear arsenals reach their peaks, and, as reflected in the two preceding case studies from 1983, tensions had run at unprecedented heights between the US and the Soviet Union. In contrast, the Norwegian rocket launch incident took place in 1995, a few years after the dissolution of the Soviet Union in late December 1991 and the handover of power, and control over the Soviet Union’s nuclear arsenal, to the president of Russia. The decade 1985–95 is even reported to have marked ‘the biggest reduction in the global nuclear stockpile’, partly in concert with the end of the Cold War.
In addition, in the year leading up to the incident, Russia concluded three major bilateral arms-control frameworks. First, in January 1994, Yeltsin and US president Bill Clinton concluded negotiations for a bilateral agreement on mutual de-targeting, which was implemented on 30 May 1994. Then, in February, Russia and the UK announced the conclusion of an agreement whereby the UK would also de-target its nuclear weapons. Finally, in September 1994, China and Russia issued a declaration pledging that they ‘would not be the first to use nuclear weapons against each other and would not target their strategic nuclear weapons at each other’.
Timeline and decision-making
The Norwegian rocket launch incident was reportedly the first time in history when a Russian or Soviet leader had activated the Cheget, the transmission system that would enable the launch of a nuclear attack in response to an alert. President Yeltsin was reportedly ‘notified within minutes of the launch and presented with one of three briefcases used to relay the authorization of a nuclear launch’. Even more than 25 years later, uncertainties persist as to how the Russian authorities responded to the incident in such a short time frame, owing to a scarcity of official documentation. Several experts have indicated, however, that there exist different versions of this incident, one of which suggests that the activation of the Cheget was staged on the day following the launch, specifically for President Yeltsin to display the readiness of his armed forces.
Upon its launch, the rocket’s radar signature resembled that of a US Navy Trident II SLBM. As a result, ‘Russia’s missile warning system [abbreviated in Russian as SRPN], quickly identified the rocket as a nuclear-tipped ballistic missile’. This information was relayed by radar operators at the Olenegorsk early-warning station in the Russian Arctic. According to information which was subsequently leaked, the rocket was of a ‘much larger design than previous versions used by Norway, and it also used the initial stage of a retired US tactical missile […], giving it a much higher boost range’.
The uncertainty surrounding the incident led Russia to also consider the possibility of a surprise attack, for instance in the form of an electromagnetic pulse attack, designed to blind and disable Russian radars. Such an attack on the Russian early-warning system could have indicated the subsequent launch of a surprise nuclear attack.
An attack on the Kola Peninsula, which hosts Russian nuclear submarines, was also considered as a possibility. In 1993, a US Navy nuclear-powered attack submarine had collided with a Russian Delta-class submarine, ‘which is normally equipped with 16 ocean-spanning nuclear-tipped missiles’ in the Arctic Ocean. This could have been a reason why the Olenegorsk radar operators were minded to identify the 1995 rocket launch as a threat and to relay the information to the relevant officers beyond the radar station.
Nevertheless, on this occasion the Russian authorities ultimately decided not to launch a nuclear attack against the US. Primary open-source information relating to the Russian decision-making process at the time of the incident is unfortunately very limited. Using such information as is available, the next section captures the critical nodes and alternative pathways.
Critical nodes and alternative pathways to decision-making
1. The letter of notification from the Norwegian government
Prior to the incident, on 21 December 1994 the Norwegian Ministry of Foreign Affairs had sent letters of notification to neighbouring countries, including Russia, outlining Norway’s intention to launch the scientific research rocket in the period between 15 January and 10 February 1995. The letter provided the location of the rocket’s launch site and the coordinates for its predicted impact areas., Whether or not the letter was received by the relevant Russian authorities is highly contested. On the one hand, there are claims that ‘due to an error at the Russian Foreign Ministry, the alert was never given to the Russian General Staff, or any part of the Russian military’. On the other hand, it was also claimed – notably by the US senator Pat Roberts – that the letter ‘got lost in the mail’ – so that neither the radar operators at the Olenegorsk early-warning station nor President Yeltsin were in possession of the information it contained.
2. The Olenegorsk early-warning station
In the 1990s, Russia’s early-warning systems involved a series of radars and a constellation of satellites, providing uninterrupted coverage of US continental missile fields. At the time of the launch of the Norwegian Black Brant XII rocket, two early-warning satellites – Cosmos-2217 and Cosmos-2261) – provided coverage on highly elliptical orbits. During this incident, the early-warning satellites functioned correctly, yet the initial assessment of the information by human operators was fallacious, since the operators of the early-warning system were not in possession of the information supplied by Norway on the rocket’s launch and intended trajectory.
Early-warning systems are a critical node for many incidents, as they are the first line of defence. In this instance, however, the early-warning system seemed to function as intended, in that it alerted the station staff of the rocket launch. Not all incidents are linked to technical errors: some can be attributed to human/operator error (for example the obtaining of false readings, or a poor assessment of the available data).
3. High-level decision-making by Cheget holders
The final critical node pertains to the deliberations made by the three Cheget briefcase holders: the Russian president, the minister of defence and the chief of the General Staff.
There is disagreement among scholars as to whether all three briefcases were needed to issue a nuclear launch order or whether a single briefcase would have been sufficient. To date, moreover, it remains unknown whether launch authority rests solely with the Russian president or not. Information on Russian command, control and communication systems dates back to the knowledge around Soviet command and control systems.
One analysis argues that for the Russian president to issue a strategic retaliatory launch, the Russian early-warning systems first need to transmit a ‘missile attack signal’. Such a signal needs to be verified by early-warning radars. Once the signal is verified, the same analysis indicates that the president – with the advice of the ministry of defence and the chief of the General Staff – would decide on the course of action to be followed. Thus, there seems to be a fail-safe mechanism embedded into the Russian command and control for cases of retaliatory launch. In the case of delivering a first strike, it is argued that ‘the supreme commander and the minister of defence would order this signal to be generated. This arrangement enables the military leadership to prevent a situation in which the decision to deliver a first strike is made by the supreme commander alone.’
Yet, the Russian constitution and the current federal Law on Defence confer the ultimate authority on all nuclear-related matters to the Supreme Commander-in-Chief (i.e. the president). Thus, it is unclear whether in today’s Russia the launch order has changed or not. Even if the president has the power to override the system in one way or another, there is a chance – however small in today’s security environment – that the Russian military may not abide by such an order. Retired colonel Valery Yarynich, who had served in the Soviet Strategic Rocket Forces, pointed out in 2003 that: ‘The widely held opinion that the Cheget is the same “nuclear button” with which the president can launch strategic missiles is erroneous. The launch of a missile is impossible without the military, starting with the crews at the command posts of the General Staff. The authorization of the president is no more than the permission and order to launch.’
Lessons learned
The 1995 Norwegian rocket launch incident provides key observations and lessons, even though it may not in itself be considered to be a ‘threshold raiser’.
The first lesson is that communications, both externally and internally, are key to reducing uncertainty and to better navigating complexity, even when the security environment is amicable. In 1987, the US and the Soviet Union agreed to each set up a Nuclear Risk Reduction Center in their respective capital cities (the US centre was later renamed the National and Nuclear Risk Reduction Center – NNRRC) in order to ensure a ‘secure, rapid, and reliable means of communication’. These centres have aimed to exchange notifications with other countries on arms-control-related matters, including ballistic missile launches and international cyber incidents. However, no such system was set up between the Soviet Union and Norway, and no such system exists today between Russia and Norway.