The Underlying Fragility of Decision-Making in Nuclear Policy

Alex Levy
11 min readApr 14, 2020
Photo by Science in HD on Unsplash

The Cold War* is regarded as one of the most stressful situations ever experienced in the history of mankind**. In 1962, the world especially felt uneasy due to the Cuban Missile Crisis, and the movie Thirteen Days gives an in-depth perspective on how American leaders prevailed in these challenging moments. One can argue that the situation was indeed managed by avoiding World War III. However, this essay seeks to explore the underlying fragility of decision making in nuclear policy.

The structure of the essay is as follows: it will start by defining what exactly is meant by the underlying fragility of decision making. This analysis will be done through one scene from Thirteen Days which visualizes why Western Democracies — in this case United States — had a fragile system established for the most destructive aspect of warfare (to this day). This part of the essay will be sustained by ideas from Thinking Fast and Slow, by Daniel Kahneman; Lastly, the essay will conclude with an overview of what lessons the Cuban Missile Crisis gave to the world and the paradox of the underlying fragility of decision making.

Underlying Fragility

Nick Bostrom, a distinguished philosopher mostly known for his theory coined The Simulation Hypothesis[1], wrote a paper called The Vulnerable World Hypothesis[2], arguing that the world is one technological invention away from complete annihilation. Bostrom states that technological inventions can be seen from a metaphorical point of view, imagining humanity extracting from an urn, white, gray and black balls — each color representing the danger of the technology invented. Bostrom says that humanity has extracted ‘white balls’ — beneficial technologies (i.e. water purification), and some ‘gray balls’ — technologies with mixed blessings.

He then argues that humanity has yet to extract a black ball from the urn, meaning that this technology would be capable of destructing human civilization. By the logic of this premise, Bostrom says that one could think of Nuclear Bombs as ‘gray balls’ extracted from the urn. What’s more striking about this gray ball is how close it has been of eradicating humanity — accidentally. Before we jump into some examples of this, it is time to define the fragility of decision-making. From now on, this will be referred as incidents, blunders, and the human factor[3] in nuclear policy.

Baum, et al (2018) offer a historical account of how many times the world has been in the brink of extinction due to incidents[4]in the nuclear side of warfare. One example[5] sheds a light in the fragility of decision making during the Cuban Missile Crisis. Around October 27th, when the Quarantine of Cuba was in place, the US Navy tried to surface a Soviet submarine (B-59) using low-risk charges[6]. Unknown to the US Navy, this submarine was carrying nuclear weapons. They thought war had started, as B-59 hadn’t had contact with Soviet officials in Moscow for two days[7]; the commanding officer, Valentin Savitsky, started to give orders to assemble a nuclear torpedo. They were ready to launch the torpedo, when Second Captain Vasili Arkhipov called off the order[8][9].

Some say this situation was by far one of the most dangerous ever lived during the Cold War[10]. All of the factors, both inside and outside of B-59, stacked the odds against peace in the world. Vadim Ordov[11], who was inside B-59 as a radio operator, said that temperatures inside reached 60-degrees. In addition, the crew had been inside the submarine for three weeks; they had no contact with Soviet officials, and all of the information they had — which Vadmir had to distil l–was coming from US radio transmissions[12]. Further, they were receiving all indications that war had started, as grenades exploded above them and sonar “passive torture” was used by the US Navy to rattle them.

This case illustrates the fragility of decision making: the fate of the world was put in hands of one single person. Luckily for mankind, he was “cool headed[13]”. Nevertheless, this scenario is evidently one that cannot depend in a single person and the likelihood that this person could discard arming a bomb because he was rational was very slim (i.e. inside B-59 we have an example of someone whose decisions ran on emotions — Captain Valentin Savitksy). As Aaron Tovish, editor of the Bulletin of The Atomic Scientists, put it in an article regarding the Okinawa Missiles of October, “…the role of accident and miscalculation…” play a substantial role in perhaps one of the most dangerous aspects of warfare. Another dimension of the fragility of decision making comes from inside the White House with a heuristic called Availability Bias[14].

Availability Bias happens when someone makes decisions or judges on an event as likely to happen if it’s easier to recall or reflect upon. In other words, the more one is exposed to a single piece of evidence, the more likely one chooses this on top of other more efficient options (Kahneman, 2011). This bias relates with the Familiarity Heuristic, which states that people often choose what is familiar with them rather than choosing another option that yields better results (Farinacci, 1992). A predisposition to choose memorable options is something that happened with President John F. Kennedy when choosing his cabinet and political advisors. This essay will focus on Kenneth O’Donnell, close friend of Kennedy, who later became his political consultant.

The essay won’t discuss O’Donnell’s role in the Missile Cuban Crisis, as many point out that he didn’t have one at all[15] even though the movie Thirteen Days illustrates him as a central piece, uniting Robert Kennedy, the joint house chiefs and the whole cabinet altogether. Rather, we will focus on the underlying fragility of choosing a close friend as a consultant. Now, while on paper it seems like a dangerous bias to have, familiarity has a big advantage: it gives one the ability to make decisions in a comfortable manner; connecting this with the choice President Kennedy made with his political consultants, this would imply he had enough confidence to open up with O’Donnell about what he was thinking of.

Promptly, big questions arise while writing these words: 1) When seeking an advisor, who was better fitted to guide President Kennedy: someone who knew him for a long time and understood how he thought, or someone who knows how to handle external evidence swiftly, yet knows President Kennedy slimly? 2) What should have been regarded as more important during this crisis: comfort to make decisions or truthful statements backed by evidence? This essay will not offer a conclusive response to neither of these queries. However, to put everything on the table, according to Begg, I. M., Anas, A., & Farinacci, S. (1992), the Familiarity Heuristic “…does not provide evidence for truth…” but rather it gives “…the attribution of truth to statements that look familiar…”.

Even though most of the literature suggests that both Familiarity and Availability Heuristics offer worse results than allocating more attention when choosing (i.e. System 2), this essay is still inconclusive in which option was best, given that System 1 is more quickly to decide, even though most of times it chooses wrong evidence. This puts a paradox in place[16]: during a crisis, when decisions have to be made faster, which system has the upper hand? In the midst of an existential crisis, would rigidness offer better outcomes rather than plasticity[17]? There is an abundance of literature concerning this dilemma[18][19].

Ultimately, the Cuban Missile Crisis sheds a light in the underlying fragility of decision making. The scope of this problem was evidently big. Yet, even with the odds stacked against humanity, decisions were made, and they did not result in a Nuclear Apocalypse. In spite of this, how should we think about the role human intuitions, emotions and heuristics have in Nuclear Policy? Will humanity extract a white or gray ball from the urn and turn it into a black one due to heuristics and faulty decision-making? What if there is no black ball in the urn? Are we willing to test how far humanity can go with failures in decision-making? Or even worse and paradoxical: have these blunders saved humanity from extracting a black ball from the urn? Have miscalculations actually turned a black ball into a white or gray one?

The movie Thirteen Days is a thrilleresque historical account of what happened during the Cuban Missile Crisis and gives a lot to reflect on the human aspect of decision making. Even humanity’s best minds, such as President Kennedy and his cabinet, are prone to making blunders –and as of today, humanity owes a big debt to a variable impossible to get a hold of: chance. Until when will Nuclear Policy rely on mistakes to fix make-or-break situations?

The movie Thirteen Days has a remarkable scene and it is one frequently overlooked but can help in answering this question: the joint chiefs of staff proposed to John F. Kennedy an air strike to remove the missiles and an all-out invasion of Cuba. General Curtis Lemay was confident the Soviets wouldn’t retaliate. Needless to say, Lemay had lost hope that there was a way of preventing a nuclear war, so he wanted to take advantage of the position his country was to win the war. Afterwards, John F. Kennedy[20] tells Kenny O’Donnell[21]: “I’ll tell you one thing, Kenny. Those brass hats have one big advantage. That is, if we do what they want us to do, none of us are going to be alive to tell them they were wrong.” What a profound thing to say.

*This essay is inspired by something said by Erwin Schrodinger (1887–1961): “The task is, not so much to see what no one has yet seen; but to think what nobody has yet thought, about that which everybody sees.”

**This essay is also inspired by a conversation between Fred Kaplan and Sam Harris in the “Making Sense Podcast”. Thank you, Sam. See the episode here: https://www.youtube.com/watch?v=Y_rH0L9r7wM

[1] In this paper Bostrom proposes that reality is in truth an artificial simulation.

[2] See Bostrom, Nick (2019) https://nickbostrom.com/papers/vulnerable.pdf

[3] This concept includes intuitions, emotions and biases (i.e. When choosing a cabinet, such as choosing people close to the President as advisors rather than judging by expertise). Kahneman and Tversky refer to this bias as Availability Bias. This heuristic will be covered in depth throughout this essay.

[4] Not all incidents happened during the Cold War.

[5] Due to page limits of the essay, only one example will be mentioned. However, many other incidents during the Missile Cuban Crisis are key to understand the fragility of decision making, such as: The Okinawa missiles of October, and The Duluth-Volk Bear Incident. See Baum et al (2018).

[6] Submarine B-59 had no clue these charges were used as a warning.

[7] Hurricane Ella is said to be responsible for some faulty communications during October.

[8] Vasili Arkhipov had been a commander in another submarine and had witnessed an accident with a nuclear reactor. This points could explain why was he inclined to not use nuclear weapons, as he saw 8 men dying from radiation poisoning. He then succumbed to this accident, as well. See https://www.pbs.org/video/secrets-dead-man-who-saved-world-full-episode/ for more.

[9] Ryurik Ketov was another commander from a different submarine and in an interview he acknowledged that submarine commanders had permission to act (i.e. fire a nuclear torpedoes) without direct authorization from Moscow. He said: ‘For the first time in life, a commander of a submarine had a nuclear weapon and had the authority to fire the missile on his command”. See https://www.pbs.org/video/secrets-dead-man-who-saved-world-full-episode/ for more.

[10] One of them was former Secretary of State Robert McNamara who later accepted they had less control of the situation than they thought. “We came very close. Closer than we knew at the time”. See https://www.theguardian.com/world/2002/may/19/theobserver

[11] See https://youtu.be/SMqyvxAEeU8 for a press conference with him.

[12] Radio operator of B-59, Vadim Orlov said to a US national archivist that given the silence from Moscow, they were listening to American radio broadcasts, specifically from Miami. He said that these radio stations were describing total mobilization to start an invasion of Cuba. Ryurik Ketov reported that “everything [he] knew and everything [he] did was by listening to Kennedy on the radio”. Further, he stated that “we could only guess what was going on by watching what the Americans were doing. . . is there a war? What’s happening? See https://www.pbs.org/video/secrets-dead-man-who-saved-world-full-episode/ for more.

[13] Ryurik Ketov said during an interview for the show “Secrets of the Dead”. See https://www.pbs.org/video/secrets-dead-man-who-saved-world-full-episode/for more.

[14] See Kahnenman. Thinking Fast and Slow. (2011)

[15] “Kennedy historian Arthur Schlesinger Jr. told the Boston Globe, “Kenny was an admirable man, but he had nothing to do with the Cuban missile crisis.””– See https://archive.seattletimes.com/archive/?date=20010204&slug=odonnell

[16] Needless to say, more about this subject has to be covered but this essay will only offer a starting point.

[17] Rigidness and plasticity can be traded for “Expertise” and “Familiarity”.

[18] Some examples include: Soltwisch, B. W. (2015). The paradox of organizational rigidity: A contingency model for information processing during times of opportunity and threat. And Sarkar, S., & Osiyevskyy, O. (2018). Organizational change and rigidity during crisis: A review of the paradox.

[19] This scenario is fragile because of the aforementioned paradox.

[20] Portrayed by Bruce Greenwood.

[21] Played by Kevin Costner

Works Cited

Baddeley, Michelle. “Herding, social influences and behavioural bias in scientific research: Simple awareness of the hidden pressures and beliefs that influence our thinking can help to preserve objectivity.” EMBO reports vol. 16,8 (2015): 902–5. doi:10.15252/embr.201540637

Baum, Seth, Robert de Neufville, and Anthony Barrett. “A model for the probability of nuclear war.” Global Catastrophic Risk Institute Working Paper (2018): 18–1.

Begg, Ian Maynard, Ann Anas, and Suzanne Farinacci. “Dissociation of processes in belief: Source recollection, statement familiarity, and the illusion of truth.” Journal of Experimental Psychology: General 121.4 (1992): 446.

Bostrom, Nick. “The vulnerable world hypothesis.” Global Policy (2019).

Donaldson, Roger. “Thirteen Days”. (2001) Movie.

Dube‐Rioux, Laurette, and J. Edward Russo. “An availability bias in professional judgment.” Journal of Behavioral Decision Making 1.4 (1988): 223–237.

Finucane, Melissa L., et al. “The affect heuristic in judgments of risks and benefits.” Journal of behavioral decision making13.1 (2000): 1–17.

Kahneman, Daniel, et al., eds. Judgment under uncertainty: Heuristics and biases. Cambridge university press, 1982.

Kahneman, Daniel. Thinking, fast and slow. Macmillan, 2011.

Kaplan, Fred. “Rethinking Nuclear Policy: Taking Stock of the Stockpile.” Foreign Affairs, vol. 95, no. 5, 2016, pp. 18–25., www.jstor.org/stable/43946952 Accessed 21 Mar. 2020.

Kerns, Margi. The man who saved the world. Secrets of the dead. PBS (2012). Retrieved from https://www.pbs.org/video/secrets-dead-man-who-saved-world-full-episode/

Leonard, Mark, and Blackhurst Rob. “I don’t think anybody thought much about whether Agent Orange was against the rules of war’ The Guadrian. (2002)

Ringle, Ken. ‘Thirteen Days’ embellishes crisis roles. The Washington Post. (2001) Retrieved from https://archive.seattletimes.com/archive/?date=20010204&slug=odonnell

Sarkar, Soumodip, and Oleksiy Osiyevskyy. “Organizational change and rigidity during crisis: A review of the paradox.” European Management Journal 36.1 (2018): 47–58.

Scharre, Paul. Adversarial Risk: Normal Accidents in Competitive Environments. Center for a New American Security, 2016, pp. 34–37, Autonomous Weapons and Operational Risk: Ethical Autonomy Project, www.jstor.org/stable/resrep06321.9. Accessed 22 Mar. 2020.

Tertrais, Bruno. ““On The Brink” — Really? Revisiting Nuclear Close Calls Since 1945.” The Washington Quarterly 40.2 (2017): 51–66.

Tovish, Aaron. “The Okinawa Missiles of October.” Bulletin of Atomic Scientists (2015). https://thebulletin.org/2015/10/the-okinawa-missiles-of-october/

--

--

Alex Levy

Awake. Integrate. Activate. Creator of Through Conversations Podcast at throughconversations.com