20 results back to index
Army of None: Autonomous Weapons and the Future of War by Paul Scharre
active measures, Air France Flight 447, algorithmic trading, artificial general intelligence, augmented reality, automated trading system, autonomous vehicles, basic income, brain emulation, Brian Krebs, cognitive bias, computer vision, cuban missile crisis, dark matter, DARPA: Urban Challenge, DevOps, drone strike, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, facts on the ground, fault tolerance, Flash crash, Freestyle chess, friendly fire, IFF: identification friend or foe, ImageNet competition, Internet of things, Johann Wolfgang von Goethe, John Markoff, Kevin Kelly, Loebner Prize, loose coupling, Mark Zuckerberg, moral hazard, mutually assured destruction, Nate Silver, pattern recognition, Rodney Brooks, Rubik’s Cube, self-driving car, sensor fusion, South China Sea, speech recognition, Stanislav Petrov, Stephen Hawking, Steve Ballmer, Steve Wozniak, Stuxnet, superintelligent machines, Tesla Model S, The Signal and the Noise by Nate Silver, theory of mind, Turing test, universal basic income, Valery Gerasimov, Wall-E, William Langewiesche, Y2K, zero day
Introduction: The Power Over Life and Death 1 shot down a commercial airliner: Thom Patterson, “The downing of Flight 007: 30 years later, a Cold War tragedy still seems surreal,” CNN.com, August 31, 2013, http://www.cnn.com/2013/08/31/us/kal-fight-007-anniversary/index.html. 1 Stanislav Petrov: David Hoffman, “ ‘I Had a Funny Feeling in My Gut,’ ” Washington Post, February 10, 1999, http://www.washingtonpost.com/wp-srv/inatl/longterm/coldwar/shatter021099b.htm. 1 red backlit screen: Pavel Aksenov, “Stanislav Petrov: The Man Who May Have Saved the World,” BBC.com, September 26, 2013, http://www.bbc.com/news/world-europe-24280831. 2 five altogether: Ibid. 2 Petrov had a funny feeling: Hoffman, “I Had a Funny Feeling in My Gut.’” 2 Petrov put the odds: Aksenov, “Stanislav Petrov: The Man Who May Have Saved the World.” 5 Sixteen nations already have armed drones: The United States, United Kingdom, Israel, China, Nigeria, Iran, Iraq, Jordan, Egypt, United Arab Emirates, Saudi Arabia, Kazakhstan, Turkmenistan, Pakistan, Myanmar, Turkey.
Two hundred and sixty-nine people had been killed, including an American congressman. Fearing retaliation, the Soviet Union was on alert. The Soviet Union deployed a satellite early warning system called Oko to watch for U.S. missile launches. Just after midnight on September 26, the system issued a grave report: the United States had launched a nuclear missile at the Soviet Union. Lieutenant Colonel Stanislav Petrov was on duty that night in bunker Serpukhov-15 outside Moscow, and it was his responsibility to report the missile launch up the chain of command to his superiors. In the bunker, sirens blared and a giant red backlit screen flashed “launch,” warning him of the detected missile, but still Petrov was uncertain. Oko was new, and he worried that the launch might be an error, a bug in the system.
It isn’t hard to imagine future weapons that could outperform humans in discriminating between a person holding a rifle and one holding a rake. Yet computers still fall far short of humans in understanding context and interpreting meaning. AI programs today can identify objects in images, but can’t draw these individual threads together to understand the big picture. Some decisions in war are straightforward. Sometimes the enemy is easily identified and the shot is clear. Some decisions, however, like the one Stanislav Petrov faced, require understanding the broader context. Some situations, like the one my sniper team encountered, require moral judgment. Sometimes doing the right thing entails breaking the rules—what’s legal and what’s right aren’t always the same. THE DEBATE Humanity faces a fundamental question: should machines be allowed to make life-and-death decisions in war? Should it be legal? Is it right?
1983: Reagan, Andropov, and a World on the Brink by Taylor Downing
active measures, anti-communist, Ayatollah Khomeini, Berlin Wall, cuban missile crisis, Donald Trump, Fall of the Berlin Wall, full employment, kremlinology, Mikhail Gorbachev, mutually assured destruction, nuclear paranoia, nuclear winter, RAND corporation, Robert Hanssen: Double agent, Ronald Reagan, Ronald Reagan: Tear down this wall, Stanislav Petrov, Vladimir Vetrov: Farewell Dossier, Yom Kippur War
An American Minuteman missile launched from the Midwest would take around thirty-five minutes to reach its target in the Soviet Union travelling at about four miles per second. Serpukhov-15 could provide around twelve minutes’ additional warning time of an attack over the radars stationed along the northern fringes of the Soviet Union. These extra minutes would be vital in giving the leadership time in which to respond appropriately; and to get to a shelter.4 At 7 p.m. on Monday, 26 September 1983, Lieutenant-Colonel Stanislav Petrov arrived at the compound’s command centre having been asked at the last minute to take over a shift as another officer had reported in ill. At the centre he spent an hour talking with colleagues whose shift was finishing, receiving the latest reports and updates from them. Then Petrov ordered his team of twelve specialist officers to salute the Soviet flag and take up their positions in the control room.
By the time the first journalists got to Petrov he was living a wretched life in a ghastly tower block in a Moscow suburb. But on the black and white television set in his tiny apartment there was a small statue of a globe resting inside an open hand which Petrov proudly pointed out to visitors. It was inscribed with a few words from Kofi Annan, the Secretary General of the United Nations: ‘To Stanislav Petrov, The Man Who Saved the World’. By the autumn of 1983, the two superpowers possessed about 18,400 nuclear warheads. They were on the tips of giant intercontinental missiles held in silos spread right across North America and the Soviet Union. Submarines carrying nuclear missiles crossed every ocean, and heavy bombers fully armed with nuclear bombs were constantly on stand-by. In addition there were smaller tactical nuclear weapons lined up for use in a conventional war along every front line of the Cold War.
Such an attack would prove to be an ‘annihilating retaliatory nuclear strike’.7 This meant that Washington and many other major cities would be hit in the first wave of a Soviet nuclear assault. In one form of logic, when time was crucial and every second counted, this made good sense. But it also made the possibility of miscalculation even greater. Only six weeks before, the Soviet early warning system had malfunctioned by interpreting reflections of the sun on clouds in the Midwest of the United States as a sign that missiles had been launched. On that occasion the cool head of Stanislav Petrov had defused the situation, but there was no guarantee that the officer on duty would respond in the same way to another early warning. The entire Soviet nuclear launch system was resting on a knife edge. Either man or machine could interpret a situation wrongly and the consequence would be nuclear war by miscalculation. When on 8 November NATO changed its top secret codes during Able Archer 83, the Soviet leaders felt confident that this could no longer be a war game.
Who Rules the World? by Noam Chomsky
"Robert Solow", Albert Einstein, anti-communist, Ayatollah Khomeini, Berlin Wall, Bretton Woods, British Empire, capital controls, corporate governance, corporate personhood, cuban missile crisis, deindustrialization, Donald Trump, Doomsday Clock, Edward Snowden, en.wikipedia.org, facts on the ground, failed state, Fall of the Berlin Wall, Howard Zinn, illegal immigration, Intergovernmental Panel on Climate Change (IPCC), invisible hand, liberation theology, Malacca Straits, Martin Wolf, Mikhail Gorbachev, Monroe Doctrine, Nelson Mandela, nuclear winter, Occupy movement, oil shale / tar sands, one-state solution, Plutonomy: Buying Luxury, Explaining Global Imbalances, precariat, Ralph Waldo Emerson, Ronald Reagan, South China Sea, Stanislav Petrov, structural adjustment programs, The Wealth of Nations by Adam Smith, Thorstein Veblen, too big to fail, trade route, union organizing, uranium enrichment, wage slave, WikiLeaks, working-age population
The exercises “almost became a prelude to a preventative nuclear strike,” according to an account in the Journal of Strategic Studies.10 It was even more dangerous than that, as we learned in the fall of 2013, when the BBC reported that right in the midst of these world-threatening developments, Russia’s early-warning systems detected an incoming missile strike from the United States, sending its nuclear system onto the highest-level alert. The protocol for the Soviet military was to retaliate with a nuclear attack of its own. Fortunately, the officer on duty, Stanislav Petrov, decided to disobey orders and not report the warnings to his superiors. He received an official reprimand. And thanks to his dereliction of duty, we’re still alive to talk about it.11 The security of the population was no more a high priority for Reagan administration planners than for their predecessors. And so it continues to the present, even putting aside the numerous near-catastrophic nuclear accidents that have occurred over the years, many reviewed in Eric Schlosser’s chilling study Command and Control.12 In other words, it is hard to contest General Butler’s conclusions.
The Pentagon’s risky measures included sending U.S. strategic bombers over the North Pole to test Soviet radar, and naval exercises in wartime approaches to the USSR where U.S. warships had previously not entered. Additional secret operations simulated surprise naval attacks on Soviet targets.”11 We now know that the world was saved from likely nuclear destruction in those frightening days by the decision of a Russian officer, Stanislav Petrov, not to transmit to higher authorities the report of automated detection systems that the USSR was under missile attack. Accordingly, Petrov takes his place alongside Russian submarine commander Vasili Arkhipov, who, at a dangerous moment of the 1962 Cuban Missile Crisis, refused to authorize the launching of nuclear torpedoes when the subs were under attack by U.S. destroyers enforcing a quarantine.
Fischer, “A Cold War Conundrum: The 1983 Soviet War Scare,” Center for the Study of Intelligence, 7 July 2008, https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/a-cold-war-conundrum/source.htm; Dmitry Dima Adamsky, “The 1983 Nuclear Crisis—Lessons for Deterrence Theory and Practice,” Journal of Strategic Studies 36, no.1 (2013): 4–41. 11. Pavel Aksenov, “Stanislav Petrov: The Man Who May Have Saved the World,” BBC News Europe, 26 September 2013, http://www.bbc.com/news/world-europe-24280831. 12. Eric Schlosser, Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion of Safety (New York: Penguin, 2013). 13. President Bill Clinton, Speech before the UN General Assembly, 27 September 1993, http://www.state.gov/p/io/potusunga/207375.htm; Secretary of Defense William Cohen, Annual Report to the President and Congress: 1999 (Washington, DC: Department of Defense, 1999) http://history.defense.gov/Portals/70/Documents/annual_reports/1999_DoD_AR.pdf. 14.
Hello World: Being Human in the Age of Algorithms by Hannah Fry
23andMe, 3D printing, Air France Flight 447, Airbnb, airport security, augmented reality, autonomous vehicles, Brixton riot, chief data officer, computer vision, crowdsourcing, DARPA: Urban Challenge, Douglas Hofstadter, Elon Musk, Firefox, Google Chrome, Gödel, Escher, Bach, Ignaz Semmelweis: hand washing, John Markoff, Mark Zuckerberg, meta analysis, meta-analysis, pattern recognition, Peter Thiel, RAND corporation, ransomware, recommendation engine, ride hailing / ride sharing, selection bias, self-driving car, Shai Danziger, Silicon Valley, Silicon Valley startup, Snapchat, speech recognition, Stanislav Petrov, statistical model, Stephen Hawking, Steven Levy, Tesla Model S, The Wisdom of Crowds, Thomas Bayes, Watson beat the top human players on Jeopardy!, web of trust, William Langewiesche
Or perhaps, as in some machine-learning techniques, following the logical process inside the algorithm just isn’t possible. There will be times when we have to hand over control to the unknown, even while knowing that the algorithm is capable of making mistakes. Times when we are forced to weigh up our own judgement against that of the machine. When, if we decide to trust our instincts instead of its calculations, we’re going to need rather a lot of courage in our convictions. When to over-rule Stanislav Petrov was a Russian military officer in charge of monitoring the nuclear early warning system protecting Soviet airspace. His job was to alert his superiors immediately if the computer indicated any sign of an American attack.34 Petrov was on duty on 26 September 1983 when, shortly after midnight, the sirens began to howl. This was the alert that everyone dreaded. Soviet satellites had detected an enemy missile headed for Russian territory.
Stanley, Pitfalls of Artificial Intelligence Decision-making. 30. Ibid. 31. Ibid. 32. Ibid. 33. Ibid. 34. Kristine Phillips, ‘The former Soviet officer who trusted his gut – and averted a global nuclear catastrophe’, Washington Post, 18 Sept. 2017, https://www.washingtonpost.com/news/retropolis/wp/2017/09/18/the-former-soviet-officer-who-trusted-his-gut-and-averted-a-global-nuclear-catastrophe/?utm_term=.6546e0f06cce. 35. Pavel Aksenov, ‘Stanislav Petrov: the man who may have saved the world’, BBC News, 26 Sept. 2013, http://www.bbc.co.uk/news/world-europe-24280831. 36. Ibid. 37. Stephen Flanagan, Re: Accident at Smiler Rollercoaster, Alton Towers, 2 June 2015: Expert’s Report, prepared at the request of the Health and Safety Executive, Oct. 2015, http://www.chiark.greenend.org.uk/~ijackson/2016/Expert%20witness%20report%20from%20Steven%20Flanagan.pdf. 38.
The Dead Hand: The Untold Story of the Cold War Arms Race and Its Dangerous Legacy by David Hoffman
active measures, anti-communist, banking crisis, Berlin Wall, Chuck Templeton: OpenTable:, crony capitalism, cuban missile crisis, failed state, joint-stock company, Kickstarter, Mikhail Gorbachev, mutually assured destruction, nuclear winter, Robert Hanssen: Double agent, rolodex, Ronald Reagan, Ronald Reagan: Tear down this wall, Silicon Valley, standardized shipping container, Stanislav Petrov, Thomas L Friedman, undersea cable, uranium enrichment, Vladimir Vetrov: Farewell Dossier, zero-sum game
Nunn and Lugar, interviews with author after visit to the facility, Aug. 31, 2007. 10 The Cooperative Threat Reduction programs were a mere .07 percent of the Defense Department's overall budget request for fiscal year 2009, 3.86 percent of the Energy Department's request and .8 percent of the State Department's request. See Bunn, p. 116. 11 Valentin Yevstigneev, interview, Feb. 10, 2005. Yevstigneev's comment repeated the claim made in an article published May 23, 2001, in the Russian newspaper Nezavisamaya Gazeta. Stanislav Petrov, the general in charge of chemical weapons, was a coauthor. The piece claimed the Sverdlovsk anthrax outbreak was the result of "subversive activity" against the Soviet Union. Stanislav Petrov et al., "Biologicheskaya Diversia Na Urale" [Biological Sabotage in the Urals], NG, May 23, 1001. 12 The closed military facilities are: the Scientific-Research Institute of Microbiology of the Ministry of Defense of the Russian Federation, Kirov, which is the main biological weapons facility of the military; the Virology Center of the Scientific-Research Institute of Microbiology of the Ministry of Defense, Sergiev Posad; and the Department of Military Epidemiology of the Scientific Research Institute of Microbiology of the Ministry of Defense, Yekaterinburg. 13 When the United States and Russia signed the Chemical Weapons Convention in 1997 they promised to destroy stocks of chemical weapons by 2012.
In public, he was an official of the state, and loyal to the official story. But he also gave the pathologists a private signal to hide and protect their evidence. Nikiforov later died of a heart attack. "We are certain that he knew the truth," Grinberg said.9 But the people of the Soviet Union and the outside world did not. II. Night Watch for Nuclear War The shift change began at 7 P.M. on September 26, 1983. Stanislav Petrov, a lieutenant colonel, arrived at Serpukhov-15, south of Moscow, a top-secret missile attack early-warning station, which received signals from satellites. Petrov changed from street clothes into the soft uniform of the military space troops of the Soviet Union. Over the next hour, he and a dozen other specialists asked questions of the outgoing officers. Then his men lined up two rows deep and reported for duty to Petrov.
Little is known about the disposition in Russia of the thousands of tactical nuclear weapons removed from Eastern Europe and the former Soviet republics after the 1991 Bush-Gorbachev initiative. They may be in storage or deployed; they have never been covered by any treaty, nor any verification regime, and the loss of just one could be catastrophic.2 Another step would be to take the remaining strategic nuclear weapons off launch-ready alert. When Stanislav Petrov faced the false alarm in 1983, launch decisions had to be made in just minutes. Today, Russia is no longer the ideological or military threat the Soviet Union once was; nor does the United States pose such a threat to Russia. Americans invested much time and effort to assist Russia's leap to capitalism in the 1990s--should we aim our missiles now at the very stock markets in Moscow we helped design?
The AI Economy: Work, Wealth and Welfare in the Robot Age by Roger Bootle
"Robert Solow", 3D printing, agricultural Revolution, AI winter, Albert Einstein, anti-work, autonomous vehicles, basic income, Ben Bernanke: helicopter money, Bernie Sanders, blockchain, call centre, Capital in the Twenty-First Century by Thomas Piketty, Chris Urmson, computer age, conceptual framework, corporate governance, correlation does not imply causation, creative destruction, David Ricardo: comparative advantage, deindustrialization, deskilling, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, everywhere but in the productivity statistics, facts on the ground, financial intermediation, full employment, future of work, income inequality, income per capita, industrial robot, Internet of things, invention of the wheel, Isaac Newton, James Watt: steam engine, Jeff Bezos, job automation, job satisfaction, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Joseph Schumpeter, Kevin Kelly, license plate recognition, Marc Andreessen, Mark Zuckerberg, market bubble, mega-rich, natural language processing, Network effects, new economy, Nicholas Carr, Paul Samuelson, Peter Thiel, positional goods, quantitative easing, RAND corporation, Ray Kurzweil, Richard Florida, ride hailing / ride sharing, rising living standards, road to serfdom, Robert Gordon, Robert Shiller, Robert Shiller, Second Machine Age, secular stagnation, self-driving car, Silicon Valley, Simon Kuznets, Skype, social intelligence, spinning jenny, Stanislav Petrov, Stephen Hawking, Steven Pinker, technological singularity, The Future of Employment, The Wealth of Nations by Adam Smith, Thomas Malthus, trade route, universal basic income, US Airways Flight 1549, Vernor Vinge, Watson beat the top human players on Jeopardy!, We wanted flying cars, instead we got 140 characters, wealth creators, winner-take-all economy, Y2K, Yogi Berra
At a time when tensions were already high because of the shooting down by the Soviet Union of a Korean aircraft, killing 269 passengers, a Soviet early-warning system reported the launch by the USA of five missiles at the Soviet Union. The Soviet officer responsible for this early-warning system, Stanislav Petrov, had minutes to decide what to do. The protocol required that he should report this as a nuclear attack. Instead, he relied on his gut instinct. If America was really launching a nuclear attack, he reasoned, why had it sent only five missiles? So, he decided that it was a false alarm and took no action. He was right. It turned out that a Soviet satellite had misread the sun’s reflections off cloud tops for flares from rocket engines. It is now widely acknowledged that Petrov’s judgment saved the world from nuclear catastrophe. Despite involving the saving of the world from a real disaster, this true story has a bitter ending. Stanislav Petrov was sacked for disobeying orders and lived the rest of his life in drab obscurity.
Surviving AI: The Promise and Peril of Artificial Intelligence by Calum Chace
"Robert Solow", 3D printing, Ada Lovelace, AI winter, Airbnb, artificial general intelligence, augmented reality, barriers to entry, basic income, bitcoin, blockchain, brain emulation, Buckminster Fuller, cloud computing, computer age, computer vision, correlation does not imply causation, credit crunch, cryptocurrency, cuban missile crisis, dematerialisation, discovery of the americas, disintermediation, don't be evil, Elon Musk, en.wikipedia.org, epigenetics, Erik Brynjolfsson, everywhere but in the productivity statistics, Flash crash, friendly AI, Google Glasses, hedonic treadmill, industrial robot, Internet of things, invention of agriculture, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, life extension, low skilled workers, Mahatma Gandhi, means of production, mutually assured destruction, Nicholas Carr, pattern recognition, peer-to-peer, peer-to-peer model, Peter Thiel, Ray Kurzweil, Rodney Brooks, Second Machine Age, self-driving car, Silicon Valley, Silicon Valley ideology, Skype, South Sea Bubble, speech recognition, Stanislav Petrov, Stephen Hawking, Steve Jobs, strong AI, technological singularity, The Future of Employment, theory of mind, Turing machine, Turing test, universal basic income, Vernor Vinge, wage slave, Wall-E, zero-sum game
Most people are aware that the world came close to this annihilation during the Cuban missile crisis in 1962; fewer know that we have also come close to a similar fate another four times since then, in 1979, 1980, 1983 and 1995. (52) In 1962 and 1983 we were saved by individual Soviet military officers who decided not to follow prescribed procedure. Today, while the world hangs on every utterance of Justin Bieber and the Kardashian family, relatively few of us even know the names of Vasili Arkhipov and Stanislav Petrov, two men who quite literally saved the world. Perhaps this survival illustrates our ingenuity. There was an ingenious logic in the repellent but effective doctrine of mutually assured destruction (MAD). More likely we have simply been lucky. We have time to rise to the challenge of superintelligence – probably a few decades. However, it would be unwise to rely on that period of grace: a sudden breakthrough in machine learning or cognitive neuroscience could telescope the timing dramatically, and it is worth bearing in mind the powerful effect of exponential growth in the computing resource which underpins AI research and a lot of research in other fields too. 9.7 – It’s time to talk What we need now is a serious, reasoned debate about superintelligence – a debate which avoids the twin perils of complacency and despair.
Click Here to Kill Everybody: Security and Survival in a Hyper-Connected World by Bruce Schneier
23andMe, 3D printing, autonomous vehicles, barriers to entry, bitcoin, blockchain, Brian Krebs, business process, cloud computing, cognitive bias, computer vision, connected car, corporate governance, crowdsourcing, cryptocurrency, cuban missile crisis, Daniel Kahneman / Amos Tversky, David Heinemeier Hansson, Donald Trump, drone strike, Edward Snowden, Elon Musk, fault tolerance, Firefox, Flash crash, George Akerlof, industrial robot, information asymmetry, Internet of things, invention of radio, job automation, job satisfaction, John Markoff, Kevin Kelly, license plate recognition, loose coupling, market design, medical malpractice, Minecraft, MITM: man-in-the-middle, move fast and break things, move fast and break things, national security letter, Network effects, pattern recognition, profit maximization, Ralph Nader, RAND corporation, ransomware, Rodney Brooks, Ross Ulbricht, security theater, self-driving car, Shoshana Zuboff, Silicon Valley, smart cities, smart transportation, Snapchat, Stanislav Petrov, Stephen Hawking, Stuxnet, The Market for Lemons, too big to fail, Uber for X, Unsafe at Any Speed, uranium enrichment, Valery Gerasimov, web application, WikiLeaks, zero day
Future of Life Institute (1 Feb 2016), “Accidental nuclear war: A timeline,” https://futureoflife.org/background/nuclear-close-calls-a-timeline. 95The Cuban Missile Crisis is probably: Benjamin Schwarz (1 Jan 2013), “The real Cuban missile crisis,” Atlantic, https://www.theatlantic.com/magazine/archive/2013/01/the-real-cuban-missile-crisis/309190. 95although the 1983 false alarm is a close second: Sewell Chan (18 Sep 2017), “Stanislav Petrov, Soviet officer who helped avert nuclear war,” New York Times, https://www.nytimes.com/2017/09/18/world/europe/stanislav-petrov-nuclear-war-dead.html. 95although much less damaging than: Laura Geggel (9 Feb 2016), “The odds of dying,” Live Science, https://www.livescience.com/3780-odds-dying.html. 95But instead of regarding it as: As amazing as it seems today, immediately after 9/11, people actually believed that terrorist attacks of that magnitude would happen every few months.
Because We Say So by Noam Chomsky
Affordable Care Act / Obamacare, American Legislative Exchange Council, Chelsea Manning, cuban missile crisis, David Brooks, drone strike, Edward Snowden, Intergovernmental Panel on Climate Change (IPCC), Julian Assange, Malacca Straits, Martin Wolf, means of production, Monroe Doctrine, Nelson Mandela, Occupy movement, oil shale / tar sands, Powell Memorandum, Ralph Waldo Emerson, RAND corporation, Slavoj Žižek, Stanislav Petrov, Thorstein Veblen, too big to fail, uranium enrichment, WikiLeaks
The NATO exercise “almost became a prelude to a preventative [Russian] nuclear strike,” according to an account last year by Dmitry Adamsky in the JOURNAL OF STRATEGIC STUDIES. Nor was this the only close call. In September 1983, Russia’s early-warning systems registered an incoming missile strike from the United States and sent the highest-level alert. The Soviet military protocol was to retaliate with a nuclear attack of its own. The Soviet officer on duty, Stanislav Petrov, intuiting a false alarm, decided not to report the warnings to his superiors. Thanks to his dereliction of duty, we’re alive to talk about the incident. Security of the population was no more a high priority for Reagan planners than for their predecessors. Such heedlessness continues to the present, even putting aside the numerous near-catastrophic accidents, reviewed in a chilling new book, COMMAND AND CONTROL: NUCLEAR WEAPONS, THE DAMASCUS ACCIDENT, AND THE ILLUSION OF SAFETY, by Eric Schlosser.
On the Future: Prospects for Humanity by Martin J. Rees
23andMe, 3D printing, air freight, Alfred Russel Wallace, Asilomar, autonomous vehicles, Benoit Mandelbrot, blockchain, cryptocurrency, cuban missile crisis, dark matter, decarbonisation, demographic transition, distributed ledger, double helix, effective altruism, Elon Musk, en.wikipedia.org, global village, Hyperloop, Intergovernmental Panel on Climate Change (IPCC), Internet of things, Jeff Bezos, job automation, Johannes Kepler, John Conway, life extension, mandelbrot fractal, mass immigration, megacity, nuclear winter, pattern recognition, quantitative hedge fund, Ray Kurzweil, Rodney Brooks, Search for Extraterrestrial Intelligence, sharing economy, Silicon Valley, smart grid, speech recognition, Stanford marshmallow experiment, Stanislav Petrov, stem cell, Stephen Hawking, Steven Pinker, Stuxnet, supervolcano, technological singularity, the scientific method, Tunguska event, uranium enrichment, Walter Mischel, Yogi Berra
Arkhipov held out against such action—and thereby avoided triggering a nuclear exchange that could have escalated catastrophically. Post-Cuba assessments suggest that the annual risk of thermonuclear destruction during the Cold War was about ten thousand times higher than the mean death rate from asteroid impact. And indeed, there were other ‘near misses’ when catastrophe was only avoided by a thread. In 1983 Stanislav Petrov, a Russian Air Force officer, was monitoring a screen when an ‘alert’ indicated that five Minuteman intercontinental ballistic missiles had been launched by the United States towards the Soviet Union. Petrov’s instructions, when this happened, were to alert his superior (who could, within minutes, trigger nuclear retaliation). He decided, on no more than a hunch, to ignore what he’d seen on the screen, guessing it was a malfunction in the early warning system.
The Precipice: Existential Risk and the Future of Humanity by Toby Ord
3D printing, agricultural Revolution, Albert Einstein, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, availability heuristic, Columbian Exchange, computer vision, cosmological constant, cuban missile crisis, decarbonisation, defense in depth, delayed gratification, demographic transition, Doomsday Clock, Drosophila, effective altruism, Elon Musk, Ernest Rutherford, global pandemic, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, James Watt: steam engine, Mark Zuckerberg, mass immigration, meta analysis, meta-analysis, Mikhail Gorbachev, mutually assured destruction, Nash equilibrium, Norbert Wiener, nuclear winter, p-value, Peter Singer: altruism, planetary scale, race to the bottom, RAND corporation, Ronald Reagan, self-driving car, Stanislav Petrov, Stephen Hawking, Steven Pinker, Stewart Brand, supervolcano, survivorship bias, the scientific method, uranium enrichment
When Soviet Premier Brezhnev found out, he asked President Carter “What kind of mechanism is it which allows a possibility of such incidents?”27 Autumn Equinox Incident: September 26, 1983 Shortly after midnight, in a period of heightened tensions, the screens at the command bunker for the Soviet satellite-based early-warning system showed five ICBMs launching from the United States.28 The duty officer, Stanislav Petrov, had instructions to report any detected launch to his superiors, who had a policy of immediate nuclear retaliatory strike. For five tense minutes he considered the case, then despite his remaining uncertainty, reported it to his commanders as a false alarm. He reasoned that a US first strike with just the five missiles shown was too unlikely and noted that the missiles’ vapor trails could not be identified.
“Accelerated Modern Human-Induced Species Losses: Entering the Sixth Mass Extinction.” Science Advances, 1(5), e1400253. Cederman, L.-E. (2003). “Modeling the Size of Wars: From Billiard Balls to Sandpiles.” The American Political Science Review, 97(1), 135–50. Challinor, A. J., et al. (2014). “A Meta-Analysis of Crop Yield under Climate Change and Adaptation.” Nature Climate Change, 4(4), 287–91. Chan, S. (September 18, 2017). “Stanislav Petrov, Soviet Officer Who Helped Avert Nuclear War, Is Dead at 77.” The New York Times. Chapman, C. R. (2004). “The Hazard of Near-Earth Asteroid Impacts on Earth.” Earth And Planetary Science Letters, 222(1), 1–15. Charney, J. G., et al. (1979). “Carbon Dioxide and Climate: A Scientific Assessment.” National Academy of Sciences. Chen, Z.-Q., and Benton, M. J. (2012). “The Timing and Pattern of Biotic Recovery Following the End-Permian Mass Extinction.”
On Power and Ideology by Noam Chomsky
anti-communist, Ayatollah Khomeini, Berlin Wall, British Empire, cuban missile crisis, feminist movement, imperial preference, land reform, Mikhail Gorbachev, Monroe Doctrine, RAND corporation, Ronald Reagan, Stanislav Petrov, union organizing
A very detailed recent study based on extensive U.S. and Russian intelligence records concludes that “the War Scare Was for Real,” and that U.S. intelligence may have underestimated Russian concerns and the threat of a Russian preventive nuclear strike. In September 2013, the BBC reported that during this dangerous period, Russia’s early-warning systems detected an incoming missile strike from the United States, sending the highest-level alert. The protocol for the Soviet military was to retaliate with a nuclear attack of its own. The officer on duty, Stanislav Petrov, decided to disobey orders and not report the warnings to his superiors. Thanks to his dereliction of duty, we are alive to reflect on the black swan we prefer not to see. Other studies reveal a shocking array of close calls, even apart from the “most dangerous moment in history” during the Cuban missile crisis of 1962. An enormous gap in these lectures, not appreciated at the time, was that another and even more ominous threat was inexorably advancing: environmental catastrophe.
Doing Good Better: How Effective Altruism Can Help You Make a Difference by William MacAskill
barriers to entry, basic income, Black Swan, Branko Milanovic, Cal Newport, Capital in the Twenty-First Century by Thomas Piketty, carbon footprint, clean water, corporate social responsibility, correlation does not imply causation, Daniel Kahneman / Amos Tversky, David Brooks, effective altruism, en.wikipedia.org, end world poverty, experimental subject, follow your passion, food miles, immigration reform, income inequality, index fund, Intergovernmental Panel on Climate Change (IPCC), Isaac Newton, job automation, job satisfaction, Lean Startup, M-Pesa, mass immigration, meta analysis, meta-analysis, microcredit, Nate Silver, Peter Singer: altruism, purchasing power parity, quantitative trading / quantitative ﬁnance, randomized controlled trial, self-driving car, Skype, Stanislav Petrov, Steve Jobs, Steve Wozniak, Steven Pinker, The Future of Employment, The Wealth of Nations by Adam Smith, universal basic income, women in the workforce
HM1146.M33 2015 171'.8—dc23 2015000705 While the author has made every effort to provide accurate telephone numbers, Internet addresses, and other contact information at the time of publication, neither the publisher nor the author assumes any responsibility for errors or for changes that occur after publication. Further, the publisher does not have any control over and does not assume any responsibility for author or third-party websites or their content. Version_1 To Toby Ord, Peter Singer, and Stanislav Petrov, without whom this book would not have been written CONTENTS PRAISE FOR DOING GOOD BETTER TITLE PAGE COPYRIGHT DEDICATION INTRODUCTION Worms and Water Pumps: How can you do the most good? ONE You Are the 1 Percent: Just how much can you achieve? PART ONE THE FIVE KEY QUESTIONS OF EFFECTIVE ALTRUISM TWO Hard Trade-offs: Question #1: How many people benefit, and by how much?
The Doomsday Calculation: How an Equation That Predicts the Future Is Transforming Everything We Know About Life and the Universe by William Poundstone
Albert Einstein, anthropic principle, Any sufficiently advanced technology is indistinguishable from magic, Arthur Eddington, Bayesian statistics, Benoit Mandelbrot, Berlin Wall, bitcoin, Black Swan, conceptual framework, cosmic microwave background, cosmological constant, cosmological principle, cuban missile crisis, dark matter, digital map, discounted cash flows, Donald Trump, Doomsday Clock, double helix, Elon Musk, Gerolamo Cardano, index fund, Isaac Newton, Jaron Lanier, Jeff Bezos, John Markoff, John von Neumann, mandelbrot fractal, Mark Zuckerberg, Mars Rover, Peter Thiel, Pierre-Simon Laplace, probability theory / Blaise Pascal / Pierre de Fermat, RAND corporation, random walk, Richard Feynman, ride hailing / ride sharing, Rodney Brooks, Ronald Reagan, Ronald Reagan: Tear down this wall, Sam Altman, Schrödinger's Cat, Search for Extraterrestrial Intelligence, self-driving car, Silicon Valley, Skype, Stanislav Petrov, Stephen Hawking, strong AI, Thomas Bayes, Thomas Malthus, time value of money, Turing test
One honors Vasili Arkhipov, who was almost literally Brandon Carter’s philosophic nuclear submariner. During the height of the Cuban missile crisis one Russian submarine remained submerged and out of radio contact with Moscow. The sub’s captain assumed that war had broken out and decided to launch nuclear torpedoes. To do so he needed the authorization of two ranking officers. One agreed. The other was Vasili Arkhipov. His veto prevented World War III. The other room commemorates Stanislav Petrov, who chose not to initiate a nuclear attack when his computer screen showed five US missiles approaching the Soviet Union in 1983. Petrov reasoned that the United States would not strike Russia with a mere five missiles; therefore it must be a computer glitch. It was. The Future of Humanity Institute is an expression of a global movement. There are comparable think tanks on both sides of the Atlantic.
Robot Rules: Regulating Artificial Intelligence by Jacob Turner
Ada Lovelace, Affordable Care Act / Obamacare, AI winter, algorithmic trading, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, autonomous vehicles, Basel III, bitcoin, blockchain, brain emulation, Clapham omnibus, cognitive dissonance, corporate governance, corporate social responsibility, correlation does not imply causation, crowdsourcing, distributed ledger, don't be evil, Donald Trump, easy for humans, difficult for computers, effective altruism, Elon Musk, financial exclusion, financial innovation, friendly fire, future of work, hive mind, Internet of things, iterative process, job automation, John Markoff, John von Neumann, Loebner Prize, medical malpractice, Nate Silver, natural language processing, nudge unit, obamacare, off grid, pattern recognition, Peace of Westphalia, race to the bottom, Ray Kurzweil, Rodney Brooks, self-driving car, Silicon Valley, Stanislav Petrov, Stephen Hawking, Steve Wozniak, strong AI, technological singularity, Tesla Model S, The Coming Technological Singularity, The Future of Employment, The Signal and the Noise by Nate Silver, Turing test, Vernor Vinge
Setting the limits of what roles AI should be allowed to fulfil is an emotive topic: many people fear delegating tasks and functions to an unpredictable entity which they cannot fully understand. These issues raise fundamental questions about humanity’s relationship with AI: Why do we harbour concerns about giving up control? Can we strike a balance between AI effectiveness and human oversight? Will fools rush in where AIs fear to tread? 4.2 Why Might We Need Laws of Limitation? In September 2017, Stanislav Petrov died alone and destitute in an unremarkable Moscow suburb. His inauspicious death belied the pivotal role he played one night in 1983 when he was the duty officer in a secret command centre tasked with detecting nuclear attacks on the USSR by America. Petrov’s computer screen showed five intercontinental ballistic missiles heading towards the USSR. The standard protocol was to launch a retaliatory strike before the American missiles landed: thereby triggering the world’s first—and potentially last—nuclear conflict.
The Cold War by Robert Cowley
anti-communist, Berlin Wall, British Empire, cuban missile crisis, defense in depth, Dissolution of the Soviet Union, Doomsday Clock, friendly fire, Henry Ford's grandson gave labor union leader Walter Reuther a tour of the company’s new, automated factory…, means of production, Mikhail Gorbachev, mutually assured destruction, RAND corporation, refrigerator car, Ronald Reagan, South China Sea, Stanislav Petrov, transcontinental railway
There Goes Brussels … WILLIAMSON MURRAY Let us play a counterfactual game, and suppose for a moment that the one-sided crisis of 1983 had gotten out of control. What if, for example, on the night of September 26, a Soviet officer in a bunker outside Moscow had not had doubts about what he was seeing on a computer screen—first one incoming missile, and then another, five in all. Had Lieutenant Colonel Stanislav Petrov followed regulations, he would have telephoned his superiors to warn them that the Soviet Union was only minutes away from a nuclear attack. Petrov hesitated, convinced that something had gone awry in the computer system. The minutes passed. Nothing did happen. That night one man's hunch may have averted World War III. “The terrible ifs accumulate,” as Winston Churchill said of the opening moves of World War I.
Superintelligence: Paths, Dangers, Strategies by Nick Bostrom
agricultural Revolution, AI winter, Albert Einstein, algorithmic trading, anthropic principle, anti-communist, artificial general intelligence, autonomous vehicles, barriers to entry, Bayesian statistics, bioinformatics, brain emulation, cloud computing, combinatorial explosion, computer vision, cosmological constant, dark matter, DARPA: Urban Challenge, data acquisition, delayed gratification, demographic transition, different worldview, Donald Knuth, Douglas Hofstadter, Drosophila, Elon Musk, en.wikipedia.org, endogenous growth, epigenetics, fear of failure, Flash crash, Flynn Effect, friendly AI, Gödel, Escher, Bach, income inequality, industrial robot, informal economy, information retrieval, interchangeable parts, iterative process, job automation, John Markoff, John von Neumann, knowledge worker, longitudinal study, Menlo Park, meta analysis, meta-analysis, mutually assured destruction, Nash equilibrium, Netflix Prize, new economy, Norbert Wiener, NP-complete, nuclear winter, optical character recognition, pattern recognition, performance metric, phenotype, prediction markets, price stability, principal–agent problem, race to the bottom, random walk, Ray Kurzweil, recommendation engine, reversible computing, social graph, speech recognition, Stanislav Petrov, statistical model, stem cell, Stephen Hawking, strong AI, superintelligent machines, supervolcano, technological singularity, technoutopianism, The Coming Technological Singularity, The Nature of the Firm, Thomas Kuhn: the structure of scientific revolutions, transaction costs, Turing machine, Vernor Vinge, Watson beat the top human players on Jeopardy!, World Values Survey, zero-sum game
On November 9, 1979, a computer problem led NORAD (North American Aerospace Defense Command) to make a false report of an incoming full-scale Soviet attack on the United States. The USA made emergency retaliation preparations before data from early-warning radar systems showed that no attack had been launched (McLean and Stewart 1979). On September 26, 1983, the malfunctioning Soviet Oko nuclear early-warning system reported an incoming US missile strike. The report was correctly identified as a false alarm by the duty officer at the command center, Stanislav Petrov: a decision that has been credited with preventing thermonuclear war (Lebedev 2004). It appears that a war would probably have fallen short of causing human extinction, even if it had been fought with the combined arsenals held by all the nuclear powers at the height of the Cold War, though it would have ruined civilization and caused unimaginable death and suffering (Gaddis 1982; Parrington 1997).
The Last Empire: The Final Days of the Soviet Union by Serhii Plokhy
affirmative action, Anton Chekhov, Berlin Wall, bilateral investment treaty, cuban missile crisis, Dissolution of the Soviet Union, Fall of the Berlin Wall, Francis Fukuyama: the end of history, land reform, Mikhail Gorbachev, mutually assured destruction, Potemkin village, RAND corporation, Ronald Reagan, Sinatra Doctrine, Stanislav Petrov, Transnistria
In the Soviet Union, the death of Leonid Brezhnev in 1982 unleashed a succession crisis in the Kremlin. International tensions rose, threatening for the first time since the early 1960s to turn the Cold War into a hot one.3 On September 1, 1983, near Sakhalin Island, the Soviets shot down a South Korean airliner with 269 people aboard, including a sitting member of the US Congress. They then awaited American retaliation. Later that month, Lieutenant Colonel Stanislav Petrov of the Soviet Air Defense Forces Command near Moscow saw a blip on his radar screen indicating a missile headed toward the USSR. Then he saw what appeared to be four more missiles headed in the same direction. Suspecting a computer malfunction, he did not report the image to his superiors. Had he done so, nuclear war between the two powers might well have become a reality. It turned out that a rare alignment of sunlight and clouds had caused a glitch in the Soviet early-warning system.
Command and Control: Nuclear Weapons, the Damascus Accident, and the Illusion ofSafety by Eric Schlosser
Albert Einstein, anti-communist, Berlin Wall, cuban missile crisis, Fall of the Berlin Wall, Haight Ashbury, impulse control, interchangeable parts, Isaac Newton, life extension, Mikhail Gorbachev, mutually assured destruction, nuclear winter, packet switching, RAND corporation, Ronald Reagan, Stanislav Petrov, Stewart Brand, too big to fail, uranium enrichment, William Langewiesche
The Kremlin denied that it had anything to do with the tragedy—until the United States released audio recordings of Soviet pilots being ordered to shoot down the plane. President Reagan called the attack “an act of barbarism” and a “crime against humanity [that] must never be forgotten.” A few weeks later alarms went off in an air defense bunker south of Moscow. A Soviet early-warning satellite had detected five Minuteman missiles approaching from the United States. The commanding officer on duty, Lieutenant Colonel Stanislav Petrov, tried to make sense of the warning. An American first strike would surely involve more than five missiles—but perhaps this was merely the first wave. The Soviet general staff was alerted, and it was Petrov’s job to advise them whether the missile attack was real. Any retaliation would have to be ordered soon. Petrov decided it was a false alarm. An investigation later found that the missile launches spotted by the Soviet satellite were actually rays of sunlight reflected off clouds.
Gorbachev by William Taubman
active measures, affirmative action, Albert Einstein, anti-communist, Berlin Wall, British Empire, card file, conceptual framework, Deng Xiaoping, Donald Trump, Fall of the Berlin Wall, fear of failure, haute couture, indoor plumbing, means of production, Mikhail Gorbachev, Neil Kinnock, Potemkin village, RAND corporation, Ronald Reagan, Ronald Reagan: Tear down this wall, Saturday Night Live, Stanislav Petrov, trade liberalization, young professional
The message to Moscow, says former member of the British Joint Intelligence Committee Gordon Barrass, was “We can run rings around you.”33 In response to this challenge, KGB agents overseas kept their eyes peeled for signs of an impending attack, such as lights on late at night in Western defense ministries, or hospitals collecting more than the usual amounts of blood. Then, on the night of September 26, 1983, in a secret underground bunker outside Moscow, alarms indicated that American missiles were speeding toward Moscow. The duty officer had seven minutes to alert Andropov, who was on a dialysis machine in a suburban sanatorium. Fortunately, Lieutenant Colonel Stanislav Petrov concluded that the alarm was false. Meanwhile, however, the Americans and British were preparing to conduct the “Able Archer 83” war game, in which NATO’s supreme commander would request permission to use nuclear weapons and receive it. When Able Archer began in early November, the chief of the Soviet General Staff took cover in his command bunker under Moscow and ordered a “heightened alert” for some land-based Soviet forces.