Avatar

I'm Drew Breunig and I obsess about technology, media, language, and culture. I live in New York, studied anthropology, and work at PlaceIQ.

Posts tagged books

On Nuclear Weapons and Complex Systems

After reading Eric Schlosser’s Command and Control, an exploration of nuclear safety and implementation, it becomes impossible to think of a nuclear weapon as a singular thing. These bombs live in a complicated system, a network, of people, technologies and politics which manage them; determine how they’re built, test, stored, and activated. One cannot truly discuss a nuclear weapon without understanding these complexities for the bomb is inseparable from this network. It is such a complicated device to design and utilize that it can only be born and managed within a network of support. Schlosser’s work is a fantastic introduction and case study for actor-network theory.

It’s striking that nearly1 all the threatening accidents detailed in the book aren’t due to faults within the bomb itself. The accidents are the result of the uncontrollable, complex systems which are necessary to manage the bombs. The causes, taken on their own, are almost always mundane: a release lever is instinctually grabbed by a pilot during turbulence to stabilize himself, a socket wrench loses it’s socket, or an aircraft’s air conditioning stops working. One time a technician loaded the wrong magnetic tape and triggered a simulation of a massive Soviet attack:

As the computer screens at NORAD filled with Soviet missiles, a Threat Assessment Conference was called. Although the pattern of the attack seemed to fit with the Pentagon’s assumptions about Soviet war plans , its timing made little sense. Tensions between the superpowers weren’t particularly high, and nothing in the news seemed to warrant a “bolt from the blue” attack on the United States. Duty officers at NORAD contacted the radar and ground stations whose sensors were relaying information about the launches. None of them had detected signs of any missiles. The NORAD computers seemed to be providing an erroneous— but highly realistic— account of a Soviet surprise attack.

As a precaution, the Klaxons were sounded at SAC bases nationwide. Bomber crews ran to their planes, and missile crews were put on heightened alert. Fighter-interceptors took off to look for signs of a Soviet attack. The National Emergency Airborne Command Post left Andrews Air Force Base without President Carter on board. And air traffic controllers throughout the country prepared to clear America’s airspace for military flights, warning every commercial airliner that it might soon have to land.

As the minutes passed without the arrival of Soviet warheads, it became clear that the United States wasn’t under attack. The cause of the false alarm was soon discovered. A technician had put the wrong tape into one of NORAD’s computers. The tape was part of a training exercise—a war game that simulated a Soviet attack on the United States. The computer had transmitted realistic details of the war game to SAC headquarters, the Pentagon, and Site R.

This genre of accident – the computer error that threatens to launch a giant volley of weapons, rather than a single warhead failing – should be especially frightening to anyone who’s ever managed a website, server, or any complex code base. And it happened several times:

At about two thirty in the morning on June 3, 1980, Zbigniew Brzezinski, the president’s national security adviser, was awakened by a phone call from a staff member, General William E. Odom. Soviet submarines have launched 220 missiles at the United States, Odom said. This time a surprise attack wasn’t implausible. The Soviet Union had recently invaded Afghanistan, confirming every brutal stereotype promoted by the Committee on the Present Danger. The United States was leading a boycott of the upcoming Moscow Olympics, and relations between the two superpowers were at their lowest point since the Cuban Missile Crisis. Brzezinski told Odom to call him back with confirmation of the Soviet attack and its intended targets. The United States would have to retaliate immediately; once the details of the attack were clear, Brzezinski would notify the president. Odom called back and said that 2,200 missiles were heading toward the United States— almost every long-range missile in the Soviet arsenal. As Brzezinski prepared to phone the White House, Odom called again. The computers at NORAD said that Soviet missiles had been launched, but the early-warning radars and satellites hadn’t detected any. It was a false alarm. Brzezinski had allowed his wife to sleep through the whole episode, preferring that she not be awake when the warheads struck Washington.

SAC bomber crews had run to their planes and started the engines. Missile crews had been told to open their safes. The airborne command post of the Pacific Command had taken off. And then the duty officer at the Pentagon’s National Military Command Center ended the Threat Assessment Conference, confident that no Soviet missiles had been launched. Once again, NORAD’s computers and its early-warning sensors were saying different things. The problem was clearly in one of the computers, but it would be hard to find. A few days later NORAD computers warned SAC headquarters and the Pentagon for a third time that the United States was being attacked. Klaxons sounded, bomber crews ran to their planes— and another Threat Assessment Conference declared another false alarm.

This time technicians found the problem: a defective computer chip in a communications device. NORAD had dedicated lines that connected the computers inside Cheyenne Mountain to their counterparts at SAC headquarters, the Pentagon, and Site R. Day and night, NORAD sent test messages to ensure that those lines were working. The test message was a warning of a missile attack—with zeros always inserted in the space showing the number of missiles that had been launched. The faulty computer chip had randomly put the number 2 in that space, suggesting that 2 missiles, 220 missiles, or 2,200 missiles had been launched. The defective chip was replaced, at a cost of forty-six cents. And a new test message was written for NORAD’s dedicated lines. It did not mention any missiles.

The effects of mundane accidents become ridiculously amplified when connected to a network which manages non-mundane things:

After studying a wide range of “trivial events in nontrivial systems,” Perrow concluded that human error wasn’t responsible for these accidents. The real problem lay deeply embedded within the technological systems, and it was impossible to solve: “Our ability to organize does not match the inherent hazards of some of our organized activities.” What appeared to be the rare exception, an anomaly, a one-in-a-million accident, was actually to be expected. It was normal.

The complexity of such a system was bound to bring surprises. “No one dreamed that when X failed, Y would also be out of order,” Perrow gave as an example, “and the two failures would interact so as to both start a fire and silence the fire alarm.”

Such effects are well known to those that study complexities or knowledge practices. Upon finishing Command and Control I immediately went to the bookshelf and pulled down one of these texts, John Law and Annemarie Mol’s Complexities. From the introduction:

The process of scaling up poses many problems. Large-scale technologies usually grow out of laboratory experiments, but the process of translation is tricky because laboratory experiments are simplificatory devices: they seek to tame the many erratically changing variables that exist in the wild world, keeping some stable and simply excluding others from the argument. This often works well in the laboratory: if one does an experiment in a test tube, it is not unreasonable to assume that the air in the lab will absorb any heat that is produced. Calculation is greatly simplified by choosing to neglect a variable such as “heat.” However, it works less well when what was confined to a test tube is scaled up to become a power plant. What happens now to all that excess heat? where does it go? And where do radioactive waste products go?

So there is scaling, and there are unpredictabilities, erratic forms of behavior. These do not fit the schemes of most sciences very well either because the latter prefer to treat with only a few variables, not too many. The problem is that what was not predictable tends to occur anyway. So how should this be handled?

The answer–one answer– is that such chaotic events are tamed by theories of chance. In being reduced to a probability and framed as a risk they are turned into something that however erratic, is also calculable. The risk of an explosion in the factory on the edge of your town (an explosion that will take your town with it) is, say, 0.000000003 percent per annum. Now go calculate whether this is a good enough reason to be anxious!

This sort of reductionism via probability is all over Schlosser’s book. Bomb safety is measured and justified by standards expressed as odds:

For example, it proposed that the odds of a hydrogen bomb exploding accidentally— from all causes, while in storage, during the entire life of the weapon— should be one in ten million.

These odds, derived in laboratories, were used to allocate money for bomb safety. The factors which cannot reasonability be measured in the laboratory remain a problem. The mundane accidents caused by the messy world, which have been removed from the lab so that variables can be isolated, will always affect the network. Hence, as Scholosser writes:

The probabilities remained unknown. What were the odds of a screwdriver, used to repair an alarm system, launching the warhead off a missile, the odds of a rubber seat cushion bringing down a B-52?

Command and Control is strongly recommended.


  1. The only issues I can recall contained solely within the weapons is a case of an unexplained noise emanating from a thermonuclear warhead (which never posed any danger) and strips of boron disintegrating within the core of a line of weapons, which rendered them useless. Neither of these, oddly, were dangerous (at least relative to any of the other mishaps). 

Will Apple launch a sort of GarageBand for e-books? “That’s what we believe you’re about to see,” MacInnis told Ars (and our other sources agree). “Publishing something to ePub is very similar to publishing web content. Remember iWeb? That iWeb code didn’t just get flushed down the toilet—I think you’ll see some of [that code] repurposed.”

Ars Technica on Apple’s textbook event this week.

My hopes for this announcement, if it is a “GarageBand for eBooks”:

  1. It’s free
  2. It’s not just for text books
  3. It is able to tie into a boilerplate Newsstand app, allowing publishers to more easily layout periodical content in a standard way (the divergence between Newsstand apps is unnecessary and frustrating)
  4. Textbooks are published into a textbook ecosystem, with unified notes, book collections for courses managed by instructors, and interactive quizzes built into books allowing students to take short exams and instructors to grade easily from an iPad.

I think 2 will eventually pan out, but 3 is wishful thinking. 4 would be a game changer, but would require massive commitments from major universities to catch on. Teachers I know would like such an ecosystem (I would have liked it as a student), but it may be a hinderance to adoption and is probably best saved for later.

We have expressed our strong disagreement and the seriousness of our disagreement by temporarily ceasing the sale of all Macmillan titles. We want you to know that ultimately, however, we will have to capitulate and accept Macmillan’s terms because Macmillan has a monopoly over their own titles, and we will want to offer them to you even at prices we believe are needlessly high for e-books. Amazon customers will at that point decide for themselves whether they believe it’s reasonable to pay $14.99 for a bestselling e-book. We don’t believe that all of the major publishers will take the same route as Macmillan. And we know for sure that many independent presses and self-published authors will see this as an opportunity to provide attractively priced e-books as an alternative.

Amazon caves to Macmillan’s pricing demands.

Earlier I said Amazon’s real leverage is their existing volume of physical book sales: publishers don’t want to be cut out of that loop. I was wrong about that…

Now that they’ve caved, it seems Amazon’s physical book business is more of a curse than a gift: when Apple was dictating pricing models to the record labels, Apple wasn’t reliant on CD sales that labels could have blocked. If the labels fought back on pricing, Apple had the luxury of not accepting their terms while they waited for their iPod install base to soar and create digital music profits the labels could no longer ignore.

Amazon is very much reliant on each major publisher. Kindle sales can’t be fueled by books digitized at home (like how the first iPods were filled with ripped CDs). They’re reliant on the publishers for content.

Just before Apple announced the iPad and the agency deal for ebooks, Amazon pre-empted by announcing an option for publishing ebooks in which they would graciously reduce their cut from 70% to 30%, ‘same as Apple’. From a distance this looks competitive, but the devil is in the small print; to get the 30% rate, you have to agree that Amazon is a publisher, license your rights to Amazon to publish through the Kindle platform, guarantee that you will not allow other ebook editions to sell for less than the Kindle price, and let Amazon set that price, with a ceiling of $9.99. In other words, Amazon choose how much to pay you, while using your books to undercut any possible rivals (including the paper editions you still sell). It shouldn’t surprise anyone that the major publishers don’t think very highly of this offer.

Amazon, Macmillan: an outsider’s guide to the fight (via marco)

Worth noting is Amazon’s leverage: they’re an enormous book seller. In these trying times, publishers loathe the punishment of being kicked out of their store.

Lessig Calls Google Book Settlement A “Path To Insanity”  

Hear, hear! Maybe this will make it on BoingBoing.

My electronic library has about a 50% crossover with my physical library, so that I can read the book on my electronic reader, “loan” the book without endangering my physical copy, or eventually rid myself of the paper copy if it is a book I do not have strong feelings about. I do not buy DRM’d ebooks that are priced at more than a few dollars, but would pay up to $10 for a clean file if it was a new release. I do not pretend that uploading or downloading unpurchased electronic books is morally correct, but I do think it is more of a grey area than some of your readers may.

From “Confessions of a Book Pirate.”

The nice thing about book pirates is that they’re a hell of a lot more articulate than the usual Pirate Bay frat-boy. (Via The Millions)

How, where, can I ask writers who are unhappy with the Settlement to speak up - to stand up and be counted? We don’t have to agree on every detail, but I think there are a lot of us who see it as urgently important to let it be known that writers support the principle of copyright, and want the Copyright Office, the judges, the publishers, and the libraries to know that we intend to keep control of our work, in print or out, printed or electronic, believing that the people who do the work, rather than any corporation, should have the major voice in how it’s used and who profits from it.
Ursula K. Le Guin. Funny, didn’t see this quote up on BoingBoing. What a lovely last line. (Via io9)

Amazon Launches 70% Kindle Royalty Option 

Under this option, Amazon will pay authors and publishers a royalty of 70% of the list price of Kindle books, which is a far higher per-copy royalty than most authors receive on physical book sales (including the standard Kindle book royalties).

Ala Ebtekar: Ascension. I can’t get enough of this work.

Next page Something went wrong, try loading again? Loading more posts