This post is not going to be about “NASA screwed up, how come after 40 years we still have a space ‘program’ and not a space industry, NASA is drifting off focus and no longer has a clearly defined mission, etc.” I will leave it to someone else to write that column, because Rand Simberg (or our own Dale Amon) could do it a lot better than I could anyway.
What I do want to talk about is: how the way information is organized and presented can make a difference in how it is received – and how bureaucracy can sometimes stand in the way of effective data organization and promote cluttered thinking. When we lost the Challenger in ’86, it should have been clear that it was unsafe to launch the shuttle on that cold January morning. NASA had plenty of data to suggest that it was not prudent to launch that day – the problem is that the data was not refined into a conclusive answer, but rather was shrouded by poor communication and bureaucratic ass-covering.
Edward Tufte, professor emeritus at Yale and author of several brilliant texts on graphic design and the visual display of quantitative data, has made the Challenger accident a centerpiece of his traveling seminar. His exegesis of the Challenger disaster is available in his book Visual Explanations (Graphics Press, 2001).
In hindsight, it was quickly determined what caused the Challenger to fail: the poor cold-weather performance of the rubber O-rings in the field joints that held sections of the solid rocket boosters together. In a memorable session of the Rogers Commission (the group that investigated the Challenger disaster) the late Richard Feynman, Nobel Prize-winning physicist, conducted a dramatic experiment. He affixed a C-clamp to a sample of O-ring material, dropped it into his glass of ice water, and then removed the clamp, revealing that the O-ring rubber lacked resiliency when cooled to 32 degrees Fahrenheit. (See photo below.) Engineers at Morton Thiokol, the Utah-based firm that made the solid rocket modules, had been concerned about the cold-weather performance of the seals, so much so that they took the unprecedented step of issuing a “no-launch warning” to NASA the day before the doomed flight. As Tufte puts it:
… the exact cause of the accident was intensely debated during the evening before the launch: will the rubber O-rings fail catastrophically tomorrow because of the cold weather? These discussions concluded at midnight with the decision to go ahead. That morning, the Challenger blew up 73 seconds after its rockets were ignited.
At this point, some would be tempted to say: “See? As usual, the engineers were right, the bureaucrats were wrong!” But the story is more complicated. Tufte argues that the Thiokol engineers failed to present a compelling “no-launch” case to NASA because (1) their analysis failed to make crystal-clear the relationship between O-ring performance and temperature, and (2) their presentation to NASA had other shortcomings, including contradictory advice in some places.
What kind of analysis/presentation might have saved the Challenger? In Visual Explanations, Tufte argues that a single graphic (had such a thing existed) could have given NASA all the data they needed to make a decision. (You can see the graphic here – it is shown on the cover of one of his booklets.)
The x- (horizontal) axis shows temperature at launch time; the y-axis shows the level of damage to the O-rings. Each dot represents a previous space shuttle launch in the years 1981-85. The forecast temperature for Cape Canaveral that infamous morning was below freezing, in the 26-29 degree range; the previous coldest shuttle launch was at 53 degrees. As Tufte points out, 29 degrees is an extreme outlier, 5.7 standard deviations outside the previous range. And, of course, the relationship of resiliency to temperature is quadratic, not linear. In other words, the Challenger mission was at substantial risk.
So the problem with the Challenger was not that the NASA bureaucrats lacked the needed data. Nor was the problem that they simply disregarded the advice of the engineers for political reasons. They had the data; what they lacked was the capacity to quickly and accurately refine the data into a clear no-launch decision.
This is the same problem that presents itself over and over in bureaucratic decision-making, especially in intelligence / antiterrorism efforts. Muhammad and Malvo’s “snipermobile,” the modified Chevy Caprice, was spotted and even apprehended at the scene of several shootings before authorities put two and two together. They received tips from thousands of disparate sources. Our intelligence agencies receive a ton of information, chatter, noise, whatever you want to call it, from sources all over the globe. The challenge for police and intelligence agencies is to refine that desultory information into a meaningful conclusion.
We know that markets do this task – refining enormous amounts of information into concise signals – exceptionally well. John Poindexter took a lot of heat for proposing a “terror futures market” in which participants could bet on events such as the Saudi government falling. Politicians, journalists and cartoonists derided the idea, but many bloggers and other commentators (such as Reason’s Ron Bailey) rose to its defense. There are limits to the applicability of markets like this, but potential benefits as well. (For example, studies have shown that weather futures markets actually outperform the US National Weather Service in predicting certain weather patterns, and you had better believe that campaign managers these days pay as much attention to political futures markets as they do to polls.)
Would something like this work for space shuttle launches? If there was a market in which we could trade, say, the probability of a failed space shuttle mission, would that have helped? Best case scenario would be: analysts independently peg relationship of temperature to risk; temperature forecasts for launch day are issued; demand for (and price of) ‘mission failure’ bets in futures market accelerates; NASA views this as a signal to postpone launch.
Building a market like that does not just happen overnight; investment bankers know the difficulties inherent in ‘making a market’. You would need a lot of knowledgeable players before the market could achieve stability and begin to provide robust answers. There’s no way to know whether a ‘shuttle futures market’ could have helped NASA in 1986, but it is hard to see how it could have hurt, either. Could the power of the free market help protect NASA’s future shuttle missions?
“I believe that has some significance for our problem.”
- Feynman’s testimony at Rogers Commission panel, Feb. 1986
Feynman, Richard. What do YOU Care What Other People Think? Norton, 1988.
Tufte, Edward. Visual Explanations: Images and Quantities, Evidence and Narrative. Graphics Press, 2001.
Siems, Thomas F. 10 Myths about Financial Derivatives. Cato Institute Policy Analysis, 9/11/1997.