All 4 books by Edward Tufte now in
paperback editions, $100 for all 4
Visual Display of Quantitative Information
Envisioning Information
Visual Explanations
Beautiful Evidence
Paper/printing = original clothbound books.
Only available through ET's Graphics Press:
catalog + shopping cart
All 4 clothbound books, autographed by the author $150
catalog + shopping cart
Edward Tufte e-books
Immediate download to any computer:
Visual and Statistical Thinking $2
The Cognitive Style of Powerpoint $2
Seeing Around + Feynman Diagrams $2
Data Analysis for Politics and Policy $2
catalog + shopping cart
Edward Tufte one-day course,
Presenting Data and Information
Houston TX, January 29
Austin TX, January 31
Dallas TX, February 2
Click here for more information about ET's course and to register.
ET on Columbia Evidence (2003)

Except for a few loose ends below, this thread has been replaced by PowerPoint Does Rocket Science (and some thoughts about the upcoming Discovery flight)

-- Edward Tufte


Postings of Boeing reports

Here are the postings for the Boeing reports and my earlier comments.

The 3 reports by Boeing engineers about potential debris damage to the left wing have been posted by NASA at many locations. Here is a good general location for many items; see "Boeing Debris Impact Assessment Charts" at

http://www.nasa.gov/columbia/foia/index.html

NASA also posted the 3 Boeing Reports at their Media Resources site for the Columbia: http://www.nasa.gov/columbia/media/index.html

In addition, NASA posted the Boeing reports in the Columbia Accident Investigation Board (CAIB) STS-117 Investigation File. The Washington Post, the New York Times, and the Houston Chronicle, and others have published excerpts from the 3 reports along with analysis and interviews.

These reports, which repay study, were done during the flight of the Columbia.

The 3 reports have the following good features: The names of the engineers producing the reports are given (unlike the Challenger pre-launch analysis). The report makes clear quantitative links among possible causes and effects (unlike the Challenger analysis). Most of the tables and graphics have scales of measurement.The analysis is multivariate. Assumptions of the analysis are fairly clear (although perhaps there are hidden assumptions that experts can reveal). It is easy enough for the alert reader to see that the results are assumption-sensitive. Quantitative estimates by means of contour lines are given for debris-velocity over the wing surface. There is an excellent diagram showing individual tiles on the wing along with the forecasted tile loss due to debris impact.

The 3 reports have the following weaknesses: It now appears that the conclusions were incorrect. The results appear sensitive to input assumptions about incidence angle, incidence location, the number and velocity of impacts, and the weight of the debris (assumed to be lightweight foam at 2.4 pounds/cubic foot)--and that multivariate sensitivity is not carefully examined. In the video of the debris impact, the debris pieces look larger than the estimated sizes (20" by 10" by 6"; and 20" by 16" by 6") used in the 3 reports. The important video is at http://www.spaceref.com/Columbia/post.launch.video.html The video also shows a fine shower of debris coming off the wing after impact; that spray does not immediately suggest foam chips. In the reports, assumptions tend to be evaluated generally in the direction of how they might reduce the seriousness of the threat (after-the-fact arguments of the form "this is a conservative estimate" replace careful quantitative estimates of robustness and uncertainty). An important table has 2 empty cells; threat assessments are missing in those 2 cells. (The Washinton Post discussed this point.) The good diagram showing forecasted tile loss provides only point estimates; there is no cloud of error around those estimates.

The 3 reports have the following analytical design characteristics: They appear to be PowerPoint slides. Some tables are difficult to read because of the grid prisons surrounding the entries in the spreadsheet, and it is difficult to make comparisons of numbers across the table. Bullets lists are used throughout, with up to 5 levels of hierarchy on a single page of 10 or 12 lines. Consequently the reasoning is broken up into stupefying fragments both within and between the many slides. Although an oral presentation accompanied the 3 reports originally, the reports were also circulated as stand-alone slides in e-mail attachments by NASA engineers concerned about the possible damage to Columbia's wing.

The fundamental nature of the Columbia analysis might be called statistical engineering: the content is engineering but the logic is exactly the logic of statistics and econometrics (issues of estimation, thin data, model sensitivity and robustness, multivariate data, error assessment). The Columbia analysis needs some high-level statistical reasoning and use of techniques from standard statistical tool-kits. The Columbia analysis would have been a perfect problem for the great applied statistician Cuthbert Daniel.

-- Edward Tufte


Response to ET on Columbia Evidence: Analysis of Key Slide

What is striking about the Boeing analysis is the absence of context and predictive scenarios.

The Goal of the Analysis: communicate to flight control the extent and meaning of "possible tile damage".

What should have been included immediately in any report are answers to the following questions:

1. how frequent do tiles become damaged? Does this happen on every flight, or is this an infrequent incident? Where is the control chart to display "tile failures" chronologically on a launch basis?

2. what other special causes might make this incident different from other tile damage incidents (if others existed). For example, would low temperature possibly create greater damage scenarios? Was the Columbia, as the oldest operating shuttle, more prone to damage?

3. finally, how variable are the results of the analysis? I role two die, and I can get anywhere from a 2 to a 12 - all with varying probabilities. How variable was the "tile damage" analysis?

From (3), causal logic should have been presented to explain consequences of tile damage. Does tile damage necessarily imply loss of hull integrity?

Proper analysis would have included context, scenarios, probabilities, and causal logic. I saw none of these in the report.

Michael Round

-- Michael Round (email)


Response to ET on Columbia Evidence: Analysis of Key Slide

Regarding the first topic that Michael Round suggested should have been covered in an report on the foam strike during the STS-107 ascent:

A presentation was given at the CAIB public hearing on April 7 on the topic of the external tank cryoinsulation. The transcript is available at http://www.caib.us/events/public_hearings/20030407/present_et.html. Streaming video is also available.

A series of slides in the presentation are labeled "History of Foam Loss". http://www.caib.us/events/public_hearings/20030407/

A graph was presented repeatedly that shows the "hits" to the lower surface of the orbiter. I think that the graph is based on damage to the tiles assessed after the return of the vehicle. Hits were graphed versus external tank number rather than chronologically. Additional slides show specific examples of foam loss with photographs of external tanks as they fall away from the orbiter. The following excerpt of the transcript refers to the graph of the lower surface hits.

"DR. RIDE: Can I just ask a question on your numbering system? STS 26 was return to flight?

MR. FOSTER: No, that was 26R. I do have to apologize here. What I did was sorted these data by ET number; and as you're well aware, the numbering system was really messed up. So this is not in chronological order. Case in point: 27R is return to flight, and 27 was way before. So although on this chart those data would be together, you know, chronologically they're far apart."

[Link updated February 2005]

-- Wiley Holcombe (email)


Response to ET on Columbia Evidence: Analysis of Key Slide

I am confused by the date on the bottom of the slide. Discussion is centered about January 21st, but the slide is labeled 2/21/03.

Is this due to some sort of PP feature associated with the date of releasing the presentation or is it a typo?

-- Bil Kleb (email)


Response to ET on Columbia Evidence: Analysis of Key Slide

My money's on yet another of PowerPoint's "features:" you can insert an automatic date, so that every time you print or save the document, you get a new date. That way all you have to do to "update" your presentation is to open and print it -- rather than going to the trouble of adding fresh content.

-- Scott Zetlan (email)


Response to ET on Columbia Evidence: Analysis of Key Slide

Richard Feynman has some things to say about bullet lists, illogical analysis of shuttle risk, and the Challenger--as pointed out by Peter Lindberg (who runs a very thoughtful weblog): http://www.tesugen.com/2003/07/02.html

Feynman on bullets during the Challenger investigation: "Then we learned about 'bullets'; little black circles in front of phrases that were supposed to summarize things. There was one after another of these little goddamn bullets in our briefing books and on the slides."

-- Edward Tufte


Response to ET on Columbia Evidence: Analysis of Key Slide

Skimming the CAIB report available at http://www.caib.us/news/report/default.html, I came across this paragraph relating to the problems involved in NASA presentations:

The Mission Management Team Chair's position in the hierarchy governed what information she would or would not receive. Information was lost as it traveled up the hierarchy. A demoralized Debris Assessment Team did not include a slide about the need for better imagery in their presentation to the Mission Evaluation Room. Their presentation included the Crater analysis, which they reported as incomplete and uncertain. However, the Mission Evaluation Room manager perceived the Boeing analysis as rigorous and quantitative. The choice of headings, arrangement of information, and size of bullets on the key chart served to highlight what management already believed. The uncertainties and assumptions that signaled danger dropped out of the information chain when the Mission Evaluation Room manager condensed the Debris Assessment Team's formal presentation to an informal verbal brief at the Mission Management Team meeting.

August 2003, Columbia Accident Investigation Board Report, Volume I, Chapter 8 (page 201).

-- Bob Leedom (email)


Columbia evidence: key slide

To bring this up to date: My analysis PowerPoint Does Rocket Science eventually appeared in the final report of the Columbia Accident Investigation Board (CAIB) , the New York Times, and elsewhere. Then, more recently, during the last 18 months or so, I've visited the plant making the external tank (43 acres under one roof), reviewed material on probability risk assessments for the return to flight, reviewed presentations on foam damage leaked to a newspaper, and given many one-day courses on analytical design and engineering presentations to a total of 1,000 NASA and Boeing employees at various NASA sites. I had a nice tour of the Neutral Bouyancy Lab (huge swimming pool to simulate weightlessness and work on full-size models of the shuttle and the space station with many SCUBA divers watching over the tests) where astronauts and cosmonauts practice space walks. There I showed them my Advanced Open Water dive certification card (hint hint) but they were distinctly unimpressed. Maybe if I had NITROX certification . . . In most of this, I'm simply a curious tourist lucky enough to see some interesting problems in evidence presentation as well as some very expensive scenery. My policy view on the shuttle is that it requires too much money to make uncertain and small reductions in risk on the margin. Therefore, to continue the flights to service the station requires accepting the risk at the level of the empirical history of the shuttle, which is 1 major loss for every 57 flights.

-- Edward Tufte


Response to ET on Columbia Evidence: Analysis of Key Slide

A Kindly Contributor, Ravi Narasimhan, provided this link to new material from a follow-up panel of the Columbia Accident Investigation Board (CAIB). The panel was reporting on the analyses recently used for the Return to Flight; the panel makes clear that the level of intellectual quality in engineering and the administration of engineering is insufficiently high.

http://www.spaceflightnow.com/shuttle/sts114/050817rtftg/

Here are the parts of the report relevant to PowerPoint:

"Another lack of rigor cited by the panel - one that also was cited by the CAIB - is the widespread use of PowerPoint presentations in lieu of actual engineering data and analyses.

"Several members of the Task Group noted, as had CAIB before them, that many of the engineering packages brought before formal control boards were documented only in PowerPoint presentations," the panel members wrote. "In some instances, requirements are defined in presentations, approved with a cover letter and never transferred to formal documentation. Similarly, in many instances when data was requested by the Task Group, a PowerPoint presentation would be delivered without supporting engineering documentation. It appears that many young engineers do not understand the need for, or know how to prepare, formal engineering documents such as reports, white papers, or analyses."

Years ago Richard Feynman had expressed a concern about NASA presentations in his work on the Challnger accident in 1986: "Then we learned about 'bullets'--little black circles in front of phrases that were supposed to summarize things. There was one after another of these little goddam bullets in our briefing books and on slides." [Richard P. Feynman, "What do you care what other people think?" (New York, 1988), pp. 126-127.

I deeply regret my lack of success in seeking to loosen the grip of PowerPoint and the pitch culture on serious engineering work during the last 18 months of teaching and consulting at NASA. Perhaps some of my work was regarded as being about the design of bullets rather than about the intellectual quality of engineering analysis.

-- Edward Tufte


Response to ET on Columbia Evidence: Analysis of Key Slide

Here is the exact quote:

"We also observed that instead of concise engineering reports, decisions and their associated rationale are often contained solely within Microsoft PowerPoint charts or emails. The CAIB report (Vol. I, pp. 182 and 191) criticized the use of PowerPoint as an engineering tool, and other professional organizations have also noted the increased use of this presentation software as a substitute for technical reports and other meaningful documentation. PowerPoint (and similar products by other vendors), as a method to provide talking points and present limited data to assembled groups, has its place in the engineering community; however, these presentations should never be allowed to replace, or even supplement, formal documentation. Several members of the Task Group noted, as had CAIB before them, that many of the engineering packages brought before formal control boards were documented only in PowerPoint presentations. In some instances, requirements are defined in presentations, approved with a cover letter, and never transferred to formal documentation. Similarly, in many instances when data was requested by the Task Group, a PowerPoint presentation would be delivered without supporting engineering documentation. It appears that many young engineers do not understand the need for, or know how to prepare, formal engineering documents such as reports, white papers, or analyses."

On page 190, pdf file of the Final Report, Return to Flight Task Force posted at

http://www.returntoflight.org/

-- Edward Tufte