All 4 books by Edward Tufte now in
paperback editions, $100 for all 4
Visual Display of Quantitative Information
Envisioning Information
Visual Explanations
Beautiful Evidence
Paper/printing = original clothbound books.
Only available through ET's Graphics Press:
catalog + shopping cart
All 4 clothbound books, autographed by the author $150
catalog + shopping cart
Edward Tufte e-books
Immediate download to any computer:
Visual and Statistical Thinking $2
The Cognitive Style of Powerpoint $2
Seeing Around + Feynman Diagrams $2
Data Analysis for Politics and Policy $2
catalog + shopping cart
Edward Tufte one-day course,
Presenting Data and Information
Atlanta, February 23, 24
Miami, February 27
Tampa, March 1
Boston, March 14, 15, 16
Analog gauges and the user interface

I attended the course in Boston yesterday, and enjoyed it very much. Made me think about the following story which might spur some discussion or comments here. It seems related to the overall theme here.

In 1985 I attended an OOPSLA (Object oriented programming languages ...) conference. Alan Kay (PARC/Smalltalk/ Apple/Macintosh/...) gave a presentation. Alan told the following true story:

He once flew down to Mexico on vacation, to some lonely place on the California peninsula for surfing etc. A pilot was supposed to come in a week to pick him up at a rural landing strip. Alan got there on time, waited, and eventually the plane, an older DC3, came. When Alan entered the plane he noticed that allmost all the instruments had been unscrewed from the panels, pulled out and twisted around in various positions, and were basically standing (or waving) on their cable hoses like flowers on their stems. He got worried, considered exiting the plane, but decided to stay. The pilot, a younger fellow, seemed trustworthy.

When the plane had reached cruising altitude and speed Alan suddenly "got it" wrt. the instruments. As long as everything was operating correctly, all the needles on the instruments was pointing in the same direction! It was very easy to spot if anything out of the ordinary was going on, and what that might be.

This story has stuck with me as a super example of adapting the technology to what we people are good at, as opposed to the other way around which is too often the case.



-- Harald Skardal (email)

Response to Alan Kay and User Interfaces

Alan replied to this story on the Squeak list, but I will, through the magic of cut-and-paste, reproduce it here...

Hi Folks --

I like the designation of this as a "true story". Actually, it was one of the main jokes at PARC about UI and we were pretty sure that it was only partly true. We heard the story from elsewhere and I have always told this joke as happening to someone else not to me. (It was actually a flight from Mexico City to the Yucatan peninsula, etc.) So here are some interesting restructured memories in the retelling. However, it's wonderful that the gist and the point of the joke were perfectly remembered. This is why allegories, etc., were heavily used before writing and printing. The stories always get changed, but usually the gist remains intact.



-- David A. Smith (email)

Response to Alan Kay and User Interfaces

For a look at some of the things that we (Alan, David Reed, Andreas Raab, and myself) are working on, check out these papers: <> and another paper with some more recent work in UI is at: <>


-- David A. Smith (email)

Response to Alan Kay and User Interfaces

At Lime Rock, the dashboards of some Porsche racing cars had their street gauges rotated so that under optimal conditions while racing the pointers were all straight up. A driver explained that it easier to see anomalies at a glance. I have some pictures around of the altered Porche dashboard.

Once in a 2-day Porsche driving school at Road Atlanta, I drove a 911 around the track but found that my mind was so concentrated by the view through the windshield at 125 mph and by hoping that I hadn't missed a gear that I was unable to perform a careful human factors assessment of gauge readings. I vaguely recall that the stability management system that corrected cars in trouble somehow revealed its operation (maybe by a light on the dashboard or by a change in the feel of the car) although I was more interested in other matters at those points in time and space. I concluded that part of the value of the stability management system was in warning the driver that the computer thought the car was in trouble; the remaining value was that the system actually did something by applying differential breaking to one wheel.

-- Edward Tufte

Response to Alan Kay and User Interfaces

The Diamond Reo line of trucks (REO stood for Ransom E. Olds, the founder of Oldsmobile) did something similar to this in the 1940s. When operating correctly, all the gauges pointed to 3 o'clock. As they were lined up side-by-side on the dashboard, the pointers conformed to a single straight line.

-- Stephen Fleming (email)

The "Nuclear Navy" has several examples of system design, although it might not be called that. After digital equipment became reliable enough for use on submarines (i.e., can handle being depth-charged), system designers found that people have an easier time reading the current temperature, pressure, fluid level, etc. from a digital meter showing the number on a 7-segment display, because you don't have to interpolate between increments, among other things. But during a situation where parameters are changing rapidly, it's easier to get a sense of the severity of the situation using an analog display, by looking at how fast the needle is dropping or rising. So nuclear-powered submarines use a lot of "Standard Digital Meters" - digital meters with the current parameter value on a 7-segment display in the middle, along with a semicircular arc of LEDs around the edge of the meter face. As a parameter rises and falls, additional LEDs on this circular bar graph light up or go out (like the individual segments on a visualizer for a stereo equalizer). This way you can quickly check a current steady-state value, and also quickly get a sense of how fast fluid level or pressure or temperature is dropping (such as during a system fluid leak).

Plenty of other design examples in submarines. In general, indicating lights for valves use a green "O" shape to indicate that they're open. Green is "good", at least for isolation valves in a coolant system. Amber bars are used to indicate when a valve is fully shut. On a submarine's Ship's Control Panel (SCP), there are indicators for most of the major valves that could let seawater into the submarine. So just prior to diving the submarine, you check for a "straight board" - making sure there's a completely unbroken row of valve position indicators that all show shut.

Warning lights are generally yellow or amber, alarm lights are red, and usually are associated with a siren. Green lights give info that doesn't generally indicate a problem. (Although you do have exceptions, like blue lights in odd places.)

Even on certain valves that are opened and shut by rotating the handle 90 degrees, the handle is usually set up so that when the valve is open, the handle is parallel to the piping the valve is connected to. When the valve is shut, the handle is perpendicular to the direction of piping.

Every little bit of time saved helps during flooding, fires, nuclear reactor coolant leaks, steam piping ruptures, yadda yadda yadda.

-- Adam Jenkins (email)

Threads relevant to interface design: