All 4 books by Edward Tufte now in
paperback editions, $100 for all 4
Visual Display of Quantitative Information
Envisioning Information
Visual Explanations
Beautiful Evidence
Paper/printing = original clothbound books.
Only available through ET's Graphics Press:
catalog + shopping cart
All 4 clothbound books, autographed by the author $150
catalog + shopping cart
Edward Tufte e-books
Immediate download to any computer:
Visual and Statistical Thinking $2
The Cognitive Style of Powerpoint $2
Seeing Around + Feynman Diagrams $2
Data Analysis for Politics and Policy $2
catalog + shopping cart
Edward Tufte one-day course,
Presenting Data and Information
Atlanta, February 24
Miami, February 27
Tampa, March 1
Boston, March 14, 15, 16
Oakland, April 20
San Jose, April 21
Palo Alto, April 24
The US Patent and Trademark Office creates a really stupid interface

The US Patent and Trademark Office spawns another disaster: one of the worse interfaces ever designed.

Here's a critique from the Coronado Group of the new USPTO Data Visualization Center Patents Dashboard:

Can someone, perhaps an Inspector General, find out the cost and the contracting company that did this? And the USPTO employees responsible? Direct responsibility begins with USPTO Director David Kappos, who enthusiastically endorses the dashboard.

Dashboard design is nothing special and does not deserve some special category or money-pit visualization contractor. Ten years ago I posted this essay on this board dealing with executive decision support systems:


Edward Tufte

(1) See Peter Drucker's book, The Essential Drucker, for a thoughtful chapter on "the information executives need today." That is, you should start by considering the intellectual problems that the displays are supposed to help with. The point of information displays is to assist thinking; therefore, ask first of all: What are the thinking tasks that the displays are supposed to help with?

(2) It is essential to build in systematic checks of data quality into the display and analysis system. For example, good checks of the data on revenue recognition must be made, given the strong incentives for premature recognition. Beware, in management data, of what statisticians call "sampling to please"--selecting, sorting, fudging, choosing data so as to please management. Sampling to please occurs, for example, when the outflow from a polluting factory into the Hudson River is measured by dipping the sampling test-tube into the cleaner rather than the dirtier effluent.

(3) For information displays for management, avoid heavy-breathing metaphors such as the mission control center, the strategic air command, the cockpit, the dashboard, or Star Trek. As Peter Drucker once said, good management is boring. If you want excitement, don't go to a good management information system.

Simple designs showing high-resolution data, well-labelled information in tables and graphics will do just fine. One model might be the medical interface in Visual Explanations (pages 110-111) and the articles by Seth Powsner and me cited there. A model for tables might be the supertable, shown in The Visual Display of Quantitative Information, p. 179. More generally, see chapter 9 of The Visual Display of Quantitative Information. The displays should often be accompanied by annotation, details from the field, and other supplements.

Sparklines show high-resolution data and also work to reduce the recency bias prevalent in data analysis and decision-making. Sparklines are ideal for executive decision support systems. See my threads on sparklines:

and on the implementation of sparklines

(4) For understanding a process and for designing a display for understanding a process, a good way to learn about what is going on is to watch the actual data collection involved in describing the process. Watch the observations being made and recorded; chances are you will learn a lot about the meaning and quality of the numbers and about the actual process itself. Talk to the people making the actual measurements; maybe you'll learn something.

(5) Measurement itself (and the apparent review of the numbers) can govern a process. For example, in printing my books, I ask that during the press run that the density of the black ink be measured in 6 or 8 different positions on every 3000th sheet being printed. These pulled sheets are then inspected shortly after the run and before the next run. The idea is to try to ensure that the color of the black type is uniform and at the right level of blackness in 3 ways: (1) across the 8 pages printed up on each sheet of paper, called a "form", (2) over the 40,000 sheets printed of that form, and (3) over the many forms making up the entire book. We sometimes review these pulled sheets the next day to check these density readings and to yell at the printer if there is a problem. But mainly the mere fact that the printers are making these measurements keeps the process in control. And the fact that someone might review the measurements.

Note that this example is mainly just common sense in workaday action; no jargon about an Executive Decision Protocol Monitoring Support Dashboard System is needed. In fact, such jargon would be an impediment to thinking.

(6) My own encounter with a real business trying to improve management data and the display of that data was in consulting for Bose. At one point it appeared to me that too many resources were devoted to collecting data. It is worth thinking about why employees are filling out forms for management busybody bureaucrats rather than doing something real, useful, productive. The story of this work is told in Michael H. Martin, "The Man Who Makes Sense of Numbers," Fortune, October 27, 1997, pp. 273-276; and in James Surowiecki, "Sermon on the Mountain: How Edward Tufte led Bose out of the land of chartjunk," Metropolis, January 1999, pp. 44-46. Both accounts make me appear excessively heroic. These articles are posted in the NEW section at

(7) Most of all, the right evidence needs to be located, measured, and displayed. And different evidence might be needed next quarter or next year.

-- Edward Tufte, August 27, 2001

-- Edward Tufte

I filed a FOIA request for the contractor of this dashboard, as well as the costs and original bid: patents-dashboard/866/

Hopefully learning about how it came about can lead other agencies to making more informed decisions in the future. That request's page is automatically updated as the request is processed.

-- Michael Morisy (email)

I don't think a FOIA request is needed, to determine the contractor. Some time spent on the Federal IT Dashboard site, indicates that the patents dashboard was the first deliverable of a six year, $146 M project to reengineer US patents data processing. The $1.2 M dashboard phase-1 had three vendors, DESIGN FOR CONTEXT LLC, APPLIED TECHNICAL SYSTEMS, INC. and KNOWLYSIS LLC. The 12 month contracts all had 10/26/2011 completion dates, which would seem to align with the September 7th announcement. The last vendor seems to have been extended an extra year into 2012.

I believe that the patents dashboard is going to be the least of our worries about this project. The "USPTO Patents End-to-End: Software Engineering" (PE2E-SE) project is the 2nd attempt to fix the patents backlog. The first attempt (Patent File Wrapper) was cancelled. The PE2E-SE project has a mission statement ( that is a 300+ word stream-of-consciousness of IT speak. Phrases like agile engineering, cloud environment and high-value targets, bounce around the pages of this project.

-- Tom K (email)