Analytical design and human factors
November 7, 2002 | J. D. McCubbin
6 Comment(s)
Please elaborate on the differences between your analytical information presentation principles and the human factors design approach. I thought the latter was to present information in a way that was within the human capabilities to perceive and process. If true, wouldn’t that design imperative completment the content drives and cognative needs which you stress underlie your univeral principles. But I understood you to say your approach and human factors’ were opposites in an answer at UCLA. Thank you.
Topics: 3-Star Threads, Science
Analytical Design
The purpose of analytical displays of information is to assist thinking about evidence. Consequently, in designing an analytical graphics, the first question should be: What are the evidence-thinking tasks that this display is supposed to serve?
Analytical grapics should be constructed to serve the fundamental cognitive tasks in reasoning about evidence: describing the data, making comparisons, understanding causality, assessing credibility of data and analysis. Thus the logic of design replicates the logic of analysis; design reasoning emulates evidence reasoning.
Converting principles of thinking into principles of design helps answer the most difficult question of all in the theory of analytical design: Where do principles of analytical design come from? The deep principles of analytical design are derived from cognitive tasks of analytical reasoning. This is appropriate, for the purpose of analytical displays is to assist evidence-thinking.
All this might have something to do with the field of human factors. But in practice, nearly all the great analytical designs have come from those possessed by the content; people who have learned something important and want to tell the world about what they have learned. That is, content-driven and thinking-driven, and not at all driven by bureaucratic externalities of marketing, human factors, commercial art, focus groups, or ISO standards.
In working on 4 books on analytical design, I have often turned to the human factors literature, and then left in despair, finding few examples or ideas (beyond common-sensical) that were useful in my own work. This contrasts to the work of scientists, artists, art historians, and architects–work overflowing with ideas about evidence, seeing, and the craft of making analytical displays.
I believe that work about analytical displays should be self-exemplifying; that is, the work should show us amazing displays of evidence. My despair about human factors began many years ago upon going through volumes and volumes of the journal, Human Factors, where evidence was reported using statistical graphics of wretched quality, with thinner data and worse designs than even in corporate annual reports. Also the methodological quality of the research was poor, and so nothing was credible. The findings seemed entirely context-dependent, univariate (design and seeing are profoundly multivariate), and without scope: what did it matter if some students in freshman psychology in Iowa preferred one clunky font compared to another clunky font in an experiment conducted by a teaching assistant? Later, while consulting, I saw this naive dust-bowl empiricism fail again and again for nearly a decade in trying design a competent PC OS interface. (And with the Mac interface sitting there, smiling, all the time. Apple’s superb interface guidelines seemed to me to be a retrospective account of the beautiful hands-on craft of a few brilliant designers, not a reason to have experimental psychologists attempt to design OS/2 and Windows.)
At any rate, if this was the scientific practice and the design craft of applied psychology, I concluded the field did not have much to contribute to my own work on analytical design.
I happily fled to the classics of science, art, and architecture.
Here are some more thoughts (borrowed from another answer I gave on software design) on this issue.
This book by Ron Baecker and Aaron Marcus seems to me to be a notable contribution on how design arrangements might help programming: Baecker, R.M., Marcus, A., Human Factors and Typography for More Readable Programs, ACM Press, 1990.
I believe the interface should be designed FIRST, by people who deeply understand the specific content and the specific analytic tasks that the interface screens are supposed to help with. Screen after screen should be specified in intense detail by content experts, completely independently and without reference to how those screens might be created.
Only then do we turn to the technical implementation, which becomes simply a by-product of the interface screens and interface activites. The interface design, the content design should drive the entire development process. Thus the lead managers for development of a project management program, for example, would be people who actually manage projects and who teach courses in project management. Too often, the available software drives the design, rather than the content/analysis needs of the user.
There are a lot of software solutions around desperately looking for some kind of problem to solve–that is, inside-out design. But better tools for users will be more likely the product of outside-in design which make the content-substance and analytical tasks of the user the driving priority. Doing good outside-in design probably requires a thorough-going independence in specifying the interface; that is, the interface should be content-specified by people completely independent of the software development process. If not, the content-specification will be governed and distorted by the needs of the already-existing software.
Content-driven design requires a radical shift in power and control. The Vice-President for Programming reports to the Senior Vice President for Content!
An experiment-based argument for keeping graphical designs a simple and spare as possible:
———
Visual Clutter Causes High-Magnitude
Errors
Stefano Baldassi1, Nicola Megna1, David C. Burr1,2
——–
Ironically, and a bit humorously, there is substantial use of data-imprisoning grids and moire effects in the paper’s data graphics.
It is not just spare design, it is also content resolution. “Simple designs, complex information” is the short form.
If possible, design architectures should be spare, straightforward, conventional, and self-effacing; AND the evidence presented should be rich, complex, multivariate, causal. The idea is to maximize the viewer’s content reasoning time, and to minimize the design figuring out time and to reduce impediments to content reasoning.
This theme runs through the first 3 books, with increasing elaborations over time: the “data-ink ratio” in The Visual Display of Quantitative Information (1983, 2001); micro/macro designs, and layering and separation in Envisioning Information (1990); and the “smallest effective difference” in Visual Explanations (1997). Models include good maps, aerial photographs, and the day-to-day practices of scientific research that involves a flood of data.
This study above seems to deal with the a small part of the approach of VDQI: the “non-data-ink” side. It’s more complicated that that.
As mentioned in an earlier answer, I have a deep skepticism of such experiments, no matter whether the results favor or disfavor my theories. One source of my skepticism is the inability of human factors researchers to produce decent statistical displays for their own research. Another is the barefoot empiricism and the naivete of such research, which sometimes appears to be a mechanistic parody of work in behavioral science.
I do however greatly admire Bill Cleveland’s work on graphical perception. His work is closely tied to assisting the analytical tasks of serious researchers when they look at their data. To my knowledge, his work is the most analytically aware of any of the experimental work.
Jonathan Ive is Apple’s lead designer for iMac, iPod, iPhone.
From Rory Cellan-Jones, BBC dot.life, “Listening to Mr. iPhone:”
Sacha Greif has an excellent essay on current digital design trends. Whether you agree with his conclusions, he presents 20 years of evidence thoughtfully and well.
http://sachagreif.com/flat-pixels/