All 5 books, Edward Tufte paperback $180
All 5 clothbound books, autographed by ET $280
Visual Display of Quantitative Information
Envisioning Information
Visual Explanations
Beautiful Evidence
Seeing With Fresh Eyes
catalog + shopping cart
Edward Tufte e-books
Immediate download to any computer:
Visual and Statistical Thinking $5
The Cognitive Style of Powerpoint $5
Seeing Around + Feynman Diagrams $5
Data Analysis for Politics and Policy $9
catalog + shopping cart
New ET Book
Seeing with Fresh Eyes:
Meaning, Space, Data, Truth
catalog + shopping cart
Analyzing/Presenting Data/Information
All 5 books + 4-hour ET online video course, keyed to the 5 books.
Why producing good software is difficult

.

-- Edward Tufte


Charles Mann, "Why software is so bad"

Charles Mann, an excellent science reporter, has written a long article "Why Software is So Bad," in the MIT Technology Review; June 30, 2002.

http://www.technologyreview.com/Infotech/12887/

Mann points out, for example: "As Microsoft's online Knowledge Base blandly explained, the special backup floppy disks created by Windows XP Home 'do not work with Windows XP Home.'"

-- Edward Tufte


Response to Why Software is So Bad

What a depressing article, but accurate nonetheless. And at the heart of the decrepit state of current commonly-used software lies (yet another) data display problem.

Code gets longer over time, as more and more people edit the same sections to enhance functionality (or fix bugs). In many companies, two different programmers may work on the same section at the same time. Good communication between coders is essential to turning out a high-quality product. Programmers have to know not to obliterate changes others have made without good reason and clear warning. Programmers generally are loath to erase code from a section for fear of breaking something else.

As a result, this year's programmers frequently must wade through thousands upon thousands of lines of code, each one representing another step in an increasingly complex algorithm. Computer screens can't present more than a small fraction of that code at once; about 65 lines is my maximum, even with a high-resolution screen. Thus, most of the code lies out of reach of the eye -- out of sight, out of mind.

Now imagine a team of developers, about two dozen, all working on a project containing nearly 1 million lines of code. The standard line-by-line representation of the code just isn't good enough to allow everyone to understand the scope of impact of any single code change, without extraordinarily meticulous planning and control, which cause costly delays to the development process. Small wonder that software companies, fighting for their existence in an increasingly feature-guzzling environment, find themselves releasing software of ever decreasing standards.

-- Scott Zetlan (email)


Response to Why Software is So Bad

Societies have invested more than a trillion dollars in software and have grotesquely enriched minimally competent software producers whose marketing skills far exceed their programming skills. Despite this enormous long-run investment in software, economists were unable to detect overall gains in economic productivity from information technology until perhaps the mid-1990s or later; the economist Robert Solow once remarked that computers showed up everywhere except in productivity statistics.

Quality may sometimes be the happy by-product of competition. The lack of competition for the PC operating system and key applications has reduced the quality and the possibilities for the user interface. There is no need on our interface for a visible OS, visible applications, or for turning the OS and browsers and e-mail programs into marketing experiences. None of this stuff appeared on the original graphical user interface designed by Xerox PARC. That interface consisted almost entirely of documents--which are, after all, what users care about. Vigorous competition might well have led to distinctly better PC interfaces--without computer administrative debris, without operating system imperialism, without unwanted marketing experiences--compared to what we have now on Windows and Mac.

Today nearly all PC software "competition" is merely between the old release and the new release of the same damn product. It is hard to imagine a more perversely sluggish incentive system for quality. Indeed, under such a system, the optimal economic strategy for market leaders may well be the production and distribution of buggy software, for the real money is in the updates and later releases.

One of Philip Greenspun's points in his introductory programming course at MIT is that the one-semester course can enable students to create the programming equivalent of the amazon, eBay, or photo.net websites. So why it is so hard to get it right the first time? Or at least by the Release 8.06th time? See Software Engineering for Internet Applications (MIT 6.171) at http://philip.greenspun.com/teaching/one-term-web

-- Edward Tufte


Response to Why Software is So Bad

I'm not entirely convinced that strong competition breeds good software. As an example, look at Microsoft's Internet Explorer vs. AOL's Netscape browser: as competition increased in the mid-90s, the quality of both browser steadily declined. Bug reports became more frequent, security vulnerabilities increased, and speed and accuracy of rendering both declined substantially. All the while, resource requirements (CPU, memory, hard disk space) increased with every new release.

Maybe software decay is more closely related to market growth: the faster the market size grows, the faster the software goes downhill. Contrast the browser example with a more stable market -- graphic design/desktop publishing, where Quark XPress vs. Adobe PageMaker (it now has a new name) bred increasingly good features and higher ease of use. Both programs eventually included a control-panel where many attributes of any item could be changed without requiring a separate window.

-- Scott Zetlan (email)


Response to Why Software is So Bad

Scott,

I'm not sure I agree with your analysis. I think the browser wars were an anomaly; it all happened within a few short years; Netscape as a company went through many huge changes; Internet Explorer became one of Microsoft's best products at the time.

Let me offer a counter-example: on-line stores. What is now Yahoo! Stores first began as a small start-up. Their competion propelled them to improve their product relentlessly, and they specifically used sophisticated tools to do so. The end result was wonderful, and intense competition was a major factor. One of the founders gave a good talk on the subject once, and you can find a transcript here:

http://www.paulgraham.com/paulgraham/avg.html

The article is really about programming languages, but it is somewhat related to this discussion.

As a side note, Adobe now make InDesign, which is not a replacement to PageMaker but is often confused as one. Its target market is graphics design professionals, and the forums I frequent are full of frustrated ex-Quark users who love InDesign. I love it, though I never used Quark. In particular, its abilities at typography are substantially better than any other program out there.

-- David Person (email)


Response to Why Software is So Bad

I see parallels between the Microsoft examples and the Adobe InDesign and Quark comparisons.

I'm a long-time Quark user and my design partner also uses Quark exclusively. I own InDesign (version one) but never really used it simply because for film/digital file output Quark is still the known quantity at the vendor end. In contrast, many magazines are requesting only PDF's for digital files for advertising so it won't matter which program is used to do layout.

But the point is that many programs I have used over the years were better than the ultimate 'winner' in the category. And the winner was the one who dominated through penetration in the market gained by business strategy, not by product superiority.

I liked MacWrite better than MS Word because it was simple and didn't try to be more than a word processing program. I liked Aldus Persuasion better than MS PPT years ago. And I might like InDesign better once I sit down and use it (but Quark still dominates service vendors in my mind today).

Lastly, Beta was always better than VHS. I think software is bad because it can be and still become the standard.

-- Alison Fraser (email)


Response to Why Software is So Bad

At the beginning of this thread I mentioned Robert Solow's famous remark that computers show up everywhere, except in the productivity statistics. Solow comments on this, and on economic matters, in a recent interview at http://minneapolisfed.org/pubs/region/02-09/solow.cfm As to be expected from Solow, bright, direct, funny.

-- Edward Tufte


Response to Why Software is So Bad

I'm a bit late responding, but you can find Mann's article "Why Software is so bad" at http://www.thesmallworlds.com/press/WhySoftwareIsSoBad.pdf. I attended Tufte's fascinating talk in Westwood last night and was browsing the site today and came across this thread.

-- Bill Sharpe (email)


Response to Why Software is So Bad

1. The article on "Why Software is So Bad" contains at least one error (if it were software, we would say it is "buggy"). He says that the Ariane failure was due to a buffer overflow. In fact it was due to an arithmetic overflow. The overflow occured in a piece of code that was no longer used, but was left in the system on the theory that "If it's not broke don't fix it".

2. Technology has drastically improved productivity in certain areas.

For example, I have access to much better information about investing than I used to have. Ordering books works much better than it used to. I am no longer at the mercy of the local bookshop to order books. The number of people required to process banking transactions has declined enormously. The cost of share trading is down 80%+. Banking services are available 24 hours a day via electronic channels. Etc.

3. In economics there is a principle that something that customers cannot assess will not be produced or will be low quality. In the case of software people can easily assess features, but not quality or security. So we get feature rich but buggy and insecure software. Economics is not as useless as a lot of people think, although it is ridden with unrealistic models and feeble mathematics.

Tim

-- Tim Josling (email)


IT productivity: Show me the numbers

For those interested in the statistics and economics of IT expenditure and productivity see: www.strassmann.com/

-- Andrew Leonard (email)


IT productivity: Show me the numbers

Never had any trouble with this link: http://www.strassmann.com/

Paul Strassmann's central thesis is that a general correlation between IT expenditure and higher corporate profits does not exist.

Links to Strassmann's books on this page: http://www.infoeconomics.com/

-- Andrew Leonard (email)


Response to Why Software is So Bad

Some very interesting discussions at Strassmann's site; for example, the Harvard Business Review sequence on IT.

-- Edward Tufte


Response to Why Software is So Bad

Tim Josling stated, "In economics there is a principle that something that customers cannot assess will not be produced or will be low quality. In the case of software, people can easily assess features, but not quality or security. So we get feature rich but buggy and insecure software."

I'd like to suggest a comparison between the "free software" built by the GNU/Linux community vs. the software built by proprietary software vendors such as Microsoft. While "free software" software is not perfect, I find it to be much more reliable than proprietary software. On those occasions when a problem does arise, the end user can often get directly in touch with the author who will take it as a point of pride to fix it asap.

For those who don't know, GNU and Linux are "free software". I.E. they're licensed under something called the General Public License (GPL) - a.k.a. "copyleft". This licensing concept was invented by Richard Stallman, the founder of the GNU project, who explains that the "free" in "free software" means "free as in freedom" not "free as in free beer". In other words, someone can charge money for "free software" but their software must come with the freedom to examine how it works, and change how it works if needed.

Source code is the version of the program in the language that the programmer wrote it in. Source code to proprietary systems is kept secret by the company who develops it. Thus, as noted by Josling above, customers cannot assess quality or security. And just as in government, when things are done behind closed doors, bad results are common. This is usually either because the company considers the code to be "good enough" to ship even when it's known to have flaws, or because the company's interests are at odds with the customer's interests and the company takes advantage of the secrecy of the source code and hides some feature that the customer would never put up with if they knew it was there. This is how "spyware" manages to exist. For those who don't know, "spyware" is code which does some task for the user, but simultaneously it silently reports information back to the vendor. That information can be relatively innocuous - e.g. how often the user actually uses that application or some competitor's application, or it can be patently offensive - e.g. searching the user's disk for personal information (credit card nos., financial records, tax records, whatever). If you want a real eye-opener, read the Microsoft Windows XP EULA (End User License Agreement) sometime and discover just how much of your right to privacy you've given away simply by installing the operating system!

In contrast to proprietary software vendors, software developed by the GNU project and the Linux community is, by definition, licensed under the GPL which requires that anyone who receives a GPL'd program must be able to get the source code and have the right to modify and distribute the source code. The programmers working in this development model are writing code because they love programming, not because the company they work for is trying to rush it out the door to make as much money as possible from it.

Because the source code is available for all to inspect, and the programmers working on it care about their reputation within the community of programmers, they'd be personally embarassed to release source code of the quality that is regularly released by companies that write proprietary software.

Furthermore, even those who are not programmers get a significant benefit from the source code's availability. Besides the obvious fact that anyone who chooses to improve it can do so, there's also the openness of the development process. If someone were to try to hide "spyware" in GPL'd code, it would be quickly discovered by those in the programming community. So the likelihood of "free software" containing "spyware" becomes negligible - a great benefit to non-programmers and programmers alike.

So let me suggest that one possible answer to Prof. Tufte's question, "Why is software so bad?" is that proprietary software companies are allowed to keep secrets.

Mark Rosenthal
<mbr@arlsoft.com>
Arlington Software Enterprises
(Linux Systems Consulting)

-- Mark Rosenthal (email)


Response to Why Software is So Bad

Thanks for the insights.

Having muddled through engineering school in the days of punched-card programming, I've long been baffled by the lattitude allowed software companies.

If the maker of a mechanical product were to publicly announce, "our NEXT version will not-work LESS than THIS one!"--they would not only be ridiculed, they would be sued. -And they would be expected to replace the "bad" with the "less bad".

For free.

'Splain to me, if you would: why do software folk get to operate by a different set of rules?

Also: the early, admittedly-buggy releases are called "Beta" releases. Shouldn't they be the "VHS" releases?

Cheers, Bill Leebens.

-- Bill Leebens (email)


Response to Why Software is So Bad

Tin Josling writes: <snip> For example, I have access to much better information about investing than I used to have. Ordering books works much better than it used to. I am no longer at the mercy of the local bookshop to order books. The number of people required to process banking transactions has declined enormously. The cost of share trading is down 80%+. Banking services are available 24 hours a day via electronic channels. Etc. </snip>

While I agree that access to information in the networked age is a huge benefit to many of us, it's hard to tie it back to productivity. I see plenty of people in my workplace (at a large university) for whom the computer is just an expensive and unreliable typewriter.

I think the poster who commented at length on the virtue of "free" software vs proprietary has a point, but I would extend it further. One thing that would open up real competition is open file formats, allowing people to move between competing applications with minimal switching costs. As it stands now, the application's publisher "owns" your files since the only way you can access them is through their application.

I think eliminating or lowering switching costs between similar applications and reverting ownership of data to the person who created it will go a long way to improving software through market forces: less chrome and more reliability.

We saw movement away from openness in web content during the browser war, also alluded to in this thread, as browser makers tried to capture the imagination of web designers with more and more features (anyone remember the bgsound property that enabled a web designer to play music as the page was displayed? think it made the latest round of standards?). The smoke has cleared and we're seeing less of that and more inclusion.

It would be refreshing for computers and software to be seen as a commodities, rather than the people who use them.

-- paul beard (email)


Response to Why Software is So Bad

In "The trouble with computers" Thomas Landauer goes into the productivity question in some detail. His hypothesis is that abysmal usability is the main reason software hasn't made us more productive, and he gives many good reasons.

Why do so many people believe that computers have improved productivity when the statistics show they have not?

The illusion perhaps comes from visible benefits and invisible costs.

This point is illustrated by a study of electronic point of sale systems in a supermarket, described in Landauer's book. Barcode scanning was introduced to a supermarket and management in the supermaket thought that, as a result, productivity had increased. They even had people spare to help shoppers pack their bags.

The truth? The time saved at tills was compensated for by expenditure on support, maintenance, and time wasted when the machines went wrong.

The people who helped shoppers pack their bags were, in fact, extra employees hired at around the same time as the scanning was introduced.

You can see the same mechanism quite often. In my last job management thought it would be a good idea to have a "skills database" to help match people to projects. They were thinking of the ease with which their matching work could be done, but not thinking of the huge effort it would require to set up, populate, AND MAINTAIN such a database with current information.

Their enthusiasm was not dampened by the fact that every previous attempt to create a database of skills had fizzled out due to failure to maintain current information.

-- Matthew Leitch (email)


Response to Why Software is So Bad

In regards to the above mention of fixes and updates, and issue that has been ignored is that this is in many ways the last carrot software producers have to induce consumers to buy market-price software.

With the proliferation digital copying devices (and hackers who enjoy hacking through serial key algorithms and putting software into peer-to-peer networks), much software can be obtained for free. One of the tied-in services that producers can offer that they do have much more control over is updates and patches. In order to get these, you must register, and usually interact with the company. While both a stick (e.g. pirated windows XP freezing up upon trying to update) and a carrot, many people who find it necessary to use software will gladly pay for the ability to make sure their work/files are protected from unfortunate incidents.

If producers shipped a perfect product, what incentive would consumers have to buy legal copies, when illegal ones are free?

Obviously, there are reasons, but still something to bear in mind.

-- Daniel Egan (email)


Response to Why Software is So Bad

Small anecdote (sorry, I don't have references and can't confirm) ...

In the "motivation" portion of a project management training class recently, I heard of a software development company whose director tried to improve the quality of their products by offering a $50 reward for every bug found by someone in the QA department, and $50 for every bug fixed by someone in software engineering. What he created was a cottage industry within his company whereby the engineers would knowingly put out code with bugs so that QA could find it and engineering could fix it.

-- Bruce Hensley (email)


Response to Why Software is So Bad

For an extraordinary departure from the normal software information design (also known as user-interface design), see http://www.mackie.com/products/tracktion/index.html. I am not a music professional by practice...only a part-time hobbyist. In a software genre that is arguably the worst offender of "computer debris", this application is stunning. Much music sequencing software is littered with screens that attempt to replicate on-screen, the appearance, some quite faithfully, of the physical machine that the software is replacing. This includes "dials" that are difficult to "turn" with a mouse. The result is always disaster for the user...despite the fact that the software in question is "industry standard".

Please note I have no interest, financial or otherwise, in this software application. I am merely a user, who loves software and has seen thousands of applications in my software development career. Before Dr. Tufte's site closes to new posts I thought some software information design specialists might find this an interesting application to study.

Ken Florian

-- Ken Florian (email)


Response to Why Software is So Bad

We need software as a commodity. Creating software is an art. Therein lies the problem.

Programming is an intensely creative process. Not only the software design -- I mean the actual coding itself. There's frequently no clearly proper way to do something -- finding the right way to do it requires creativity. It requires a person who's something of an artist. There simply aren't enough artists around to make consistently good software.

Until we figure out how to make programing a non-creative task -- make it like brick-laying or prefab construction -- we'll be stuck with bad software. The guys who are creative enough to do great work end up becoming designers, leaving legions of less capable individuals to do the construction.

Could the pyramids have been built if all the laborers needed to be architechts too? Until we figure out how to turn programmers into laborors -- how to commodotize them -- I think we'll be stuck with bad software.

-- James Acres (email)


Response to Why Software is So Bad

Phillip Greenspun very his Phillip's and Alex's guide successfully argues the point the information industry is managed by those who are essentially ignorant of programming practice. Taking Mr. Acres contribution to heart, what is really needed is to reform perception of what is good and what is bad. Craftsmanship is irrelevant.

-- Ed Mikula (email)


Response to Why Software is So Bad

Why is Software So Bad? Why is Art So Bad? Why are Books so poorly written? Why is Music so awful?

These are curious questions. I enjoy so much good music, I read great books, I see transcendent art, and I marvel at your words from an unknown distance thanks to software from many, many dedicated programmers.

And if the software that brought your words to me this evening never appear to increase productivity, that's just fine with me. I enjoyed them just the same.

-- John J. Barton (email)


Response to Why Software is So Bad

We could blame both consumers and producers. Many customers who complain about poor software quality are also the ones demanding many features they will barely use (or at least, that most of their users won't ever touch.) Rarely have I seen a customer put quality on their list of criteria to purchase, but they will have long laundry lists of interfaces, import facilities, security protocols to integrate with, etc. Quality is typically measured in proofs of concept prior to purchase and they that typically won't throw out a vendor unless the level of bugs prevent the determined project from being delivered.

Likewise, when we design GUI type programs, we often begin with a certain scenario in mind for the new task to address. Users often come along and attempt to apply it to a related but unplanned for scenario (I bought a bus, but why can't I race it?) and then wonder why they hit bugs.

Finally, studies have been performed on even simple applications and found how quickly bugs creep in to simple software packages. Even dedicated senior programmers have a typically rate of 2-4 bugs per 1,000 lines of code, so a million line program is typically developed with 2,000-4,000 defects (I have worked for multiple major vendors, not just SAS.) Some software vendors are much more willing to release software with minimal testing and others will allow much more time prior to release, still testing can only catch so many of the issues (maybe 95% at a good company), so you are still stuck with many defects in a complex program.

Bottom line- software is one of the least purpose specific products on the market (compare with a car, screwdriver, skateboard, bicycle, etc.) and as a result, will continue to be very buggy for some time to come. Think of very specific purpose systems like Tivo and you will see software of much higher quality due to limited usage patterns. Use a skateboard as a hammer or a car as a bulldozer and watch how long they meet your need.

-- Stephen McDaniel @ SAS (email)


Response to Why Software is So Bad

Stephen McDaniel provides an excellent summary of why software is bad. However, this changing, and a lot of software today is becoming quite good.

Google Search, Flickr, Delicious, RSS/XML feeds, Google Maps and other applications and platforms are becoming, or have been, good.

They focus on solving individual problems. With API's or standard protocols today's software enables third parties to extend, or enable, the software beyond the creator's ability, imagination or energy.

Another factor is taking place in software development: test driven design. Software is written by first testing it. This has three big advantages: bugs are reduced, better development documentation can be produced(via the tests) and feature creep is possibly stymied.

The later is more theoretical than applied. However, a truly test driven application will require scenarios to test various features of the application. The developer(s) will need to design and write such scenarios for new features. Since good developers are 'lazy' they will ask: is this necessary, does this feature improve the software, why?

I feel optimistic about software, at its core, improving over the coming years. Whether the GUI improves is another matter.

-- J. Weir (email)


Response to Why Software is So Bad

In tune with the current political environment, the new version of MS Excel automatically generates lies by default, because that's what the focus groups liked.

-- Alexey Merz (email)


Blame the language

A large portion of the blame lies in the languages chosen to write software. C/C++ are horrible languages for writing good, reliable software. It can be done, but it certainly does not promote it. Even the popular languages which are purported to have these properties do not - for example, Java (This is quite a controversial statement).

Beyond reliability, a requirement for product goodness is ability to experiment - ability to give a feature a quickie implementation to see if it has potential. In C/C++ often times the codebase is a huge mound of built up hacks - implementing an unforseen feature requires tons of effort.

It has been shown that programmers spend about 5x more time reading code than writing it. While this number seems fairly arbitrary, the point is that readability is very important. Programmers would get much more done if the code was so opaque that they could spend less time reading it and more time thinking and coding.

Another desireable feature which is most present in functional languages is the ability to express things in the purest, most correct way. Mathematical concepts become near direct transliterations. This is part of opaqueness, yet has the side effect of being able to prove properties.

Basically, programming language designers have to keep many things in mind at once in designing a good language. The problem is, most only really think about a few in regards to their langauge, and take everything else from past languages. This leaves languages on slow, evolution-style improvement cycles. Until true care and consideration is put into every aspect, we will remain on this course.

So, most languages aren't very good overall, yet there are certain languages which have overwhelming popularity. These languages are often times good at certain things yet not others. Commercial software projects will go the route of tradition, and pick the most popular languaege. Rarely is this the right choice for the task, yet it certainly makes the manager's job easier - hiring is easier, and the choice is very justifiable.

The correct language is certainly not a substitute for programmer skill, yet it does allow for fewer programmers and higher quality.

I do not mean that a good all-round language is impossible. It just doesn't exist yet.

-- Michael Sloan (email)


Response to Why Software is So Bad

Software is bad by what metric? Compared to 10 years ago, we have software functionality that is incredibly advanced. The ubiquity of video on YouTube of google search, of software that makes automobiles drive and airplanes fly is all due to rapid advances in capability and technology. Of course there is a lot of badly engineered and crappy software. The math is a lot harder - as Mann notes. But I think the truly distinctive feature of programming is cultural. Programming is, currently, a mixture between craft and engineering, and academic computer scientists are, by and large, disdainful of both. For a long time, I accepted the standard critique of software as bloated, buggy, lacking elegance and rigor etc. But I've become dubious of this critique which more and more reminds me of the Modern Architecture complaints about buildings that were comfortable and usable. Some more remarks can be found here

-- victor yodaiken (email)


Response to Why Software is So Bad

Michael Sloan's comment that

C/C++ are horrible languages for writing good, reliable software. It can be done, but it certainly does not promote it.

(with which I thoroughly agree) prompts me to ask him or anyone else if they can explain the C/C++ revolution that overwhelmed programming in the early 1990s. If you read expert opinion in Byte or similar publications from immediately before the revolution you will find a general consensus that C/C++ were not the way ahead, and that reliable software could be much more easily written in Pascal or Modula 2. (Of course, you can also write thoroughly unstructured unmaintainable programs in Pascal if you really want, but it's not so easy or tempting as it is in C.) Yet quite suddenly C/C++ won the war, and I have never understood why.

-- Athel Cornish-Bowden (email)


Response to Why Software is So Bad

Athel, I don't know for sure but probably the following things contributed to C and C++:

  • C is easy to learn, especially if you already know Pascal. After you know C, C++'s basics are easy to learn.
  • Compilers were easily available for most operating systems, often for free.
  • Most operating systems at that time were being written in C or assembler and C (With the exception of the Macintosh?) and so it was easiest for programs to use C to access the OS. The most famous OS associated with C is of course Unix.
  • From there, free programs and libraries proliferated.
  • C compilers tended to create faster executing code than interpreters of the time, and also of some of the more interesting 4GL /Object Oriented languages.

I disagree that the C and C++ languages contribute to "bad software". Bad software for the user is a result of lack of vision or interest on the part of the programmer or vendor in creating "good software" for the user. Some other languages may provide helpful tools for doing various tasks, but in the end it is up to the programmer/vendor to shape the experience of the final program.

It is also not difficult to write correct and bug-free software in C or C++ if you think clearly and carefully about your code, and have a good understanding about how it will be compiled and executed. These traits should be required for any kind of programming, regardless of language.

-- Reed H (email)


Response to Why Software is So Bad

My own experiences in software engineering and development boils down to the old quote "You may chose two the three options, no more: 1 - On time 2 - On budget 3 - Correct "

Unfortunately, most companies prefer to chose 1 and 2. There do exist companies that choose 1 and 3, except they generally aren't in the market of selling software to general consumers. The folks that write the control software for the space shuttle come to mind.

-- Matt (email)


Response to Why Software is So Bad

What do people mean by saying C and C++ are "bad" languages? Ritchie and Thompson are among the most brilliant computer programmers of the last 30 years. Their accomplishments are stunning in depth and breadth. The comments of "Byte magazine" writers on their works have exactly the same force as complaints in Architecture Digest about John Roebling or in Readers Digest about Seymour Cray. Every major post 1980s operating system is written in C. The internet is running on C code. Why do people use that ugly reinforced concrete instead of mud and sticks or glass bricks ?

-- victor yodaiken (email)


Response to Why Software is So Bad

Here's a probable explanation of the phenomena - http://www.jwz.org/doc/worse-is-better.html . Some of the specifics of this essay I disagree with, but the overall idea makes sense. I also disagree with the implied conclusion that 'worse is better' is really justifiable.

Reed H - "C is easy to learn, especially if you already know Pascal. After you know C, C++'s basics are easy to learn."

I disagree. C is easy to learn if you already know ASM (it is heaven-sent if you had to do ASM before, especially if its along the line of x86 - ARM isn't too bad), or, a C inspired language, which is a large portion of the available languages. Also, C is easy to learn if you have a decent platform. I tried to learn it when I was 12 (programming for 4 years by then, mostly Basics), but I had problems getting real programs going - I lost motivation because I learned pointers and all that such but could only do console programs (and win32 was hell - i wanted to do dx or ogl). An eventual switch to linux solved this problem.

Essentially C was just a shorthand for common ASM idioms. In this task it does very well.

Reed H - "Bad software for the user is a result of lack of vision or interest on the part of the programmer or vendor in creating "good software" for the user. Some other languages may provide helpful tools for doing various tasks, but in the end it is up to the programmer/vendor to shape the experience of the final program."

It is true that the product quality is the responsibility of the makers. The question, however, is "Why Software is So Bad". We are explaining why these makers tend to consistently fail in creating quality software. My explanation is that the languages they use require a disproportionate amount of time for boilerplate (repetition), bugfixing, and reading the longwinded expressions of relatively uncomplicated concepts (like a bad, wasteful graphic). The conventional answer to this is the addition of more and more coders. This tends to have quite diminishing returns. This is particularly true with the primitive code versioning and collaboration systems (if any at all) of some companies.

Reed H - "It is also not difficult to write correct and bug-free software in C or C++ if you think clearly and carefully about your code, and have a good understanding about how it will be compiled and executed. These traits should be required for any kind of programming, regardless of language."

HAH! It's difficult to write bug-free software in much, much better languages than C/C++. Unless extravagant measures are taken, there will always be a corner cases. C/C++ tends to multiply these issues with a slew of hard to find, rarely occurring bugs, not to mention ease of coding bugs in general (which will later occupy programmer time for fixing).

I agree about thinking clearly about your code, although the compilation one is rather difficult with optimizing compilers such as most C implems. The thing is, C/C++ obscures clear thinking about code. Concepts that should take one small line (and do, in languages like haskell) take up 10. It is difficult to truely reason about your program unless you know the implementation of all the functions you deal with, and the compiler on top of that. This obscures thinking about what you are doing, the actual design, algorithms, and interface to your code.

Matt - I agree with that assessment. I think that decent languages would allow achievement of 3 without much extra effort - languages that don't get in your way, and allow expression of libs/progs in the most correct, natural way.

Victor - I was not criticizing Ritchie or Thompson (though to tell you the truth I had to google to find out who they were). The C language is indeed impressive - I could not make it. Most could not. It has allowed us to vastly speed up ASM coding. The problem, is it is overused. As you said, operating systems are written with it, the internet relies upon it (one might argue that the high level internet runs on ruby, python, php, etc). These are great applications of C - pedal to the metal applications attempting to squeeze out every last drop of performance. The thing is, this is the great minority of software. The problem is that there are all those other software projects which use languages like C/C++.

Perhaps I shouldn't have focused so much on C - there are plenty of languages with the same problems. Basically all languages, in fact. Just C and languages inspired by it particularly demonstrate them, and are particularly popular.

-- mgsloan (email)


Response to Why Software is So Bad

Victor Yodaiken tells us that "The internet is running on C code". Maybe so, and maybe that's why every single link in the "Recent blog entries" section of his blog (linked in his earlier post) yields the response "Sorry, no posts matched your criteria". I'm sorry too, and I'm not trying to be rude, but I do think that error-free code is easier to produce with a language other than C.

-- Athel Cornish-Bowden (email)


Response to Why Software is So Bad

A good discussion of programming languages was in the July-August issue of American Scientist.

http://www.americanscientist.org/template/AssetDetail/assetid/51982?&print=yes

Brian Hayes likens discussion of computer languages and computing concepts as similar to the big/little-endian war in Gulliver's Travels -

"In 1726 Jonathan Swift told of a dispute between the Little-Endians of Lilliput and the Big-Endians of Blefuscu; 41,000 perished in a war fought to decide which end of a boiled egg to crack."

He finishes with "I do believe there are real differences among programming languages - better ones and worse ones - and I rank Lisp among the better. When you get to the bottom of it, however, I write programs in Lisp for the same reason I write prose in English - not because it's the best language, but because it's the language I know best."

One area of software development that has interested me since I wrote an undergraduate thesis on it is "Literate Programming" a concept introduced by Donald Knuth in 1983. There is a website http://www.literateprogramming.com which gives you access to the original papers and development since then; which unfortunately hasn't been much.

-- Andrew Nicholls (email)


Response to Why Software is So Bad

I wouldn't regard literate programming as a step forward. I've tried using the various systems and my opinion always comes back to that its nice for documentation if documentation is the primary effort rather than the actual code. This is the case of books detailing algorithms (Knuth primary focus) but for real world development it doesn't help much.

Documentation's problem is that its an interpretation of what a piece of code "should" be doing but not necessarily is. It can lead you astray. Not that I'm advocating the removal of documentation, but its another part of development that is prone to have bugs.

-- Matt (email)


Response to Why Software is So Bad

I have experienced problems with technical types and non-technical types not being able to effectively communicate with each other. There isn't a common vocabulary and in the end programmers create software for themselves. A big part of the problem is software is an abstract thing, making the leap to a real product is extremely difficult. I don't think this situation is going to improve any time soon.

-- Doug Cleary (email)


Response to Why Software is So Bad

As a programmer, something I find interesting is that corporate financial statements must adhere to Generally Accepted Accounting Standards (GAAP) and now Sarbanes-Oxley, yet companies that write code for "mission critical" applications have only their own standards, if any, to adhere to. In theory both should be unnecessary as the market should weed out "bad software" and bad stocks as the consumer would refuse to buy either, but in reality that doesn't happen. The ability to carefully evaluate both software and corporate financial statements requires a level of knowledge that most people don't have, don't want and aren't going to acquire. In theory, proper adherence to standards could improve the situation, but just like with Sarbanes-Oxley, someone would have to pay for the extra work. In the end there's no free lunch.

Let's not forget that a manufacturer isn't making the best piece of software possible, they're making the best software possible within a given budget and time-frame that the consumer will purchase and allow them to make a profit. Airplane engines are built to strict standards, certified and often amazingly reliable assuming they are properly maintained, but they are devilishly expensive. That said, customers are willing to pay for both the engine and maintenance, in other words to pay for the reliability.

Ultimately software is buggy because as consumers we want it cheap, pretty and we want it now, and we're much more willing to live with the bugs than we are to wait longer, sacrifice functionality and pay more. The software manufacturers are just giving us what we as consumers want, and we're voting not with words but with dollars.

-- Jack Slowinski (email)


Response to Why Software is So Bad

Jack Slowinski writes As a programmer, something I find interesting is that corporate financial statements must adhere to Generally Accepted Accounting Standards (GAAP) and now Sarbanes-Oxley, yet companies that write code for "mission critical" applications have only their own standards, if any, to adhere to. Actually, there are all sorts of standards for safety critical and avionics software and software used in critical transportation systems. It is not clear that those standards actually produce better code, but they do exist. Google e.g. "DO178B" and "IEC software standard" and so on.

-- victor yodaiken (email)


Response to Why Software is So Bad (part 1: methodologies)

Before addressing the question, I feel obliged to present credentials to justify my comments.

I've been actively developing software for over 25 years, in academic, non-academic research, and commercial environments, and have been through almost as many projects over that time period.

I've used a variety of methodologies, ranging from the traditional linear "waterfall" method to totally unstructured development.

I should also address the definition of "bad" that I'm using to characterize software. Depending on whose assessment you read, anywhere from 50% to 75% of all software projects fail, either fully or completely. I think that referring to these as "bad" is reasonable, and furthermore that generalizing as far as to say that "more software is bad than not" is legitimate: I can't think of another discipline in which that high a failure rate wouldn't be scandalous, excepting advanced R&D and other experimental situations.

I believe that the single factor most responsible for the poor quality of most software is that the predominant software project methodology continues to be based on the waterfall model, where you first obtain requirements, then go though various levels of analysis, design, and review, and finally (after weeks to months of analysis and formal design) get down to implementation.

The fact that this model is so prevalent is surprising to anyone who studied Decision Support Systems (DSS), which existed as far back as the late 70s and early 80s. One of the fundamental premises of good DSS design was a rapid-prototyping approach, the reasoning behind which has been confirmed in almost every project I've ever undertaken that did not follow a rapid-prototyping model.

The simple fact is that most clients, customers, and end-users really don't know what they want. They have many things that they think that they want from an abstract, theoretical point of view, but which turn out not to be as useful practically as the theory suggested. They also don't know about things that they would find useful, if only they were aware of them -- but, lacking adequate knowledge of the state of the art in software (and particularly human-interface) design, they can't even conceive of them.

These problems are not necessarily symptomatic of ignorance or fickleness on the part of users: the world changes continuously, and it's not uncommon for requirements to morph -- sometimes to a significant degree -- between the original problem statement and the delivery of a system developed under the waterfall model. By the time that the new requirements are known, the system design and specs have probably been locked down, and development might have progressed in directions that are incompatible with the changes. The system is now doomed to failure -- or to excessive, and possibly unacceptable, cost and time overruns... a "lady or tiger" situation.

If you base a system design on what the customer says s/he wants up front, the system is almost inevitably going to fail to satisfy the customer. The only way to circumvent this reliably is to implement a quick (and, if necessary, dirty) prototype and put it in front of the user -- and then sit with the user and take notes while the flaws and limitations of the system are enumerated, and new functionality inspired by the insights gained from playing with the prototype is described.

-- Jon Pastor (email)


Response to Why Software is So Bad

Jon Pastor: I think you are mixing "so many software projects fail" with "software is so bad". These are related but the second is an assertion about software projects that do not fail. My observation is that by many objective indications much of this software is outstandingly good. My children can use Skype on Windows talking to a Linux based wireless router that pumps out IP packets though a chain of other complex programs and have live video discussions with their friends in other states. Thats 10s of millions of lines of code that generally works and makes something amazing so apparently effortless that we take it for granted. The ugliness that people see in many software products seems to me to often reflect not software itself but things like the bureaucratic environments for which the software is designed. This is like asking why strip malls are so bad or why giant hog farms are so ugly - the engineer or architect is only partially responsible in such cases. But the two questions are related by the difficulty people have appreciating the complexity and value of something invisible.

-- victor yodaiken (email)


The problem seems to be with really big systems, the "tar pit," as Fred Brooks famously called it in his great book The Mythical Man-Month. Also the original article and discussion that started this thread was about the tradeoff between resources committed vs. benefits.

And perhaps the big gains in computing should be substantially attributed to hardware improvements, not software.

See Scott Rosenberg's recent Washington Post article , which reports

"In my view, we lost our way," Vista's manager, Jim Allchin, wrote in an e-mail (later posted online) to Microsoft founder Bill Gates and chief executive Steve Ballmer. "I would buy a Mac today [2003] if I was not working at Microsoft."

-- Edward Tufte


Response to Why Software is So Bad

A link update for the article “Why Software Is So Bad,” by Charles Mann, posted at the beginning of this thread: http://www.technologyreview.com/Infotech/12887/

-- John Galada (email)


Response to Why Software is So Bad

Why software is so bad? or alternatively, why is good software so difficult to make?

Fred Brooks' influential 1986 paper, 'No silver bullet', explains why software production is inherently difficult, and why there are no significant productivity-improvements to be expected anytime soon. His main argument is that problems have an 'essential complexity' and an 'accidental complexity'. Accidental complexity is due to the imperfect nature of our tools, our representation methods, our programming languages. Essential complexity is part of the problem, and we can't do much about it. Brooks argues that as of 1986, accidental complexity accounts for less than 10% of the software development effort, and since all we can improve is related to this accidental complexity, we can't hope for significant overall gains. This prediction seems to have been confirmed by the last 21 years of developments.

-- Laszlo Kozma (email)


Response to Why Software is So Bad

The article focuses on Microsoft's way of doing software, which is not the norm in the industry, as it used to be ten years ago. For small systems, a single designer and implementer might do a good job, but for larger projects the open-source model seems to give better quality results. If the source code is open, there is an opportunity for reviewing that greatly reduces the number of defects. As a writer of open source software myself, I can say that by making my source code public, I received hundreds of helpful reports and suggestions, that I could never have found alone. As Eric Raymond puts it in "The Cathedral & the Bazaar", "given enough eyeballs, all bugs are shallow".

-- Laszlo Kozma (email)


Wonderful thread!

I think the fundamental disconnect is noted by a contributor above, that the users don't code and the coders don't use. And I don't think the current trends will allow the twain to meet.

Back in the mid-80's to mid-90's when PCs and then client-server came in, super-users at the departmental level could write meaningful and significant applications. Central IT hated these because they'd get calls for support for products they hadn't written, but the users had products that did exactly what they needed because they were written by people who had to eat their own dog food.

Sadly, support is a critical issue though, especially as we've moved back to the centralized development paradigm. Let's face it, is the Internet anything more than the 3270 greenscreen block mode processing with some dropdown controls and Ajax to make it a bit more palatable?

So, IT (often on a different continent) has to fix stuff when it blows up, which it will do for a variety of reasons aside from buggy software. They want standardized code so they know where to look for the problem, and get it running again because each minute of downtime is $100K in lost revenue. That means ERP, COTS packages, and (maybe, if you believe the hype) SOA coarse-grained service libraries.

As a consumer, I would much rather have Amazon display the correct amount of inventory on hand for my Days Of The New CD (which they don't always get right) than give users five different ways to upload their homespun scans of the cover art because they used J2EE instead of .NET.

I agree with contributors who say software is pretty good for what it needs to do. YouTube, MySpace, Facebook all seem to keep people happy despite the occasional 404. MS Word or Star Office still can't improve atrocious prose.

Non-technical people who want to generate blogs can do so if they are motivated. In fact, I think that's one of the strongest benefits of the Web is that users are becoming coders (in a limited sense) again, and as they take whatever level of control of the technology they want, they accept the limitations and get on with their overall objective.

I don't know if coding will die as an art form, but it certainly seems to me to have less room to innovate for the user experience. In 2000 we used to charge $150/hr for shopping carts and other e-commerce tools... now they give them away for $14.95/month including email accounts, credit card processing, and links to international shipping with UPS!

We recently had a demo of a WebSphere product that helps create marketing campaigns on websites. First, can you think of all the custom coding apps that this product will make redundant? Secondly, our developers seemed to think this product was aimed at them. IBM isn't aiming at $65/hr US coders, nor even $30/hr offshore coders... this is going to be used by $10/hr shipping clerks at the Xingang warehouse who need to offload excess inventory.

In five years I think people will regard software the same way they regard their cellphone interface... they won't even notice it.

-- Gordon Fuller (email)


The answer is simple: Because you cannot fool Nature but you can fool people.

-- Yehoshafat Shafee Give'on (email)


Hi Prof. Tufte,

I was in your seminar yesterday and you talked about hiding operating systems and applications vs. documents just as your post at the top of this thread does. I am a software developer myself so I thought about this for awhile and I wanted to try and add some comments to this thread.

When it comes to operating systems there is actually a way to hide it. In Unix and Linux for example, the OS is smaller kernal of code upon which different "desktop environments" can be placed. These desktop environments look and behave very differently and have names like KDE, Fedora, Gnome, Xfce, Xpde, X Windows, etc. It is possible to design and build a desktop environment however you want. Unfortunately the Windows family of operating systems don't support desktop environments like the Linux/Unix family does, but at least you can see it is possible and the technology has been around for some time. I suppose over time, market forces will decide which technique wins.

I also thought about the argument for just launching documents off the desktop vs. launching applications and then opening documents in the application. The reason different software applications exist instead of one "super" application is because it would be insanely complex and prohibitively expensive to produce and maintain a "super" application that would be capable of creating and editing any type of document.

A good analogy would be automobiles. Why isn't there a "super" automobile that can tow a trailer, run an Indy 500 race, push an airplane back from the airport gate, move a mountain of dirt around an empty lot, grade an empty lot, pick up and haul away the trash, haul and pour cement, etc.? Automobiles have niches just like software applications do, and different jobs require different amounts of torque, speed, cargo space, center of gravity, towing capacity, etc. It would be virtually impossible to build a "super" automobile that did all these tasks well, so we have to live with needing different types of automobiles for different types of tasks.

Software applications work the same way. The closest application I can think of that sort of "does everything" would be the most advanced desktop publishing apps like Adobe Illustrator or Serif PagePlus. Apps like these are capable of simple drawing, displaying and processing text, displaying and editing graphics, etc. They don't do each of these tasks particularly well, but a jack of all trades is a master of none right?

Hope this helps.

Thank You.

Alex L.

-- Alex Lancaster (email)


The following link points to a very interesting story about quality software engineering. Appropriately for this forum, it involves the space shuttle:

http://www.fastcompany.com/magazine/06/writestuff.html

The software group was actually praised by R. Feynman during his shuttle disaster investigation.

-- Vassilis Golfinopoulos (email)


This was an interesting post called "Kill The Settings, Build Opinionated Software". He uses Jason Fried from 37signals and Apple as powerful examples to stay to your vision, and not build in too many settings -- if you consider too many options part of "bad" software (I do).

http://flyosity.com/iphone/kill-the-settings-build-opinionated-software.php

-- Laurel Segel (email)




Threads relevant to interface design:


Privacy Policy