All 5 books, Edward Tufte paperback $180
All 5 clothbound books, autographed by ET $280
Visual Display of Quantitative Information
Envisioning Information
Visual Explanations
Beautiful Evidence
Seeing With Fresh Eyes
catalog + shopping cart
Edward Tufte e-books
Immediate download to any computer:
Visual and Statistical Thinking $5
The Cognitive Style of Powerpoint $5
Seeing Around + Feynman Diagrams $5
Data Analysis for Politics and Policy $9
catalog + shopping cart
New ET Book
Seeing with Fresh Eyes:
Meaning, Space, Data, Truth
catalog + shopping cart
Analyzing/Presenting Data/Information
All 5 books + 4-hour ET online video course, keyed to the 5 books.
Wildfire maps and media coverage

As I write this, a dense plume of smoke passes overhead from the "Paradise" wildfire in San Diego County. I am in the path of the fire, burning four or five miles away, but it will have to burn through a substantial section of Escondido before it reaches me.

Discovering where the firelines are at present, and the direction the fires are burning, is a hit or miss business. In part, this is the nature of the beast -- the fog of war, to mobilize a metaphor. But I am amazed at the paucity of mapping resources being used by local media. Television news anchors are using hand-held Thomas guides and pointing to areas where fires are burning. Web and newspaper maps are vague at best.

And this is the center of GIS technology! There's no reason at all that there shouldn't be real-time updates of a GIS map showing firelines, areas that have recently burned, wind directions, areas at imminent risk, etc.

(By contrast, the mapping technology deployed by CNN and Fox during the Iraq war was state-of-the-art.)

After the crisis has passed, I will be writing ESRI (the ArcView company), the geography department at San Diego State, as well as county and regional authorities, urging them to develop an appropriate web-based graphic system.

But it strikes me that this is a measure of the immaturity of the internet and associated technologies.

-- Mark Hineline (email)

Considerably more detail -- street scale.

There are three to five million people potentially in the path of these fires, from the Mexican border to San Bernardino. One of the fires jumped four interstates, one of them ten lanes wide. It should be possible to check from minute to minute to see where the fireline is and estimate how fast it is moving.

-- Mark Hineline (email)

A local television station is probably spending more money on scriptwriters for airhead news happy-chat than on GIS systems. And the talent would much rather be seen holding the Thomas map than not be seen describing a good map.

-- Edward Tufte

There's quite a bit of data here, but still not what you are looking for.

-- Jeffrey Berg (email)

How are firefighting crews tracking/mapping the fires? Are emergency crews maintaining real-time street-level GIS data, and if so, can it be tapped into by the public?

News services can provided real-time mapped data of rush hour traffic to let people know if they will be delayed 15 minutes getting to the gym- why not be able to provide a similar public service for wildfires, floods, earthquakes, etc. that might save someone's life?

Investments in mapping (a rather dull item for spending in the eyes of politicians and the public during non-emergency times) always seem to pay for themselves sooner or later.

-- Mark Kasinskas (email)

The technology is there - GIS is already used for fighting wildfires. It's well covered in ESRI's newsletter and the other GIS and geospatial magazines. I just don't think most municipalities would want to invest the time, money, and manpower for it. What's the value in letting everyone sit in their homes refreshing a GIS map in their browser? Face it, if the fire's heading your way, you'll be evacuated by the authorities, not IM'ed or blogged about your need to get moving.

This system would also sit idle for 9 months out of the year. On the occasions it is needed, all the info and access should be focused so that the firefighters can get to the info as soon as possible. Would you want your firefighters getting a 404 error because SlashDotters are watching the firelines too?

Honestly, most municipalites would get a lot more value out of an underground services web-GIS.

-- Marc Pfister (email)

"Would you want your firefighters getting a 404 error because SlashDotters are watching the firelines too?"

Unlikely, considering all of the state, federal, county, .us, .gov, .edu, NOAA sites that seem to mirror one another and then link to the others as places to go for "more information" on the subject. What a Google dud! If they all carried the same GIS data it seems there would be 30 - 40 pages available to hit for data!

Funding is a bit unfocused isn't it?

-- Jeffrey Berg (email)

"Face it, if the fire's heading your way, you'll be evacuated by the authorities, not IM'ed or blogged about your need to get moving."

This is a level of helplessness that, frankly, is not necessary. In the case of hurricanes, there is a three to five-day build-up of caution, warnings, maps about possible landfall, and so on. That much information about probabilities, that far in advance, may or may not be helpful. But (a) it shows that it is possible, (b) is idle nine months of the year, and (c) gives residents information on which to make quasi-rational judgments. Perhaps more important than this, the ability to monitor risk would permit a larger evacuation window. In yesterday's fires, the immediate evacuation of hundreds of people all at once was a danger in itself.

The best source of information about the proximity and movement of fire fronts yesterday was on the local NPR affiliate, which allowed callers to provide information. Some of it was right and some was wrong, which was predictable. Wouldn't it have been helpful were the announcer to have had a GIS, refreshing every minute or so, on which to check information?

-- Mark Hineline (email)

Like I said:

Going to ESRI's homepage to file a bug report, what do I find as the first item: Maps of the California Wildfires.

Under Live Maps they link to Geomac Wildland Fire Support.

And if you go to that page, you get a message about how map generation will be slow because of heavy user demand. If you try to view a map, then you find out their image server has been disabled. All I got was a big blank map.

It doesn't matter how many government sites mirror the spatial data when you server has been slashdotted to smoldering metal and plastic.

I'm not saying the internet isn't a good way to get emergency data out, but if you want dynamic GIS content that depends on server-side applications, you're asking for trouble. Remember, with something like this you have global-scale rubbernecking.

If you had a clean system that used static pages updated every five minutes or so and minimized graphics, then you'd be talking. But "web-based graphic systems" are just asking for server troubles.

-- Marc Pfister (email)

The Geomac is an interesting example of -- well, of the worst kind of GIS map. I clicked fire perimeter as an active layer. I suppose that the darker green edge of the darker green value is the perimeter. But there is no key, so I am left to suppose. The graphic, at the resolution I am looking for, is unacceptable.

I am not sure that this proves the point you are pushing, but it certainly makes a point.

-- Mark Hineline (email)

I can't even view any data layers at this point.

I you click on the Legend text you should get a legend for the layers you're looking at, but the legend is a server-side generated image, and the image doesn't load when I try to get it.

There's a Tuftian lesson to learned here - the high information density conveyed by a graphic also implies high bandwidth costs and if generated dynamically, high server loads.

There's probably a good solution for this - I could see something like an RSS or other XMLed broadcast over SMS, and your GPS-enable smart phone would filter the broadcast by location, letting you know about problems near you.

-- Marc Pfister (email)

Australia faces the threat of bush fires every summer and the one thing of importance to note is the speed at which it can all occur. Contributor Hineline states that it is possible to provide information about hurricanes days in advance, why not the same for bushfires?

A good example are the 'Ash Wednesday' fires of February 16, 1983 in South Australia and Victoria. These fires occurred in the space of an afternoon leaving 70 people dead, 2545 buildings destoyed and 3900 square km burnt. Our national capital Canberra lost 500 buildings in an afternoon on January 18, 2003.

All the technology in the world will not help when things are dependent on the temperature, wind speed, humidity and any number of factors. What is safe now can be gone in five minutes.

One lesson from the 1983 fires is that the power grid can contribute to the starting of files. Due to cable clash in high winds and other factors. A protocol now in place is that when the fire risk is extreme the grid in an area may be turned off. Technology is not much use then. When fires take hold people rely on radio, the spoken word, many stations suspend all other services and provide nothing but continuous fire updates directly from the fire control centres.

Quite a good animation of one of the 1983 fires can be found at -

-- Andrew Nicholls (email)

The fires may have spawned the worst info-graphic ever:

-- Marc Hudgins (email)

This seems to be a move in the right direction:


-- Mark Hineline (email)

As a SoCal fire evacuee, I can tell you that all the mapping applications on the web have proven to worthless, as have all the news services, both local and national (every goddamned one of 'em). What has proven to be the best is a text posting by local residents,, who stayed on the mountain, visiting the all areas of concern and getting good information from fire officials and others, then relaying them to us through their web site. What we need to know is what has really burned, what's currently actually on fire, and which way the wind is really blowing. Tell us the name's of the places, we'll know where they are. Making maps just takes too long. Steve Sprague, Crestline, California.

-- Steve Sprague (email)


Your frustration with receiving accurate and timely information is understandable. Seemingly one would think satellite based photography could be useful for getting such activity within a large geographical area into some kind of meaningful context. Most of the photos I've seen are of a wall of fire. When Isabel blew by here the Doppler system missed the boundary of the "eye" by about 10 miles. So while the instant information was useful in general, it was a little confusing to those who were in that 10 mile block of geography.

-- Gene Prescott (email)

I cannot argue with Steve's take -- or experience -- with information about the fire -- or its dearth. But Crestline is not Escondido. I've gone half mad trying to find named streets on print maps because, this being the land of endless development, the maps around my house are obsolete the day before I buy them.

Steve's criticism of the way things are is apt. This is not the way things have to be. The point is to harness a robust technology to make just-in-time emergency cartography fast and accurate.

-- Mark Hineline (email)

The news agencies and general media are not liable of being responsible for the safety of the people or for that matter ensuring the accuracy of their data. (Although finding a news source that prides itself in accurate data is desirable by everyone.) The job of keeping the safety of the People, for the People is owned by the government. After all, the government has the right to hijack the broadcast for their own Emergency Broadcast usage. We have Homeland Security budgets, FEMA, The NOAA.

The government is aptly failing with horribly convoluted web sites with not nearly enough bandwidth.

We can't ask the likes of Time Warner to protect us, and in the same hand complain about corporate replacement of democratic process.

I'd start pointing at the government, FEMA, the NOAA, and all the other satellite agencies and commitees with web pages claiming to be the controllers of the data at hand.

-- Jeffrey Berg (email)

Here's an interesting implementation of web mapping that would work well for emergency data:

Starcus RaveGeo Demo

RaveGeo is a multiresolution, compressed streamed vector format from Idevio. It uses a client-side Java viewer. This combination should result in lower server loads and bandwidth requirements. The demo is incredibly responsive (left mouse button to pan, right mouse button to zoom) and the detail increases as higher resolution data is streamed to the viewer.

-- Marc Pfister (email)

Another mapping problem in a crisis:

[link updated February 2005]

-- Edward Tufte

Hi Mark,

I was involved in a project in Australia several years ago were we set up a wiki to record information about a majo bushfire event. We looked at all the data sources and found that the problem is not the tools used to present the data, it is capturing and distributing the necessary data in the first place.

Satellite data is useless for operational purposes because 1) the turnaround time to get high resolution data captured and processed is several days at best, or 2) the data is so coarse that it is useless. e.g. an image with a 1km cell size cannot tell you much about a fire front that existed 6 hours ago.

Fire front location data can be captured using infrared cameras mounted in aircraft. The processing of this data is complex (involving digital terrain models, aircraft attitude compensation, etc), and doing it inflight adds more complexity and constraints. Then you have the issue of flying in somewhat dangerous conditions, given the updrafts and water bombing traffic.

Also, having the aircraft with these capabilities sitting idle most of the year, within range of the actual fire, is not looked on favourably by those with the money.

Any web site that can provide the detail you are looking for will get swamped. A site create in Australia ( was getting 1 million hits a day during the Canberra fires, and the data it displays is usually at least 6 hours old and it very coarse.

If the fire ops staff can't get detailed data from sensors, let alone process it and distribute it to the media, then there's not a lot the media can do.

If anyone knows of automated data capture tools being used to track wildfires I'd be interested to know more.


-- Andrew Hallam (email)

It's been a few weeks since the California wildfires left the media's attention here in New England. Did any accurate GIS maps or data for the burn areas ever materialize, even after-the-fact?

-- Mark Kasinskas (email)

Yes, the website I referenced above ( has a number of GIS maps, interactive maps, and a discussion of many of the issues raised in this thread (I have no connection with that site). It is a fascinating site, with information graphics that range from very fine to preposterously dopey.

-- Mark Hineline (email)

ESRI has a page with maps and other items related to the SoCal burn areas at A number of perimeter and progression maps of the local fires, created by ESRI and others, can be found at

-- Steve Sprague (email)

I'm revisiting this discussion five years later because due to the recent outbreak of fires up here in Northern California I suddenly found myself in the position of making my own internet wildfire maps out of frustration that there was no decent, simple information out there available to the public.

The maps are running at:

ENPLAN Wildfire Map

In the discussion above, someone mentioned that there should be up-to-the-minute information. This proved impossible. The only data we as a private company have access to are the wildfire perimeters at GeoMAC and the MODIS fire detects from UMD. The perimeters were generally 24 hours old. The MODIS data updates 4 times a day but still could be up to 6 hours old. I know from agency contacts that there is more recent IR mapping data available but that it was not available for public use.

At one point we were trying to also map evacuated areas but they changed to often and were often described in terms that were vague enough to make trying to show them a liability if they were not accurate. One evacuation notice used geographic features that were labeled only on one brand of printed maps and were difficult to find for anyone using any other maps or GIS data. They also could vary between the fire crews calling for the evacuation and the Sheriff's office actually implementing it. It was too much of a challenge to try to pin down a swarm of information in our GIS. Hopefully at some point in the future there will be a standardized was to describe and serialize data like this.

We also ran into the server loading issues, both on our own server and from the agency data sources. MODIS and GeoMAC both went offline several times. Our own servers were bogging down, even though our maps rely on Google's infrastructure for all map rendering. We fixed our server problems by offloading our data files onto Amazon's S3 server infrastructure, though even that system, which is supposedly geo-redundant, went down for half a day.

It seems that at this point there is a least useful data out there that can be put on maps, and the server loads can be distributed to a point. For future improvement the data needs to be more timely and more of it needs to be generated in standardized formats that can be easily aggregated or mapped.

-- Marc Pfister (email)

Threads relevant to maps:

Threads relevant to news:

Privacy Policy