this reference made me think of how I could make a Petabyte more understandable. In digital data terms a petabyte is a lot of data. 1 PB = 1,000,000,000,000,000 B = 1015 byte. Assuming a byte is 8 bits then a petabyte is 8 x 1015 bits. According to this paper, Google processes more than 20 Petabytes of data per day using its MapReduce program. According to Kevin Kelly of the New York Times, this reference, "the entire works of humankind, from the beginning of recorded history, in all languages" would amount to 50 petabytes of data. These are all difficult to understand as they are abstract. So I tried to find a way of understanding what a Petabyte is in terms of an individual human being. From the paper you refer to here we can estimate that the human retina communicates with the brain at a rate of 10 million bits per second or 106 bits per second. This sounds pretty impressive. How long does it take a human eye-brain system to move a petabyte of data (assuming that you could keep your eyes permanently open so that you are getting your full 10 million bits per second). By my calculations a year is 3.15 x 107 seconds. This means a total amount of data per year from retina to brain of 3.15 x 1013 bits. Dividing 8 x 1015 by 3.15 x 1013 we get 254 years. This is a long time to keep your eyes open! If we take a normal human life to be the biblical standard of Psalms 90: The days of our years are threescore years and ten, then a normal human creates about 0.27 petabytes in their life. We could also define a brand new unit, the PetaBlife, with a symbol ℘ which is the number of standard human lifetimes required for a human retina to make a PetaByte of data. Matt Reed -- Matt R (email)
 Yesterday's NYT had an article about processing visual information and risk analysis within extreme conditions: www.nytimes.com/2009/07/28/health/research/28brain.htm. J.D. McCubbin -- J. D. McCubbin (email)
 34 gigabytes per day per person? -- Edward Tufte
 Dear ET,I have been reading up on the evolution of eyes and vision. I stumbled across the work of Prof Russell Fernald who is at Stanford University (http://www.stanford.edu/group/fernaldlab). From a paper by him published in Current Opinion in Neurobiology 10(4): 444-50 in 2000 the following profound statement made a big impression on me;"Light has probably been the most profound selective force to act during biological evolution. The 10^15 sunrises and sunsets that have taken place since life began have led to the evolution of eyes which use light for vision and for other purposes including navigation and timing."Best wishesMatt -- Matt R (email)
 Dear Professor Tufte, I eagerly look forward to your analysis of Apple's new iPhone 4 particularly the Retina Display they are branding. I can't help but wonder who at Apple has been following this discussion thread which you initialized several years ago? Some people who have apparently held and used an iPhone 4 are making comments like these: The resolution of the "retina display" is as impressive as Apple boasts. Text renders like high quality print. and It's mentioned briefly in Apple's promotional video about the design of the iPhone 4, but they're using a new production process that effectively fuses the LCD and touchscreen -- there is no longer any air between the two. One result of this is that the iPhone 4 should be impervious to this dust-under-the-glass issue. More importantly, though, is that it looks better. The effect is that the pixels appear to be painted on the surface of the phone; instead of looking at pixels under glass, it like looking at pixels on glass. Combined with the incredibly high pixel density, the overall effect is like "live print". What might text and sparklines look like on a "Retina Display"? I can't wait for your own hands-on review of iPhone 4 and also the iPad. Thank you! -Eddie -- Eddie (email)
 There is a growing debate about the resolution of the new iPhone and how it compares to the eye. Here are some highlights: Raymond Soneira : on Wired 1. The resolution of the retina is in angular measure - the accepted value is 50 Cycles Per Degree. A cycle is a line pair, which is two pixels, so the angular resolution of the eye is 0.6 arc minutes per pixel. 2. So, if you hold an iPhone at the typical 12 inches from your eyes it would need to be 477 pixels per inch to be a retina limited display. At 8 inches it would need to be 716 ppi. You have to hold it out 18 inches before the requirement falls to 318 ppi. The iPhone 4 resolution is 326 ppi. Phil Plait : on Discover Let me make this clear: if you have perfect eyesight, then at one foot away the iPhone 4's pixels are resolved. The picture will look pixellated. If you have average eyesight, the picture will look just fine. SOURCES http://blogs.discovermagazine.com/badastronomy/2010/06/10/resolving-the-iphone- resolution/ http://www.wired.com/gadgetlab/2010/06/iphone-4-retina-2/ -- Tchad (email)
 Jeff Hawkins's brilliant new model of the neocortex Jeff Hawkins presents a coherent, derived-from-first-principles model of neocortical thinking that is just stunning. Jeff Hawkins: Advances in Modeling Neocortex and Its Impact on Machine Intelligence -- Niels Olson (email)
 Dear ET,Alfred Lukyanovich Yarbus (1914 -1986) was a Russian psychologist who made a number of seminal studies of eye movements. Many of his most interesting results were published in a book, translated into English and published in New York in 1967 as Eye Movements and Vision. This book is now out of print but you can find PDF copies to download.I first saw some of Yarbus' data about 13 years ago as scratchy black and white scans from the book.One of the most compelling of Yarbus' experiments was an eye-tracking study he performed where he asked subjects to look at a reproduction of a Russion oil painting An Unexpected Visitor painted by Ilya Repin in 1884.Yarbus asked the subjects to look at the same picture in a number of different ways, including; [1] examine the painting freely. [2] estimate the material circumstances of the family. [3] assess the ages of the characters [4] determine the activities of the family prior to the visitor's arrival. [5] remember the characters' clothes. And [6] surmise how long the visitor had been away from the family. What is brilliant is that the eye-tracking traces recorded by Yarbus showed that the subjects visually interrogate the picture in a completely different way depending on what they want to get from it.Cabinet Magazine (Issue 30 The Underground Summer 2008) has a piece by Sasha Archibald called Ways of Seeing that takes the original eye-tracking traces from Yarbus' book and superposes them on a colour reproduction of the painting. This is the first time I have seen this done. The originals in the book by Yarbus are disembodied eye-tracking traces laid out near to, but not overlaying, the reproduction of the Repin painting. These new overlays by Archibald are worth comparing. Here is (left) the original image (middle) free examination and (right) what the subject did when asked to estimate the material circumstances of the family. Best wishesMatt yarbus-visitor.jpg -- Matt R (email)
 Example of industrial supplier using retinal tracking A major industrial supplier asked me to participate in a study of their web interface. After an interview, they sat me before a monitor and handed me about six different objects to find to find on their website. One object was a small plastic pipe fitting, and I remember a a couple of fasteners. A tiny web cam atop the monitor tracked my eye movements as I negotiated the site and found the products. They were testing frames, as I recall. Based on my compensation, this testing is expensive, but the quality of their website shows. -- Jon Gross (email)