Errata & FAQ to Through the Digital Lens
by Charles Maurer
Is Photoshop necessary?
For fixing up snapshots, no. Many applications will do that just fine, including quite a nice piece of (defunct, alas) freeware called PixelNhance.
If I have Photoshop, do I really need all those other products?
No, you can do excellent work without them, you just may spend more time at it and/or run into some limitations.
Is there any simpler alternative to Photoshop that's comparable in power?
I wish there were. For scientific uses the open-source ImageJ is well worth a look, but it is not appropriate for ordinary photography. For general use, the only approximation to Photoshop is GIMP, which I find even more awkward to use and does not work with Photoshop plug-ins. Photoshop Elements is less confusing than Photoshop CS but the commands are still a jumble and the user interface of Elements 3 feels more like Windows than Macintosh. Elements 3 also ignores the system's ColorSync setting and uses Microsoft's sRGB profile instead of Apple's, which can be a nuisance if you work in different programs. (See Colour & Computers). I would love to see a photo editor comparable to Create but it does not exist.
Do I need Photoshop CS?
Probably not. Photoshop Elements ought to do most people just fine.
FAQ: Colour & Computers
How can you recommend ignoring ICC colour profiles when they are so important to the graphic arts? Everybody knows that you can't get good colour without them.
I am not recommending this for firms in the graphic arts, I am recommending it for photographers who are having trouble balancing colour and who will be content with pleasing colour rather than the ultimate. The ICC say the same thing themselves on their web site. After explaining that profiling has the potential to be better they point out, "But it is not trivial and you are unlikely to be fully successful if you do not invest some time, effort and money." They suggest, "As an alternative to profiling you may want to set up an sRGB workflow."
How can you say that most commercial printers assume sRGB when other profiles work better for offset printing?
I am not talking about offset printing, I am talking about photographs and commercial photo printers.
How important is a wide gamut of colour in a photograph?
Very important if you are comparing two photos side-by-side but less important otherwise. The eye interprets scenes through relative values, not absolute values.
What is colour temperature?
The more heat you pour into an object, the more heat will radiate out from it. Heat is electromagnetic radiation, which is the stuff of light waves, so if you get an object hot enough, it will glow like a light bulb. Indeed, that's how a light bulb works: a wire is heated until it glows. Incandescence is a mixture of light waves of different frequency. The exact mixture varies systematically with the temperature of the object. The colour temperature of a lamp is the temperature that corresponds to the mixture of light that it emits.
What is the significance of colour temperature?
Different colour temperatures emit diferent mixtures of wavelengths, so different colour temperatures will cause different proportions of wavelengths to be reflected off dyes and pigments into the eye. The mixture of wavelengths is what determines how we see colour. Thus, colour temperature directly influences how we see colour.
What is "equivalent colour temperature"?
Only objects that are heated until they glow have a colour temperature—the sun, incandescent lamps, fires and the like. Flourescent tubes do not, nor do mercury-vapour or sodium-vapour lamps. However, you can still hold a colour-temperature meter to a flourescent tube and get a reading. That reading is the "equivalent colour temperature."
How equivalent is "equivalent"?
Not very. No flourescent tube emits all of the frequencies found in incandescence. If a pigment reflects a lot of energy at a certain frequency, and a flourescent tube emits none at that frequency, then a colour requiring that frequency may look bright beneath an incandescent lamp but dark under a fluorescent tube.
That said, some flourescent lamps are used by the graphic-arts industry for comparing colours. These have been especially designed for the purpose, to come close to incandescence. Although they aren't perfect, they are more practical than incandescent lamps would be, because the graphic-arts standard for judging colour approximates sunlight, and incandescent lamps that approximate sunlight are hot and short-lived.
How can you compare the colour accuracy of two lamps?
If they have different colour temperatures, you cannot. If they have the same equivalent colour temperature, you can compare them by their Colour Rendering Index (CRI).
What is the Colour Rendering Index?
Take two lamps that measure the same with a colour-temperature meter. Let one of the lamps meet a formal standard and compare the other to it. To compare them, measure the lamps' reflectance off eight colour pataches, using a meter that is calibrated to "see" colours as the Standard Observer would see them. The CRI is a calculation of the similarity of these readings, expressed as a percentage. A CRI of 93 purports to indicate that the lamp in question has an accuracy of 93%.
This is a little like measuring the distance from Toronto to Vancouver to the nearest kilometer but the number does gives a rough-and-ready idea of whether a lamp will tend to make colours look reasonable or bizarre. In the graphic arts business, a CRI of 90 is usually deemed acceptable for comparing colours.
Do you know anything more about the native resolution of any inkjet printers?
Yes, I have since experimented some more with an Epson Stylus Pro 9600 using seven Ultrachrome inks. This is a 44" printer but I would expect it to be similar to smaller printers using the same inks (e.g., 7600, 2200). With my striped test pattern it gave best results at 288 dpi. That appears to be the printer's real resolution. The cleanest stripes were at the printer's "resolution" settings of 1440 dpi or 2880 dpi, with 720 dpi almost as good. With a photograph fed to the printer at 288 dpi, I could see no difference between 720 dpi and 1440 dpi, but 2880 dpi showed a hint of grain-like noise in smooth areas. Unidirectional (slow) printing was a little cleaner than bidirectional (fast). Images optimized for the Epson and images optimized for the Olympus print came out comparably sharp and comparably detailed.
How can you say that detail doesn't matter?
I don't. I say that (1) detail you cannot see does not matter, (2) detail you can hardly see hardly matters, and (3) sharp edges matter more to the eye than fine lines.
Aren't you alone in saying that resolution is not so important?
Not at all. It is a commonplace in optical engineering and a truism in visual perception.
Why do so many people think resolution is so important?
It's easy to understand and it's easy to test, so lots of people talk about it.
How can fine resolution be unimportant if I can see obvious differences in detail between photos?
Most obvious differences are actually differences of contrast and/or edge sharpness, not of resolution. If you compare test photos of the same subject shot on a tripod, then you may see differences in fine detail that can be attributed to resolution, but they are not likely to be obvious unless you make side-by-side comparisons of large enlargements from close up.
If you make a Bayer sensor without a blurring filter, will it be as sharp as a Foveon?
No. Even if the optical image is perfect, Bayer interpolation will blur it. This interpolation is based on averaging, so whenever black is next to white, grey will be interpolated between them.
That is why it is important not to enlarge Foveon images using an interpolating algorithm that takes a running average. Enlarging Foveon images by taking a running average will create blur.
Foveon claim their sensor is comparable to a 6.3MP Bayer. So do other reviewers. How can you compare it to a 13.8MP Bayer?
Foveon compare resolution alone, as do most reviewers, but I compare overall image quality. Moreover, to equate size, other comparisons enlarge the Foveon's image with an application that interpolates a running average, thereby adding blur.
If Foveon sensors are so good, why are Bayer the standard of the industry?
Because Bayer sensors have been around for longer. Bayer sensors are also likely to remain the standard, because manufacturers have invested a lot of money in factories and marketing. That has nothing to do with the sensors' qualities or their quality.
Are you saying that Foveon sensors are better than Bayer?
No, I am saying that they are better in some ways, less good in other ways, and comparable overall, if you compare sensors with a similar number of pixels (complete pixels, not cells).
How can you define "pixel" as you do when the entire industry defines it differently?
Engineers look at an image as a mosaic of voltages, so their pixels are the smallest elements that emit a voltage. The camera industry have adopted this definition. It is appropriate for electrical engineering but has no sensible application to colour photographs. We see a photograph as a visual image, not as a matrix of voltages. To the eye, the elements forming a photograph are the smallest spots of the correct colour. This usage of "pixel" comes from basic English and is common in the field of perception.
How can you say that no photograph has more than four megapixels?
I don't, I say that (a) four megapixels hold all the information that you can see in a photograph at normal viewing distances but (b) a picture will often need more than four megapixels of ink. Two distinct dimensions are involved, (a) the visual information and (b) the presentation of this information. Think of a football play on television: (a) it's over in 5 seconds then (b) it's shown on an instant replay that is slowed down to take 10 seconds. The play contains five seconds of blocking and tackling but in the replay, interpolated time expands that to ten seconds. Adding that time adds no information—no additional player is being tackled—and "seconds" measures both dimensions.
If you crop a picture, don't you want more pixels?
Yes, to enlarge a cropped picture you will want more pixels—but you will also want greater edge sharpness, because the transitions at edges will be magnified, and you will want less noise, because noise will be magnified as well. In short, the requirements for a cropped photo are much as they are for an uncropped photo.
How can you say that lens quality doesn't matter?
I don't, I say that it matters less than most people believe. Differences among modern lens are small enough that they can rarely be seen without a side-by-side comparison of identical test pictures shot with the camera on a tripod. It is true that experienced, perceptive photographers may notice some specific circumstances that seem to bring out flaws in a lens, and some of these flaws may matter to exacting pros. That's why I was so discriminating when I reviewed the Sigma lenses in the fourth article. However, most such differences are small enough to disappear in the woodwork. They loom large only when you look for them.
How can you say the colour reproduction is not important?
I do not say it is unimportant, I say there is no practical way to assess it. For an explanation, see the second article in this series, Colour & Computers.
If the choice of metering modes is not important, why do so many cameras offer it?
It is a hold-over from film. People have not fully adapted their expectations and practices to the new technology.
How can you compensate for a soft lens?
Softness is optical blur. If this blur can be characterised mathematically, then a computer can subtract some of it from the image. For example, if you know that you moved the camera leftward during an exposure so that each point exposed two pixels, then the computer can subtract some of the exposure from each pixel and add it to the pixel on the right. This holds for two-dimensional blur as well, for poor focus or generalized softness. The computer cannot know exactly how much information to move, so its compensation can never be perfect, but if guided judiciously by a human being, the results can be remarkably good. "Smart Sharpening" in Photoshop CS2 can do this, as can the Photoshop plug-ins Focus Magic and FocusFixer.
Note that this is not "sharpening" as the term is usually used. "Sharpening" in the usual sense distorts the edges of lines to make them contrastier. A modest of amount of this contrast at edges will usually make an enlargement look sharper but it cannot compensate for optical blur.
How can you say that lens quality doesn't matter?
I don't, I say that it matters less than most people believe. Differences among modern lens are small enough that they can rarely be seen without a side-by-side comparison of identical test pictures shot with the camera on a tripod. It is true that experienced, perceptive photographers may notice some specific circumstances that seem regularly to bring out flaws in a lens, and some of these flaws may matter to exacting pros. That's why I was so discriminating when I reviewed the Sigma lenses. However, most such differences are small enough to disappear in the woodwork. They loom large only when you look for them.
How are Sigma lenses compared to others?
I do not know but when I examine photos taken with other lenses, I often see similar problems.
Does the camera still lack a suitable wide-angle lens?
No, Sigma now supply a good 10-20 mm.
Do digital cameras need different lenses?
They can work with lenses designed for film but image sensors have different optical characteristics than film, so lenses that are optimized for image sensors are different.
What should I look for in a point-and-shoot?
The same kinds of things as in a fancier camera, except for the manual controls. The only additional feature I look for in a point-and-shoot is a lens that retracts into the body, so that I can carry the camera in a pocket.
It has recently become possible to buy a point-and-shoot that compensates for camera shake by moving either the sensor or some element of the lens. This feature is so desireable that, if it is in within range financially, I cannot see buying a model that lacks it it. I can specifically recommend the line of Panasonic cameras with optically stabilized Leica lenses. We bought the DMC-FX7, because we liked its size and LCD, but I would expect them all to work similarly. Panasonic's latest generation of image-processor ("Venus II") is remarkably competent. It even removes colour fringing.
What is the difference between optical zoom and digital zoom?
Optical zoom enlarges the image coming into your camera, digital zoom merely crops the image once it's inside the camera. Digital zoom is the same as cropping in your computer with this difference: you cannot go back and change your mind. Thus, optical zoom is worthwhile but digital zoom is not.
Did Sigma pay you to write this article?
Nobody paid me a thing and I paid the normal prices for all of my equipment. I wrote these articles to raise money for Doctors without Borders. If you found that these articles helped to save you money, please send them a portion of what you saved.
How large are the different image sensors?
Here are the common sizes:
Also, many cameras use sensors from 14mm x 21mm to 16mm x 24mm, and some cameras use sensors the size of 35mm film (24mm x 36mm).