Wednesday, May 30, 2007
All About Image Sensors
At the heart of every digital camera is an electronic marvel
Photosite Size And Color
Pixel sizes are measured in microns, and they vary from sensor to sensor. A micron is one-millionth of a meter, so there are more than 25,000 in an inch. The period at the end of this sentence is roughly 500 microns wide, and a typical photosite ranges from about two to 10 microns. To get more and larger photosites on a sensor requires a larger chip, and smaller photosites can be more easily squeezed into a smaller sensor. So why not just fill a large sensor with the smallest photosites possible and maximize resolution?
Bigger photosites typically mean lower noise and higher dynamic range. Smaller sites, like a smaller bucket, carry less of a charge. When balanced against the electrical noise produced, the larger charge capacity of a larger photosite leads to a better signal/noise ratio—but it needs plenty of room to dissipate its higher heat and keep from commingling charges in order to keep noise down. Since well-spaced photosites also decrease fill factor, the potential for aliasing, moiré and artifacts are increased. The larger charge capacity of a larger photosite, then, is necessary to improve the signal/noise ratio and ultimately deliver a greater dynamic range. Add to this the fact that the tiniest photosites have difficulty rendering colors accurately and, according to Foveon's Turner, you're left with a general rule: “Bigger pixel, better sensitivity, better signal-to-noise performance.”
Blooming also impacts signal-to-noise ratio, and it happens more easily with densely packed photosites. Both CCDs and CMOS utilize the same technique of filling a pixel well with electrical charge, and in bright light situations, the excess charge can spill out of the pixel well into adjacent pixels. CMOS has a natural immunity to blooming due to the larger pathways between photosites, but CCDs require more engineering to minimize blooming. Anti-blooming drains are added to eliminate overflow charge and protect nearby pixels from contamination.
The relation from one pixel to the next is also important in terms of how pixels work together to render the color in a given image. In even the most expensive professional digital cameras, CCD and CMOS sensors only record light in terms of grayscale luminance. In order to turn those readings into full-color images, color filter arrays are applied over the sensor.
The most popular CFA is known as the Bayer pattern, which is essentially a checkerboard of red, green and blue filters. Each filter only allows its own color through, so the grayscale reading of an individual photosite is reassigned its color during processing to deliver an accurate color value at that pixel. The three primary colors, however, aren't distributed evenly in a Bayer-pattern CFA. There are two green filters for every red or blue filter, meaning that any given square of four pixels on the sensor contains two green-sensitive pixels, one red-sensitive and one blue-sensitive. This is because the human eye is more receptive to green light, particularly with respect to sharpness.
With a Bayer-pattern CFA, the sensor is effectively interpolating color data. During processing, the camera examines just how much color one pixel contains and uses that information to determine how to interpret the color of adjacent pixels. The only photographic sensors collecting all colors of light at every pixel are made by Foveon, and they're currently available only in the Sigma SD-series cameras.
Page 7 of 8