DPP Home Gear Cameras Turbo'd Image Sensors

Monday, March 3, 2008

Turbo'd Image Sensors

At the heart of it all, the tiniest technology makes every picture possible



image sensors
image sensors

ABOVE: A CCD image sensor requires much more in-camera circuitry than a CMOS sensor. Each additional processing element draws power and also generates heat, which can contribute to increased noise, particularly in long exposures. As more pixels are crammed onto a sensor, these issues become ever more important.

Inside CMOS
As the latecomer to the imaging industry, CMOS has been fighting an uphill battle for years. That underdog status has now become an advantage, however—because of all the research-and-development dollars poured into the technology, it has surpassed CCD in terms of increased implementation in digital SLRs. Technological innovation was necessary to overcome the inherent challenges presented by CMOS design.

Unlike CCDs, CMOS sensors do much of the heavy lifting on the chip itself. Image processing effectively begins before the charge has been delivered from the sensor, ultimately translating into a faster transfer than the “bucket brigade” method, making for a faster user experience when the millions of pixels on a given chip are accounted for. The additional electronics on a CMOS sensor mean faster and more flexible information transfer, but they also mean more heat and increased likelihood of failure. That's where noise reduction steps in to compensate for the higher inherent noise with CMOS.

Almost eight years ago, Canon pioneered the use of CMOS in digital SLRs with the introduction of the EOS D30. At the time, CCD was the clear front-runner for photographic applications, but since then, more and more CMOS chips have been incorporated into cameras, including top-of-the-line pro models. This industry adoption is evidence of both advances in the image quality delivered by CMOS and in the practicalities of manufacturing and working with these chips on a grand scale.

CMOS sensors provide more efficient power consumption than CCDs. The CMOS sensor uses a fraction of a CCD's power because the CMOS effectively powers down between captures, switching on only to capture and transfer the charge. Power savings is nice, but ultimately it's the flexibility of CMOS' on-chip electronics that has made it appealing to camera makers. Additional image processing—initially to compensate for increased noise and pixel failure—gives CMOS some advantages in lower noise when operating at higher ISO and with longer exposures.

Sensor Support Systems
Though the pictures produced appear to be seamless, the pixels on a CCD or CMOS sensor definitely are not. Many photographers imagine their sensors are built like a checkerboard, with edge-to-edge adjoining pixels. In truth, it's more like a checkerboard where only the red spaces are actively receptive to light. Because of all of the necessary electronics incorporated onto a chip, particularly with CMOS sensors, there actually are large amounts of dead space. The area of a sensor covered by pixels is referred to as fill factor, and although denser pixels increase this, they also increase heat and interference and noise. Less dense chips increase the chance of creating nasty artifacts and patterns because of the wasted light. No matter how dense they're packed, though, the chip's pixels never will be seamless. That's where technology steps in to fill the voids between the pixels with millions of microscopic lenses.

Microlenticular arrays, or microlenses, improve sensor functionality and image quality by not only filling the gaps between pixels, but also by accommodating the depth that a photosite possesses. Photographic film is effectively flat, but electronic image-sensor sites are bucket-like in shape so light must strike them almost directly from above to keep from missing the pixel well and bouncing off the side. Microlenses not only fill gaps between pixels, but also collimate light to send it straight down into pixel wells—even those way out at the edge of the sensor. Some sensors also utilize microlens shifting to further increase the image quality at the edges of the frame. Ultimately, a sensor may utilize as little as 50 percent of its real estate for light-sensitive pixels. Microlenses can bring that number up to 90 percent or more.

Both technologies will remain vital in high-quality imaging applications for the foreseeable future, and to understand why is to understand the inherent differences between the two.



 

Check out our other sites:
Digital Photo Outdoor Photographer HDVideoPro Golf Tips Plane & Pilot