DPP Home Gear Cameras The Quest For Ultimate Image Quality

Tuesday, August 14, 2012

The Quest For Ultimate Image Quality

The complex relationship between cameras, sensors and noise—bigger resolution, large sensors and huge pixels won’t necessarily lead to improved image quality


This Article Features Photo Zoom
More noticeable on the wide end, lens diffraction properties also will limit total image resolution regardless of megapixels. There's a limit as to how narrow an aperture can be before channeled light diverges and then blurs together as it passes through apertures that are too small to focus the diameter of incoming light photons correctly. Known as the Airy disk, even a perfectly made lens with a perfectly circular aperture would have limited fundamental resolution because of this concept of diffraction. Diffraction is also very gradual, another reason why intermediate apertures are often the sharpest in a lens.

There's a correlation between the Airy disk diffraction and sensor size, as well. Larger sensors require smaller apertures to achieve the same depth of field as smaller sensors because sensor size affects the angle of view. So total depth of field will decrease for an aperture in relation to larger sensors. Depth of field from an aperture of ƒ/2.0 on a 100mm lens with an APS-C camera and its 1.6x crop factor, for instance, would require the equivalent of ƒ/3.2 at a 160mm focal length on a full-frame sensor to gain the same perspective. Because the size of the Airy disk is dependent on aperture and sensor size, the size of the Airy disk can also become larger than the circle of confusion (the perceived area of sharpness in the selected depth of field), which reduces sharpness. In fact, the smaller the aperture (ƒ/32, ƒ/22), the larger the Airy disk and, hence, the lower the capable resolution. Lens limitations aren't often broached when discussing image quality in regard to the sensor, but it's a limiting factor on sub-full-frame cameras, in particular, and it's at least partially responsible for the perceived lack of image quality when comparing full-frame sensors to sub-full-frame sensors.


Nikon D800
The Final Resolution
In summation, more pixels on a sensor will indeed lead to higher image quality if all other factors are ignored. It's also true that a large sensor with smaller pixels will have less noise than a smaller sensor with larger pixels for two reasons: 1) There would be more area for better light information absorption; and 2) Information from larger sensors doesn't have to be blown up as much as it does with a smaller sensor upon enlargement to an equivalently comparable image. Smaller pixels on the sensor also will be noisier than larger counterparts because less image information is captured in comparison to the amount of noise being created, which can be expressed mathematically through the signal-to-noise ratio. Contrarily, more pixels can be fit to a sensor when the pixels are smaller, which creates finer resolution detail for larger prints and output files.

That said, more pixels on the sensor (megapixels) isn't a guarantee of better image quality and, in fact, more resolution in this manner can degrade image quality if the sensor doesn't employ microlenses or other devices that increase the overall quantum efficiency of light gathering and interpretation. The same can be said for larger-sensor formats. Ideally, a large sensor with a lot of pixel resolution offers the best of both worlds as long as that sensor is capable of using all of that information in an efficient manner. Larger sensors gather more light; for that reason, they're inherently less noisy than smaller sensors, but overall quantum efficiency matters most, and even large sensors with poorly designed imaging processors can perform worse than well-designed, sub-full-frame imaging chains.


Hasselblad H4D-40
This is because quantum efficiency includes a variety of in-camera functions that will affect the total image quality—from the lens to the sensor to processing algorithms to the anti-aliasing filter to the analog-to-digital converter to in-camera noise reduction all the way out to the software edits on a computer. All of these factors influence image quality to varying degrees; however, excepting glaringly bad manufacturing, not one of them is solely responsible for overall image quality even though, at the same time, improvements to single components also will improve overall image quality even if only negligibly.

Even quantum efficiency as a measurement discounts a number of fundamental imaging system comparisons, like color and image distortion, so it's important to remember that each element, no matter its perceived importance, is only a single parameter of the image chain, and it's also important to evaluate an entire imaging system rather than getting stuck on any single specification, especially one as notorious as megapixel count or sensor size, for example.


 

Check out our other sites:
Digital Photo Outdoor Photographer HDVideoPro Golf Tips Plane & Pilot