If you’d like to see how much noise each ISO setting on your camera produces, you can shoot frames (in manual mode) at each ISO the camera provides, with the lens cap over the lens. The resulting images will be very dark. Enlarge each to 100% on-screen and apply Auto Contrast in Photoshop to really bring out the noise. Then compare the results. You may find some surprising things, as Tony Lorentzen did in the video cited at the start of this article.
If your DSLR does video (and you intend to use that feature), try the test in video mode, too. The optimal ISO setting(s) may be different than for still photography, depending on how the manufacturer processes the different ISOs.
But that’s not the end of your testing. Once you’ve established the optimal ISO(s) for your camera in terms of noise, go out and shoot some real photos at different ISO settings and compare the results (the real photos will look better than the exaggerated grain test you just did). Then you’ll know what to expect at the various settings in the future. After all, sometimes you need to shoot at a higher or lower ISO than the optimal one(s) to get a shot, and it’s nice to know what to expect, quality-wise. The ultimate judge of the best ISO setting(s) for your gear and photography is you.
The photodiodes used in image sensors are color-blind—they detect only the amount of light (the number of photons), not its color. To produce color images, conventional sensors are covered by a grid of red, green and blue filters arranged in a Bayer array (named for the man who created it)—see Figure A. Thus, each pixel receives only red, green or blue light. The image processor (the camera’s, for JPEG images; the RAW converter software and your computer, for RAW files) turns the image into a color one by a process known as demosaicing: Data for the missing colors at each pixel are derived from neighboring pixels via complex proprietary algorithms. This sounds convoluted, but actually produces excellent images.
Sigma’s DSLRs use the unique Foveon sensor, which collects all three primary colors at every photosite. It does this by taking advantage of the fact that different wavelengths penetrate silicon to different depths. The Foveon sensor stacks three layers of pixels, the top collecting primary blue light, the middle layer, green, and the bottom layer, red—see Figure B. This does away with the need for the Bayer filter array (and the blurring filter needed to eliminate artifacts inherent with this system) and produces improved image quality for a given horizontal-by-vertical image pixel count.