Digital image sensors consist of a grid of light-sensitive photodiodes (pixels). When you make an exposure, each pixel receives a certain amount of light—a certain number of photons—according to the brightness of the portion of the scene being photographed that’s focused at that pixel. Inherent in this process is the moiré that can result when a finely patterned subject’s image at the focal plane conflicts with the pattern of the sensor’s pixel grid (see Diagram A).
A further complication with conventional sensors is the fact that a given pixel records only one primary color of light. Photodiodes don’t detect color. They detect the quantity of light (photons) striking them, but not what color (wavelength) that light is. To produce color images, most sensors employ a Bayer filter array, a grid of green, red and blue filters that’s named after the Kodak scientist who devised it. This grid positions a red, green or blue filter over each pixel so that each pixel receives only one of these primary colors (see Diagram B). The missing color data for each pixel is produced by interpolation of data from neighboring pixels, using complex proprietary algorithms in a process known as demosaicing. Due to the demosaicing, and the fact that there are twice as many green pixels as red or blue ones, aliasing not only produces moiré, but also false-color artifacts with Bayer array sensors.
To counter these effects, sensor manufacturers place an anti-aliasing filter (also known as an optical low-pass filter, or OLPF) atop the Bayer filter grid. The anti-aliasing filter is generally a multilayer unit with a top layer that slightly displaces the image horizontally, an infrared filter and a filter that slightly displaces the image vertically. This blurs the image’s high frequencies (fine detail) at the pixel level, eliminating (or at least greatly reducing) moiré and artifacts, but it also has the effect of reducing resolution.
Medium-format photographers want maximum sharpness and prefer to compensate for any moiré and artifacts on a per-image basis in postprocessing, so medium-format cameras don’t have anti-aliasing filters. But most users of smaller-format cameras would rather not have to deal with moiré in postprocessing, so these cameras have utilized low-pass filters.
As pixel counts have grown in DSLRs and mirrorless interchangeable-lens cameras, manufacturers have started omitting the anti-aliasing filter. The thinking is that with pixel counts being so high and thus pixel pitch being so small, the pixel density itself reduces the possibility of moiré. So many pixels being so closely packed together nullifies the occurrences of intersecting patterns. Moiré is still possible, but with higher pixel counts and smaller pixel pitch, it’s much less common.
Nikon began the trend with the 36.3-megapixel, full-frame D800E early in 2012. (The D800E actually has the top layer of a low-pass filter and a second layer that cancels the first layer’s effect; Nikon also offers the D800 with a weak conventional low-pass filter.) Other current DSLRs with no low-pass filter include Nikon’s 24-megapixel D7100, D5300 and D3300, and Pentax’s 16-megapixel K-5 IIs and 24-megapixel K-3. The K-3 has a unique two-strength anti-aliasing function simulator that uses the sensor-shift shake-reduction mechanism. It rapidly moves the sensor down one pixel, right one pixel and up one pixel, slightly blurring the image at the pixel level as an anti-aliasing filter would.