Tuesday, February 25, 2014
Radical advancements in AF technology are blowing away old assumptions about phase- and contrast-detection systems. Set your old assumptions aside and look at what’s available today and what’s coming tomorrow.
Unfortunately, even that new rule is changing.
Once you've wrapped your head around the fact that contrast-detection systems can provide faster performance in some systems, there's a new technique of combining contrast detection and phase detection on the same chip instead of using a secondary phase-detection autofocus sensor.
Remember, contrast-detection autofocus can't perform predictive focus. It can't follow a subject as it moves across the frame without having to refocus constantly. No matter how fast the contrast systems become at focusing on a moving subject, that's still no substitution for being able to predict where to focus.
And, even if the system was fast enough to perform on par with phase detect when capturing a moving subject, it wouldn't be as smooth as phase detection is when capturing video, an area where predicting a subject's motion is vastly more important than a still image.
Shoot enough still frames, and eventually one will be in focus, but with video, a lens racking back and forth to try to achieve focus is irreparably distracting. No one will put up with a video where the focus is constantly locking and then becoming soft.
To solve this problem, companies like Canon, Nikon and Olympus have added phase-detect sensors to their imaging sensors, creating a hybrid contrast/phase-detection tool in one single chip. That's exciting, as it opens up whole new possibilities for this relatively new breed of cameras.
The technique used is to replace some of the pixels on the imaging sensor with autofocus pixels, essentially making two discrete sensors out of one piece of silicon.
Different manufacturers have addressed this new technique differently, with various levels of performance. Nikon's solution, found in their Nikon 1 cameras, is integrated into their CX sensor, the chip at the heart of that system.
"In the early CX sensor," explains Nikon's Heiner, "there was a specific area where the phase detect was [located], and that was more toward the center. But it's a larger center area than you'd typically find on the SLR [sensors]. The algorithms to determine which of those pixels are used depends mostly on the light level. But the phase-detect [pixels] displace imaging pixels; that's why there are fewer of them in CX designs. There, you've got maybe about a third or less of the effective pixels. As soon as you start putting in many phase detect, then you've displaced a disproportionate number of pixels."
Page 5 of 8