DPP Home Technique Camera Technique Autofocus Evolution

Tuesday, February 25, 2014

Autofocus Evolution

Radical advancements in AF technology are blowing away old assumptions about phase- and contrast-detection systems. Set your old assumptions aside and look at what’s available today and what’s coming tomorrow.



Since the advent of digital photography, there have been aspects of a camera's operation that are shrouded in mystery and confusion. That's because many of the technologies involved in digital photography are rooted more deeply in optical physics and computational algorithms than in shutter speeds and lens openings.

Most recently, a seismic shift in the technologies and techniques used to perform a camera's autofocus is the cause of confusion when it comes to the operation and performance of everything from SLRs to mirrorless systems.

There are two different methods used to perform autofocus in a camera: contrast detection and phase detection. It used to be canon (pardon the pun) that contrast-detect autofocus was slower than phase-detect systems. That's not the case anymore (at least, it's not always the case, as we'll see). With the advent of compact mirrorless systems, the ground rules have changed, and thanks to some new products in the DSLR space, the rules are in the process of changing again.

Artificial Eyes
Autofocus systems function by way of one of two mechanisms—contrast detection or phase detection—and, at best, most photographers have a sketchy understanding of the differences. Most photographers who are familiar with the two systems are likely to say "contrast-detection systems are slower than phase detection," and while that was the case recently, that has become an obsolete assumption.

Here's an extremely rudimentary (and not scientifically precise) explanation. Phase-detection AF works by taking beams of light from different sides of a lens, bouncing that light to a separate autofocus sensor and comparing those two different beams. The lens focus is adjusted until the waveform of the light from each side overlaps. When those waveforms overlap, they're in "phase" with each other and an image is in focus.

This is something that would be familiar to anyone with glasses who has held them at arm's length. Try to read a sentence, and the words overlap and line up improperly. Bring the glasses back toward your face, and at a certain point, the images line up and everything is in focus.

The camera's phase-detect sensor looks to see if the waveforms misaligned because they're wide apart or if they're out of alignment because they're overlapping too far, and that indicates if the subject is back-focused or front-focused (i.e., the lens is focused too close or too far).

 

Check out our other sites:
Digital Photo Outdoor Photographer HDVideoPro Golf Tips Plane & Pilot