What does it take to focus a lens? Well, not much. You gotta move some lens elements, that’s all. The only problem is to know in what direction and how far 😄
OK, it’s not the only problem, the other major problem is to do it real fast, but that’s the easier part.
If we disregard rangefinders, split-prism screens and all the other aids for manual focusing, there are two major focusing systems today: phase-detect and contrast-detect autofocus. Phase-detect is what DSLRs have and contrast-detect is for the toys. Or is this so?
Most mirrorless cameras have contrast-detect autofocus. There is no extra light path via mirror and prism, there are no phase-detect sensors (they are normally in a DSLR’s hump) on mirrorless cameras, though we are beginning to see hybrids designs, using some sensor pixels for phase detection instead of imaging. This is still new, it may be the coming thing, but so far phase-detect autofocus is mostly a DSLR thing. It is commonly associated with speed and precision.
Speed is relative. Contrast detection is mainly image processing, a computing-intensive task, and with current fast processors it has become very, very fast indeed. Look at the OM-D, the Sony NEX cameras and Panasonic’s m4/3 cameras and you know what I mean. There is still an advantage to DSLRs when it comes to continuous tracking, because this requires not only image processing but image analysis, a much harder task, although it is only a matter of some more computing power until that gap closes as well.
Precision is a different matter though. Look at DSLRs. Their phase-detect autofocus sensors are in the hump, light reaches them via a light path that’s different from the main sensor path. It all depends on the lengths of those different light paths, but the fact that there is not one path but two, already implies that there will be a deviation from the ideal. It may be within tolerances, the tolerances may be extremely tight, but there is no way to get rid of this deviation.
Unfortunately the tolerances are not even in the hand of a single manufacturer, and they are in fact dependent on the particular lens/camera combination. Some lenses may have front focus on a given camera, back focus on a second and precise focus on a third, different lenses of the same model may behave differently on the same camera. It’s a mess.
That’s what autofocus correction is for. While AF correction once was a pro feature only, it has dribbled down into mid-range models. Just take a lens, make some test shots of a test target at various focus distances, check for front or back focus, dial in some positive or negative compensation, repeat until satisfied. So easy. Not.
Contrast-detect autofocus has no problem with front or back focus. It does not exist. Never. Focusing does not depend on actual distances, it just depends on the sharpness of the picture. It’s in focus when the picture is sharp. End of the story.
Contrast-detect autofocus has come a long way, but there is no reason why it shouldn’t get even better. Add computing power and it does.
This is much easier to improve than phase-detect autofocus, and indeed I expect a cross-over soon. Maybe not this year, maybe not next, but it is coming for sure. Image analysis is a problem that’s well understood, scales extremely well with processor speed and processor count, and once you’ve solved it in software, you can get rid of an expensive part of the camera. As I said, for still photography we are already there, for motion it’s only a matter of time.
The Song of the Day is “You Gotta Move”, and other than in “1378 – You Gotta Move”, this time we don’t hear the Rolling Stones, this time it’s Cassandra Wilson on her 2002 album “Belly of the Sun”. Hear it on YouTube.