To fully exploit the sensor resolution of today’s high-resolution digital cameras, photographers must arbitrate the conflict between diffraction and lens performance/depth of field. Understanding how diffraction affects image quality will enable you to extract the best results possible from your camera.
Today’s digital cameras offer steadily improving color accuracy, dynamic range, and bit-depth, together with the crowd favorite— ever-higher megapixel counts. Yet actual image detail is constrained by optical performance: overall sharpness and depth of field require stopping down, but stopping down too far degrades image quality due to diffraction, an optical effect that puts an upper bound on resolution. Even worse, well before the resolution limit is reached, the image has long since been declining in contrast (whites and blacks become grayish), which we perceive as lower resolution, and a loss of “snap.”
For the 21MP Canon EOS 1Ds Mark III, the problem is acute: with a top-performing lens, the loss of contrast from diffraction is already an observable factor by ƒ/8. By ƒ/11, image contrast drops noticeably. By ƒ/22, the degradation is enough that only a critical need for depth of field justifies its use.
Of course, depth of field is not a prerequisite for a good picture; many beautiful and compelling images are made at wide apertures (ƒ/1.4, ƒ/2, ƒ/2.8, etc). But to fully exploit sensor resolution, even a high-quality lens requires stopping down one or two stops down from maximum aperture for optimal performance across the frame.When focus error and field curvature and vignetting are taken into account, the one-to-two-stops figure is almost an inviolate rule, with very few exceptions.
Diffraction is irrelevant when strict technical requirements are not met: focus must be spot-on and camera stability must be rock-solid (high shutter speed and/or mirror lockup on a tripod). Lens optical misalignment must also be ruled out.