In recent years, image-capture devices such as cameras, lenses, and digital backs have continued to improve at a very fast pace. Post-capture software tools also have shown dramatic improvements. It appears to me that as a result of these two trends, more and more photographers are becoming lazy, developing a laissez faire attitude about the capture process. There is a tendency to set the camera to autofocus, auto exposure, auto-image stabilization, auto everything, and just point and release the shutter.
I constantly hear that no matter what the problem is with the original capture, it can always be fixed later in software. Wrong exposure? Image blurred or out of focus? Bad framing? Who cares, you can always fix it later in Photoshop or another image-editing program.
The real question is: Is this a wise way to operate? My answer to this is a resounding No!
Call me weird or unlucky, but when using the automatic settings on a professional-grade DSLR such as a Canon 1Ds MKIII or a Nikon D3 (or any other camera for that matter), approximately 99% of the time the camera does the wrong thing for me, and I either entirely miss the shot or my capture is suboptimal. I’m not exaggerating. I just returned from a trip to Botswana where I took two Canon 1Ds Mark III bodies and a variety of lenses with me. I shot roughly 4,000 images. I don’t think there is a single image from this trip where I did not override the auto settings in my cameras to capture the image better.