Hi everyone!
I hope I'm not interfering with edalongthepacific's original question... However, this topic directly relates to my recent effort to begin learning how to use the hyperfocal distance to my advantage (I am really attracted to photos I see where there is a subject in the extreme foreground in very sharp focus and the the horizon is also in very sharp focus - so I am trying to learn the technique). I also am trying to appreciate the role defraction plays in undermining these efforts.
I began by reading the tutorials on Cambridge in Colour. Then went outdoors to test my understanding. I used a Depth of Field calculator on my iPhone to (using 0.03 for the Circle of Confusion factor) provide me with the hyperfocal distance for f8 on my Canon 5DmkII wi/ an 18-35mm, then used a tape measure to ensure my foreground subject was at that precise distance. I took the shot. I then took shots at f22 for comparison.
Back home, in comparing the two, the one shot at f22 displayed a much broader range of sharp focus than the f8 shot.
Is anyone else surprised? Am I missing something in my setup process?
Thanks
Jim
First off, Great Job in running your own tests and experiments. Based on your output format (small prints, online display, etc...) and your personal subjective assessment, it COULD be okay to shoot f/22 all the time. Doing experiments is how you decide that and learn.
Now for a question: what focal length did you plug into the calculator? You don't mention the actual focal length, and I want to make sure you setup the test correctly. If done correctly, and your image was framed so that it only contained the 1/2*hyperfocal distance to infinity, I would expect the f/8 image to be in better focus than the f/22 image. At f/8 and 18mm, your hyperfocal distance is a little bit over a meter. At 35mm the hyperfocal distance is closer to 5 meters.
Why would people NOT shoot at f/22 all the time?
Some do, but most don't. Shooting at f/22 is far more prone to lens artifacts like chromatic aberration or vignetting and to phenomena like diffraction. Most modern digital cameras start seeing a decrease in sharpness caused by diffraction somewhere around mid-apertures. For full frame it might not come in until f/16. For current APS-C cameras it is more like F/10 or f/11 while the micro four thirds cameras see it begin at closer to f/8. These are broad generalizations I am making and depending on your output format (large prints or small online viewing) diffraction issues may never matter to you.
(in some cases people may not want to focus a camera, particularly a wide angle lens for street photography, so they will set the aperture really slow and set the hyperfocal distance and shoot away without a care in the world. Some old cheap cameras didn't have apertures or a focus ring at all and this is how you took all the images.)
Further muddying this discussion is the fact that lenses are theoretically sharpest at their widest (fastest) aperture. Practically speaking they are often sharpest at their fastest aperture -1 stop; so a f/2.8 lens is probably going to be sharpest at f/4. This isn't always true, but it is a reasonable guideline to follow until you have time to do your own tests on your equipment. This may seem confusing because there is so much less depth of field at f/4 than at f/8 or f/22, but this peak sharpness is only in a very thin depth of field. For example, at f/1.4 a portrait may have a person's eyes in focus (great) while their ears and nose are out of focus (maybe not so great). Stopping that portrait down to f/22 will make the whole face in focus, but the eyes won't be as perfectly in focus as they were at f/1.4. If you print that image at 13x19 inches, this could matter a lot. conversely, if you are making a thumbnail image, it won't matter at all.
There are lots of complexities here that I haven't touched on, but remember everything is a tradeoff and keep doing experiments.