Luminous Landscape Forum

Equipment & Techniques => Cameras, Lenses and Shooting gear => Topic started by: ErikKaffehr on July 12, 2012, 03:43:34 pm

Title: Effects of diffraction
Post by: ErikKaffehr on July 12, 2012, 03:43:34 pm
Hi,

I made a small write up on the effects of diffraction. All had been said before, it's just information collected in a single place. The article will be updated in the coming days.

http://echophoto.dnsalias.net/ekr/index.php/photoarticles/68-effects-of-diffraction

Best regards
Erik
Title: Re: Effects of diffraction
Post by: marcmccalmont on July 12, 2012, 09:37:19 pm
Erik
Thanks for the summary!
I certainly can verify the accuracy of your conclusions with both the IQ180/Rodenstock HR's and the Nikon D800E/Leica R's
The sweet spot is f5.6 to f8.0 but f4 to f16 is certainly usable and with proper sharpening quite good
Marc
Title: Re: Effects of diffraction
Post by: Ray on July 13, 2012, 01:13:04 am
Erik,
That's an excellent description of the problem. Thanks for taking the trouble to do the tests and share your results.
I'm sure many folks who have never hear of the Airy Disk will be aware that their camera produces sharper results at F5.6 than at F16, but your graph of Imatest results demonstrating the differences in resolution at 50% MTF at F stops between F2.8 and F32 (using the same lens with 3 different cameras) is very revealing, and also very credible in my opinion, in relation to my own tests.

You've done a more thorough and comprehensive test than I've done. I simply concentrated on the 10mp Canon 40D and the 15mp Canon 50D, as a result of claims on this forum, some years ago, that the 50% increase in pixel count of the 50D would serve no purpose beyond F8.

Your graph, which I've reproduced here (hope you don't mind) shows results for 3 cameras which have the pixel density of full-frame 13.7mp, 27.5mp and 54.7 mp. What surprises me is that, even at F32, both the Alpha 700 and the SLT-A77 show a slight resolution advantage over the 6mp Dimage 7D, and that resolution advantage is about the same magnitude as the resolution advantage of the 24mp A77 over the 12mp Alpha 700 at F16.

I say I'm surprised because both Guillermo and Bart in this context have claimed that at F16 the D800 will have no resolution advantage. Now such statements do not accord with my own tests, but I would probably have been prepared to accept the truth of such a statement in relation to F32. It now looks as though a comparison between the D800 and the D3 will show very marginally increased detail for the D800 shot even at F32.

When I carried out my tests back in 2009, between the 40D and 50D, it was very apparent that at F22 there was no difference whatsoever in detail, whatever the degree of magnification on the monitor. As a consequence there was no point in testing at F32.

If I ever get around to comparing my Canon 5D with the D800E, I'll test all the way down to F32, and expect to see some subtle increase in detail in the D800E shot, at maybe 300% magnification.

Title: Re: Effects of diffraction
Post by: free1000 on July 13, 2012, 02:11:21 am
Eric thats a nice and clear explanation.
 
The sharpening information is very interesting, I'd never before realised the relationship between radius and detail. I've always used little of the detail because of an increase in 'grittiness'. I now see that increasing the radius can remove that effect while retaining more detail.

I don't know who found these sharpening recipes but thank you for posting them, the effect on my recent test images is revelatory. It makes the Nikon 24 f2.8 more usable. At f16 the corners aren't too bad at all.

PS: The icing on the cake would be a diagram that showed the size and shape of the airy disk 3d plot at different apertures,  even better an interactive one!
Title: Re: Effects of diffraction
Post by: ErikKaffehr on July 13, 2012, 02:54:13 am
Hi,

Detail works as "halo supression" at low values at high values it will switch algrithm to deconvolution.

Best regards
Erik


Eric thats a nice and clear explanation.
 
The sharpening information is very interesting, I'd never before realised the relationship between radius and detail. I've always used little of the detail because of an increase in 'grittiness'. I now see that increasing the radius can remove that effect while retaining more detail.

I don't know who found these sharpening recipes but thank you for posting them, the effect on my recent test images is revelatory. It makes the Nikon 24 f2.8 more usable. At f16 the corners aren't too bad at all.

PS: The icing on the cake would be a diagram that showed the size and shape of the airy disk 3d plot at different apertures,  even better an interactive one!
Title: Re: Effects of diffraction
Post by: Bart_van_der_Wolf on July 13, 2012, 04:57:55 am
Your graph, which I've reproduced here (hope you don't mind) shows results for 3 cameras which have the pixel density of full-frame 13.7mp, 27.5mp and 54.7 mp. What surprises me is that, even at F32, both the Alpha 700 and the SLT-A77 show a slight resolution advantage over the 6mp Dimage 7D, and that resolution advantage is about the same magnitude as the resolution advantage of the 24mp A77 over the 12mp Alpha 700 at F16.

I say I'm surprised because both Guillermo and Bart in this context have claimed that at F16 the D800 will have no resolution advantage.

Ray,

That is because of the units used for the vertical axis of the chart. Line Widths per Image Height is a metric that lumps together actual resolution in cycles/mm together with number of (vertical) pixels. More pixels will always help to reduce the need for output magnification.

Using actual resolution in cycles/mm on the vertical axis would give a somewhat different picture, which is what I've been saying all along. Therefore it all depends on what it is that one is comparing, output resolution based on same size output, or same output resolution at different sizes. These are different goals and thus will lead to different evaluations results.

LW/PH is a good criterion for same size output scenarios, Cy/mm is a good criterion for same output resolution at different output sizes. They are different sides of the same coin. LW/PH is more about image quality at a fixed output size (can I improve my image quality at this specific output size?), where Cy/mm is more about enlargement capability (can I produce larger output without sacrificing quality?).

Cheers,
Bart
Title: Re: Effects of diffraction
Post by: Ray on July 13, 2012, 07:17:37 am
Ray,

That is because of the units used for the vertical axis of the chart. Line Widths per Image Height is a metric that lumps together actual resolution in cycles/mm together with number of (vertical) pixels. More pixels will always help to reduce the need for output magnification.

Using actual resolution in cycles/mm on the vertical axis would give a somewhat different picture, which is what I've been saying all along. Therefore it all depends on what it is that one is comparing, output resolution based on same size output, or same output resolution at different sizes. These are different goals and thus will lead to different evaluations results.

LW/PH is a good criterion for same size output scenarios, Cy/mm is a good criterion for same output resolution at different output sizes. They are different sides of the same coin. LW/PH is more about image quality at a fixed output size (can I improve my image quality at this specific output size?), where Cy/mm is more about enlargement capability (can I produce larger output without sacrificing quality?).

Cheers,
Bart

Hi Bart,
I have to say, I haven't heard anything so confusing in a long time. You've even exceeded the obfuscation records previously set by BJL. ;D

Whenever I compare images for resolution and detail, I always compare them at equal output size and always from an equal viewing distance. That seems just plain common sense to me. When one starts changing output size of one image in relation to another, or changing viewing distance of one image in relation to another, then one can achieve any result one desires, as regards perception of detail or resolution.

I've dug out the DVD where I recorded my tests of the 40D and 50D in 2009, and reconverted two of the images of the $50 banknote, a 40D shot and a 50D shot both taken at F16.

As far as I can tell, from my position of inexperience, there is no moiré or artifacts in either image. But there is clearly more real detail in the 50D shot, but visible only in certain parts of the images at a high magnification.

Now, how would you like me to display such images to demonstrate my point? If I display them both at 100%, and unequal size, the resolution differences will be ambiguous and uncertain. If I display both images at 200%, and unequal size, the differences will be more obvious. If I display them both at 300%, and unequal size, the differences will be even clearer.

On the other hand, I could display the lower resolution 40D at 242% and the higher resolution 50D at 200%, which would make them equal size. Or I could display the 40D at 300% and the 50D at 245%, which would also make them equal size.

Which would you prefer? I can be very accommodating.  ;D

Cheers!  Ray
Title: Re: Effects of diffraction
Post by: Bart_van_der_Wolf on July 13, 2012, 08:22:08 am
Hi Bart,
I have to say, I haven't heard anything so confusing in a long time.

Which is understandable if you prefer to only look at one side of the coin ...

Quote
Whenever I compare images for resolution and detail, I always compare them at equal output size and always from an equal viewing distance. That seems just plain common sense to me.

Strange as it may seem to you, looking only at one side of the situation, there are actually people out there who buy a higher megapixel and/or larger sensor array size camera because they have a need for larger output. These are often the same kind of people that resort to stitching when such a larger physical size sensor is not feasible.

Quote
When one starts changing output size of one image in relation to another, or changing viewing distance of one image in relation to another, then one can achieve any result one desires, as regards perception of detail or resolution.

I know it must come as a shock that not everybody prints only 8x10 inch images, some have different requirements such as filling the wall of a booth on a tradeshow. A higher pixel or line count alone (LW/PH) only tells one part of the whole story, because one also needs to know the actual resolution in Cy/mm for each of those 'lines' to know what the output resolution (in Cy/mm) will be.

On-sensor resolution (Cy/mm) divided by output magnification factor equals output resolution (Cy/mm). For reference, 5 cycles/mm is considered good output quality for normal reading distances with adequate light levels, 8 cycles/mm is considered to be excellent. Double the viewing distance, and one can half the resolution requirement. Grasping such a concept doesn't come even close to rocket science.

Quote
Now, how would you like me to display such images to demonstrate my point?

I have already explained the possibility to represent images/figures with two different requirements in mind. I don't feel the need to repeat myself.

Feel free to post whatever you like to illustrate whatever point it is you are trying to make.

Cheers,
Bart
Title: Re: Effects of diffraction
Post by: Ray on July 13, 2012, 09:49:02 am
Which is understandable if you prefer to only look at one side of the coin ...

Strange as it may seem to you, looking only at one side of the situation, there are actually people out there who buy a higher megapixel and/or larger sensor array size camera because they have a need for larger output. These are often the same kind of people that resort to stitching when such a larger physical size sensor is not feasible.

I know it must come as a shock that not everybody prints only 8x10 inch images, some have different requirements such as filling the wall of a booth on a tradeshow. A higher pixel or line count alone (LW/PH) only tells one part of the whole story, because one also needs to know the actual resolution in Cy/mm for each of those 'lines' to know what the output resolution (in Cy/mm) will be.


C'mon now Bart. Everyone knows that you can produce whatever output size you want, whatever the megapixel count of your sensor. The only reason for requiring more megapixels is so that a print at a particular, large size will show finer detail from a closer viewing distance.

There must be thousands of bill boards on the highways that were produced from an old-fashioned 6mp MFDB, or an 11mp 35mm DSLR, that look very detailed from a distance of 20 metres or so. Walk up close, and the image quality is crap.

My plasma HDTV is 65"diagonal. It can display no more than a 2mp image, which is quite pathetic in terms of modern DSLR resolution, and even pathetic in terms of the average P&S resolution, yet from the other side of the living room, say 5 or 6 metres away, I might find it difficult to discern any additional detail in an equal-size print from a 36mp file placed next to the TV.

However, that 2mp image displayed on my HDTV also looks crap from the same close distance one might view an 8"x10"print.

My prints are not coins. They have two sides, but the other side is a blank.

Cheers!  Ray
Title: Re: Effects of diffraction
Post by: bjanes on July 13, 2012, 12:12:36 pm
Hi Bart,
I have to say, I haven't heard anything so confusing in a long time. You've even exceeded the obfuscation records previously set by BJL. ;D

Whenever I compare images for resolution and detail, I always compare them at equal output size and always from an equal viewing distance. That seems just plain common sense to me. When one starts changing output size of one image in relation to another, or changing viewing distance of one image in relation to another, then one can achieve any result one desires, as regards perception of detail or resolution.

Ray,

Perhaps an example would help clarify the differences. Suppose that you want to take a picture of a bird at some distance and you have only a 200 mm lens and have both the D800 and D7000 cameras. With both cameras you would take the shot and then crop the area of interest. If both crops are of the same pixel dimension, there will be little difference in the images. The Nyquist frequencies of both cameras is slightly over 100 cy/mm. If your crop is 1000 x 800 pixels in the plane of the sensor, you are not using the full picture height of each sensor but are using less of the full picture height of the D800. The resolution in terms of cy/mm is the determining factor.

On the other-hand, if you take a picture so that the full frames of both cameras are used and the views are the same, then the determining factor is cy/PH.

Regards,

Bill




Title: Re: Effects of diffraction
Post by: ErikKaffehr on July 13, 2012, 12:33:41 pm
Hi,

All the images discussed in the original article were taken with APS-C sensors of different generations, so LW/PH and cy/mm are essentially just related by a constant.

Best regards
Erik

Ray,

Perhaps an example would help clarify the differences. Suppose that you want to take a picture of a bird at some distance and you have only a 200 mm lens and have both the D800 and D7000 cameras. With both cameras you would take the shot and then crop the area of interest. If both crops are of the same pixel dimension, there will be little difference in the images. The Nyquist frequencies of both cameras is slightly over 100 cy/mm. If your crop is 1000 x 800 pixels in the plane of the sensor, you are not using the full picture height of each sensor but are using less of the full picture height of the D800. The resolution in terms of cy/mm is the determining factor.

On the other-hand, if you take a picture so that the full frames of both cameras are used and the views are the same, then the determining factor is cy/PH.

Regards,

Bill





Title: Re: Effects of diffraction
Post by: Ray on July 13, 2012, 12:38:40 pm
Ray,

Perhaps an example would help clarify the differences. Suppose that you want to take a picture of a bird at some distance and you have only a 200 mm lens and have both the D800 and D7000 cameras. With both cameras you would take the shot and then crop the area of interest. If both crops are of the same pixel dimension, there will be little difference in the images. The Nyquist frequencies of both cameras is slightly over 100 cy/mm. If your crop is 1000 x 800 pixels in the plane of the sensor, you are not using the full picture height of each sensor but are using less of the full picture height of the D800. The resolution in terms of cy/mm is the determining factor.

On the other-hand, if you take a picture so that the full frames of both cameras are used and the views are the same, then the determining factor is cy/PH.

Regards,

Bill


Bill,
Surely you know that sensor resolution is determined by pixel density. LW/PH is a combination of pixel density and sensor size. The D800 has approximately the same pixel density as the D7000 (very slightly less to an insignificant degree), so resolution in terms of line pairs per mm is about the same, but LW/PH is obviously greater for the D800.

When comparing equal size sensors, as in my comparison between the Canon 40D and the 50D, and in Erik's comparisons amongst 3 cropped-format sensors, one Minolta and two Sony, there is no distinction between resolution per mm (cy/mm) and Line Widths per Picture Height. A higher resolution in terms of cy/mm equates to a proportionally higher resolution in terms of LW/PH.

Bart has raised a complete red herring in this respect.

Ps. I see Erik has said more or less the same. Cy/mm is not the same in precise mathematical terms as LW/PH, for same size sensors. Different terminology. But it varies proportionally, assuming that the LW/PH is an extrapolation of centre resolution, which I believe it is.
Title: Re: Effects of diffraction
Post by: Bart_van_der_Wolf on July 13, 2012, 02:44:54 pm
When comparing equal size sensors, as in my comparison between the Canon 40D and the 50D, and in Erik's comparisons amongst 3 cropped-format sensors, one Minolta and two Sony, there is no distinction between resolution per mm (cy/mm) and Line Widths per Picture Height. A higher resolution in terms of cy/mm equates to a proportionally higher resolution in terms of LW/PH.

Bart has raised a complete red herring in this respect.

You said you "haven't heard anything so confusing in a long time". I'm afraid you are, confused that is, and you also managed to get this thread off-topic. Oh well, what's new.

Erik's three camera models do have virtually similar physical sensor sizes but with quite different sensel pitches. The sensel pitch has shrunk with each generation, but with that, the on-sensor resolution has increased (assuming the same lens performance and similar AA-filter characteristics).

The Maxum 7D has an approx. on sensor resolution at Nyquist of 64 cy/mm, the Alpha A700 has approx. 91 cy/mm, and the SLT-A77 has approx. 128 cy/mm.

That means that for same output resolution the A700 can produce some 42% larger output, and the SLT-A77 can produce output that is 100% larger, twice the size. That is such a significant difference that I wouldn't call it a red herring, but then I'm not claiming to be confused.

BTW the on-sensor resolution of the 40D is approx. 88 cy/mm at Nyquist, and the 50D is 106 cy/mm. So that would allow the 50D to produce approx. 20% larger output with the same output resolution in cy/mm as the 40D, ceteris paribus (http://en.wikipedia.org/wiki/Ceteris_paribus).

So the differences when expressed as LW/PH which are good for comparison of same size output, tell a somewhat different story than when we compare cycles/mm which is more useful for same output resolution but different magnification and thus output size. Actually it is the same story but seen from two different usage goals/perspectives, hence the two sides of the same coin analogy.

I can also tell from first hand experience that the 2012 'Dutch Herring' (http://en.wikipedia.org/wiki/Hollandse_Nieuwe) tastes excellent, and that it is not red. And yes, we prefer to eat them raw (with chopped onions).

Cheers,
Bart
Title: Re: Effects of diffraction
Post by: ErikKaffehr on July 13, 2012, 03:17:31 pm
Hi,

I would suggest that these figures explain a lot:
(http://echophoto.dnsalias.net/ekr/images/DMBFigures/Dynax7_crop.png)
and
(http://echophoto.dnsalias.net/ekr/images/DMBFigures/Alpha77_crop.png)

The upper figure shows MTF as function of LW/PH for the Dynax 7 at f/5.6 and at f/16. The dotted line is the diffraction limit. So would we have a sensor with MTF == 1.0 and a prefect lens the MTF curve would fit the dotted line. Obviously the MTF curve is quite a bit below the diffraction limit even at f/16 but it also obvious that diffraction affects MTF. If we check 1000 LW/PH, MTF is around 0.55 at f/5.6 but only 0.4 at f/16.

The lower curve shows MTF for the Alpha 77SLT. MTF is still less than what would be due to diffraction alone. If we compare 1000 LW/PH  we can see that we have MTF around 0.7 at f/5.6 and about 0.5 for f/16.

On the other hand, the Nyquist limit for the Dimage 7D is 2000 LW/PH, and here it produces virtually nil MTF at f/16. The Alpha 77SLT still has a significant MTF for 2000 LW/PH.

What I see is that the Dimage 7D is sensor limited at f/16, while the Alpha 77SLT is diffraction limited at f/16 as diffraction limit crosses the horizontal axis at around 3400.

The sample images in the article illustrate that the effect of diffraction is noticable even at f/8.

Contrary to Bart's view I'd say that LW/PH is a good measure, it says what amount of information the sensor can deliver. The MTF curve is much effected by the amount of sharpening, but that is a different can of worms, and a can of worms it is.

Best regards
Erik


Erik,
That's an excellent description of the problem. Thanks for taking the trouble to do the tests and share your results.
I'm sure many folks who have never hear of the Airy Disk will be aware that their camera produces sharper results at F5.6 than at F16, but your graph of Imatest results demonstrating the differences in resolution at 50% MTF at F stops between F2.8 and F32 (using the same lens with 3 different cameras) is very revealing, and also very credible in my opinion, in relation to my own tests.

You've done a more thorough and comprehensive test than I've done. I simply concentrated on the 10mp Canon 40D and the 15mp Canon 50D, as a result of claims on this forum, some years ago, that the 50% increase in pixel count of the 50D would serve no purpose beyond F8.

Your graph, which I've reproduced here (hope you don't mind) shows results for 3 cameras which have the pixel density of full-frame 13.7mp, 27.5mp and 54.7 mp. What surprises me is that, even at F32, both the Alpha 700 and the SLT-A77 show a slight resolution advantage over the 6mp Dimage 7D, and that resolution advantage is about the same magnitude as the resolution advantage of the 24mp A77 over the 12mp Alpha 700 at F16.

I say I'm surprised because both Guillermo and Bart in this context have claimed that at F16 the D800 will have no resolution advantage. Now such statements do not accord with my own tests, but I would probably have been prepared to accept the truth of such a statement in relation to F32. It now looks as though a comparison between the D800 and the D3 will show very marginally increased detail for the D800 shot even at F32.

When I carried out my tests back in 2009, between the 40D and 50D, it was very apparent that at F22 there was no difference whatsoever in detail, whatever the degree of magnification on the monitor. As a consequence there was no point in testing at F32.

If I ever get around to comparing my Canon 5D with the D800E, I'll test all the way down to F32, and expect to see some subtle increase in detail in the D800E shot, at maybe 300% magnification.


Title: Re: Effects of diffraction
Post by: Bart_van_der_Wolf on July 13, 2012, 05:52:42 pm
Contrary to Bart's view I'd say that LW/PH is a good measure, it says what amount of information the sensor can deliver.

Hi Erik,

Just to make sure, I'm not saying it is bad measure, on the contrary. It is just one way of specifying performance, specifically normalized for 1 picture height (PH), or same output size. That is fine if that is the comparison; does the image quality improve when I choose one system instead of the other? However, there is a limit to how much quality can improve when we exceed a 720 PPI output resolution, because the printer becomes the limitation.

Another way of looking at a system comparison is; how much larger can the output become, before losing image quality? That comparison is not more limiting for one or the other system because the printer resolution will be the same.

Both are good methods to compare systems, they are just anwering the question with a different goal in mind. That's all.

Cheers,
Bart
Title: Re: Effects of diffraction
Post by: Fine_Art on July 13, 2012, 11:05:15 pm
Good info, thanks Erik.
Title: Re: Effects of diffraction
Post by: Ray on July 14, 2012, 01:42:59 am
You said you "haven't heard anything so confusing in a long time". I'm afraid you are, confused that is, and you also managed to get this thread off-topic. Oh well, what's new.

Erik's three camera models do have virtually similar physical sensor sizes but with quite different sensel pitches. The sensel pitch has shrunk with each generation, but with that, the on-sensor resolution has increased (assuming the same lens performance and similar AA-filter characteristics).


As do mine, Bart. I've been quite clear all along that my comparisons relate to equal size sensors with a different pixel pitch. No confusion on my part, old chap.

Whenever a DSLR manufacturer significantly raises the pixel count of its latest model we seem to get the same old concerns about diffraction, raised again and again. Will the extra pixels serve any purpose when the camera is used above F5.6 or F8? Is there a cut-off point at a particular F stop, beyond which no further resolution can be gleaned however large the ouput size?

Erik's graph demostrates that even a camera with the very high pixel density of the SLT-A77, which is equivalent to a 55mp full-frame sensor of the same pixel density, can show a resolution edge at all apertures up to, and including F16, compared with a camera of half the pixel count, such as the Alpha 700. The graph also shows that F22 is the point where no further resolution benefits are to be had from the 24mp A77, compared with Alpha 700.

Now you are quite right to point out, Bart, that the differences in pixel densities between the 12.25mp Alpha 700 and the 24.3mp ALT-A77, are greater than the differences between the Canon 40D and 50D. In the case of the two Sony cameras, one has double the pixel count of the other. In the case of the two Canon cameras, one has only 50% greater pixel-count than the other.

One should therefore not presume from Erik's graph that a sensor with a lower pixel density than the A77, such as the Canon 50D, equivalent to a 38 or 39mp full frame, will also show a resolution edge at F16. And indeed I haven't presumed that. I carried out my own tests back in September 2008, and discovered for myself that even a camera with a modest 50% increase in pixel count over another of fairly similar pixel density to the Alpha 700, will show a very slight resolution edge at F16. (Alpha 700 roughly equivalent to 28mp full-frame; 40D roughly equivalent to 25.6mp. That's close enough).

The question that now remains, will a camera such as the D800 with 3x the pixel count of a D3 show any resolution advantage whatsoever at F32? I doubt it, but I'm fairly confident that the D800 will show a very slight advantage at F22, and a more significant advantage at F16, compared with the D3.

Enjoy your Dutch Herring, Bart.  ;D

Cheers!
Title: Re: Effects of diffraction
Post by: Bart_van_der_Wolf on July 14, 2012, 05:30:21 am
The question that now remains, will a camera such as the D800 with 3x the pixel count of a D3 show any resolution advantage whatsoever at F32? I doubt it, but I'm fairly confident that the D800 will show a very slight advantage at F22, and a more significant advantage at F16, compared with the D3.

That is not too difficult to answer, denser sampling will get the most out of any optical projection. However, physics sets a hard absolute diffraction limit, based on wavelength and aperture. No other parameters play a role.

For a 564 nm wavelength (wich is a nice Luminance weighted average between Red, Green, and Blue) and a round aperture, diffraction will limit spatial frequency resolution in the focal plane by a zero modulation at resolutions beyond:

There is zero resolution possible beyond those spatial frequencies, none, nada. The indicated sample pitches have a Nyquist frequency that limits resolution at that same maximum spatial frequency, and as a consequence of diffraction there cannot be any aliasing because there is no signal beyond Nyquist. These are the absolute optical and sampling limits.

Because our sensors are not point samplers but area samplers, and our lenses are not perfect, the optical limits are even a fraction lower, but the above limits cannot be broken no matter how close the small sampling area gets to resembling a point sampler, or how good our lenses are. Only by using shorter wavelength light can we squeeze a bit more resolution out of our diffraction limiting optics. That's why chip manufacturers use UV and X-ray wavelengths to expose the photo resist masking layers, to beat the absolute limits of diffraction.

And again, near those diffraction limits there will only be resolution for very high contrast features because the MTF response is so low. Micro contrast is increasingly limited once the diffraction pattern diameter starts to exceed 1.5x the sensel pitch. A very rough rule of thumb tells us that, for wavelengths around 555 nm, that starting point of low micro contrast loss is at the sensel pitch in microns plus 10%, so for a D800 with a 4.88 micron sensel pitch that would be at approx. f/5.6 . Therefore, for the scenario of magnification capability based on cycles/mm, the per pixel contrast at maximum (for a given purpose) magnification will be low, unless the subject contrast was very high.

Cheers,
Bart
Title: Re: Effects of diffraction
Post by: Ray on July 14, 2012, 10:27:31 am
That is not too difficult to answer, denser sampling will get the most out of any optical projection. However, physics sets a hard absolute diffraction limit, based on wavelength and aperture. No other parameters play a role.

For a 564 nm wavelength (wich is a nice Luminance weighted average between Red, Green, and Blue) and a round aperture, diffraction will limit spatial frequency resolution in the focal plane by a zero modulation at resolutions beyond:
  • f/32 is limited at 55.4078 cycles/mm, a sample pitch of 9.0 micron
  • f/28 is limited at 62.1932 cycles/mm, a sample pitch of 8.0 micron
  • f/25 is limited at 69.8095 cycles/mm, a sample pitch of 7.2 micron
  • f/22 is limited at 78.3585 cycles/mm, a sample pitch of 6.4 micron
  • f/20 is limited at 87.9544 cycles/mm, a sample pitch of 5.7 micron
  • f/18 is limited at 98.7255 cycles/mm, a sample pitch of 5.1 micron
  • f/16 is limited at 110.816 cycles/mm, a sample pitch of 4.5 micron

There is zero resolution possible beyond those spatial frequencies, none, nada. The indicated sample pitches have a Nyquist frequency that limits resolution at that same maximum spatial frequency, and as a consequence of diffraction there cannot be any aliasing because there is no signal beyond Nyquist. These are the absolute optical and sampling limits.

Because our sensors are not point samplers but area samplers, and our lenses are not perfect, the optical limits are even a fraction lower, but the above limits cannot be broken no matter how close the small sampling area gets to resembling a point sampler, or how good our lenses are. Only by using shorter wavelength light can we squeeze a bit more resolution out of our diffraction limiting optics. That's why chip manufacturers use UV and X-ray wavelengths to expose the photo resist masking layers, to beat the absolute limits of diffraction.

And again, near those diffraction limits there will only be resolution for very high contrast features because the MTF response is so low. Micro contrast is increasingly limited once the diffraction pattern diameter starts to exceed 1.5x the sensel pitch. A very rough rule of thumb tells us that, expressed in microns and for wavelengths around 555 nm, that starting point of low micro contrast loss is at the aperture number plus 10%, so for a D800 with a 4.88 micron sensel pitch that would be at approx. f/5.6 . Therefore, for the scenario of magnification capability based on cycles/mm, the per pixel contrast at maximum (for a given purpose) magnification will be low, unless the subject contrast was very high.

Cheers,
Bart

Hi Bart,
That's an informative table which certainly makes sense to me as long as we emphasise that the sample pixel pitches are maximum sizes in relation to maximum resolutions and that any sensor with a smaller pixel pitch will potentially achieve the same diffraction-limited resolution at those F stops, provided the target is high contrast of course, such as B&W lines or similar contrasty detail one might find on a banknote.

For example, the 50D has a pixel pitch of 4.68 microns and a potential maximum resolution at the Nyquist limit of 106 cy/mm. It looks as though the 50D would be capable of delivering close to its maximum potential resolution at F16, whereas the 40D cannot deliver more than 88 cy/mm at any aperture up to F20.

One would therefore expect the 50D to deliver slightly higher resolution than the 40D at F16, which it does.

The D800 with a very marginally larger pixel than the 50D (4.7 versus 4.68 microns), should deliver a similar resolution to the 50D at F16 (in terms of cy/mm rather than LW/PH of course) especially if we use the D800E.

However, the D3 with a pixel pitch of 8.4 microns, and a maximum resolution at the Nyquist limit of only 60 cy/mm, will deliver worse resolution than the D800 all the way down to F28. At F32, diffraction will prevent it from achieving its full resolution potential under any circumstances. Resolution with both the D3 and D800 at F32 and F28 should be indistinguishable at any degree of magnification. But resolution at F25 could be very slightly better for the D800, viewed at say 200% on screen, I would predict. And if not at F25, then certainly at F22, wouldn't you agree?

Cheers!

Title: Re: Effects of diffraction
Post by: Bart_van_der_Wolf on July 14, 2012, 07:49:52 pm
Hi Bart,
That's an informative table which certainly makes sense to me as long as we emphasise that the sample pixel pitches are maximum sizes in relation to maximum resolutions and that any sensor with a smaller pixel pitch will potentially achieve the same diffraction-limited resolution at those F stops, provided the target is high contrast of course, such as B&W lines or similar contrasty detail one might find on a banknote.

Yes, they are maximum values. Denser sampling will usually help because it provides a better oversampling of the diffraction pattern, which in turn allows more accurate restoration by deconvolution sharpening.

Quote
However, the D3 with a pixel pitch of 8.4 microns, and a maximum resolution at the Nyquist limit of only 60 cy/mm, will deliver worse resolution than the D800 all the way down to F28. At F32, diffraction will prevent it from achieving its full resolution potential under any circumstances. Resolution with both the D3 and D800 at F32 and F28 should be indistinguishable at any degree of magnification. But resolution at F25 could be very slightly better for the D800, viewed at say 200% on screen, I would predict. And if not at F25, then certainly at F22, wouldn't you agree?

Yes, that's how it happens to work out for high contrast detail. But do keep in mind that lower contrast subject matter will be mostly lost already at wider apertures than these limiting ones. Stopping down will kill low contrast micro-detail long before it kills all micro-detail (of any level of contrast). For the D800 it starts at f/5.6 and for the D3 it starts at f/9 .

Cheers,
Bart
Title: Re: Effects of diffraction
Post by: Ray on July 14, 2012, 11:24:52 pm
Yes, they are maximum values. Denser sampling will usually help because it provides a better oversampling of the diffraction pattern, which in turn allows more accurate restoration by deconvolution sharpening.

Yes, that's how it happens to work out for high contrast detail. But do keep in mind that lower contrast subject matter will be mostly lost already at wider apertures than these limiting ones. Stopping down will kill low contrast micro-detail long before it kills all micro-detail (of any level of contrast). For the D800 it starts at f/5.6 and for the D3 it starts at f/9 .

Cheers,
Bart

Hi Bart
Here we're getting into even higher degrees of speculation. How low is low-contrast? Also, at some predefined level of lowness, the strength of the camera's AA filter will have a more significant effect on the results. Many people were surprised to see how little resolution difference there seemed to be between the D800 and the D800E in the sample images on display. At a particular low level of contrast, I imagine the D800E would produce a more obvious improvement over the D800, than it does at high contrast.

What I propose is a test chart with the usual B&W lines spaced at progressively smaller intervals, then another columm on the same sheet containing the same size and spacing of lines, but dark grey and white lines, then another column of medium grey and white lines, then pale gray and white lines, then very pale grey and white lines.

Such a chart would enable us to quantify more precisely the different capacities of different cameras, with different strengths of AA filters and different pixel densities, to handle predefined levels of low-contrast micro-detail at specific F stops.

However, I see another problem here. It is understood that one would try to use the same lens on both cameras when doing such a test, but the choice of lens may significantly affect the results, and I'm not referring just to the quality of the lens, in terms of good or bad, cheap or expensive, but rather the lens MTF characteristics and its design.

For example, certain high-contrast lenses, such as some of the Carl Zeiss lenses, are very appealing (and very expensive of course) because they have a high MTF response at relatively low frequencies, say up to 40 lp/mm. However at higher frequencies up to the Nyquist limit of the D800, say 100 cy/mm, the MTF response may be significantly lower in the Zeiss lens than another design of lens, whether the other lens is cheaper or not.

Therefore, when comparing the handling of low-contrast micro-detail with the D800 and D3, for example, the high-contrast lens would favour the D3 and the low-contrast lens would favour the D800. In other words, the relatively high MTF response of a low contrast lens at 70 to 100 lp/mm would be irrelevant with regard to the D3 because the D3 cannot resolve any more than 60 lp/mm, whatever the circumstances. But that additional contrast at high resolution could have a visible impact on the D800 results.

Cheers!

Title: Re: Effects of diffraction
Post by: ErikKaffehr on July 15, 2012, 05:05:44 am
Marc,

Thanks for your feedback!

Best regards
Erik

Erik
Thanks for the summary!
I certainly can verify the accuracy of your conclusions with both the IQ180/Rodenstock HR's and the Nikon D800E/Leica R's
The sweet spot is f5.6 to f8.0 but f4 to f16 is certainly usable and with proper sharpening quite good
Marc
Title: Re: Effects of diffraction
Post by: Bart_van_der_Wolf on July 15, 2012, 08:49:46 am
Hi Bart
Here we're getting into even higher degrees of speculation. How low is low-contrast?

That's where MTF curves come to the rescue. When the MTF response for a given lens/aperture/camera combination is e.g. 10% for a certain spatial frequency, then a subject with 20% contrast and that spatial frequency of detail will be rendered at 2% resulting contrast. It's only just visible. If the subject itself only had some 5% contrast, then the resulting 0.5% would probably not even be resolved anymore.

Quote
Also, at some predefined level of lowness, the strength of the camera's AA filter will have a more significant effect on the results. Many people were surprised to see how little resolution difference there seemed to be between the D800 and the D800E in the sample images on display. At a particular low level of contrast, I imagine the D800E would produce a more obvious improvement over the D800, than it does at high contrast.

Indeed, such low contrast micro-detail will benefit from any boost in the MTF. However, at the boundaries of resolution we are also faced with Demosaicing false color artifacts due to the aliased input signal. So it depends a bit on how things are implemented in the Rawconverter, e.g. does noise reduction differentiate between high and low contrast micro-detail?

Quote
What I propose is a test chart with the usual B&W lines spaced at progressively smaller intervals, then another columm on the same sheet containing the same size and spacing of lines, but dark grey and white lines, then another column of medium grey and white lines, then pale gray and white lines, then very pale grey and white lines.

Such a chart would enable us to quantify more precisely the different capacities of different cameras, with different strengths of AA filters and different pixel densities, to handle predefined levels of low-contrast micro-detail at specific F stops.

That wouldn't be too difficult to make based on e.g. my resolution test chart (http://www.openphotographyforums.com/forums/showthread.php?t=13217), just lower the contrast by adjusting the output levels more towards mid-gray before assigning the output profile to it and printing it.

Quote
However, I see another problem here. It is understood that one would try to use the same lens on both cameras when doing such a test, but the choice of lens may significantly affect the results, and I'm not referring just to the quality of the lens, in terms of good or bad, cheap or expensive, but rather the lens MTF characteristics and its design.

Correct, we can only draw absolute conclusions based on a given combination of equipment. There is also variation between lens copies. But there are of course general findings that describe a cross-section of the population. One can use the slanted edge features on the test chart for that purpose.

For those who don't have Imatest software at their disposal, one can also use my Slanted Edge Analysis tool (http://bvdwolf.home.xs4all.nl/main/foto/psf/SlantedEdge.html). When the Blur radius is calculated, one can use section 4. of the tool page, and produce an MTF curve from that Gaussian blur.

That section 4 on the tool's page can also be used independently from the 3 prior steps, as a means to test various scenarios. What I have sofar found is that blur radii of 0.7 are quite common for modern lenses around their optimal aperture of f/5.6 or thereabout. Stopping down to f/11 - f/16 will produce blur radii in the order of 1.0 or a bit more, all before sharpening. Sharpening of course attempts to reduce the blur to very small values, a radius of 0.39 corresponds to an edge profile rise from 10-90% in approx. 1 pixel distance.

Cheers,
Bart
Title: Re: Effects of diffraction
Post by: Hening Bettermann on July 15, 2012, 05:09:15 pm
Hi
sorry to interrupt this high-end discussion with a very trivial problem. Erik, I can't read your web site - when I adjust the browser to show the text lines all the way to the right, the characters are way too small for reading. When I enlarge them, the lines are cut off at right, and it does not help to use the horizontal scroll bar.
Kind regards - Hening.
Title: Re: Effects of diffraction
Post by: Ray on July 15, 2012, 07:45:50 pm
Correct, we can only draw absolute conclusions based on a given combination of equipment. There is also variation between lens copies. But there are of course general findings that describe a cross-section of the population. One can use the slanted edge features on the test chart for that purpose.


It's a pity that no-one is producing MTF results for the lens only, at frequencies that range as far as the Nyquist limit of modern DSLRs, say 10, 20, 40, 80 & 120 lp/mm. For best low-contrast micro-detail, the lens with the highest MTF response at 80 and 120 lp/mm might be preferred.

I suspect that most lenses are designed to make the coarsest detail look best, ie. snappy and contrasty, at the sacrifice of finer and fainter detail which gets lost. One can always enhance coarse detail that lacks contrast, in post processing, but nothing can be done regards detail that was never captured as a result of both the contrast of the subject matter being too faint and the MTF response of the lens at the same spatial frequency being too low.

I wish Photodo had continued with their MTF testing of the lenses only, and added a few higher frequencies instead of discontinuing the whole process. I used to find their results useful.

Cheers!
Title: Re: Effects of diffraction
Post by: ErikKaffehr on July 16, 2012, 12:21:47 am
Hi,

The 10, 20, 40 standard of MTF presentation was developed in the film days. It's widely used both Zeiss and Leica present MTF in that formate.

Rodenstock and Schneider added curves for 60 lp/mm on some of their HR lenses for digital.

Olympus is using another set, taking their smaller format into account.

My guess is that fine detail is less important than high MTF at lower frequencies at normal viewing distances. A lot of research has gone into that. But now we often pixel peep both on screen and print,

A lens that "hits the diffraction limit" early  is probably a good lens. This can be used as a hint. A lens that needs to be stopped down to f/11 will not produce very good fine detail.

I agree that it would be nice to have MTF curves available that go to Nyquist. As an alternative it would be possible to show LW/PH or cy/px data for low MTF (like 20%).

Vendors MTF data seems to be inconsistent. I presume that in many cases the MTF curve is more of an artists impression than measured lens data.

Regarding the old Photodo data, it was measured by Hasselblad. Unfortunately the Photodo name was sold and the Photodo now in existence seems to provide little data of interest.

We have a periodical here in Sweden that still measures MTF at Hasselblad (AFAIK), but they only publish data for 20 lp/mm (although they measure up to 60 lp/mm).

DxO mark seems to have decent lens tests, and so has SLRGEAR and Dpreview. And we have Photozone, of course.

Best regards
Erik


It's a pity that no-one is producing MTF results for the lens only, at frequencies that range as far as the Nyquist limit of modern DSLRs, say 10, 20, 40, 80 & 120 lp/mm. For best low-contrast micro-detail, the lens with the highest MTF response at 80 and 120 lp/mm might be preferred.

I suspect that most lenses are designed to make the coarsest detail look best, ie. snappy and contrasty, at the sacrifice of finer and fainter detail which gets lost. One can always enhance coarse detail that lacks contrast, in post processing, but nothing can be done regards detail that was never captured as a result of both the contrast of the subject matter being too faint and the MTF response of the lens at the same spatial frequency being too low.

I wish Photodo had continued with their MTF testing of the lenses only, and added a few higher frequencies instead of discontinuing the whole process. I used to find their results useful.

Cheers!

Title: Re: Effects of diffraction
Post by: Bart_van_der_Wolf on July 16, 2012, 06:44:15 am
A lens that "hits the diffraction limit" early  is probably a good lens. This can be used as a hint. A lens that needs to be stopped down to f/11 will not produce very good fine detail.

Hi Erik,

Yes, that seems to be a fair assessment. To avoid confusion though, I'd rather call it "starts being affected by diffraction more than by residual lens aberrations" than limited. There is only one limit, and that's the absolute diffraction limit where resolution is lost even for high contrast signals.

A lens stopped down beyond it's optimal aperture (often one or two stops narrower than wide open) will start to lose overall contrast due to diffraction, and the highest spatial frequencies will suffer most because they already have the lowest contrast. However, as long as the input signal has a high enough contrast at those highest spatial frequencies, there will still be a resolved final signal, it is not limited. It's only limited when at any signal contrast the resulting signal is always lost, and that is wavelength and aperture dependent.

Because our Bayer CFA sensors and Raw converters favor Luminance over Chroma resolution, the above mentioned list of limits for 564 nm wavelengths allows a reasonable prediction of what to expect. As such it is also useful to predict 'total immunity' to aliasing artifacts. Without signal beyond the Nyquist spatial frequency, there can be no aliasing. This can be useful in situations (with stationary subjects) where it is easier to shoot an additional truely diffraction limited shot to be used in postprocessing of problematic subject matter, than to spend valuable time on trying to fix luminosity moiré.

Cheers,
Bart
Title: Re: Effects of diffraction
Post by: ErikKaffehr on July 16, 2012, 04:37:47 pm
Bart,


Thanks for correction!

Best regards
Erik

Hi Erik,

Yes, that seems to be a fair assessment. To avoid confusion though, I'd rather call it "starts being affected by diffraction more than by residual lens aberrations" than limited. There is only one limit, and that's the absolute diffraction limit where resolution is lost even for high contrast signals.

A lens stopped down beyond it's optimal aperture (often one or two stops narrower than wide open) will start to lose overall contrast due to diffraction, and the highest spatial frequencies will suffer most because they already have the lowest contrast. However, as long as the input signal has a high enough contrast at those highest spatial frequencies, there will still be a resolved final signal, it is not limited. It's only limited when at any signal contrast the resulting signal is always lost, and that is wavelength and aperture dependent.

Because our Bayer CFA sensors and Raw converters favor Luminance over Chroma resolution, the above mentioned list of limits for 564 nm wavelengths allows a reasonable prediction of what to expect. As such it is also useful to predict 'total immunity' to aliasing artifacts. Without signal beyond the Nyquist spatial frequency, there can be no aliasing. This can be useful in situations (with stationary subjects) where it is easier to shoot an additional truely diffraction limited shot to be used in postprocessing of problematic subject matter, than to spend valuable time on trying to fix luminosity moiré.

Cheers,
Bart
Title: Re: Effects of diffraction
Post by: bjanes on July 20, 2012, 10:12:20 am
That is not too difficult to answer, denser sampling will get the most out of any optical projection. However, physics sets a hard absolute diffraction limit, based on wavelength and aperture. No other parameters play a role.

For a 564 nm wavelength (wich is a nice Luminance weighted average between Red, Green, and Blue) and a round aperture, diffraction will limit spatial frequency resolution in the focal plane by a zero modulation at resolutions beyond:
  • f/32 is limited at 55.4078 cycles/mm, a sample pitch of 9.0 micron
  • f/28 is limited at 62.1932 cycles/mm, a sample pitch of 8.0 micron
  • f/25 is limited at 69.8095 cycles/mm, a sample pitch of 7.2 micron
  • f/22 is limited at 78.3585 cycles/mm, a sample pitch of 6.4 micron
  • f/20 is limited at 87.9544 cycles/mm, a sample pitch of 5.7 micron
  • f/18 is limited at 98.7255 cycles/mm, a sample pitch of 5.1 micron
  • f/16 is limited at 110.816 cycles/mm, a sample pitch of 4.5 micron

There is zero resolution possible beyond those spatial frequencies, none, nada. The indicated sample pitches have a Nyquist frequency that limits resolution at that same maximum spatial frequency, and as a consequence of diffraction there cannot be any aliasing because there is no signal beyond Nyquist. These are the absolute optical and sampling limits.

Because our sensors are not point samplers but area samplers, and our lenses are not perfect, the optical limits are even a fraction lower, but the above limits cannot be broken no matter how close the small sampling area gets to resembling a point sampler, or how good our lenses are. Only by using shorter wavelength light can we squeeze a bit more resolution out of our diffraction limiting optics. That's why chip manufacturers use UV and X-ray wavelengths to expose the photo resist masking layers, to beat the absolute limits of diffraction.

And again, near those diffraction limits there will only be resolution for very high contrast features because the MTF response is so low. Micro contrast is increasingly limited once the diffraction pattern diameter starts to exceed 1.5x the sensel pitch. A very rough rule of thumb tells us that, for wavelengths around 555 nm, that starting point of low micro contrast loss is at the sensel pitch in microns plus 10%, so for a D800 with a 4.88 micron sensel pitch that would be at approx. f/5.6 . Therefore, for the scenario of magnification capability based on cycles/mm, the per pixel contrast at maximum (for a given purpose) magnification will be low, unless the subject contrast was very high.

Bart,

Very useful data. I presume that 0 MTF represents the Dawes limit. To illustrate some of your points and invite further discussion I will present some of my own data for the Nikon D800e and the MicroNikkor 60mm f/2.8 AFS derived from your resolution chart and calculating the Gaussian radius with your method. I published the shots of your chart earlier, but have revised the resolution at f/5.6 as per your suggestion and also the other resolution figures. Since the Gaussian radius was excessive at f/2.8 and improved with stopping down, I suspect some degree of defocus may be present.

The table summarizes the results. MTF at the Rayleigh limit and at 50% contrast are from Roger Clarks' site. At f/5.6 resolution is close to the Nyquist limit for low contrast (approximately the Rayleigh limit), but resolution at 50% contrast is appreciably lower as measured by Imatest, but can be improved with sharpening. At f/16 and smaller apertures, resolution by your method and by Imatest are close to the theoretical limits.

Examination of the images from your resolution chart are interesting. The red circle indicates the Nyquist limit. At f/16 the system resolves close to the Nyquist limit and contrast could be improved by careful deconvolution sharpening. At f/32 there is complete loss of any detail near Nyquist and deconvolution sharpening would probably be of little benefit.

(http://bjanes.smugmug.com/Photography/800e-Resolution/i-LszW4fJ/1/O/Img0628120003.png)

(http://bjanes.smugmug.com/Photography/800e-Resolution/i-NqkrgVh/1/O/Img0628120006.png)

(http://bjanes.smugmug.com/Photography/800e-Resolution/i-kmJPpvR/1/O/Img0628120008.png)

(http://bjanes.smugmug.com/Photography/800e-Resolution/23864799_7bdFcp#!i=1942503873&k=wG7Wkxx)

The edge response as measured by imatest demonstrates why an increased sharpening radius is needed for f/32. The sharpening algorithm needs to look out further to areas where there is a reasonable semblance of white and black. A narrow radius would encompass only gray values.

(http://bjanes.smugmug.com/Photography/800e-Resolution/i-9LLQkKC/0/O/Img0628120003YA901cpp.png)

(http://bjanes.smugmug.com/Photography/800e-Resolution/i-vSzJjzD/0/O/Img0628120008YA901cpp.png)

Comments would be appreciated.

Regards,

Bill