Pages: [1] 2   Go Down

Author Topic: How does one tell if a lens outresolves a sensor?  (Read 10935 times)

AFairley

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1486
How does one tell if a lens outresolves a sensor?
« on: February 10, 2012, 12:50:24 pm »

Well, how do you?  Is is something that is determined on a test bed or calculated from MTF charts and the like?  Thank you.
Logged

Chairman Bill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3352
    • flickr page
Re: How does one tell if a lens outresolves a sensor?
« Reply #1 on: February 10, 2012, 12:51:41 pm »

And the corollary - how do you know whether your lens will be out-done by the resolving power of your sensor?

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: How does one tell if a lens outresolves a sensor?
« Reply #2 on: February 10, 2012, 02:23:19 pm »

It should be very easy for anyone testing a new camera or lens for purchase.

Modern cameras are very well designed, so a lens that can out-resolve the sensor should give pictures that are very close to the nyquist limit of the sensor. ie. a line per pixel or even better, to test all angles, resolve close to 92 pixels diameter on BartVanderWolf's 144 cycle version of the Siemens star chart. It's an elegant test. Print his chart big enough that you can easily put the 92 pixels outside the centre where your printer chokes on the converging lines. On old cameras there used to be about a 70% factor for losses in the camera/film system. That is gone these days, you pretty much hit the limits of the sensor.

If the sensor out-resolves the lens you get mush in the details of the picture. It might be a defect showing up in a cheap lens like smearing of the corners. Look at the corners of the maple shot in the D800 samples. Top right, bottom left in particular. It looks like a lens element is out of alignment or just failing there.


Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: How does one tell if a lens outresolves a sensor?
« Reply #3 on: February 10, 2012, 02:44:13 pm »

It should be very easy for anyone testing a new camera or lens for purchase.

Modern cameras are very well designed, so a lens that can out-resolve the sensor should give pictures that are very close to the nyquist limit of the sensor. ie. a line per pixel or even better, to test all angles, resolve close to 92 pixels diameter on BartVanderWolf's 144 cycle version of the Siemens star chart.

Hi,

Yes, I was just about to post the same suggestion. Some results are discussed here, and here. There are links to the target itself.

Testing whether the lens or the sensor/OLPF is pulling down the combined result is made a bit easier with my resolution target. When the lens is good, then one will be able to resolve close to the Nyquist frequency at a central blur of 92 pixels, and thus the sensor will be a limiting factor. If the lens completely fails to come even close to the 92 pixel minimum, then the lens will be the limiting factor.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

AJSJones

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 357
Re: How does one tell if a lens outresolves a sensor?
« Reply #4 on: February 10, 2012, 05:10:15 pm »

It's not quite as simple a question as that!  Both the sensor and lens have properties that determine the ability to resolve details in an image (e.g., the white between adjacent black lines) as some sort of gray (this is a measure of the so-called MTF )  As the lines get closer together, the value approaches "pure gray".  What you see in the image is the combination of the the lens's properties and the sensor's properties.  The "outresolving" phenomenon is a gentle slope, not a sudden cut-off. 
Logged

Christoph C. Feldhaim

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2509
  • There is no rule! No - wait ...
Re: How does one tell if a lens outresolves a sensor?
« Reply #5 on: February 10, 2012, 05:31:41 pm »

This forum outresolves my brain. But how can I measure that ?

AJSJones

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 357
Re: How does one tell if a lens outresolves a sensor?
« Reply #6 on: February 10, 2012, 07:06:56 pm »

Hi,

Yes, I was just about to post the same suggestion. Some results are discussed here, and here. There are links to the target itself.

Testing whether the lens or the sensor/OLPF is pulling down the combined result is made a bit easier with my resolution target. When the lens is good, then one will be able to resolve close to the Nyquist frequency at a central blur of 92 pixels, and thus the sensor will be a limiting factor. If the lens completely fails to come even close to the 92 pixel minimum, then the lens will be the limiting factor.

Cheers,
Bart

Thanks Bart - I just read the info at your links -
very helpful!
Logged

AJSJones

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 357
Re: How does one tell if a lens outresolves a sensor?
« Reply #7 on: February 11, 2012, 05:28:05 pm »

A techie follow-up, if I may!
How low does the lens's MTF have to drop before one sees a significant decrease in the limiting resolution (or increase in the 92 pixel diameter) - and is there some MTF value associated with this "limiting resolution"?

Thanks
Andy
Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: How does one tell if a lens outresolves a sensor?
« Reply #8 on: February 11, 2012, 06:04:37 pm »

A techie follow-up, if I may!
How low does the lens's MTF have to drop before one sees a significant decrease in the limiting resolution (or increase in the 92 pixel diameter) - and is there some MTF value associated with this "limiting resolution"?

I Think that the limiting resolution is near the Rayleigh limit, which is often quoted at around 10% MTF, but some calculations by Bart indicate the Rayleigh limit is closer to 20%. Hopefully, he will comment.

Regards,

Bill

Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: How does one tell if a lens outresolves a sensor?
« Reply #9 on: February 11, 2012, 08:18:04 pm »

A techie follow-up, if I may!
How low does the lens's MTF have to drop before one sees a significant decrease in the limiting resolution (or increase in the 92 pixel diameter) - and is there some MTF value associated with this "limiting resolution"?

Hi Andy,

Good questions, but hard ones to answer with a single number.

First of all, with my proposed target one doesn't test optical MTF alone, but rather the system MTF (lens (de)focus, residual aberrations, veiling glare, diffraction, OLPF, microlenses, sensel aperture). When the system MTF goes to zero, that's the limit. But it is of course not that simple, because MTF is a function of contrast reduction with increasing spatial frequency. Therefore, if one increases the input contrast enough, then there will be more residual contrast. I've designed the target to produce a moderate contrast, but one would need to spot measure it after printing (which is beyond my control) to get a more accurate input value.

Second, as for an MTF value associated with the limiting resolution, the ISO standard for digital camera resolution and the ones for scanner resolution used to have a remark that 10% MTF (or SFR) corresponds well with the limiting resolution of the human eye. I don't know if the more recent standard copies still mention that.

I Think that the limiting resolution is near the Rayleigh limit, which is often quoted at around 10% MTF, but some calculations by Bart indicate the Rayleigh limit is closer to 20%. Hopefully, he will comment.

Yes, Bill is correct. The difference is due to accidental alignment with the sensel grid. When the Rayleigh limit gap between two diffraction PSF peaks aligns with the sensel between the diffraction patterns then the MTF can be something like 26%, when however the peaks of the ajacent diffraction patterns happen to be aligned with the sensels then the gap could have almost zero MTF, depending on sensel pitch. So on average, one could say it's 10%, but in practice it varies between approx. 26% and 0%, depending on the particular parameters (wavelength, aperture number, sensel pitch, sensel grid alignment).

EDIT
I'll illustrate it a bit better. This is how the Rayleigh criterion of f/5.6, assuming a 564nm wavelength, looks when sampled at a 0.1 micron pitch.


Now imagine area sampling that same Rayleigh criterion with 1 to 6 micron sensels, which will look like this:

1: 2: 3: 4: 5: 6:

When you magnify the images, it becomes apparent that due to the area sampling of our sensels, it becomes quickly impossible to find any contrast between the diffraction peaks. In fact, in order to reliably resolve the two diffraction patterns according to the Rayleigh criterion, we require more than 4 pixels per diffraction pattern diameter. In other words, to resolve the two diffraction patterns of f/5.6 (assuming a 564nm wavelength), we require a sensel pitch of less than (2.44 x 0.564 x 5.6) / 4 = 1.93 micron, and even then the alignment of the diffraction patterns may blend them to a uniform brightness in the worst case.  

Fortunately, my resolution target is relatively insensitive to hor/ver sensel alignment because it tests the resolution at various angles, not just the ones that happen to align with the grid.

Cheers,
Bart

P.S. I've been working on a webpage that illustrates it with visual examples. Although it basically works, one needs to continuously confirm that it is okay that the Javascript takes more time to finish, so I also need some more time to get things sorted out and sped up.
« Last Edit: February 12, 2012, 01:37:55 pm by BartvanderWolf »
Logged
== If you do what you did, you'll get what you got. ==

AJSJones

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 357
Re: How does one tell if a lens outresolves a sensor?
« Reply #10 on: February 11, 2012, 10:40:30 pm »

Thanks Bart - very helpful. 
Given that most high frequency stuff I might shoot would be unlikely to be regular enough to vary between alignment and non-alignment (not too many brick walls or fabric in my landscapes) the 10% value sounds practical - as well as close to the Rayleigh limit (I've also seen 9% in some descriptions but that's essentially the same !)

Clearly, a ton of parameters affect the actual numbers, but if everything is kept constant except lens parameters (i.e. I simply use the same camera) then lens comparison results will eventually depend on lens performance differences, right?  In other words, we can't answer the question as posed by the OP but we can say when it doesn't outresolve the (sensor+other system components) if the taking conditions are optimal. For example, just progressively stopping down, or moving out from the center of the image circle to the edges, and noting the blur circle diameter, I would get curves containing useful information to allow lens performance comparisons.  The distance and framing independence of the star make such tests easier to contemplate.  If I get round to doing such tests, I'll post them here :D  Perhaps when Canon releases a FF sensor with a nice high pixel count so we can really test when the lenses poop out.
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: How does one tell if a lens outresolves a sensor?
« Reply #11 on: February 11, 2012, 11:30:31 pm »

Hi,

As a side comment, I recently measured MTF on three cameras using Imatest. In this case I used sample images from Imaging Resource although I have two of the three cameras an the third on order. The cameras are all  Sony Alphas:

- A700 12 MP
- A55   16 MP
- A77   22 MP

Interestingly enough the MTF curves were very similar and all cameras had between 16.6% and 21.5% MTF at Nyquist according to Imatest.

I also plotted MTF50 against sqrt(MP). MTF50 is the resolution yielding 50% MTF and according to Norman Koren (the author of Imatest) is closely correlated to our perception of sharpness. It seems that MTF50 increases linearly with the number of pixels along one dimension. So it seems that good lenses outresolve the best sensors of today near the axis (where the measurement was made).

Best regards

Erik


Thanks Bart - very helpful.  
Given that most high frequency stuff I might shoot would be unlikely to be regular enough to vary between alignment and non-alignment (not too many brick walls or fabric in my landscapes) the 10% value sounds practical - as well as close to the Rayleigh limit (I've also seen 9% in some descriptions but that's essentially the same !)

Clearly, a ton of parameters affect the actual numbers, but if everything is kept constant except lens parameters (i.e. I simply use the same camera) then lens comparison results will eventually depend on lens performance differences, right?  In other words, we can't answer the question as posed by the OP but we can say when it doesn't outresolve the (sensor+other system components) if the taking conditions are optimal. For example, just progressively stopping down, or moving out from the center of the image circle to the edges, and noting the blur circle diameter, I would get curves containing useful information to allow lens performance comparisons.  The distance and framing independence of the star make such tests easier to contemplate.  If I get round to doing such tests, I'll post them here :D  Perhaps when Canon releases a FF sensor with a nice high pixel count so we can really test when the lenses poop out.
« Last Edit: February 18, 2012, 03:31:52 pm by ErikKaffehr »
Logged
Erik Kaffehr
 

framah

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1418
Re: How does one tell if a lens outresolves a sensor?
« Reply #12 on: February 12, 2012, 09:28:06 am »

This forum outresolves my brain. But how can I measure that ?

It depends on 2 factors... whether blood is trickling out of one ear or both ears.

Glad to help. ;D
Logged
"It took a  lifetime of suffering and personal sacrifice to develop my keen aesthetic sense."

BJL

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 6600
Re: How does one tell if a lens outresolves a sensor?
« Reply #13 on: February 12, 2012, 10:16:56 am »

So at the smallest of these pixel sizes, about 4 microns, the same lenses on 36x24mm would still give more or less full advertised resolution at least in the central part of an image, which is many cases is enough to make the image noticeably better than it would be with a lower resolution sensor. That is 9000x6000 or 54MP (in linear resolution, not very far beyond the 7360x4912 of the D800), which by the way is close to what large format scanning backs used to offer last time I looked, but those are "X3", measuring all three colors at each location.

And these tests show near linear improvement, so there is surely considerable room to go on the shoulder of sub-linear improvement (still getting more lines per picture height but less lines per pixel).

On the other hand, MF might well still win comparisons of overall IQ, if it turns out that the best MF lenses handle the edges and corners better.

Another conclusion: without an OLPF, moiré will still be an issue at 54MP in 36x24mm, and probably well beyond!
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: How does one tell if a lens outresolves a sensor?
« Reply #14 on: February 12, 2012, 12:07:11 pm »

Another conclusion: without an OLPF, moiré will still be an issue at 54MP in 36x24mm, and probably well beyond!

Hi,

Yes, as you can deduce from the info that I just added to my post, to avoid aliasing in an image taken with a perfect lens at f/5.6, one requires sensels of at most 2 microns if diffraction is to be used as an OLPF. Such small sensels will have other issues, but then we can say that resolution is lens limited and no longer mostly sensor limited.

Cheers,
Bart
« Last Edit: February 12, 2012, 01:38:45 pm by BartvanderWolf »
Logged
== If you do what you did, you'll get what you got. ==

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: How does one tell if a lens outresolves a sensor?
« Reply #15 on: February 12, 2012, 03:13:10 pm »

Hi,

As a side note, I also found raw samples from the Leica M9 on Imaging Resource. If we compare the Leica to the Sony Alpha 900 we get about the same LW/PH at 50% MTF assuming "standard sharpening", but MTF at Nyquist is much higher on the Leica.

LW/PH at 18% is around Nyquist on the Sony but much beyond Nyquist on the Leica. I guess that Leica may be slightly favored by larger aperture.

Best regards
Erik

Hi,

Yes, as you can deduce from the info that I just added to my post, to avoid aliasing in an image taken with a perfect lens at f/5.6, one requires sensels of at most 2 microns if diffraction is to be used as an OLPF. Such small sensels will have other issues, but then we can say that resolution is lens limited and no longer mostly sensor limited.

Cheers,
Bart
Logged
Erik Kaffehr
 

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: How does one tell if a lens outresolves a sensor?
« Reply #16 on: February 12, 2012, 05:53:10 pm »

As a side note, I also found raw samples from the Leica M9 on Imaging Resource. If we compare the Leica to the Sony Alpha 900 we get about the same LW/PH at 50% MTF assuming "standard sharpening", but MTF at Nyquist is much higher on the Leica.

LW/PH at 18% is around Nyquist on the Sony but much beyond Nyquist on the Leica. I guess that Leica may be slightly favored by larger aperture.

Erik,

Alaising and sharpening artifact can result in "resolution" well beyond Nyquist when one is using a slanted edge target. The best reference I could find after a brief search is here. Some time ago, I had an online discussion with Norman Koren and he acknowledged this limitation of the slanted edge method and said that alternate methods to eliminate this false resolution were under consideration, but I don't know if he implemented anything. Hopefully, Bart or others can amplify here, but I don't think that the "resolution" beyond Nyquist that you observed with the Leica (which lacks a low pass filter) is desirable, but rather is a penalty that must paid for getting more contrast below Nyquist when one omits the low pass filter.

Regards,

Bill
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: How does one tell if a lens outresolves a sensor?
« Reply #17 on: February 12, 2012, 08:11:00 pm »

Erik,

Alaising and sharpening artifact can result in "resolution" well beyond Nyquist when one is using a slanted edge target. The best reference I could find after a brief search is here. Some time ago, I had an online discussion with Norman Koren and he acknowledged this limitation of the slanted edge method and said that alternate methods to eliminate this false resolution were under consideration, but I don't know if he implemented anything. Hopefully, Bart or others can amplify here, but I don't think that the "resolution" beyond Nyquist that you observed with the Leica (which lacks a low pass filter) is desirable, but rather is a penalty that must paid for getting more contrast below Nyquist when one omits the low pass filter.

Hi Bill,

The issue, a usual, is in the interpretation of the results. The slanted edge method, through some clever math, will give an over-sampled resolution result. A slanted edge at an angle of approx. 5.7 degrees ( aTan(1/10) ) will oversample the edge resolution by a factor of 10x, which is reduced by the ISO standards method that Imatest adheres to (as an option) to 4x (thus adding statistical robustness, with noise in mind).

That means that Imatest, and the ISO standard resolution tests, will give more information than is actually visible. When I used to exchange information directly with Norman, he showed a lot of interest in my Radial Gradient chart suggestion, and added it as one of the charts that Imatest can produce and read. Since then, the ISO standard organisation also embraced it as an alternative method to determine the resolution of discrete sampling sensors.

This means that the signal that is shown above the Nyquist limit, the shaded area in the graphs that Erik showed, is basically an indication of the amount of aliasing (if those higher than Nyquist frequency spatial details are present in the scene). Conceptually, the signal shown above the Nyquist frequency is mirrored horizontally at the Nyquist frequency and added to the signal below the Nyquist frequency as aliases (and can be seen as increasingly larger aliases of lower amplitude). You are correct in describing that as a penalty for a higher MTF below Nyquist.

My Radial Gradient test chart doesn't provide the analytical detail above Nyquist (although it does offer the slanted edges to do so), but it does show a direct visual representation of the aliasing artifacts as they could manifest themselves given repetitive patterns of certain spatial frequencies at the given angles.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: How does one tell if a lens outresolves a sensor?
« Reply #18 on: February 12, 2012, 11:42:59 pm »

Hi,

I got a bit enthusiastic about the Leica. The reason was really that it is one of the few cameras without OLP-filter, and I wanted to see how OLP-filtering affects the MTF curve.

I'm aware that there is no real resolution beyond Nyquist. What I have seen really that MTF left of Nyquist was quite similar to the Sony Alpha 900 I compared to. MTF at Nyquist was much higher on the Leica which will obviously cause aliasing.

Best regards
Erik


Erik,

Alaising and sharpening artifact can result in "resolution" well beyond Nyquist when one is using a slanted edge target. The best reference I could find after a brief search is here. Some time ago, I had an online discussion with Norman Koren and he acknowledged this limitation of the slanted edge method and said that alternate methods to eliminate this false resolution were under consideration, but I don't know if he implemented anything. Hopefully, Bart or others can amplify here, but I don't think that the "resolution" beyond Nyquist that you observed with the Leica (which lacks a low pass filter) is desirable, but rather is a penalty that must paid for getting more contrast below Nyquist when one omits the low pass filter.

Regards,

Bill
Logged
Erik Kaffehr
 

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: How does one tell if a lens outresolves a sensor?
« Reply #19 on: February 13, 2012, 06:34:49 am »

Hi,

I got a bit enthusiastic about the Leica. The reason was really that it is one of the few cameras without OLP-filter, and I wanted to see how OLP-filtering affects the MTF curve.

Hi Erik,

No problem. What is relevant is that for subject matter that is imaged large enough, the MTF response near Nyquist is higher than with an OLPF, and that is welcome. It's only the subject matter that has a higher spatial frequency than the Nyquist frequency that is going to cause trouble when it's in the DOF zone. Outside the DOF zone, defocus will gradually start to act as a low-pass filter.

When one has the choice, one could also use a longer focal length or shoot from a shorter distance to increase the magnification factor and thus reduce certain aliasing artifacts. One might need to use stitching to regain the desired FOV, but for e.g. stationary product shots it would work.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==
Pages: [1] 2   Go Up