Pages: [1] 2 3 ... 6   Go Down

Author Topic: How much sensor resolution do we need to match our lenses?  (Read 29690 times)

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word

There has been much discussion in this and other forums of how much resolution do we need in our sensors. Erik Kaffehr started a thread about how much can we see. I’d like to come at it from another angle.

I have been reading a book by Robert Fiete, entitled Modeling the Imaging Chain of Digital Cameras:

There’s a chapter on balancing the resolution of the lens and the sensor, which introduces the concept of system Q, defined as:

Q = 2 * fcs / fco

where fcs is the cutoff frequency of the sampling system (sensor), and fco is the cutoff frequency of the optical system (lens).

An imaging system is in some sense “balanced” when the frequencies are the same, and thus Q=2.

The assumptions of the chapter in the book where Q is discussed are probably appropriate for the kinds of aerial and satellite surveillance systems the author works with, but they are not usually met in the photographic systems that most of us work with.

1)   Monochromatic sensors (no CFA)
2)   Diffraction-limited optics
3)   No anti-aliasing filter

Under these assumptions, the cutoff frequency of the sensor is half the inverse of the sensel pitch; we get that from Nyquist.

To get the cutoff frequency of the lens, we need to define the point where diffraction prevents the detection of whether we’re looking at one point or two. Lord Rayleigh came up with this formula in the 19th century:

R = 1.22 * lambda * N, where lambda is the wavelength of the light, and N is the f-stop.

Fiete uses a criterion that makes it harder on the sensor, the rounded Sparrow criterion:

S = lambda * N

Or, in the frequency domain, fco = 1 / (lambda * N)

Thus Q is:

Q = lambda * N / pitch

I figure that some of the finest lenses that we use are close to diffraction-limited at f/8. If that’s true, for 0.5 micrometer light (in the middle of the visible spectrum), a Q of 2 implies:

Pitch = N /4

At f/8 we want a 2-micrometer pixel pitch, finer than currently available for any available sensors sized at micro 4/3 and larger. A full frame sensor with that pitch would have 216 megapixels.

You can try to come up with a correction to take into account the Bayer array. Depending on the assumptions, the correction should be between 1 and some number greater than 2, but in any case, the pixel pitch should be at least as fine as for a monochromatic sensor.

As an aside, note that you don’t need an AA filter for a system with a Q of 2, since the lens diffraction does the job for you. That’s not true with a Bayer CFA.

I have several questions for anyone who cares to get involved in a discussion:

1)   Is any of this relevant to our photography?
2)   Have I made a math or logical error?
3)   At what aperture do our best lenses become close to being diffraction-limited?
4)   What other questions should I be asking?

For details about the Sparrow criterion, click here:
For more details on calculating Q, take a look here.
For ruminations on corrections for a Bayer CFA, look at this.

Thanks,

Jim
« Last Edit: May 08, 2014, 05:25:04 pm by Jim Kasson »
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: How much sensor resolution do we need to match our lenses?
« Reply #1 on: May 08, 2014, 06:14:36 pm »

Hi,

Thanks for sharing knowledge!

I don't know the relevance of this in this context, but I made a test with my P45+ and I had very significant aliasing at f/11 but virtually none at f/16. That sensor has 6.8 micron pitch. You can find it here (look for Aliasing): http://echophoto.dnsalias.net/ekr/index.php/photoarticles/80-my-mfd-journey-summing-up?start=1

My understanding is that for balance performance at f/16 a pitch of four microns would be needed.

The lens I used here was my Sonnar 150/4, and I would guess that it reaches maximum performance around f/8. To that comes also the amount of defocus.

My Sony Alpha 77 with 3.9 micron pixels is able to show moiré at f/8 but it is very little. That sensor probably has OLP filter.

I don't have high end primes, just pretty decent zooms.

Best regards
Erik
« Last Edit: May 08, 2014, 11:09:51 pm by ErikKaffehr »
Logged
Erik Kaffehr
 

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: How much sensor resolution do we need to match our lenses?
« Reply #2 on: May 08, 2014, 09:21:46 pm »

R in your Rayleigh formula is Radius of each diffraction spot so you have to double it to get the distance between 2 spots matching your pixel grid.

3) Its easy to find out. Do a test of your lens at a variety of aperture settings. A good zoom, probably f8. A prime probably f5.6, a top prime f4, the Otus f2. Further stopping down will give less detail. Wider lens aberrations damage the output.

Does it matter? It depends on what you need the image for.
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: How much sensor resolution do we need to match our lenses?
« Reply #3 on: May 08, 2014, 11:14:24 pm »

Hi Jim,

I was considering my results a bit, and I would mention that there is a considerable diffusion of light in the pixels. I have the impression the diffusion length of light  in silicon is around 2 microns and more depending on wave length.

Best regards
Erik
Logged
Erik Kaffehr
 

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word
Re: How much sensor resolution do we need to match our lenses?
« Reply #4 on: May 08, 2014, 11:25:26 pm »

R in your Rayleigh formula is Radius of each diffraction spot so you have to double it to get the distance between 2 spots matching your pixel grid.

The radius of the Airy function is not well-defined, since the distance to which it extends depends on the precision used in the calculations. Can you resolve the 3rd ring? The 9th? The 38465rd? If you say that the radius is the distance to the first zero, then the R in the formula is that. However, the Rayleigh criterion is more subtle than that.

Lord Rayleigh's criterion states that the if distance between the center positions of two impulses, or spatial Dirac delta functions (although he never used that term, having predated Paul Dirac) is such that the center of each lies on the first zero of the other, the two points are barely resolvable.



Subsequently, astronomers found that they could resolve (in the sense that they could tell if there were one or two stars) points closer than that. Hence the Sparrow criterion.



Jim



« Last Edit: May 08, 2014, 11:51:55 pm by Jim Kasson »
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: How much sensor resolution do we need to match our lenses?
« Reply #5 on: May 09, 2014, 12:46:26 am »

Hi Jim,

Although I recently have focused on what is visible in print I have also looked into the effects of aliasing on large and small pixels, the best article probably being this: http://echophoto.dnsalias.net/ekr/index.php/photoarticles/78-aliasing-and-supersampling-why-small-pixels-are-good

I would say that the feather shots in that article are quite interesting.

One thing I have noticed in "the differences at A2 size" article is that the obvious differences in image quality are areas showing high contrast detail with significant aliasing. So aliasing has a negative effect on image quality. In this case the IQ-180 showed less aliasing, probably because the sensor mostly outresolved the test target.
 
Best regards
Erik
There has been much discussion in this and other forums of how much resolution do we need in our sensors. Erik Kaffehr started a thread about how much can we see. I’d like to come at it from another angle.



Thanks,

Jim

« Last Edit: May 09, 2014, 12:50:02 am by ErikKaffehr »
Logged
Erik Kaffehr
 

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: How much sensor resolution do we need to match our lenses?
« Reply #6 on: May 09, 2014, 01:13:40 am »

The radius of the Airy function is not well-defined, since the distance to which it extends depends on the precision used in the calculations. Can you resolve the 3rd ring? The 9th? The 38465rd? If you say that the radius is the distance to the first zero, then the R in the formula is that. However, the Rayleigh criterion is more subtle than that.

Lord Rayleigh's criterion states that the if distance between the center positions of two impulses, or spatial Dirac delta functions (although he never used that term, having predated Paul Dirac) is such that the center of each lies on the first zero of the other, the two points are barely resolvable.



Subsequently, astronomers found that they could resolve (in the sense that they could tell if there were one or two stars) points closer than that. Hence the Sparrow criterion.



Jim





That is fine. What does your CFA need to make your raw converter generate 2 distinct spots? The spots may or may not align with your pixels. I would venture that you need peaks at least a diagonal apart.
Logged

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: How much sensor resolution do we need to match our lenses?
« Reply #7 on: May 09, 2014, 01:29:42 am »

To make this conversation more practical for photography maybe we can ask a new question. If our eyes resolve about 1 minute of arc, we need systems with the same or greater ability to give textures the same look. If we photograph a fabric and it has more plastic look than satin sheen, it may be that the components you refer to are lacking. Or skin tones which we were discussing in another thread. The surface is somewhat translucent. There are also very fine lines with very different contrast in most light.

So do our lenses always handle 1 minute of arc? I would say wide lenses no. If we do photograph with enough detail to get the 1 minute do our screens or prints show the detail while still showing the "forest"? Big prints yes. Our screens no. Prints lack the dynamic range of our eyes. Screens are getting good enough. So maybe the weakest link is our screen resolution. We need to get to 8K soon to be able to show our images as they were seen.
Logged

EinstStein

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 501
Re: How much sensor resolution do we need to match our lenses?
« Reply #8 on: May 09, 2014, 10:05:20 am »

I dont' think the definition of Q and the balance criteria (=2) is correct.
There is a well know theory about the required sampling rate for a band limited signals, which requires

Sampling rate = 2 x BW, where BW is the difference of the min. Frequency and the max. Frequency.

This equation should work even if min. Frequency is not 0, for the original signal can be shfited in the frequency domain by convolution with the sinusoidal function of min. Frequency.

If we focus on the case with min. Frequency =0, then a system with balanced sensor and lens should have

Sensor resolution = lens resolution x 2.
 so a better definition of Q should be

Q = lens resolution x 2 / sensor resolution.
The balance criterior should be Q = 1.

Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: How much sensor resolution do we need to match our lenses?
« Reply #9 on: May 09, 2014, 10:05:43 am »

There has been much discussion in this and other forums of how much resolution do we need in our sensors. Erik Kaffehr started a thread about how much can we see. I’d like to come at it from another angle.

I have several questions for anyone who cares to get involved in a discussion:

1)   Is any of this relevant to our photography?
2)   Have I made a math or logical error?
3)   At what aperture do our best lenses become close to being diffraction-limited?
4)   What other questions should I be asking?

Jim


Jim,

Thanks for an excellent post. With regard to question 1, the contrast at the Rayleigh limit (usually regarded at 9-10 lp/mm 9-10% although Bart van der Wolf has calculated a higher value), is too low to be photographically useful. 50% contrast is often quoted as corresponding most closely to perceived image sharpness and, if one uses that criterion, the equations change. Table 1 of the excellent treatise by Osuna and Garcia, Do Sensors Outresolve Lenses, lists resolutions for 50% and 80% contrast. Using the Nikon D800e for reference, the sensor has a Nyquist of 103 lp/mm. Using the Rayleigh criterion, the sensor is nowhere up to outresolving a diffraction limited lens, but few mass produced lenses are diffraction limited at their widest apertures. The best lenses are nearly diffraction limited at mid apertures. Using the Rayleigh criterion and the 800e, a diffraction limited lens will outresolve the sensor until f/16. However, if one uses 50% or 80%, the equation changes. Furthermore, as Osuna and Garcia point out, one may need to sample at more than 2 pixels per lp.

Also, as SQF analysis points out, human perception is most sensitive to high contrast at relatively low frequencies. Depending on the print size, a lens with high contrast at lower frequencies may give better results than a lens with resolution to the Rayleigh limit.

Regards,

Bill

ps edited 16May2014 to correct typo. Contrast at Rayleigh is in terms of percent, not lp/mm.
« Last Edit: May 16, 2014, 07:44:36 am by bjanes »
Logged

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word
Re: How much sensor resolution do we need to match our lenses?
« Reply #10 on: May 09, 2014, 01:18:54 pm »

I dont' think the definition of Q and the balance criteria (=2) is correct.
There is a well know theory about the required sampling rate for a band limited signals, which requires

Sampling rate = 2 x BW, where BW is the difference of the min. Frequency and the max. Frequency.

This equation should work even if min. Frequency is not 0, for the original signal can be shfited in the frequency domain by convolution with the sinusoidal function of min. Frequency.

If we focus on the case with min. Frequency =0, then a system with balanced sensor and lens should have

Sensor resolution = lens resolution x 2.
 so a better definition of Q should be

Q = lens resolution x 2 / sensor resolution.
The balance [criterion] should be Q = 1.



Thanks for pointing out the bandwidth/min frequency nicety. I hadn't thought of that in a long time. Let's assume min freq is zero, though.

I believe your restatement of Q, essentially changing it from frequency to distance, is accurate, if lens resolution is interpreted as half the Sparrow distance. Stated in the frequency domain, Q=2 says that the lens stops delivering contrast (the contrast actually drops to 1%, which is close enough) at one half cycle per pixel, so there can be no aliasing. Let's look at it in distance. Q = 2 says that there are 4.88 samples across the first ring of the Airy function. For two point spread functions at the rounded Sparrow distance, there is one sample between the two peaks, assuming the sampling grid is aligned with the two peaks. That one sample in the middle is necessary to distinguish a drop in signal level as the points move apart. 

If your definition of Q says that the lens resolution is the Sparrow distance, it's a new definition, and yes, I think that in that case, Q should be 1 for a "balanced" system.

Let's work through an example, with an f/8 lens and 0.5 micrometer light. The Sparrow distance is

S = 0.5 * 8 = 4 micrometers. The resolution of the lens is half that, or 2 micrometers. Remember, we're (barely) resolving a point halfway in between the two Airy functions.

Let's say the sensor pitch is 2 micrometers.

Using your formula,  Q = lens resolution x 2 / sensor resolution = 2 * 2 / 2 = 2.

Does that make sense?

Jim

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word
Re: How much sensor resolution do we need to match our lenses?
« Reply #11 on: May 09, 2014, 03:03:29 pm »

...the contrast at the Rayleigh limit (usually regarded at 9-10 lp/mm, although Bart van der Wolf has calculated a higher value), is too low to be photographically useful. 50% contrast is often quoted as corresponding most closely to perceived image sharpness and, if one uses that criterion, the equations change. Table 1 of the excellent treatise by Osuna and Garcia, Do Sensors Outresolve Lenses, lists resolutions for 50% and 80% contrast. Using the Nikon D800e for reference, the sensor has a Nyquist of 103 lp/mm. Using the Rayleigh criterion, the sensor is nowhere up to outresolving a diffraction limited lens, but few mass produced lenses are diffraction limited at their widest apertures. The best lenses are nearly diffraction limited at mid apertures. Using the Rayleigh criterion and the 800e, a diffraction limited lens will outresolve the sensor until f/16. However, if one uses 50% or 80%, the equation changes. Furthermore, as Osuna and Garcia point out, one may need to sample at more than 2 pixels per lp.

Bill, thanks for the perspective, and for the link to the excellent tutorial.

Jack Hogan has pointed out elsewhere that there are ways to do presampling AA filtering that don't affect spatial frequencies under the Nyquist frequency as much as doing AA filtering with diffraction. But there seems to be a trend towards sensors with no AA filtering, so it's probably worthwhile to look at that case.

The rounded Sparrow criterion gives a contrast of 2% (peak - valley) / peak). My calculations for the Rayleigh criterion put the contrast at (1 - 0.75) / 1 = 0.25. Am I using the wrong definition of contrast? Maybe it should be half those numbers. If we can agree on a definition of contrast, then I can do a plot of contrast versus point spread function separation, and we'll have a way to correct the Q definition for whatever level of contrast we deem crucial, or, probably better, pick a target Q based on our contrast criterion.

Jim

Iluvmycam

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 533
Re: How much sensor resolution do we need to match our lenses?
« Reply #12 on: May 09, 2014, 03:08:57 pm »

Don't know. Flat bed scanned 35mm film = 3 or 4 mp. Everything above that is gravy for me.

http://photographycompared.tumblr.com/
Logged

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word
Re: How much sensor resolution do we need to match our lenses?
« Reply #13 on: May 09, 2014, 03:48:31 pm »

http://photographycompared.tumblr.com/

Nice real-world comparisons. Thanks.

WRT to film comparisons, I recall an experience I had at the Kodak research center in Rochester in the early 90s, about a year before they announced the PhotoCD. They handed me two 16x20s, one from an 35mm Ektar 25 negative, and one scanned at what we quaintly called then 6 megapixel resolution (if was 3K by 2K, but since it had separate sensors for the red, green, and blue pixels, we'd call it a 18 MP capture today) from that negative, edited, output to a 5x7 interneg on a film recorder, and printed from that interneg. They challenged me to tell the difference, and were disappointed when I told them which was which. It was close, but I had my secret weapon. I'm nearsighted, and I took off my glasses.

The buzz on the street at that time was that Ektar 25 could resolve 200 lp/mm. It would take a sensor with a 2.5 micrometer pitch to do that. At full frame 35mm size, that would be 138 MP. The Kodak folks never explained why they thought they could digitize the film well with a sensor with a 12 micrometer pitch. I suspect the answer had something to do with MTF50 trumping MTF0.

Jim

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word
Re: How much sensor resolution do we need to match our lenses?
« Reply #14 on: May 09, 2014, 04:15:39 pm »

3) Its easy to find out. Do a test of your lens at a variety of aperture settings. A good zoom, probably f8. A prime probably f5.6, a top prime f4, the Otus f2. Further stopping down will give less detail. Wider lens aberrations damage the output.

I get that it is a necessary condition for a diffraction-limited lens that resolution decreases as f-stops numerically increase. I'm not sure that it is a sufficient condition, and I'd appreciate guidance on that point.

Let's assume that we have a test protocol that holds camera ISO setting and shutter speed constant, and compensates for the exposure differences at various f-stops by varying the amount of light on the test chart or by using a variable-absorption ND filter, or both. (If we use flash we have to use a variable ND filter or be sure that the camera's vibration is not a factor, since varying the output of strobes varies the flash duration.)

If we focus wide open, could focus shift upon stopping down cause a change so great the DOF wouldn't compensate, and we'd see the resolution decreasing du to focus error? A fix for this would be to focus at shooting aperture for each exposure, but we'd have to make enough exposures to gather statistics on each group to make sure that focusing error isn't skewing the result.

Are there lens optical defects that increase with numerically-increasing f-stops that could cause us to think we'd found diffraction before it actually occurs?

The best way to tell when a lens is diffraction-limited is to put it on a bench and look for the rings, but I don't have the equipment to do that.

By the way, my admittedly cursory testing of the Otus makes me think that f/2 is not its sharpest f-stop, but now you've got me interested and I'll have to get serious.

Jim

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: How much sensor resolution do we need to match our lenses?
« Reply #15 on: May 09, 2014, 05:59:03 pm »

Bill, thanks for the perspective, and for the link to the excellent tutorial.

Jack Hogan has pointed out elsewhere that there are ways to do presampling AA filtering that don't affect spatial frequencies under the Nyquist frequency as much as doing AA filtering with diffraction. But there seems to be a trend towards sensors with no AA filtering, so it's probably worthwhile to look at that case.

The rounded Sparrow criterion gives a contrast of 2% (peak - valley) / peak). My calculations for the Rayleigh criterion put the contrast at (1 - 0.75) / 1 = 0.25. Am I using the wrong definition of contrast? Maybe it should be half those numbers. If we can agree on a definition of contrast, then I can do a plot of contrast versus point spread function separation, and we'll have a way to correct the Q definition for whatever level of contrast we deem crucial, or, probably better, pick a target Q based on our contrast criterion.

Jim

Jim,

I have not seen Jack Hogan's presampling method. Do you have a link?

As to MTF at the Rayleigh criterion, I am no expert and don't know how to do the calcuation, but Osuna and Garcia (among others) cite a value of 9%. Bart van der Wolf did come calculations and derived a considerably higher value, but I don't remember the link. I would think that 25% would be usable in a terrestrial photographic situation, which is more demanding than separating point sources (start) on a dark background.

Bill
Logged

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: How much sensor resolution do we need to match our lenses?
« Reply #16 on: May 09, 2014, 06:24:17 pm »

I get that it is a necessary condition for a diffraction-limited lens that resolution decreases as f-stops numerically increase. I'm not sure that it is a sufficient condition, and I'd appreciate guidance on that point.

Let's assume that we have a test protocol that holds camera ISO setting and shutter speed constant, and compensates for the exposure differences at various f-stops by varying the amount of light on the test chart or by using a variable-absorption ND filter, or both. (If we use flash we have to use a variable ND filter or be sure that the camera's vibration is not a factor, since varying the output of strobes varies the flash duration.)

If we focus wide open, could focus shift upon stopping down cause a change so great the DOF wouldn't compensate, and we'd see the resolution decreasing du to focus error? A fix for this would be to focus at shooting aperture for each exposure, but we'd have to make enough exposures to gather statistics on each group to make sure that focusing error isn't skewing the result.

Are there lens optical defects that increase with numerically-increasing f-stops that could cause us to think we'd found diffraction before it actually occurs?

The best way to tell when a lens is diffraction-limited is to put it on a bench and look for the rings, but I don't have the equipment to do that.

By the way, my admittedly cursory testing of the Otus makes me think that f/2 is not its sharpest f-stop, but now you've got me interested and I'll have to get serious.

Jim


Manual focus with the DoF preview button pressed would take care of focus shift.

To my limited knowledge of lens aberrations, they are strongest over the full set of rays (wide open). My subjective testing of the Art 35 and the nikon 85 1.8G gave the best results at f4. My old minolta primes were best at f5.6 for the 2.8 or wider lenses. You can only see the rings on a diffraction limited lens. Most photography lenses are not diffraction limited. Sure you can stop them down to make them so, im not sure that would work as expected. In amateur astronomy we make artificial stars for testing simply. A bright light source behind a pinhole off in the distance at night.

My dob is supposedly diffraction limited at 0.5 seconds of arc. I modified it with a smaller secondary ( about 35% down to about 15-18%) so I don't know what it is now. I think it is now mirror limited at about 1/4 wave.
Logged

EinstStein

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 501
Re: How much sensor resolution do we need to match our lenses?
« Reply #17 on: May 10, 2014, 11:25:42 am »


it OK to play with the definition of each term in the equation, but it should not aspect the result of the following statement:

If the lens can resolve 100lines in 1 mm, the lens resolving power has spatial wave length =10um.
The required sampling rate on the sensor should have pixel to pixel distance = 5um.

Note that this theory is based on square band figure, the ideal one. In the real world, the max. Frequency has a long tail, which can pinch in in the aliasing. The sensor resolution would be even higher. The quantification of that effect should be an engineering art. I can accept any fudge factor, but that would be a matter of taste, ... I would like to know if someone can find the statistically popular taste.





Thanks for pointing out the bandwidth/min frequency nicety. I hadn't thought of that in a long time. Let's assume min freq is zero, though.

I believe your restatement of Q, essentially changing it from frequency to distance, is accurate, if lens resolution is interpreted as half the Sparrow distance. Stated in the frequency domain, Q=2 says that the lens stops delivering contrast (the contrast actually drops to 1%, which is close enough) at one half cycle per pixel, so there can be no aliasing. Let's look at it in distance. Q = 2 says that there are 4.88 samples across the first ring of the Airy function. For two point spread functions at the rounded Sparrow distance, there is one sample between the two peaks, assuming the sampling grid is aligned with the two peaks. That one sample in the middle is necessary to distinguish a drop in signal level as the points move apart. 

If your definition of Q says that the lens resolution is the Sparrow distance, it's a new definition, and yes, I think that in that case, Q should be 1 for a "balanced" system.

Let's work through an example, with an f/8 lens and 0.5 micrometer light. The Sparrow distance is

S = 0.5 * 8 = 4 micrometers. The resolution of the lens is half that, or 2 micrometers. Remember, we're (barely) resolving a point halfway in between the two Airy functions.

Let's say the sensor pitch is 2 micrometers.

Using your formula,  Q = lens resolution x 2 / sensor resolution = 2 * 2 / 2 = 2.

Does that make sense?

Jim

Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: How much sensor resolution do we need to match our lenses?
« Reply #18 on: May 10, 2014, 11:36:19 am »

Hi,

On the other hand, we probably would like to avoid aliasing and fake detail. It seems that little contrast (MTF) at Nyquist is needed to generate significant aliasing.

Best regards
Erik


Jim,

Thanks for an excellent post. With regard to question 1, the contrast at the Rayleigh limit (usually regarded at 9-10 lp/mm, although Bart van der Wolf has calculated a higher value), is too low to be photographically useful. 50% contrast is often quoted as corresponding most closely to perceived image sharpness and, if one uses that criterion, the equations change. Table 1 of the excellent treatise by Osuna and Garcia, Do Sensors Outresolve Lenses, lists resolutions for 50% and 80% contrast. Using the Nikon D800e for reference, the sensor has a Nyquist of 103 lp/mm. Using the Rayleigh criterion, the sensor is nowhere up to outresolving a diffraction limited lens, but few mass produced lenses are diffraction limited at their widest apertures. The best lenses are nearly diffraction limited at mid apertures. Using the Rayleigh criterion and the 800e, a diffraction limited lens will outresolve the sensor until f/16. However, if one uses 50% or 80%, the equation changes. Furthermore, as Osuna and Garcia point out, one may need to sample at more than 2 pixels per lp.

Also, as SQF analysis points out, human perception is most sensitive to high contrast at relatively low frequencies. Depending on the print size, a lens with high contrast at lower frequencies may give better results than a lens with resolution to the Rayleigh limit.

Regards,

Bill
Logged
Erik Kaffehr
 

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word
Re: How much sensor resolution do we need to match our lenses?
« Reply #19 on: May 10, 2014, 12:09:53 pm »

If the lens can resolve 100lines in 1 mm, the lens resolving power has spatial wave length =10um.
The required sampling rate on the sensor should have pixel to pixel distance = 5um.

Are you sure you don't mean, "If the lens can resolve 100 line pairs in 1 mm, the lens resolving power has spatial wave length =10um."

Jim
Pages: [1] 2 3 ... 6   Go Up