Neither lenses nor sensors have a single resolution level, like being able to resolve everything on a size scale bigger than 100 line pairs per mm and nothing smaller than this. Instead, there is a gradual decline as image features get smaller, sometimes measured by MTF graphs, which is roughly a measure of the fraction of contrast that the lens or sensor delivers for image details at various sizes (lp/mm). The overall performance of a lens-sensor combination as measured by MTF is the product of lens MTF with sensor MTF.
So even if one lens has more resolution than a sensor in the sense of higher MTF curve, another lens of even higher MTF can still improve the combined lens-sensor MTF, and so improve the overall resolution.
It seems that the hands-on/practical types out there have deviced many practical "rules of thumb". This is all nice and good as long as it is used in the proper context.
I am no expert on optics or quantum physics, but I am convinced that rules about the "diffraction limit" are just that: rules of thumb.
-h
It seems that the hands-on/practical types out there have deviced many practical "rules of thumb". This is all nice and good as long as it is used in the proper context.
I am no expert on optics or quantum physics, but I am convinced that rules about the "diffraction limit" are just that: rules of thumb.
Its also possible a lens can be weak or just lack contrast at a certain frequency which happens to be in the image. Sometimes you see a drop in the MTF curve at a point.
Another possibility is a protective filter on that lens which is not on the other.
When we are talking lines per mm, I assume this is at the sensor or fim plane ?
Hi,
I think you mix up things a little. I have never seen an MTF vs. frequency plot that is non monotonous (sorry for the term!). What I'd suggest you see is that MTF varies over the image. One reason is often that the field of focus is often wavy. That is coming from correction of field curvature.
Best regards
Erik
Im not sure what you mean by non monotonous. Probably a translation difference. Your English is excellent btw.
Have a look at the 4th diagram: diffraction MTF for example.
http://www.optenso.com/optix/ex_diffana.html (http://www.optenso.com/optix/ex_diffana.html)
The S shape on the red line is typical for mirror lenses for example. I've seen other MTF charts with serious sudden drops at higher frequencies.
Neither lenses nor sensors have a single resolution level, like being able to resolve everything on a size scale bigger than 100 line pairs per mm and nothing smaller than this. Instead, there is a gradual decline as image features get smaller, sometimes measured by MTF graphs, which is roughly a measure of the fraction of contrast that the lens or sensor delivers for image details at various sizes (lp/mm). The overall performance of a lens-sensor combination as measured by MTF is the product of lens MTF with sensor MTF.
So even if one lens has more resolution than a sensor in the sense of higher MTF curve, another lens of even higher MTF can still improve the combined lens-sensor MTF, and so improve the overall resolution.
Hi,The question posed in the first post is (as I understand it): can the "diffraction limit" be used to perfectly predict at what pixel pitch/aperture combination the final resolution is limited only by diffraction (i.e. increasing the amount of megapixels will give exactly zero benefit).
Not at all. A lens that is diffraction limited will not improve on stopping down. Technically you could say that MTF is dominated by diffraction an not aberrations.
(http://www.coinimaging.com/blogimages/merging-discs.jpg)In a classical physics, linear, noiseless world, one would expect proper deconvolution to be theoretically able to resolve point-sources whos Airy disks overlap to a very high degree (assuming that the blur kernel can be precisely estimated, that there is very little noise etc).
The distance between two Airy discs where they are still considered to be resolved separately is the radius of the disc – also called the Raleigh criterion. A smaller Airy disc means a smaller disc radius and a higher resolution. This distance is somewhat arbitrary as there is still a small amount of contrast remaining in the space between the discs at the Raleigh criterion. If the discs are moved any closer, the remaining contrast between the two objects will completely disappear. The point where all contrast is lost between the adjacent discs is called the Sparrow criterion and is the absolute limit of resolution.
The Raleigh criterior is the most commonly used measure of resolution:
Airy disc radius = 1.22 * N * light wavelength ( N = aperture size, light wavelength is commonly 546 or 550 nm or 0.550 um, a wavelength of green)
Rayleigh's resolution limit is reached when the two stars are separated by the radius of the Airy disk, but many astronomers say they can still distinguish the two stars even when they are closer than Rayleigh's resolution limit. Sparrow's Resolution Limit improves on this by saying that the ultimate resolution limit is reached when the combined image from the two stars no longer has a dip in brightness between them, but instead has a roughly constant brightness from the peak of one star's image to the other. But because of the extended image, it is still distinguishable from a single star.
Sparrow's resolution limit is about half Rayleigh's resolution limit. For example, for an eight-inch telescope, Rayleigh's resolution limit is 0.70 seconds of arc, but Sparrow's resolution limit is 0.35 seconds of arc.
Sparrow's resolution limit is also used for optical microscopes.
(http://www.olympusmicro.com/primer/digitalimaging/deconvolution/images/deconresolutionfigure1.jpg)-h
It should also be understood that any resolution criterion is not an absolute indicator of resolution, but rather an arbitrary criterion that is useful for comparing different imaging conditions.
...
In some applications, such as localization of a moving object, resolution below the Rayleigh limit is possible. This highlights the fact that resolution is task-dependent and cannot be defined arbitrarily for all situations.
In addition, resolution also depends to a great extent on image contrast, or the ability to distinguish specimen-generated signals from the background.
From what I recall from building pinhole cameras the red and infra-red parts of the spectrum were more prone to diffraction due to a wider wavelength I would always use an IR (Red 99) filter over the front of my camera to help limit diffraction.
I have no idea if this works, or if the red lenses are less diffraction prone compared to green or blue in the Bayer array.
The question posed in the first post is (as I understand it): can the "diffraction limit" be used to perfectly predict at what pixel pitch/aperture combination the final resolution is limited only by diffraction (i.e. increasing the amount of megapixels will give exactly zero benefit).
http://coinimaging.com/blog1/?p=139In a classical physics, linear, noiseless world, one would expect proper deconvolution to be theoretically able to resolve point-sources whos Airy disks overlap to a very high degree (assuming that the blur kernel can be precisely estimated, that there is very little noise etc).
In practice, this is practically impossible because of noise and other errors, but that is another story than the "theoretically, you can never move beyond the diffraction limit, no matter what".
Hi h,This is exactly the way of reasoning that I wanted to suggest. Rather than claiming that some arbitrary airy-disk distance is the absolute limit for spatial information recoverable, it seems reasonable to assume that diffraction leads to a gradual loss of practical resolution due to vanishing (high-spatial-frequency) contrast. The limit is likely to depend somewhat on input, raw processing etc.
The real problem will be that lower than maximum contrast detail will have been attenuated into a featureles signal, even if there would not be any noise. The quantization limit of 14-bit ADCs will leave some room to retain a relevant micro contrast, but when higher spatial frequencies are concerned, then contrast will be reduced a lot already due to the area sampling of the sensels.
Suppose some scene micro detail has a contrast of 10:1 by itself, the area sampling could reduce that to perhaps 20% of that, so 2% signal would be left if no optics were involved. When the MTF of lens/diffraction reduces that to 10%, then only 0.2% will remain. Add a bit of noise (photon shot noise, read noise), and there will be no relevant signal left to deconvolve. Adding a bit of defocus (even within the DOF zone) will crush all hopes of recovery.
That's why I mentioned that the different impact on higher spatial frequencies (= lower MTF) will kill those first, and somewhat lower spatial frequencies stand a bit better chance because their MTF will be higher.
Cheers,
Bart
This is exactly the way of reasoning that I wanted to suggest. Rather than claiming that some arbitrary airy-disk distance is the absolute limit for spatial information recoverable, it seems reasonable to assume that diffraction leads to a gradual loss of practical resolution due to vanishing (high-spatial-frequency) contrast. The limit is likely to depend somewhat on input, raw processing etc.