Pages: 1 [2] 3 4 5   Go Down

Author Topic: Do Sensors “Outresolve” Lenses?  (Read 24236 times)

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: Do Sensors “Outresolve” Lenses?
« Reply #20 on: October 23, 2014, 02:43:07 am »

Fine_Art,

Not sure what exactly you are disagreeing with, but my point is a point of fact.  A lens performance is independent of the camera sensor onto which it's image circle shines.  If a lens has the resolving power equal to 12 MPs of data (FF Size Sensor), then no matter what sensor reads that FF image circle, you get the same data.  Cutting a pie into 36 slices instead of 12 slices doesn't give you anymore pie!

What a higher resolution capable sensor will do is show you the limited resolving power of the lens, but the lens has not changed it's performance.  and done properly and all else constant, then the print or displayed image will be the same from a 12MP or 36MP sensor.  You could 'upsample' in the camera or post processing, but you still started with the same amount of actual data.  The other benefit of a higher resolution sensor is that it can capture all the data from all lenses less than or equal to it's data saturation point.  Put a better resolving lens on both those sensors and the 12MP starts throwing away data while the 36MP keeps it.

The 1st question I ask my friends when they want to upgrade their camera is why?  Usually it is more MPs.  So if they have a 12MP camera, I ask them "Assuming the format of the picture (2x3) stays the same, what is double the resolution of a 12MP camera?"  They are usually dumbfounded to know it is 48MP!!!




I was referring to this : "...what you are seeing is that the sensor could resolve more than the lens could give. "

You may infer that, but it is not what you are seeing.
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: Do Sensors “Outresolve” Lenses?
« Reply #21 on: October 23, 2014, 06:06:39 am »

What a higher resolution capable sensor will do is show you the limited resolving power of the lens, but the lens has not changed it's performance.

Hi,

This is where that theory falls apart. Have another look at the chart that Jim posted earlier. A higher sampling density (smaller sampling pitch) will continue to extract more resolution from a lens. While there will be more to be gained from a good lens, it also works that way with a lesser lens. The simple reason is that one needs to combine the MTF functions of both lens and sampling system, and the result will grow closer to the worst of the two contributors if the better one improves, but if the worst of the two is improved then the combination will raise the combined quality even more.

It's rather basic arithmetic, 50%x50% is 25%, but 50%x90% is 45% (closer to the worst of the two). Raising the worst of the two to e.g. 75% would give 75%x90% is 67.5% (again closer to the worst of the two and a much better combination). You can consider the sampling density as the worst of the two, holding back the combined result most, until they get closer to each other's performance when improvement will (not stop, but) slow down.

Lens resolution and sensor sampling density are not independent limitations, they work in combination to produce a system MTF.
 
Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

Petrus

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 952
Re: Do Sensors “Outresolve” Lenses?
« Reply #22 on: October 23, 2014, 07:57:18 am »

Slightly OT, but in connection with hi-fi systems I have tried to get across the idea that the total throughput quality of the system is the multiplication of quality index of each component (1 = perfect, 0 = total non function), just like with lens and sensor explained above. Cables are practically perfect, as are electronics in digital audio, but speakers and rooms are far from perfect. So if we multiply cable index 0.99 with CD-player index 0.97, amplifier index 0.95, speaker 0.80 and room 0.75 we get 0.55 as the total quality index. Paying thousands or even tens of thousands to try to improve the player or amplifier is futile, when investing the same money into better speakers or room acoustics a much bigger improvement can be achieved. Just that playing with thousand dollar cables and $10000 players is much more "hi-fi" than gluing acoustic materials to walls and building bass traps.

Back to original programming...
Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: Do Sensors “Outresolve” Lenses?
« Reply #23 on: October 23, 2014, 08:47:55 am »

This is where that theory falls apart. Have another look at the chart that Jim posted earlier. A higher sampling density (smaller sampling pitch) will continue to extract more resolution from a lens. While there will be more to be gained from a good lens, it also works that way with a lesser lens. The simple reason is that one needs to combine the MTF functions of both lens and sampling system, and the result will grow closer to the worst of the two contributors if the better one improves, but if the worst of the two is improved then the combination will raise the combined quality even more.

It's rather basic arithmetic, 50%x50% is 25%, but 50%x90% is 45% (closer to the worst of the two). Raising the worst of the two to e.g. 75% would give 75%x90% is 67.5% (again closer to the worst of the two and a much better combination). You can consider the sampling density as the worst of the two, holding back the combined result most, until they get closer to each other's performance when improvement will (not stop, but) slow down.

Lens resolution and sensor sampling density are not independent limitations, they work in combination to produce a system MTF.

Bart,

Well stated. Looking at Jim's charts, we see that the smallest pixel spacing currently available in 135 format cameras is 4.8 um with the 36 mp Sony Exmoor chips. That is second from the widest pixel spacings that Jim considered. We really need finer pixel spacing to make the best use of the Otus lens, and Jim states that the fastest route to higher MTF is to decrease the pixel spacing of the sensor.

While lenses don't outdate as rapidly as digital cameras, the question arises for those of us with limited resources and 36 mp cameras and very good rather than excellent lenses, would the best value be obtained by keeping the current camera and upgrading to the Otus lens or keeping our current optics and upgrading to a higher MP camera?

Bill
Logged

Manoli

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2296
Re: Do Sensors “Outresolve” Lenses?
« Reply #24 on: October 23, 2014, 09:30:58 am »

While lenses don't outdate as rapidly as digital cameras,

That's something of an understatement, good quality manual lenses can last more than a lifetime.

... the question arises for those of us with limited resources and 36 mp cameras and very good rather than excellent lenses, would the best value be obtained by keeping the current camera and upgrading to the Otus lens or keeping our current optics and upgrading to a higher MP camera?

This is the question that DxO have tried to qualify with their very subjective P-Mpix rating and we know that this is indeed a difficult issue to quantify as there are many factors relating to lens selection, but by way of (an extreme) example:

You have an M8(10mp) , combined with the Leica Summilux 50/1.4 - until the arrival of the Otus, pretty well universally accepted as the ultimate 50mm - today, displaced by the 50/2 APO-Summicron. Do you upgrade your lens or buy an M240(24MP) ?

From a resolution(only)/best value perspective - upgrade the camera first.

(Note: I do have the M8, and no, I would not upgrade to the M240 - but that is for reasons wholly unconnected to the core topic of this thread.)






Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: Do Sensors “Outresolve” Lenses?
« Reply #25 on: October 23, 2014, 10:36:34 am »

From a resolution(only)/best value perspective - upgrade the camera first.

Hi M.,

That's often the case. Sensor sampling density is relatively easier to upgrade than lens resolution, unless the latter is very poor (e.g. and if it matters, in the extreme corners).

We've seen sampling density and thus the limiting resolution (Nyquist frequency) go up from a 6.4 - 7.2 micron pitch to 4 - 4.88 micron, so say 35% in some 7 years. Lens resolution is more complex to catch in a single number, but I think the pace is much slower and less dramatic. The biggest difference was due to analog sensor (film) oriented optical designs being replaced by digital sensor oriented optical designs (taking into account the optical filter stack and cover-glass, which also made rear anti-reflection coating and lens shape more important).

Maybe the OTUS jumps a bit further, instead of crawling, but that's mostly for wide open use because at smaller apertures things get diffraction limited pretty fast).

This is also kind of consistent with Jim's "quiver plots" which show that even with a given lens, the more significant improvement can be achieved by increasing the sampling density (i.e. reducing the pitch).

Of course there are other ways to improve resolution as well, e.g. shooting with a longer focal length and stitching for the angle of view, or using super-resolution techniques. But resolution alone is not as big a requirement for most, except for those who need to produce large output sizes.

So, changing the camera/sensor is often the faster approach to better image quality (resolution and dynamic range and quantum efficiency), and will also allow to utilize improved useful features like live view, tethering, improved autofocus, faster shooting intervals, articulating LCDs, etc..

Modern lenses will last many generations of camera bodies to come, so there is less of a need to upgrade, unless for replacing a dud. Of course lens manufacturers will think of other features to incorporate in lenses, like autofocus improvements, which will only work together with the newest generation of bodies, but 'built-in obsolescence' or forced upgrading/replacement is a way of survival for those companies.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

dwswager

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1375
Re: Do Sensors “Outresolve” Lenses?
« Reply #26 on: October 23, 2014, 11:22:53 am »

Hi,

This is where that theory falls apart. Have another look at the chart that Jim posted earlier. A higher sampling density (smaller sampling pitch) will continue to extract more resolution from a lens. While there will be more to be gained from a good lens, it also works that way with a lesser lens. The simple reason is that one needs to combine the MTF functions of both lens and sampling system, and the result will grow closer to the worst of the two contributors if the better one improves, but if the worst of the two is improved then the combination will raise the combined quality even more.

It's rather basic arithmetic, 50%x50% is 25%, but 50%x90% is 45% (closer to the worst of the two). Raising the worst of the two to e.g. 75% would give 75%x90% is 67.5% (again closer to the worst of the two and a much better combination). You can consider the sampling density as the worst of the two, holding back the combined result most, until they get closer to each other's performance when improvement will (not stop, but) slow down.

Lens resolution and sensor sampling density are not independent limitations, they work in combination to produce a system MTF.
 
Cheers,
Bart

Most seemed to have missed my initial condition on the discussion which was that 12MP was all the lens had to give.  And yes, each link in the chain impacts the overall output.  But it is a process of subtraction from image quality which starts at 100%.  The best performance each link in the chain can give is to not detract from image quality, a practical impossibility.  It can never add to it.  My point is not that more MPs is a bad thing, only that it is not necessarily helping, depending on the rest of the chain.  Having extensive experience with military imaging senors, there are uses for oversampling, but it never gives you more data than you started with. 

Oh, and the image circle produced by a lens is ABSOLUTELY independent from the surface upon which that image circle shines!  While the quality of the final image is not, the output of the lens most certainly is.   At the same size, it doesn't matter if the surface is a 12MP 1990s sensor, a 36MP 2014 sensor or a waffle!  To believe otherwise is beyond credibility.

I love theoretical discussions as much as almost anyone.  But that is what they are...theoretical.  In the real world, we deal with practical limitations that make a lot of improvements mute, unless the rest of the chain improves with it.  And finally, from Jack Dykinga (via John Shaw): “Cameras and lenses are simply tools to place our unique vision on film.  Concentrate on equipment and you’ll take technically good photographs. Concentrate on seeing the light’s magic colors and your images will stir the soul.”  The best photographs, in my opinion, are both technically good, and stir the soul; technical failures don't interfere with the soul stirring!
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: Do Sensors “Outresolve” Lenses?
« Reply #27 on: October 23, 2014, 11:57:32 am »

Most seemed to have missed my initial condition on the discussion which was that 12MP was all the lens had to give.

Hi,

Maybe because 12MP means nothing without further context, like i.e. sampling density or surface area, and even that is only part of the image chain...

Quote
And yes, each link in the chain impacts the overall output.  But it is a process of subtraction from image quality which starts at 100%.

But that 100% is not the lens, it's the scene we want to image. Each component of the imaging chain offers 100% of it's own performance,  yet it may be the weaker or the stronger link in the cascade of interactions that follow. It is the weakest contributor that sets the ceiling (not the floor).

Quote
My point is not that more MPs is a bad thing, only that it is not necessarily helping, depending on the rest of the chain.


Yet MPs, if defined as sampling density (for limiting resolution, Nyquist frequency) and number of sensels (for field of view, or required image magnification factor to cover a certain field of view), usually are the weakest link, not the lens (unless that is diffraction or severely aberration limited). The proof is that image resolution improves proportionally faster from denser sampling of the projected image of an existing lens than from better lenses (when we assume normal lens designs) on a sensor that is limiting resolution.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

Manoli

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2296
Re: Do Sensors “Outresolve” Lenses?
« Reply #28 on: October 23, 2014, 12:01:14 pm »

I love theoretical discussions as much as almost anyone.  

The title of this thread is "Do sensors 'outresolve' lenses ?"
Before we end up going down a warren of rabbit holes, let me answer with a simple truism -

Today the majority of sensors out-resolve most of the lenses currently in production. The incremental gains to be had from upgrading favour the sensor, both from an economic POV and the consequential IQ benefits.

Even as far back as the analog days, that truism held - even in the debate of 35mm v MF. The larger negative had the IQ advantage, no matter that, back then, MF lenses were generally inferior to their 35mm counterparts.


Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: Do Sensors “Outresolve” Lenses?
« Reply #29 on: October 23, 2014, 12:18:18 pm »

Hi,

I would say it is the other way around, almost any decent lens outresolves any sensor of today at medium apertures and near the optical axis. Truly great lenses also outresolve any sensor at a large but normally not maximum apertures over the largest part of the sensor.

That is definitively what I see.

But clearly, there are deviations. It is possible that small pixel cameras (like Nikon 1) or Sony RX100 are limited more by lens than sensor.

Consumer zooms, specially super zooms, can be really bad at some focal length.

A good evidence of this is the need of OLP filtering on DSLRs and the tendency to yield colour moiré on cameras that lack OLP filter. Moiré is a sure sign that the lens outresolves the sensor. Or as I would say it has significant MTF at the Nyquist frequency.

Best regards
Erik

The title of this thread is "Do sensors 'outresolve' lenses ?"
Before we end up going down a warren of rabbit holes, let me answer with a simple truism -

Today the majority of sensors out-resolve most of the lenses currently in production. The incremental gains to be had from upgrading favour the sensor, both from an economic POV and the consequential IQ benefits.

Even as far back as the analog days, that truism held - even in the debate of 35mm v MF. The larger negative had the IQ advantage, no matter that, back then, MF lenses were generally inferior to their 35mm counterparts.



Logged
Erik Kaffehr
 

Manoli

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2296
Re: Do Sensors “Outresolve” Lenses?
« Reply #30 on: October 23, 2014, 12:23:14 pm »

I would say it is the other way around ...

and I would say, you're correct - I typed it 'back to front'!
Thanks, Erik.

Logged

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word
Re: Do Sensors “Outresolve” Lenses?
« Reply #31 on: October 23, 2014, 12:57:09 pm »

Not sure what exactly you are disagreeing with, but my point is a point of fact.  

A dangerous way to get started. It puts any person disagreeing with you in the position of being one who denies facts. I will continue anyway.

A lens performance is independent of the camera sensor onto which [its] image circle shines.

I agree with that statement.

If a lens has the resolving power equal to 12 MPs of data (FF Size Sensor), then no matter what sensor reads that FF image circle, you get the same data.  

There's an assumption buried in that statement that the resolving power of a lens can be measured in megapixels (presumably on a Bayer CFA sensor). Can you cite a test protocol that would allow a lens to be characterized in that way?

The only one that I can think of is circular wrt the definition. Take a lens, and make resolution tests with finer and finer pixel pitches until, say, the MTF10 in cy/ph stops changing. Then say that the number of pixels on the sensor just before the MTF10 stopped changing is the pixel resolving power of the lens.

But that's an impossible test to perform, except in simulation. I have performed it in simulation, and the numbers of pixels obtained for even a prime lens that most would consider to be mediocre are very large; beyond what you can buy in a consumer camera today.

Jim
« Last Edit: October 23, 2014, 01:40:46 pm by Jim Kasson »
Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: Do Sensors “Outresolve” Lenses?
« Reply #32 on: October 23, 2014, 01:02:56 pm »

Modern lenses will last many generations of camera bodies to come, so there is less of a need to upgrade, unless for replacing a dud. Of course lens manufacturers will think of other features to incorporate in lenses, like autofocus improvements, which will only work together with the newest generation of bodies, but 'built-in obsolescence' or forced upgrading/replacement is a way of survival for those companies.

That is true for lenses with brass helicoid manual focusing mechanisms like Leica and Zeiss, but not necessarily true for autofocusing or vibration reduction (image stabilization) lenses. A friend and I have both experienced US$500 repair bills for our Nikon 70-200 f/2.8 VR1 lenses. Neither were subjected to any impact damage or extraordinary use.

Bill
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: Do Sensors “Outresolve” Lenses?
« Reply #33 on: October 23, 2014, 01:15:42 pm »

Hi,

I would say that Bill is right, on the other hand I have something like 20 lenses, some as old as from 1985, all AF and no real failures on any lens.

But, generally the more complex something is the more probable it is that it will break sooner or lighter. I don't think plastic materials are bad, BTW, if the plastic used is of good quality.

Best regards
Erik

That is true for lenses with brass helicoid manual focusing mechanisms like Leica and Zeiss, but not necessarily true for autofocusing or vibration reduction (image stabilization) lenses. A friend and I have both experienced US$500 repair bills for our Nikon 70-200 f/2.8 VR1 lenses. Neither were subjected to any impact damage or extraordinary use.

Bill
Logged
Erik Kaffehr
 

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word
Re: Do Sensors “Outresolve” Lenses?
« Reply #34 on: October 23, 2014, 01:19:34 pm »

That is true for lenses with brass helicoid manual focusing mechanisms like Leica and Zeiss, but not necessarily true for autofocusing or vibration reduction (image stabilization) lenses. A friend and I have both experienced US$500 repair bills for our Nikon 70-200 f/2.8 VR1 lenses. Neither were subjected to any impact damage or extraordinary use.

Good point, Bill. Then there's obsolescence. A good lens stays good judged by the standards of the day it was designed, but standards change over time. I got rid of almost all my Hasselblad V-series lenses when the H=series came out. (I kept the 500, even though it's not very sharp, and the 250 APO, which is pretty sharp.) In fact, aside from view camera lenses and the 50mm f/2 that's on my Nikon S2, those are the oldest lenses I own.

Another thing to consider. When you buy a sharp lens, you've got a sharp lens that will be useful for many years. When you buy a hi-res body, all the lenses you own (except for some zooms) get better.

Jim

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word
Re: Do Sensors “Outresolve” Lenses?
« Reply #35 on: October 23, 2014, 01:44:35 pm »

Take a lens, and make resolution tests with finer and finer pixel pitches until, say, the MTF10 in cy/ph stops changing. Then say that the number of pixels on the sensor just before the MTF10 stopped changing is the pixel resolving power of the lens.

It occurs to me that, since this method results in asymptotically approaching a certain cy/ph, you could argue that it never actually converges. That would be pedantic. But it certainly would be true to say that the answer you get depends entirely on what you decide is an inconsequential improvement.

Jim

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: Do Sensors “Outresolve” Lenses?
« Reply #36 on: October 23, 2014, 03:58:50 pm »

Hi,

As it happened I had MTF curves from three of my lenses on screen.

Top Zeiss Sonnar 150/4 CF (30 years old?), centre Minolta 80-200/2.8 (around 30 years) and Sony 70-400/4G (2 years old). All shot on a Sony Alpha SLT 77 with 3.9 microns pixels, and no sharpening. I guess that SLT 77 has an OLP filter.


Best regards
Erik


Good point, Bill. Then there's obsolescence. A good lens stays good judged by the standards of the day it was designed, but standards change over time. I got rid of almost all my Hasselblad V-series lenses when the H=series came out. (I kept the 500, even though it's not very sharp, and the 250 APO, which is pretty sharp.) In fact, aside from view camera lenses and the 50mm f/2 that's on my Nikon S2, those are the oldest lenses I own.

Another thing to consider. When you buy a sharp lens, you've got a sharp lens that will be useful for many years. When you buy a hi-res body, all the lenses you own (except for some zooms) get better.

Jim
« Last Edit: October 23, 2014, 04:11:16 pm by ErikKaffehr »
Logged
Erik Kaffehr
 

Here to stay

  • Newbie
  • *
  • Offline Offline
  • Posts: 15
Re: It's not binary
« Reply #37 on: October 24, 2014, 12:30:21 am »

Use them as you wish. If you post them, please credit me and link to my blog.

The two links above are a good place to start people for this topic, although, if you poke around a little, you'll see that's starting at the middle.

Here's the beginning: http://blog.kasson.com/?p=5720

Jim
Thank you Jim & will surely link to your Blog
I have also started to follow some of your other posts at another unnamed site
thank you for all this work
 
Logged

Here to stay

  • Newbie
  • *
  • Offline Offline
  • Posts: 15
Re: It's not binary
« Reply #38 on: October 24, 2014, 12:44:13 am »

The title of your post makes it seem that there is a point with decreasing pixel pitch where a sensor "outresolves" a lens, and so further improvement in resolution is possible as the pitch continues to decrease. In fact, over a broad range of pitches, lenses, and lens apertures, both making the lens sharper and making the pixel pitch finer will improve resolution.

Here's an example, from a simulation of a RGGB Bayer-CFA sensor of variable pitch with a beam-splitting AA filter and a model of the Otus 55mm f/1.4.



MTF50 in cycles per picture height for a FF sensor is the vertical axis, pitch in um is coming towards you, and f-stop is from left to right.

If we look down from the top at a "quiver plot", with the arrows pointing in the direction of greatest improvement, and the length of the arrpws proportional to the slope, we can seen that, over much of the aperture range of the lens, the fastest path towards improvement is finer pixel pitch.




Details here and here.

Note that some would call a 2 um sensor used with this lens underutilized, since, on a per-pixel level it is not as sharp as the same lens on a 4 um sensor. Nowever, in cycles per picture height, the finer sensor is sharper.

Jim

I am I correct to assume that the maximum resolution( I really should be calling it contrast) that a lens and sensor can resolve is at the point  a lens projects an Airy Disk size at which the sensor’s pixels pitch can accurately measure the size, brightness and location of that disk in an image?

For example with the D800 it is able to accurately locate a smaller Airy disk ( wider F-stop), thus for the highest resolution (contrast) it peaks sooner than let’s say a D700 that would show its greatest resolution(contrast) at a narrower F-stop(and to be specific  both cameras using the same lens).

With the D700 it is only able to accurately detect a larger Airy Disk and because of this the D700 peaks at a narrower F-stop than the D800. Is this correct?

To simplify this it would look something like this

Blur from resolution-limited sensor-> highest resolution (highest Airy Disk edge contrast that the sensor can detect) <- blur from diffraction,
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: It's not binary
« Reply #39 on: October 24, 2014, 02:09:30 am »

Hi,

The reason that resolution figures are problematic is that they are totally unrelated to our vision. Resolution figures are very interesting for aerial reconnoissance  type photography but not for visual observation.

Panavision has a great series explaining this (it is for motion, but also applies to stills):

https://www.youtube.com/watch?feature=player_detailpage&v=iBKDjLeNlsQ

https://www.youtube.com/watch?feature=player_detailpage&v=v96yhEr-DWM

Looking at MTF at different feature sizes (frequencies) is thus much more interesting.

I have run MTF tests in pixel sizes from 9 my to 3.8 my, and lens performance essentially always peaks at the same aprtures, but with smaller pixels we get more resolution at a given MTF (which often is choosen at 50%).

So what I would say, the advantage of smaller pixels is better definition of whatever the lens renders, and that applies to any somewhat well corrected lens.

Best regards
Erik


I am I correct to assume that the maximum resolution( I really should be calling it contrast) that a lens and sensor can resolve is at the point  a lens projects an Airy Disk size at which the sensor’s pixels pitch can accurately measure the size, brightness and location of that disk in an image?

For example with the D800 it is able to accurately locate a smaller Airy disk ( wider F-stop), thus for the highest resolution (contrast) it peaks sooner than let’s say a D700 that would show its greatest resolution(contrast) at a narrower F-stop(and to be specific  both cameras using the same lens).

With the D700 it is only able to accurately detect a larger Airy Disk and because of this the D700 peaks at a narrower F-stop than the D800. Is this correct?

To simplify this it would look something like this

Blur from resolution-limited sensor-> highest resolution (highest Airy Disk edge contrast that the sensor can detect) <- blur from diffraction,

Logged
Erik Kaffehr
 
Pages: 1 [2] 3 4 5   Go Up