Pages: [1]   Go Down

Author Topic: Question re diffraction and downsampling  (Read 4305 times)

AreBee

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 638
Question re diffraction and downsampling
« on: December 10, 2013, 06:03:38 pm »

Folks,

I hope you can help me out with something I have been pondering.

MTF is a measure of contrast. As a lens is stopped down the visible effect of diffraction increases when viewed at 100%. An increase in diffraction reduces MTF, or contrast, of the captured image. My understanding is that pure black tends to grey, as does pure white. I am not sure if the 'dulling' effect of diffraction applies to colour other than black and white (and this may be where I am going wrong - refer below!).

For the same aperture, as sensor resolution increases the effect of diffraction is increasingly visible when the captured image is viewed at 100%. I appreciate that viewing magnification has increased. Given an increase in the visible effect of diffraction, I extrapolate that at 100% view, colour in the captured image will be dulled more so than it will be for the same image viewed at a magnification less than 100% or, in the case of a camera with lower resolution sensor, at a viewing magnification up to and including 100%. I am compelled to logically reason that the converse, downsizing an image, will reduce dullness of colour.

I am fairly sure that I have gone wrong somewhere in my reasoning as I seriously doubt that colour decreases/increases with a change of sensor resolution, when an image is viewed at 100%. Therefore, I would be grateful if someone will point out my mistake. :)
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: Question re diffraction and downsampling
« Reply #1 on: December 10, 2013, 06:18:53 pm »

Hi,

I guess that you see two different effects. Look at the image below, showing the effects of diffraction.


As you see, a bright point will be rendered by a central cone, surrounded by a series of discs of lower intensity. My understanding/guess is that the central cone is responsible for loss of resolution but the discs that are spread over a larger area are causing the global loss of contrast. Reducing magnification will reduce loss of resolution, but the global image contrast will not be affected.

You can do a lot of stuff with sharpening.

You can check this: http://echophoto.dnsalias.net/ekr/index.php/photoarticles/68-effects-of-diffraction

Best regards
Erik

Folks,

I hope you can help me out with something I have been pondering.

MTF is a measure of contrast. As a lens is stopped down the visible effect of diffraction increases when viewed at 100%. An increase in diffraction reduces MTF, or contrast, of the captured image. My understanding is that pure black tends to grey, as does pure white. I am not sure if the 'dulling' effect of diffraction applies to colour other than black and white (and this may be where I am going wrong - refer below!).

For the same aperture, as sensor resolution increases the effect of diffraction is increasingly visible when the captured image is viewed at 100%. I appreciate that viewing magnification has increased. Given an increase in the visible effect of diffraction, I extrapolate that at 100% view, colour in the captured image will be dulled more so than it will be for the same image viewed at a magnification less than 100% or, in the case of a camera with lower resolution sensor, at a viewing magnification up to and including 100%. I am compelled to logically reason that the converse, downsizing an image, will reduce dullness of colour.

I am fairly sure that I have gone wrong somewhere in my reasoning as I seriously doubt that colour decreases/increases with a change of sensor resolution, when an image is viewed at 100%. Therefore, I would be grateful if someone will point out my mistake. :)
Logged
Erik Kaffehr
 

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8914
Re: Question re diffraction and downsampling
« Reply #2 on: December 11, 2013, 04:38:25 am »

Folks,

I hope you can help me out with something I have been pondering.

MTF is a measure of contrast. As a lens is stopped down the visible effect of diffraction increases when viewed at 100%. An increase in diffraction reduces MTF, or contrast, of the captured image. My understanding is that pure black tends to grey, as does pure white. I am not sure if the 'dulling' effect of diffraction applies to colour other than black and white (and this may be where I am going wrong - refer below!).

Hi Rob,

Yes, it also applies to color. The diffraction pattern can be considered as a non uniform blur filter. The pixel(s) near the center of the blur pattern will dominate the summed result, and surrounding pixels will have a weighted contribution.

Quote
For the same aperture, as sensor resolution increases the effect of diffraction is increasingly visible when the captured image is viewed at 100%. I appreciate that viewing magnification has increased.

Correct, the diameter of the diffraction pattern that affects the image is a given for a combination of aperture size and shape, and wavelength. Smaller sensels just take more accurate, denser spaced, samples of that blurred image. That will visibly affect the contrast of micro-detail most, and larger detail somewhat less, clearly visible when output large (or looked at when zoomed in close).

Quote
Given an increase in the visible effect of diffraction, I extrapolate that at 100% view, colour in the captured image will be dulled more so than it will be for the same image viewed at a magnification less than 100% or, in the case of a camera with lower resolution sensor, at a viewing magnification up to and including 100%. I am compelled to logically reason that the converse, downsizing an image, will reduce dullness of colour.

It's not so much a dulling effect, but rather a blending effect with nearby colors. If the colors are similar, then not much will change, but fine color detail will become an averaged tone (center weighted) of a region around each pixel covered by the diffraction pattern diameter. Since human vision is very sensitive to edge detail, any blurring of that will be noticed faster than a subtle blending of colors, but all are blurred in a similar way. The green wavelengths are contributing most to the luminance of a scene, so the wavelength dependent diffraction of those wavelengths will dominate our impression of contrast and color purity loss.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

AreBee

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 638
Re: Question re diffraction and downsampling
« Reply #3 on: December 11, 2013, 02:05:41 pm »

Hello Erik,

Thanks for the link. Interesting to see the extent to which deconvolution sharpening can help (I don't use this type of sharpening...yet, but that's a story for another day).

Hello Bart,

Quote
Yes, it also applies to color. The diffraction pattern can be considered as a non uniform blur filter. The pixel(s) near the center of the blur pattern will dominate the summed result, and surrounding pixels will have a weighted contribution.

Is it the case that even though the blur pattern is non-uniform, each pixel is the centre of its own blur pattern - the 'spike' in the diagram in Erik's post above - and therefore the effect of each pixel on its neighbours returns a uniform effect across the sensor? Presumably this does not hold true at sensor edges (summed result not equal to that from 'balanced' pixels, i.e pixels remote from an edge)?

Quote
It's not so much a dulling effect, but rather a blending effect with nearby colors. If the colors are similar, then not much will change...

The first sentence I understand, and in the context of different coloured pixels adjacent to each other, also your second sentence. However, your use of "much" has confused me somewhat. Consider the following thought experiment:

Say we shoot a theoretical pure white background. Assuming we shoot a series of images where the camera is progressively stopped down but shutter speed is adjusted to return the same exposure (assume ISO is held constant and that noise is non-existent). Is it correct that we would find the images tending to grey for smaller aperture? If so, then given that in this example, "the colors [of adjacent pixels] are similar" (identical actually), why would "not much...change"?

Quote
The green wavelengths are contributing most to the luminance of a scene, so the wavelength dependent diffraction of those wavelengths will dominate our impression of contrast and color purity loss.

Yes, this makes sense to me.

Cheers,
Logged

AreBee

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 638
Re: Question re diffraction and downsampling
« Reply #4 on: December 11, 2013, 02:11:17 pm »

Think I may have answered my own question - if all the pixels are white then the summed result will be...white!. :P
Logged

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: Question re diffraction and downsampling
« Reply #5 on: December 11, 2013, 02:30:50 pm »

Chances are the color mudding due to diffraction will happen along edges of objects. You would also see it in fine lines like the colored veins in plans. The same thing happens in our eyes.
Logged

allegretto

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 660
Re: Question re diffraction and downsampling
« Reply #6 on: December 22, 2013, 12:30:02 am »

Chances are the color mudding due to diffraction will happen along edges of objects. You would also see it in fine lines like the colored veins in plans. The same thing happens in our eyes.

yep

our senses are all digital

type of cell recruited, number of cells recruited and integration/processing are what accounts for the differences.
Logged

01af

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 296
Re: Question re diffraction and downsampling
« Reply #7 on: January 17, 2014, 08:53:33 am »

For the same aperture, as sensor resolution increases the effect of diffraction is increasingly visible when the captured image is viewed at 100 %.
That's right.


I appreciate that viewing magnification has increased.
And exactly this is the reason why the diffraction effect will become more visible at higher sensor resolution. However if you keep the viewing magnification constant then there won't be an increase of the diffraction effect's visibility.


Given an increase in the visible effect of diffraction ...
And here is where you are mistaken—as well as mostly everybody else, too. You're starting to confuse the increase of the effect's visibility at 100 % view with an increase of the effect itself. For some folks, this confusion may eventually lead to the myth that lower-resolution sensors (of the same size and under the same lens) were less sensitive to diffraction and hence give sharper pictures at medium and small apertures than higher-resolving sensors ... which of course is bunkum.


I am compelled to logically reason that the converse, downsizing an image, will reduce dullness of colour.
This would be the logical consequence ... if the premise was correct in the first place. Which it isn't. So, to be totally clear: No, downsampling won't reduce diffraction-related losses of sharpness and contrast. To the contrary—downsampling will always reduce resolution and detail rendition.
Logged

xpatUSA

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 390
    • Blog
Re: Question re diffraction and downsampling
« Reply #8 on: January 18, 2014, 12:05:03 pm »

Folks,

I hope you can help me out with something I have been pondering.

MTF is a measure of contrast. As a lens is stopped down the visible effect of diffraction increases when viewed at 100%. An increase in diffraction reduces MTF, or contrast, of the captured image. My understanding is that pure black tends to grey, as does pure white. I am not sure if the 'dulling' effect of diffraction applies to colour other than black and white (and this may be where I am going wrong - refer below!).

A picky point or two:

MTF is not a measure of contrast. It is the ratio between a scene contrast and the resulting picture contrast:

Quote
MTF is the contrast at a given spatial frequency ( f ) relative to contrast at low frequencies.

source: http://www.normankoren.com/Tutorials/MTF.html

Also, the scene itself could have a very low low-frequency contrast but that would not change the imaging system MTF. By which, I mean that we're not talking about just pure blacks and whites, which I imagine the OP does realize.

As has been said, color affects MTF in that the amount of diffraction depends directly on wavelength (amongst other things). The MTF for a system varies with monochromatic color and is usually stated for an implied wavelength of 555nm. For this reason, infra-red microscopy is not very popular  ;)

Quote
I am compelled to logically reason that the converse, downsizing an image, will reduce dullness of colour.

That is my experience on a per-pixel basis and based on slant-edge testing. MTF50 in cycles/pixel terms does increase and so does the MTF value at Nyquist. The improved MTF curve could indeed be perceived as less dullness of color. Not talking about resolution now, just MTF.

Cheers,
« Last Edit: January 18, 2014, 12:21:34 pm by xpatUSA »
Logged
best regards,

Ted

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word
Re: Question re diffraction and downsampling
« Reply #9 on: January 18, 2014, 12:21:08 pm »

Is it the case that even though the blur pattern is non-uniform, each pixel is the centre of its own blur pattern - the 'spike' in the diagram in Erik's post above - and therefore the effect of each pixel on its neighbours returns a uniform effect across the sensor?

The diffraction occurs before the image is sampled by the sensor. Therefore the blur pattern due to the diffraction is independent of the sensel pitch or displacement with respect to the diffracted image. Therefore, each pixel is not the center of its own diffractive blur pattern, although by virtue of the fact that the sensels sample a part of the image larger than a point, the sensels introduce their own blur, which is unrelated to diffraction. In addition, if an anitaliasing filter is present, it introduces blur on the image that's already been blurred by diffraction and passes that blurred image on to the sensels, which introduce their own blur.

Some -- or all, in some circumstances -- of this blur, including diffusion blur, may be desirable for some subject matter/camera combinations; it tends to reduce aliasing errors in all cameras and false-color effects stemming from the non-coincident color sensors in cameras using a color filter array.

Jim
Pages: [1]   Go Up