Pages: 1 ... 5 6 [7]   Go Down

Author Topic: D800 hyperbole  (Read 43313 times)

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: D800 hyperbole
« Reply #120 on: March 28, 2012, 06:12:27 am »

Sampling, aliasing, quantisation etc are relevant in many fields and in creative audio I can see how the artefacts will be potentially problematic, but as a genuine question, why does this even matter to most photographers? Except as a fun discussion:-)
Aliasing at capture usually does not affect audio recordings in a perceptable way, at least not for sensible equipment constructed the last 20 years.
Aliasing at capture can affect image recordings in a perceptable way, especially for aa-less cameras.
Quote
As a second question, does aliasing introduce detail or just smear what is there? (I think I understand the concepts of sampling, aliasing and bayer matrix construction and understand how sampling is an issue for example in audio or scientific measurements, but can't get my head round how it applies at the camera sensor/RAW conversion level.) Any explanation or links would be good thanks.
When sampling with significant aliasing, you capture "something". This something is a function of the scene and the camera, but it is ambiguous: two (or very many) quite different scenes can generate the exact same raw files. As the raw developer have no other information than the raw files, which of those scenes would you like it to render?

I think that map-making is a good analogy. Imagine making a topology map that is to be represented as 1km x 1km squares ("pixels"). How would you like to calculate each squares value? Measure the elevation above sea level by placing you GPS or something similar at the exact mid of each square? Or would you rather do it by averaging the elevation of all points inside the square? Or would you want to smooth even more?

-h
Logged

MikeMac

  • Newbie
  • *
  • Offline Offline
  • Posts: 31
Re: D800 hyperbole
« Reply #121 on: March 28, 2012, 11:58:23 am »

Aliasing at capture usually does not affect audio recordings in a perceptable way, at least not for sensible equipment constructed the last 20 years.
Aliasing at capture can affect image recordings in a perceptable way, especially for aa-less cameras.When sampling with significant aliasing, you capture "something". This something is a function of the scene and the camera, but it is ambiguous: two (or very many) quite different scenes can generate the exact same raw files. As the raw developer have no other information than the raw files, which of those scenes would you like it to render?

I think that map-making is a good analogy. Imagine making a topology map that is to be represented as 1km x 1km squares ("pixels"). How would you like to calculate each squares value? Measure the elevation above sea level by placing you GPS or something similar at the exact mid of each square? Or would you rather do it by averaging the elevation of all points inside the square? Or would you want to smooth even more?

-h
Thanks for reply, I need to think about this a bit more:-)
Logged

BJL

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 6600
Aliasing effects: averaging over space vs measuring only at discrete points
« Reply #122 on: March 28, 2012, 01:24:00 pm »

I think that map-making is a good analogy. Imagine making a topology map that is to be represented as 1km x 1km squares ("pixels"). How would you like to calculate each squares value? Measure the elevation above sea level by placing you GPS or something similar at the exact mid of each square? Or would you rather do it by averaging the elevation of all points inside the square? Or would you want to smooth even more?
This is maybe a difference between the theoretical case of aliasing when sampling is measurement at single discrete instants in time, or single points in space [your "midpoint of the square"] vs the case with photography, which is more like averaging light levels over each photosite [your "averaging the elevation of all points inside the square"]. Isn't that some kind of low pass filtering in itself?

But color filter arrays mess the simple view up, and maybe that sampling of each color over only 1/2 to 1/4 of the area is the main villain. Could examples of luminosity aliasing in nearly monochrome subjects be due mainly to the luminosity values given by demosaicing being based mainly on data from green pixels, so that there are gaps in the spatial coverage of those "luminosity" measurements?
Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051

This is maybe a difference between the theoretical case of aliasing when sampling is measurement at single discrete instants in time, or single points in space [your "midpoint of the square"] vs the case with photography, which is more like averaging light levels over each photosite [your "averaging the elevation of all points inside the square"]. Isn't that some kind of low pass filtering in itself?

But color filter arrays mess the simple view up, and maybe that sampling of each color over only 1/2 to 1/4 of the area is the main villain. Could examples of luminosity aliasing in nearly monochrome subjects be due mainly to the luminosity values given by demosaicing being based mainly on data from green pixels, so that there are gaps in the spatial coverage of those "luminosity" measurements?
Yes, integrating the signal over a square is lowpass filtering the signal. It is not a very efficient lowpass filter, though. For a monochrome sensor with 100% fill-rate or perfect microlenses, the "integrate all light within a pixel" idea might be right. For a color-filtered sensor with non-perfect micro-lenses and <100% fill-factor, it is not right.

It is a question of degree: how much aliasing and how much passband blurring will there be if I do operation "X" on my camera/scene-combination. Removing the AA-filter will (everything else equal) tend to increase aliasing, and increase passband sharpness.

-h
Logged

BJL

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 6600
D800 vs D800E: so the CFA does make aliasing worse
« Reply #124 on: March 29, 2012, 10:26:35 am »

Thanks "h", that seems to confirm my rough reasoning that the use of a color filter array makes aliasing far worse. This is supported I suppose by the worse aliasing hat happens in video made from still sensors where only a selection of the photosites are read at all, so that it is closer to the "point samples" considered in the simplest mathematical models of sampling and aliasing.

So it would be nice if someone could produce an "X3" technology (all color information measured at each spatial location) which works better that the somewhat flawed, noise prone, Foveon implementation. Then the AA filter would be far less needed, or could at least have a lighter touch. I have read of patents on several alternative approaches to X3 from several major sensor makers, using stack of color filters and such, but at most these have been deployed in small, special purpose sensors.
« Last Edit: March 29, 2012, 02:53:34 pm by BJL »
Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: D800 vs D800E: so the CFA does make aliasing worse
« Reply #125 on: March 30, 2012, 01:52:54 am »

Thanks "h", that seems to confirm my rough reasoning that the use of a color filter array makes aliasing far worse. This is supported I suppose by the worse aliasing hat happens in video made from still sensors where only a selection of the photosites are read at all, so that it is closer to the "point samples" considered in the simplest mathematical models of sampling and aliasing.

So it would be nice if someone could produce an "X3" technology (all color information measured at each spatial location) which works better that the somewhat flawed, noise prone, Foveon implementation. Then the AA filter would be far less needed, or could at least have a lighter touch. I have read of patents on several alternative approaches to X3 from several major sensor makers, using stack of color filters and such, but at most these have been deployed in small, special purpose sensors.
I agree that a Foveon-type sensor would be less prone to aliasing-induced artifacts at a given sensel-pitch, but it will still have luminance aliasing.

I believe that spatially, if sensel density can be some factor X higher, much the same characteristic can be achieved using traditional Bayer technology.

I speculate that this is the reason why we don't see these exotic designs available; it is simply easier and less expensive to shrink current methods, than it is to do revolutionary things at sufficiently small spatial scale and modest cost. Perhaps this trend will continue until we hit some hard quantum law?

I am only adressing spatial behaviour here, noise, saturation, color response etc are also interesting.

-h
(edit: fix my quotes)
« Last Edit: April 03, 2012, 03:45:54 am by hjulenissen »
Logged

MikeMac

  • Newbie
  • *
  • Offline Offline
  • Posts: 31
Re: D800 vs D800E: so the CFA does make aliasing worse
« Reply #126 on: April 03, 2012, 03:12:19 am »

I agree that a Foveon-type sensor would be less prone to aliasing-induced artifacts at a given sensel-pitch, but it will still have luminance aliasing.

I believe that spatially, if sensel density can be some factor X higher, much the same characteristic can be achieved using traditional Bayer technology.

I speculate that this is the reason why we don't see these exotic designs available; it is simply easier and less expensive to shrink current methods, than it is to do revolutionary things at sufficiently small spatial scale and modest cost. Perhaps this trend will continue until we hit some hard quantum law?

I am only adressing spatial behaviour here, noise, saturation, color response etc are also interesting.

-h

Is this why some of the older MF backs used to have a 3 shot mode? I think that was the name, 3 shots were taken with a different colour filter in front of the sensor, then the shots combined.
Logged

marcmccalmont

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1780
Re: D800 vs D800E: so the CFA does make aliasing worse
« Reply #127 on: April 03, 2012, 03:20:17 am »

I agree that a Foveon-type sensor would be less prone to aliasing-induced artifacts at a given sensel-pitch, but it will still have luminance aliasing.

I believe that spatially, if sensel density can be some factor X higher, much the same characteristic can be achieved using traditional Bayer technology.

I speculate that this is the reason why we don't see these exotic designs available; it is simply easier and less expensive to shrink current methods, than it is to do revolutionary things at sufficiently small spatial scale and modest cost. Perhaps this trend will continue until we hit some hard quantum law?

I am only adressing spatial behaviour here, noise, saturation, color response etc are also interesting.

-h

I hope some day we can not only count photons at a photosite but also measure the frequency of the light doing away with color filters, years ago there was an idea of small piezoelectric "spikes" for photosites these would vibrate at the frequency of the light that was hitting them. If you could read out the frequency and voltage for each "piezo-spike" the problem would be solved.
Marc
Logged
Marc McCalmont
Pages: 1 ... 5 6 [7]   Go Up