Luminous Landscape Forum

Equipment & Techniques => Cameras, Lenses and Shooting gear => Topic started by: imagico on September 14, 2009, 02:52:25 pm

Title: Aliasing on image sensors
Post by: imagico on September 14, 2009, 02:52:25 pm
With the recent discussion on the Leica M9 and the fact that it does not have a low pass filter something i have been pondering about occasionally came into mind again:

As you commonly learn in the context of sampling theory (see for example http://en.wikipedia.org/wiki/Sampling_theorem) (http://en.wikipedia.org/wiki/Sampling_theorem)) aliasing occurs with discretely sampled signals in the form that signal components above the nyquist frequency are folded back into the lower frequency range (as explained on Wikipedia as well: http://en.wikipedia.org/wiki/Aliasing) (http://en.wikipedia.org/wiki/Aliasing)).  This results in Moiré artefacts in case of sampled images.  This is essentially how aliasing, Moiré and the need for low pass filters is explained in every text i have read about the topic and countless digital photos showing Moiré patterns support this.

It is however important to note that this is about sampled signals.  If the change of light intensity across the sensor is indeed sampled at discrete points the effect of aliasing is maximized.  You can observe this extreme form of aliasing for example in raytraced synthetical images when no anti-aliasing technique (like super-sampling of pixels) is used.  In real world sensors light is always collected not only at a point but across a finite size light sensitive area and many recent sensor designs increase this even further by using microlenses.  As i understand things this should significantly reduce the intensity of aliasing artefacts up to the point where a hypothetical perfect sensor without insensitive gaps between the pixels (even the so called gapless microlens designs still loose a lot of light for sure) would not generate any aliasing artefacts.  In other words:  I think the need for a low pass filter should decrease while the sensor efficiency in terms of how much of the light that hits the sensor is actually detected by the electronics increases.

The question is of course if real world observations can confirm these theoretical considerations - the M9 is supposed to use a new sensor design using microlenses, most likely more efficient with respect to light collection percentage than older cameras without aa filters (the old Kodak FF DSLRs and MFDBs come to mind)  It would not be too surprising so see that the M9 actually does not suffer too much from aliasing problems.  On the other hand the presence of aliasing artefacts and the fact that Canon, Nikon, Sony etc. still feel the need for aa filters indicates there is still a lot of room for improving sensor efficiency.
Title: Aliasing on image sensors
Post by: BJL on September 14, 2009, 03:28:10 pm
Quote from: imagico
It is however important to note that this is about sampled signals. ... a hypothetical perfect sensor without insensitive gaps between the pixels (even the so called gapless microlens designs still loose a lot of light for sure) would not generate any aliasing artefacts.
I think that is right ... and so monochrome sensors or "X3" type sensors could avoid any need for low-pass (AA) filters.

But sensors using color filter arrays sample each color with gaps:
GxGx
xGxG for Green, x for gaps in coverage

xRxR
xxxx
xRxR for Red

and so on. These sampling gaps seem to make aliasing errors impossible to avoid.
Title: Aliasing on image sensors
Post by: Graeme Nattress on September 14, 2009, 03:56:00 pm
Although a point sample is the worst case for aliasing problems, a true area sample, where each sample area joins to the next with no gap will still produce aliasing artifacts. No type of sensor is immune to aliasing. The added problem with Bayer pattern sensors is that aliasing can appear as chroma moire as well as luma moire. However, for the very most part, luma moire is removable without severe detriment to the image. Luma moire is not, and hence it is important that it is not ignored.

So just to put it plainly, to answer the question - yes, a gapless sensor can and will alias.

Graeme
Title: Aliasing on image sensors
Post by: imagico on September 14, 2009, 04:00:10 pm
Quote from: BJL
I think that is right ... and so monochrome sensors or "X3" type sensors could avoid any need for low-pass (AA) filters.

But sensors using color filter arrays sample each color with gaps:
GxGx
xGxG for Green, x for gaps in coverage

xRxR
xxxx
xRxR for Red

and so on. These sampling gaps seem to make aliasing errors impossible to avoid.

This special bayer aliasing indeed seems not avoidable.  The aa filter does not remove it either though (unless it is so strong that the resolution advantage of the bayer sensor is pointless anyway).
Title: Aliasing on image sensors
Post by: ErikKaffehr on September 14, 2009, 04:03:34 pm
Hi,

There were some discussions on this forum on that issue. I try to sum up these and some other reading.

1) If lens resolves higher (let's say has significant MTF) above the Nyquist limit we will get aliasing and false resolution and these are clearly artifacts.
2) The above artifacts cannot be eliminated numerically
3) So AA-filtering is strictly speaking necessary

However,

1) Monochrome artifacts are normally not disturbing and it's hard to tell false resolution from real resolution. So for photography monochrome moiré is probably not a big issue.
2) Color artifacts arise because of aliasing over the Bayer RGBG pattern. This is very disturbing, but can be eliminated by essentially throwing away color information. Luminance is more important for detail.
3) Aliasing is only a problem on certain type of subjects having patterns with the same frequency as the pitch of sensor, thin branches and so on.
4) Aliasing only occurs in narrow plane of best focus
5) Aliasing is only occurring around optimum apertures

One reason MFDBs don't have AA filter may be cost. Those filters are expensive. Another that MFDB customers can live with moirés.

Why the situation is different on DSLRs, I don't know. One guess is that they can output JPEG and cannot do local processing on moiré artifacts, so either can they do a global moiré reduction, loosing quality or ignore it an have moirés.

I understand that aliasing is very bad for motion filming, because artifacts colorful or not work against the motion prediction base compression algorithms used for video.

You may check this discussion: http://luminous-landscape.com/forum/index....showtopic=30758 (http://luminous-landscape.com/forum/index.php?showtopic=30758)

Best reagrds
Erik

Quote from: imagico
With the recent discussion on the Leica M9 and the fact that it does not have a low pass filter something i have been pondering about occasionally came into mind again:

As you commonly learn in the context of sampling theory (see for example http://en.wikipedia.org/wiki/Sampling_theorem) (http://en.wikipedia.org/wiki/Sampling_theorem)) aliasing occurs with discretely sampled signals in the form that signal components above the nyquist frequency are folded back into the lower frequency range (as explained on Wikipedia as well: http://en.wikipedia.org/wiki/Aliasing) (http://en.wikipedia.org/wiki/Aliasing)).  This results in Moiré artefacts in case of sampled images.  This is essentially how aliasing, Moiré and the need for low pass filters is explained in every text i have read about the topic and countless digital photos showing Moiré patterns support this.

It is however important to note that this is about sampled signals.  If the change of light intensity across the sensor is indeed sampled at discrete points the effect of aliasing is maximized.  You can observe this extreme form of aliasing for example in raytraced synthetical images when no anti-aliasing technique (like super-sampling of pixels) is used.  In real world sensors light is always collected not only at a point but across a finite size light sensitive area and many recent sensor designs increase this even further by using microlenses.  As i understand things this should significantly reduce the intensity of aliasing artefacts up to the point where a hypothetical perfect sensor without insensitive gaps between the pixels (even the so called gapless microlens designs still loose a lot of light for sure) would not generate any aliasing artefacts.  In other words:  I think the need for a low pass filter should decrease while the sensor efficiency in terms of how much of the light that hits the sensor is actually detected by the electronics increases.

The question is of course if real world observations can confirm these theoretical considerations - the M9 is supposed to use a new sensor design using microlenses, most likely more efficient with respect to light collection percentage than older cameras without aa filters (the old Kodak FF DSLRs and MFDBs come to mind)  It would not be too surprising so see that the M9 actually does not suffer too much from aliasing problems.  On the other hand the presence of aliasing artefacts and the fact that Canon, Nikon, Sony etc. still feel the need for aa filters indicates there is still a lot of room for improving sensor efficiency.
Title: Aliasing on image sensors
Post by: imagico on September 14, 2009, 04:04:02 pm
Quote from: Graeme Nattress
So just to put it plainly, to answer the question - yes, a gapless sensor can and will alias.

Graeme

If we ignore the bayer specific issues (i.e. lets imagine a monochrome sensor) - how is that?
Title: Aliasing on image sensors
Post by: ErikKaffehr on September 14, 2009, 04:11:33 pm
Hi Graeme,

To my understanding a gapless sensor will also resolve less than a sensor with smaller active area, because the gapless design actually works like an AA filter.

Technically the AA filter splits the light beam in four parts. The pitch of thes four parts is normally much smaller than a pixel. Now, if we do not have separation on pixels a beam of light will hit more then one pixels anyway.

Best regards
Erik

Quote from: Graeme Nattress
Although a point sample is the worst case for aliasing problems, a true area sample, where each sample area joins to the next with no gap will still produce aliasing artifacts. No type of sensor is immune to aliasing. The added problem with Bayer pattern sensors is that aliasing can appear as chroma moire as well as luma moire. However, for the very most part, luma moire is removable without severe detriment to the image. Luma moire is not, and hence it is important that it is not ignored.

So just to put it plainly, to answer the question - yes, a gapless sensor can and will alias.

Graeme
Title: Aliasing on image sensors
Post by: Graeme Nattress on September 14, 2009, 04:25:14 pm
Yes, the MTF curve of the sensor will be lower for a gapless monochrome sensor, and that is why aliasing is reduced a little but not eliminated. A point source sample or a sensor with gaps will have a higher MTF for the sensor, and hence more propensity for aliasing. But if you point both these theoretical cameras at an object that causes aliasing, both will alias. It will just be more intense on one with the gaps.

How does an area sample alias?

think of a black and white target pattern of very fine detail, and area sample pixels 1 through 6:

BWBWBWBWBWBWBW
111222333444555666

pixel 1 sees BWB, pixel 2 sees WBW - so one is dark, one is bright which we'll see in a repeating alias / moire pattern. The extra high frequency in the source has been folded back into a visible lower frequency in the resulting image, but at reduced MTF.

Graeme
Title: Aliasing on image sensors
Post by: imagico on September 14, 2009, 04:26:04 pm
Quote from: ErikKaffehr
[...]

You may check this discussion: http://luminous-landscape.com/forum/index....showtopic=30758 (http://luminous-landscape.com/forum/index.php?showtopic=30758)

Thanks for the pointer - i was aware of most of the points discussed there but some got clearer by reading that.

What i wanted to discuss here was the specific issue of how improvements in sensor efficiency could as a sort of side effect diminish the need for a low pass filter.

The observation mentioned in that thread that the Kodak DSLRs suffered strongly from Moire while newer aa-filter-free sensors do less might indicate that this is indeed the case.

Greetings,

Christoph
Title: Aliasing on image sensors
Post by: Graeme Nattress on September 14, 2009, 04:35:04 pm
It's probably down to improved demosaic algorithms that don't show up chroma moire, and use of micro-lenses to increase the apparent fill factor of the sensor. Couple that with increased resolutions, and you've got an explanation.

Graeme
Title: Aliasing on image sensors
Post by: imagico on September 14, 2009, 04:40:43 pm
Quote from: Graeme Nattress
How does an area sample alias?

think of a black and white target pattern of very fine detail, and area sample pixels 1 through 6:

BWBWBWBWBWBWBW
111222333444555666

pixel 1 sees BWB, pixel 2 sees WBW - so one is dark, one is bright which we'll see in a repeating alias / moire pattern. The extra high frequency in the source has been folded back into a visible lower frequency in the resulting image, but at reduced MTF.

That seems a valid point indeed - i have the impression though (correct me if i am wrong) that the low pass filter will not diminish this effect apart from reducing contrast in general - equally that of real signals and artefacts like this.
Title: Aliasing on image sensors
Post by: ErikKaffehr on September 14, 2009, 04:48:33 pm
Hi,

Yes, but moiré seems still to be around in MF photography and the Leica lenses are designed to be very sharp, so it should be a problem.

Ideally I'd suggest that Leica perhaps should have opted for a more high resolution sensor, but they are not available (Sony has CMOS 24.5 MP but 24.5 MP and 18 MP is not a lot of difference). I guess Leica are often used in conditions where the full capacity of the lens is not utilized, anyway. When have you seen a Leica on top of a 5-series Gitzo?

One more reflection:

One issue with the M-cameras is that the distance between the shutter and the back end of the lens is very short. There may simply not be room for an AA-filter.

Best regards
Erik

Quote from: imagico
Thanks for the pointer - i was aware of most of the points discussed there but some got clearer by reading that.

What i wanted to discuss here was the specific issue of how improvements in sensor efficiency could as a sort of side effect diminish the need for a low pass filter.

The observation mentioned in that thread that the Kodak DSLRs suffered strongly from Moire while newer aa-filter-free sensors do less might indicate that this is indeed the case.

Greetings,

Christoph
Title: Aliasing on image sensors
Post by: Graeme Nattress on September 14, 2009, 05:09:20 pm
Quote from: imagico
That seems a valid point indeed - i have the impression though (correct me if i am wrong) that the low pass filter will not diminish this effect apart from reducing contrast in general - equally that of real signals and artefacts like this.

A low pass filter cannot infinitely attenuate high frequencies - it can only reduce them. The goal is not to eliminate aliasing, but to practically eliminate aliasing. The goal being the combination of the MTF, OLPF and sensor is such that under practical circumstances aliasing artifacts do not occur.

In real world use, there are factors that limit resolution - camera steadiness, lens aperture - wide open might be a bit soft, fully stopped down may be diffraction limiting, DOF, etc.

With the Leica, the OLPF would be very close to the sensor. Any imperfections in the OLPF would most probably show up as visible artifacts, and an artifact free OLPF may be very costly to produce.

Although visible moire is a problematic artifact from aliasing, excessive edge sharpness can appear un-natural, and to me, that's just as much an artifact.

Graeme
Title: Aliasing on image sensors
Post by: ErikKaffehr on September 14, 2009, 06:18:18 pm
Hi,

This chart from Erwin Puts demonstrates the problem:

http://www.imx.nl/photo/leica/camera/page1...es/m9chartb.jpg (http://www.imx.nl/photo/leica/camera/page155/files/m9chartb.jpg)


The full article is here: http://www.imx.nl/photo/leica/camera/page155/page155.html (http://www.imx.nl/photo/leica/camera/page155/page155.html)

This plot only shows a central crop, but it's obvious that there is a lot of aliasing here.

A generated a similar plot for the Nikon 3DX, converted by Lightroom (with capture sharpening) I generate right now, also using Imatest, in a less professional version.

[attachment=16570:D3X.jpg]

So problem is there, period. On the other hand, many photographers prefer sharpness artifact or not...

Regardless aliasing or other factors I still think that the M9 is a great product!

Best regards
Erik


Quote from: Graeme Nattress
A low pass filter cannot infinitely attenuate high frequencies - it can only reduce them. The goal is not to eliminate aliasing, but to practically eliminate aliasing. The goal being the combination of the MTF, OLPF and sensor is such that under practical circumstances aliasing artifacts do not occur.

In real world use, there are factors that limit resolution - camera steadiness, lens aperture - wide open might be a bit soft, fully stopped down may be diffraction limiting, DOF, etc.

With the Leica, the OLPF would be very close to the sensor. Any imperfections in the OLPF would most probably show up as visible artifacts, and an artifact free OLPF may be very costly to produce.

Although visible moire is a problematic artifact from aliasing, excessive edge sharpness can appear un-natural, and to me, that's just as much an artifact.

Graeme
Title: Aliasing on image sensors
Post by: imagico on September 15, 2009, 01:00:11 am
Quote from: Graeme Nattress
A low pass filter cannot infinitely attenuate high frequencies - it can only reduce them. The goal is not to eliminate aliasing, but to practically eliminate aliasing. The goal being the combination of the MTF, OLPF and sensor is such that under practical circumstances aliasing artifacts do not occur.

Right, but i think your onedimensional example is about something different in fact: It shows that the best sensor concerning aliasing would in fact not have a uniform sensitivity across the pixel area but decreasing towards the edges.  In the example:  If the right and left third of each pixel are less sensitive than the center third the artefacts would be reduced.  

This of course means the best sensor concerning light collection efficiency is not automatically the best concerning aliasing - too bad.  
Title: Aliasing on image sensors
Post by: Graeme Nattress on September 15, 2009, 08:18:24 am
Nope, if you reduced pixel sensitivity to the edges, you're more closely approximating a point source, and the aliasing gets worse, not better.

Graeme
Title: Aliasing on image sensors
Post by: imagico on September 15, 2009, 09:49:06 am
Quote from: Graeme Nattress
Nope, if you reduced pixel sensitivity to the edges, you're more closely approximating a point source, and the aliasing gets worse, not better.

Well - if a gap-free uniform sensitivity monochrome sensor does not show any sampling related aliasing but only the kind of aliasing in your example and this can be reduced by a decreasing sensitivity towards the edges of the pixel this would mean the optimum lies somewhere in between.

This problem seems closely related to the question of the best algorithm for downscaling an image.  Nearest neighbor and simple binning techniques are worse than sinc and Lanczos resizing.
Title: Aliasing on image sensors
Post by: Graeme Nattress on September 15, 2009, 10:23:37 am
Now you're getting confusing.....

Yes, a gap free sensor shows aliasing. A sensor with gaps shows worse aliasing. Both alias.

I don't know what you're meaning by "kind of aliasing in your example" - aliasing is aliasing.

Yes, the problem is exactly the same as downscaling or downsampling an image. Both the initial sampling of an image and downsampling are sampling processes. Both will show aliasing if un-wanted high frequencies are not removed first.

NN is the worst - it's like a point sample.
Binning is bad - it's like an area sample.

Sinc and Laczos, or other filters are low pass filters. They pass low, wanted frequencies and attenuate high, unwanted frequencies. Their function is as the optical low pass filter in a camera.

The issue being that digitally you can design any filter shape you want - there are no limits other than how long you want to take to render the downsample. In optics, we lack negative photons, which means we can't have filter designs with negative lobes, and hence we can only have slow, soft filters, whereas digitally we can have very sharp filters.

The problem being that the sharper a digital filter, the more it rings (think sharpness halos around objects) so that is not a perfect solution either.

There is no one perfect downsampling filter - you're always balancing softness, ringing and aliasing. Always.

Graeme
Title: Aliasing on image sensors
Post by: imagico on September 15, 2009, 11:16:37 am
Quote from: Graeme Nattress
Now you're getting confusing.....

Yes, a gap free sensor shows aliasing. A sensor with gaps shows worse aliasing. Both alias.

I don't know what you're meaning by "kind of aliasing in your example" - aliasing is aliasing.

Yes, the problem is exactly the same as downscaling or downsampling an image. Both the initial sampling of an image and downsampling are sampling processes. Both will show aliasing if un-wanted high frequencies are not removed first.

NN is the worst - it's like a point sample.
Binning is bad - it's like an area sample.

Yes and since nearest neighbor corresponds to a sensor pixel only sensitive at a single point and binning is like a pixel of uniform sensitivity across the whole pixel area we can conclude that both these extremes are not the best choice.  The optimum filter shape of course depends on the priorities you have.

I understand you see aliasing with point sampling and with area sampling as two aspects of the same effect.  It does not matter for the my main hypothesis though: that an improved sensor design with respect to sensitivity across the pixel area will diminish aliasing and therefore reduce the need for an additional optical low pass filter.  I see now that my initial idea that aliasing could be completely avoidable this way is not right but it still seems to make quite a difference.

Title: Aliasing on image sensors
Post by: ErikKaffehr on September 15, 2009, 11:24:05 am
Hi,

In my opinion there is a simple solution, and that's more pixels. That would affect SNR negatively on the pixel level but not really in print. DR will suffer.

Best regards
Erik

Quote from: imagico
Yes and since nearest neighbor corresponds to a sensor pixel only sensitive at a single point and binning is like a pixel of uniform sensitivity across the whole pixel area we can conclude that both these extremes are not the best choice.  The optimum filter shape of course depends on the priorities you have.

I understand you see aliasing with point sampling and with area sampling as two aspects of the same effect.  It does not matter for the my main hypothesis though: that an improved sensor design with respect to sensitivity across the pixel area will diminish aliasing and therefore reduce the need for an additional optical low pass filter.  I see now that my initial idea that aliasing could be completely avoidable this way is not right but it still seems to make quite a difference.
Title: Aliasing on image sensors
Post by: Graeme Nattress on September 15, 2009, 11:58:22 am
Quote from: imagico
Yes and since nearest neighbor corresponds to a sensor pixel only sensitive at a single point and binning is like a pixel of uniform sensitivity across the whole pixel area we can conclude that both these extremes are not the best choice.  The optimum filter shape of course depends on the priorities you have.

I understand you see aliasing with point sampling and with area sampling as two aspects of the same effect.  It does not matter for the my main hypothesis though: that an improved sensor design with respect to sensitivity across the pixel area will diminish aliasing and therefore reduce the need for an additional optical low pass filter.  I see now that my initial idea that aliasing could be completely avoidable this way is not right but it still seems to make quite a difference.

Adjusting sensitivity across the pixel area will not help one little bit. Full area sampling is the very best a pixel can do with respect to aliasing performance. If you diminishing that in any way, aliasing will increase, not decrease. You can look at individual artificial examples that might, for that example, show better or worse performance, but that is the nature of aliasing - you will get frequency combinations that cancel out to mid grey, for instance, but even in the real world, and even with test charts, they're never perfectly aligned or uniform.

Filters work among groups of pixels, and that is why they can perform better. Optical filters spread the light which would touch one pixel over many pixels, and that is why they can reduce aliasing.

Graeme