Erik,
Yes, AA filterless sensors are known to produce moire and various other artifacts.
Cheers,
Bernard
Hi,
According to signal processing theory, we would get aliasing on any structures that were transferred with significant contrast (MTF) past the resolution of the sensor. This is mostly seen as "moiré". Colour moiré is most obvious and is normally removed by local desaturation of the offending colour pattern. In absence of colour moiré we often see fake detail, that may actually enhance the image by giving impression of detail, that is not actually there.
This is easy to reproduce on artificial subjects, but may not be that often seen on natural subject.
Hi Erik,
Thanks for the example/demonstration. Thank goodness the theory is not just theory, it can actually happen ... ;)
There are of course some requirements that need to be fulfilled for it to happen, and knowing them may allow to reduce the risk of aliasing showing up when we can least use it.
As always, knowing your enemy is the best remedy to cope with the situation and have success in the end. It may save some retouching time/cost, and nasty surprises with a deadline approaching.
Cheers,
Bart
Hi,
Sometimes they are hard to explain, check step chart in the image below.
What I speculate a bit is that sometimes detail we see are actually aliases, so aliases may enhance an image, giving for instance structure to fur or feathers.
I would speculate that the reason that I often (almost always) find color aliases in test images and seldom in real images is that test images tend to have more high contrast detail. Also I tend to stop down a lot for DoF.
The false color aliasing is due to the differing samplingdensitieslocations of the different colors of the Bayer CFA.
The artefacting results from frequencies present in the image beyond sampling limit getting folded back during the reconstitution process.
So sampling position is not the issue.
Red and Blue will aliase faster and larger for the same level of diagonal detail.
Sampling density is the issue.
No, sampling density is not the issue since the sampling density for Red and Blue is equal. However, sampling position is different and that is why they alias at different positions. That causes false color. Their density relative to green is utterly irrelevant, because for a decent demosaic algo the green is equal to the detail signal and should already have been subtracted from the color signals prior to reconstruction.
You should also be aware that this is not your normal sampling theorem problem: the sampling occurs disjunct, which specifically for AA filter-less sensors, makes the problem slightly different. It is the disjunct nature of the sampling that causes most of these artifacts.
My main interest here is aliasing in general, not color aliasing in general. That is the reason I posted BW images. There are tools to reduce color moire but i don't think monochrome aliases can be removed.
Hi Edmund,
Exactly. Red and Blue will create aliasing faster because of the lower sampling density, and thus also larger (lower spatial frequency, due to the fold-back).
On 45-degree rotated sensel layouts like Fuji used, the horizontal/vertical versus diagonal sampling switches in densities, but the same channel differences remain. In fact only the higher diagonal luminance resolution is traded for a higher horizontal/vertical resolution (which makes logical sense in a gravity driven environment). On the newer X-trans sensors the whole false color artifacting situation becomes worse.
Cheers,
Bart
At Photokina, the Fuji marketing guy held a press presentation in the presence of the engineers. He claimed no aliasing. I later asked an engineer "And what if we have a signal at 2x the nyquist limit". The engineer replied "That frequency is suppressed by the lens" :)
No, sampling density is not the issue since the sampling density for Red and Blue is equal. However, sampling position is different and that is why they alias at different positions. That causes false color.
Yes, AA filterless sensors are known to produce moire and various other artifacts.
The feathers of some bird species are especially likely to produce these artifacts even when a camera with an AA filter is used; the genus Callipepla, for example i.e. California Quail and Gambell's Quail.
What software did you use for processing Erik?
Therefore, only different aliasing amounts can cause these false color issues,
Luminance aliasing can not always be removed easily because it involves multiple different spatial frequencies, folded back from higher spatial frequencies from beyond Nyquist and they show up as multiple lower spatial frequency aliases. It requires elaborate reconstruction, by ...
Hi,
LR 5, but I also made some quick and dirty checks with Capture One and RawDeveloper. The results are very similar. The amount of color artifacts differ but monochrome aliasing, what I was looking at, is similar.
Did not really try to supress color moiré. Regarding color artifacts on LensAlign target there was a lot with both LR
5 and C1. Raw developer was not imune either. I decided to skip Capture One Pro, I own it, but we make no friends. I use it for test, but not for creative work.
Best regards
Erik
Aliased information cannot be distinguished from real information because both are recorded together in the same sensel position. Therefore, only different aliasing amounts can cause these false color issues, and all that the demosaicing algorithms can do is iterative reduction of the local color differences where RB and G channel luminances significantly differ.
Exactly, and different aliasing amounts are caused by different position, not by different density, since there is no difference in density.
Was that Capture One 6 or 7?
If Erik doesn't mind, I think this exchange about the root cause of visible aliasing artifacts may add some insight, therefore I've prepared some examples (attached).
I wouldn't dream of contradicting your expertise in these matters if I didn't think it worthwhile for our readers, so, consider this the devil's advocate response to improve our collective understanding:
Your usual precision in these matters is not making much sense to me currently. What's inside the yellow circle is telling me precious little about aliasing. How do you propose false color as a result from what happens inside the yellow circle? What happens inside the yellow circle shows perfect anti-aliasing blur...
ALL samples are an alias. False color results from what happens directly outside the yellow circle where the aliasing is quite obvious, but very different for each channel. The difference is due to position, not density. Of course, we can go all red-herring about how the "amplitude" of the aliasing is the same for all samples, but that wasn't contested to begin with. The aliasing characteristic is the same for each sensel, the aliasing effect of sampling disjunct positions is not.
LR5 | Capture One | Raw Developer |
(http://echophoto.dnsalias.net/ekr/Articles/MFDJourney/FakeDetail/20130714-CF043488.jpg) | (http://echophoto.dnsalias.net/ekr/Articles/MFDJourney/FakeDetail/20130714-CF043488_C1.jpg) | (http://echophoto.dnsalias.net/ekr/Articles/MFDJourney/FakeDetail/20130714-CF043488_RD.jpg) |
No, aliasing can always be removed very easily by (excessive) blurring.
The issue is that aliased signals or artifacts could be of a lower frequency than Nyquist (in theory they can be of any frequency) so, how much are you going to blur?. IMHO It is better to have a low-pass filter before sampling (not necessarily an AA layer on top of the sensor).
Much better, but even if a lot less evident, there are still artifacts or fake detail along the edges of the feather (this is splitting hairs, I know)would require another 5-10 minutes to restore the artifact-free capture. Okay... if it's an really important image it may require half an hour or mabye even an hour (since it's "important"... who cares...).
But above all I am all for solutions how to solve issues caused by said limitations... this is why I've posted the "corrected" feather...
Hi Francisco,
That's indeed what Erik's file demonstrates, nothing new under the sun, the image data was already aliased when it was recorded because the analog input signal was not 'properly' low-pass filtered.
To avoid such issues one can attempt to shoot additional images, e.g. with a much narrower aperture to create diffraction blur, or a bit of defocus, and then make a composite in postprocessing. That of course works best with stationary subjects. Shooting at a larger magnification factor or at an angle may also work.
Blurring the aliased image data will also destroy other useful detail, unless one uses elaborate processing tricks.
Cheers,
Bart
Hi,
Just to make a point, I think that the image was shot at f/5.6, but I did reshoot it at f/11 and most aliasing was still there.
I shot the subject with both 80/2.8 and 150/4 lenses and both had obvious aliases. Both images were shot 3.5 m, so the aliasing is much less sensitive to distance than I would have thought.
P45+ | A99 | A77 |
(http://echophoto.dnsalias.net/ekr/Articles/MFDJourney/FakeDetail/0716/20130716-CF043522PhaseOne_A_SP45+_small.jpg) | (http://echophoto.dnsalias.net/ekr/Articles/MFDJourney/FakeDetail/0716/20130716-_DSC2300SONYSLT-A99V_small.jpg) | (http://echophoto.dnsalias.net/ekr/Articles/MFDJourney/FakeDetail/0716/20130716-_DSC5172SONYSLT-A77V.jpg) |