Hi Oscar,
If Erik doesn't mind, I think this exchange about the root cause of visible aliasing artifacts may add some insight, therefore I've prepared some examples (attached).
Aliased information cannot be distinguished from real information because both are recorded together in the same sensel position. Therefore, only different aliasing amounts can cause these false color issues, and all that the demosaicing algorithms can do is iterative reduction of the local color differences where RB and G channel luminances significantly differ.
Exactly, and different aliasing amounts are caused by different position, not by different density, since there is no difference in density.
If position would change the
amplitude (not the shape of aliasing because we sample a different position in the object space), then it should be reflected in the difference between the 2 Green filtered sensels that are only 1.4 diagonal sensel-pitch widths apart.
Have a look at the first attachment. I've taken a crop from the Raw sensel data, without demosaicing, and extracted the 2 Green channel data sets, and removed the Red and Blue and the other Green sensel data. So each image is essentially an almost point sample of 1 CFA color channel with other channels eliminated.
The
amplitude of the Green channel aliasing shown inside the yellow Nyquist limit circle, is basically identical. There is also a difference in position, which is caused by sampling a different position in object space. Maybe that is what you were thinking of. But that
Phase shift will be dealt with by the signal reconstruction process AKA demosaicing. Also the
seemingly aliased region outside the Nyquist limit circle will be reconstructed to a smoother approximation of the non-discrete original scene content. So the choice of Raw converter also plays a role in all this.
Now, compare it with the Red channel aliasing amplitude inside the yellow circle of the second attachment. A combination of lower (2.8x sensel-pitch) sampling density
and diffraction of this f/5.6 shot significantly reduced the
amplitude of the Red channel aliasing. The Phase shift is again different from all other B/G1/G2 sampling positions, therefore the non-aliased signal outside the yellow circle also aligns differently, but that reconstruction is the task for the demosaicing algorithm. The Blue channel, while also sampled with the same large sampling distanceas the Red channel, is much less affected by diffraction and therefore shows more amplitude of its aliasing inside the yellow circle.
As is hopefully clear, the amplitudes of aliasing (inside the yellow circles) differ quite a bit. That their Phase also differs is because they are sampling different parts of the scene, with a 1 sensel offset. That Phase difference will be used to reconstruct the luminance portion of the image and, as shown earlier, quite successful because some 93.6% of full luminance resolution can be restored. The Chroma portion of the image, which in normal (not stress-test test-chart) images has a much lower spatial frequency in the actual scene, can therefore be reconstructed with relatively good fidelity to our eyes, except for where the aliasing differences throw a spanner in the reconstruction works.
So to summarize, the majority of the effect from sensel
position is related to sampling different scene content which will be reconstructed as different luminance detail. The aliasing
amplitude differences are what will cause reconstruction artifacts, because they are combined with actual phase shifted luminance detail and cannot be separated afterwards. Afterall, one cannot unscramble an egg omelet.
Hope that helps to clarify some of the causality.
Cheers,
Bart