Graeme suggest that the best solution is oversampling, that is more pixels. The industry is clearly moving in that direction possible for the wrong reasons, namely that pixels sell. Anyway in my eyes there seems to be an agreement on this forum that aliasing phenomena are bad and cannot really handled in software. There still seems to be a good reason for using those filters.
To my understanding, based on this discussion, color moiré can be reduced in postprocessing whereas the removal of BW moiré and other aliasing artifacts are not really feasible in software. On the "positive" side aliasing effect can give a spurious resolution giving the impression of more detail than actually resolved.
The discussion did not really answer my other question, why MFDBs don't have a AA-filter, with the sole exception of the the Mamya ZD which had it as an option. A suggestion was that the reason was a "raw" workflow is the rule on MFDBs while on DSLRs JPEG is at least an option.
My personal view would be that:
- The AA filters are there for a good reason
- AA-filters loose some acutance (by smearing an image point over multiple pixels) but this can partly be recovered using sharpening
- The softening by the AA-filter is just an addition to other softening effects, like residual aberrations in the lens and focusing errors
I have a small comment regarding resolution. If we have a very good lens which resolves very much higher than the sensor the resolution is dominated by the sensor. A well designed AA-filter would not reduce resolution, as it is given by the pixel pitch (or the Nyquist limit), but it would reduce contrast by "spilling" a controlled amount of light into the surrounding pixels. So contrast (or MTF) would be reduced and not resolution. If resolution is not dominated by the sensor the situation is much more complicated. This explanation may be simplistic and possibly not even correct.
Do you have a reference for that? I'd be interested.
There are two uses of the term "aliasing" in common usage. One is the stairstepping of diagonal lines, also called "jaggies". The other, which is what I believe Graeme is referring to, is the shifting of spatial frequency of a signal by a multiple of the sampling frequency k_max, due to the fact that the sampling cannot distinguish a signal of frequency k and k-k_max when k>k_max. For some pretty pictures, see
I would think the main reason for the AA filter is to combat moire and color aliasing. Thought the latter can be mitigated by post-processing techniques, the former is quite nasty and nearly impossible to to back out, since the information that would allow one to distinguish that an oscillation of luminance at frequency k-k_max really should have been frequency k is irretrievably lost.
On the other hand, there is substantial room for improvement in demosaicing algorithms. Many of the ones I see break down strongly on texture data near the Nyquist frequency, and if they don't produce moire as a result, introduce maze artifacts and other structures that are just as bad. AA filters help here, but I think there is progress to be had on the processing side.