There have been many discussions advocating and rebutting the idea of removing the aa filter. This post discusses the basis for aa filters and refers to digital signal processing. My goal is to have a courteous discussion about the design of digital cameras. The first thing I point out is that many devices we use today are based on digital signal processing: phones, TV's, CD and other digital music. All of these things are based on digital signal theory. The goal of digital signal processing is to represent a continuous signal with a finite number of samples. The basis for this is the Shannon sampling theorem which guarantees that if certain conditions are met that a continuous signal can be recovered from its samples. For our discussion, the samples are the data from the raw image. With the Bayer sensor there are many complications, including the rgb sensor elements which do not have the same sampling. Nevertheless, the point I want to make is that in 2010 nobody builds a sensor without having a very good idea of how it will perform. It is possible and desirable to simulate the whole process, including the aa filter, to predict the capability of the system. The endpoint of the simulation may be a Modulation transfer function, or even a test image. The effect of the different noise components can be included in the simulation. I don't know if Canon or Nikon does this sort of thing but I would shocked to find out that they don't. This is how instruments are designed today. If a better system could be built without an aa filter, canon and nikon would certainly do it.