The fill-rate (amount of light-sensitive area) is IQ-wise one of the most important technical aspects of sensors. The higher the fill-factor, more photons hit the (larger) light-sensitive surface and more photons can hit it before saturation sets in (important for DR). Of course the analogue sensor signal then has to be amplified and converted in digital information - but what's lost in the sensor, cannot be restored.
Full-frame CCDs need special manufacturing sites (they're expensive), they're slow, they consume much energy and they're stupid (they're just a large array of photodiodes). But they have one big advantage: fill-rate, it's still not 100% but close.
So they are able to gather more information, create a high-quality and still pretty much unprocessed (no filtering etc.) signal. That's why they are used for prefessional digital cameras.
CMOS-based systems have come a long way, but their fill-rate is still quite low but they're capable of very sophisticated image-processing to reduce noise (and they have to, especially certain kinds of pattern noise due to the specific amplification process of CMOS), that's why high-iso-files from these cameras appear quite clean but also tend to be more artificial.
When comparing noise, we have to compare processed (because CMOS-files are always filtered internally) RAW-files regarding noise AND detail. Regarding CMOS and CCD we then have to care about the same pixel-pitch, usage of microlenses and the age of the architecture. The common 6.8µm-CCDs are from 2004! We will have such a rare opportunity to compare "CMOS vs. CCD" again with the appearence of the S2 - it has microlenses (the other 6µm-CCD-systems don't have those) and is comparable to the 20+MP-DSLRs and when the engineers didn't mess up the processing (conversion...) the noise/detail will look different but won't be worse than with CMOS-based systems.
CMOS is the future, the new EVIL-systems need their capability to implement live-view, even for professional camera systems. But the approach will be different, when not using AA-filters, reduce internal processing to minimum and similar HQ-color-filters as in current CCDs, their "look" won't be much different than current CMOS - if there will be a difference at all!
CMOS or CCD-sensors itself don't have a "look", they're just electronic devices to convert light into electric information! The size of the sensels, the microlenses, the architecture etc. don't even affect sharpness/MTF because these things are all affecting individual pixels (just like sharpness is no longer an issue with TFT-displays instead of CRTs). Only the AA-filter affects "sharpness" sensor-wise (given the same pixel-pitch/size)
AA-filters are needed to reduce alaising artifacts "out of the sensor" and have nothing to do with the choice of CCD vs. CMOS. That's what the Japanese DSLRs are designed for: professional press-photography in the upper end and amateurs on the lower end. They both need their images fast without post-processing, even if they have to compromise IQ. MFDB-files are processed carefully in post.
CCDs can be used without microlenses, making them more sensitive to oblique light rays and avoid aberrations - important for technical cameras. But you loose about one stop of sensitivity, because microlenses try to compensate the low fill-rate and focus light only on light-sensitive areas of the sensor. That's why they are more important with CMOS-designs, their "loss in sensitivity" due to the lack of microlenses would be much bigger.