Hi,
From what I read, my understanding is that:
1) Readout is different. CCD shifts the charge from pixel to pixel, CMOS reads voltage from cell directly. This makes correlated double sampling possible, a technique used my most CMOS vendors. One of the reasons CMOS has less readout noise. (It essentially means that voltage is measured after sensor reset and after exposure, so noise from reset can be eliminated.)
2) CMOS needs more on chip electronics. Each sensel contains several transistors so the chip "estate" is split between sensels and readout electronics.
3) CMOS can integrate more electronics on chip. Sony EXMOR does for instance analog digital conversion on chip using thousands of ADCs. CCD will have off chip pre amps and ADC.
To sum up, CCD has somewhat advantage in that it has better utilization of chip area for collecting electrons (each electron corresponding to a photon). That advantage is quite small. Microlenses on CMOS will improve fill factor but will not help full well capacity. CMOS can have much lower readout noise. Also, CMOS can have much faster readout.
It seems that readout noise on CCD is about a dozen electron charges while CMOS can reach 2-3 electron charges of noise. As the full well capacity is reduced only 30% (or so?) CMOS will have an advantage in DR (which is FWC divided by noise). CCD may have a very small advantage in shot noise as chip area is better utilized.
The major difference may be that CCD is old technology. It seems that CMOS is preferred mostly at least for cameras, but no one makes CMOS sensors for cameras in large sizes. CMOS is said to be a more complex technology than CCD and it may that large size CMOS sensor fabrication in small series would not be profitable.
Best regards
Erik
Never understood very well if there are significant differences (pros / cons) between both sensor types, or is it at the end not relevant?