Bernard, before producing an 8000 dollar camera with the huge investment in research and plant tool-up implied therein, I would be truly very surprised if Canon had NOT "done their homework". I mean this isn't exactly a fly-by-night start-up company - they've been in the camera business for a while; don't ya think?
They have been in the business long enough to know where to stop their effort and still make money. Where they decide to stop their effort based on business considerations clearly doesn't mean meet my expectations as a potential customer.
OK we agree on that (even before you answer! ), so what's the deal? I'd frankly be concerned about buying 30MP in a 35mm frame - sounds like an aweful lot of photosites to cram into a small space. Maybe the bleeding edge of the technology as it stands today simply says that the trade-off between resolution and pixel quality (noise etc) is optimized at 21 MP for a 24*36 sensor, full stop.
Well I would not be concerned if it were agreed on that the 1dsIII is a camera targetting maxium image quality with a reduced scope of usage, namely focussing on low/medium ISO.
I am 100% sure that it is possible to design, produce and sell TODAY a camera with great 100 ISO image quality at 30MP (and OK 400 ISO) on a FF sensor. Extrapolating the resolution of a D2x to FF shows 28MP, and the per pixel image quality of the D2x at low ISO is still among the best although it is 3 years old (OK, DR could be better).
I don't understand why Canon is trying to make the 1ds3 a tool of all trades with very good high ISO capability. IMHO they are missing the target segment for the 1ds3. The 1dIII is there for those applications where users agree up front to compromise on image quality by using high ISO.
Now, the truth is probably that:
1. They don't have the lens technology it takes to produce wide angles that would benefit from 30MP,
2. They decided to settle at 21MP because it will enable them to sell us in 3 years the 30MP camera they could have produced today.
You story about the impact of bit depth is really an interesting one, but I'm not impressed by the 60 layers. In principle more bit depth MUST translate into smoother tonal gradations because there are more possible shades of grey per pixel, isn't it? So the added tonal depth will be there in the image file. The question then becomes whether our monitors and printers are blind to the difference between 12 and 14. You know the chain and the weakest link business. But who knows, monitors and printers two years from now may be able to render yet finer subtilities that escape us today. Then we can begin to ask at what point perfection is perfect...........
I am not saying that high bit depth isn't good, just that until now I haven't seen any evidence that the Canon implementation of 14 bits has any actual real world value. Until this is proven, 14 bits will remain a marketing tool to my eyes.
Will a display/output technology to be released in 2 years will be able to tap into this if we cannot see the hint of value today? Maybe, but I am not sold.
The value of 16 bits on MFDB can be seen easily with today's technology already, why would it be different with the 14 bits of Canon?