It's easy to think it's just a matter of bits, but what if I put the fastest Intel Dual Core Duo in your PC but bottleneck it with a minimal cache, slow the cache MHz, slower memory (or just too little), weak video card, 5400 RPM HD's, etc.
Adding bit depth goes a bit further than just an A/D component. There has to be a pipe that can move that data quickly enough off the sensor (in this case the Canon has 8 channels), the sensor reset and prepared for the next image capture. As the data comes off the sensor is has to be prepped for conversion (various stages of amplificaion and filtering). Once converted, the data has to be processed (in camera tone curve, noise reduction, WB, etc) and converted into the RAW file (larger files means more CPU power). Finally, the image is moved to the bufffer and then stored to CF.
The CPUs have to run cool (heat is a bad thing), nor can the CPUs drain too much battery power. The quality of the A/D is very important too. More data means higher grade components too because the CPUs & VLSI engines have to run at higher MHz.
I doubt 14-bits will hurt Canon's image quality The improvement may not be radical, but it should help. I looked at the 1D3 samples and they look so-so; I really think my 1Ds2 matches or exceeds the image quality. The 1Ds2 has far greater resolution - those 10 MP files felt downright puny!