Hi,
With modern sensors, the cost is slowing down everything by a factor of four, one frame per second instead of four frames a second. How hard is that to get?
The pipeline is 16 bit wide. Processors are normally 8, 16 or 32 bit wide. No one builds a processor 12 or 14 bit wide today. That said, it may be possible that some part of the ASICs may be optimized for 14 bits, but, digital technology is normally multiples of 8.
Modern CMOS sensors use column wise conversion and the converters are probably ramp type converters. A reference voltage is ramped up until voltage matches the voltage from the pixel. The value is simple the number of clocks until match is reached.
If you want to measure with 16 bit precision, you need to have 1/4 of the step size on the ramp, increasing conversion time by a factor of form. So, you get a slow conversion.
On older systems, most CCDs and some CMOS, the voltage from the pixel would go trough a preamp to of sensor ADC. Those would be flash type ADCs, so you could get a 16 bit device from Burr & Brown, and that would deliver 16 bit data. But, the input data to the converter would still be limited by pixel noise. So, you have 72 dB of data, corresponding to 12 bits and feed it into a 16 bit converter. So you get 16 bit of data with low four bits representing noise.
Would you do that in engineering school, it would be regarded a serious error. If you have input data with say two decimals, you should never present the result with more than two decimals.
So, the material cost of going 16 bits is zero, but with modern sensors it would lead to significant performance loss and the old sensors don't have accurate data anyway.
Best regards
Erik
Erik, I'm not really sure what you're responding to here. I agree that 16-bit recording is surpurfluous as yet. But the queston I was responding to was "Is it difficult... or just expensive?"
So, cost aside, if you are ging to record 16 bits (apart from the question of whether you need to) more-expensive tech can move the bits faster, but not withoug creating proportionately more heat, and in a small camera, heat is a very important limiting factor because it's hard to get rid of it and it affects both usability and image quality.
Red took a novel approach to this problem with their Dragon sensor by making its optimal operating temperature much higher than the norm. That helped them keep operating temps where they need to be, but the tradeoff is a very long startup time (20 minutes) before the camera reaches its proper operating temperature and optimal noise levels.