I read recently comments from Kodak suggesting that digital sensors have a quite low natural ISO (50-80?) and most or all of the ISO adjustment on digital cameras is done digitally, after A/D conversion, with little or none being amplification in the analog stage.
Digital amplification makes sense if the data is then going to be cut down to 8 bits, because it throws aways useless "leading zero bits" making room for more of the somewhat useful less significant bits. But it seems to make no sense if the file is going to be store RAW (or 16 bit TIFF): at worst, it risks blowing out highlights that the camera had recorded successfully if you overstimate the ISO needed.
Why not instead offer only a smaller range of ISO adjustments through gain settings for the analog amplifier section, and leave the rest to "digital push processing" in the RAW conversion or editing software?
Do some DSLR's offer that more conservative option?
Can one find out which is the lowest ISO setting for each amplifier gain level and use only those, to avoid "in camera digital pushing"?
P. S. [added] If I recall correctly, the Kodak 14n uses more that the usual amount of variation in analog amplification levels as one varies the ISO setting, which might contribute to its greater variation of noise level with changing ISO [Did I phrase that diplomatically enough?]