That's a good point. I'd forgotten that the A900 doesn't have 14 bit encoding like some of its competitors. One might also wonder how a camera with 12 bit encoding can deliver greater DR than another camera, such as the 1Ds3, which has 14 bit encoding.
Nevertheless, the fact that 12.6 stops might be a bit of an exaggeration is not significant in paractical terms, provided the DR of the other cameras with which the A900 is compared, is exaggerated to the same degree.
Hi Ray, the problem is not that they give a 12-bit camera a higher DR than a 14-bit camera. In fact, that's perfectly possible since DR is not bitdepth-limited but noise-limited, and the 14-bit camera could have a worse SNR than the 12-bit.
The problem is that a 12-bit linear encoding can _NEVER_ encode (I am not talking about noise here, just about encoding) more than 12 f-stops of DR. That's why anyone looking at a table where a 12-bit camera is given more than 12 f-stops of DR should not think of any exageration, but simply of wrong information.
Ideally the following levels would be encoded in a 12-bit RAW file where the whole range 0..4095 were used:
0EV: 2048 levels, 2048..4095
-1EV: 1024 levels, 1024..2047
-2EV: 512 levels, 512..1023
-3EV: 256 levels, 256..511
-4EV: 128 levels, 128..255
-5EV: 64 levels, 64..127
-6EV: 32 levels, 32..63
-7EV: 16 levels, 16..31
-8EV: 8 levels, 8..15
-9EV: 4 levels, 4..7
-10EV: 2 levels, 2..3
-11EV: 1 level, 1
There is no room for more than 12 f-stops, and in any case the lowest f-stops would be poorly represented (1 level, 2 levels, 4 levels,...)
BR