Canon DR usually goes up from ISO 100 to 200 in current models for a combination of two reasons: (1) read noise goes down by almost a factor of two; and (2) the sensor saturates well before the max raw level 16383 at ISO 100. This means that one loses less than a stop at the highlight end in going from ISO 100 to 200, while getting nearly a stop at the shadow end.
Emil, please correct me if I am wrong: what you mean with this is that:
1. These Canon cameras _almost_ double their S/read noise ratio when going from ISO100 to ISO200, and that is equivalent to having very close curve plots in the low end (never crossing curves however
), and that is equivalent to admitting that almost all read noise is produced after the ISO amplification.
2. On the other side, the difference in saturation point (lower at ISO100 than at ISO200) means more extra DR in the right end than what we loose in the left end when going from ISO100 to ISO200.
Being * the saturation points for ISO100 and ISO200 this is what we are talking about:
1. If the saturation points were the same, we would _never_ get more DR at a higher ISO no matter how much improvement in S/read noise we could get from ISO100 to ISO200. The more we could expect in that case would be to get the same DR. Right?
2. If it is true that for a given camera a higher ISO provides more DR, unless we need extra long exposure times, it would be completely stupid to use the lower ISO for any application (this is what Ray asked).
3. Why is that big difference in the Canon's saturation points? at ISO100 my cheap old 350D saturates at 4095, while the 5D saturates at 3692. I never understood this. It seems as if the output of the ISO amplification gets clipped before entering the ADC, and even that clipping point depends on the ISO set.