Hi all, I have a question for those brilliant minds around here.
We all know that:
- The higher the ISO for a given exposure, the lower the SNR.
- The lower the exposure for a given ISO, the lower the SNR as well.
But if we are forced to use some exposure parameters (aperture and exposure time), what is recommended to obtain the best possible SNR (Signal to Noise Ratio), i.e. less visible noise, to push the ISO number? or to allow certain underexposure?
One could think both options would be the same in terms of visible noise, but this is not the case. I have done a test shooting the same scene with the same exposure parameters changing only the ISO, and the result was that
the higher the ISO, the lower the visible noise, specially in the shadows.
Test: 3 shots at 70mm, 0.3s, f/8 using ISO 100, 200 and 400.
This is the scene:
And these are crops of normal, dark and very dark areas of the scene from top to bottom.
(development done with no noise reduction in DCRAW balancing exposures in PP):
I am sorry for the large file size due to noise. It is easy to see that the higher ISO is, the better SNR we achieve.
The fist conclusion I make of this is that it is better to use high ISO values than leaving our shot underexposed. Also that exposing to the right makes sense not only at ISO100 but at any ISO value, so ISO may be used to get ETTR when needed for being forced to use certain fixed exposure parameters (aperture and exposure time).
I wonder if it's the same for even higher ISO values, but I guess it is.
My question is: what explanation do you think this phenomemon has?
1. The camera applies some noise reduction in the signal amplification process when using high ISO (i.e. with large amplification values)?
2. There is some noise source that is added after amplification, so the higher the signal level is (thanks to ISO) the less that 'last minute' noise source will affect?
Regards