I read this:
http://www.luminous-landscape.com/tutorials/understanding-series/u-raw-files.shtmlA 12 Bit raw File
Within the first F/Stop, which contains the Brightest Tones
2048 levels available
Within the second F/Stop, which contains Bright Tones
1024 levels available
Within the third F/Stop, which contains the Mid-Tones
512 levels available
Within the fourth F/Stop, which contains Dark Tones
256 levels available
Within the fifth F/Stop, which contains the Darkest Tones
128 levels available
An 8 Bit JPG File
Within the first F/Stop, which contains the Brightest Tones
69 levels available
Within the second F/Stop, which contains Bright Tones
50 levels available
Within the third F/Stop, which contains the Mid-Tones
37 levels available
Within the fourth F/Stop, which contains Dark Tones
27 levels available
Within the fifth F/Stop, which contains the Darkest Tones
20 levels available
My interpretation is that the brightest 2048 raw tones are mapped to the brightest 69 jpg tones.2048/69=~30.
If the exposure slider moves to -1, I assume that means -1EV, so the top 2 jpg levels (69+50=119 tones) now contain the the top 2048 raw tones?
IS the brightest jpg tone is an average of the brightest 30 raw tones at EV 0? and at EV -1 is it the average of the brightest 15 raw tones?
When you move the exposure slider to the left, what is the raw converter doing to generate more jpg tones?
Is it spreading those top 30 RAW tones into more than the brightest jpg tones logarithmically, linearly, or some other way that I can fathom? On top of this the tone curve can be manipulated in ACR so that means there is an additional transformation that is represented.
In case anyone claims that I should do the work and figure this out for myself, I have tried.
Attached is a jpg file, its histogram from GIMP (log), its histogram from CS6, and the raw histogram from raw digger.
Why are the histograms of GIMP and CS6 so different?
Does the raw histogram indicate that the raw is a 14 bit file because it has 16000 levels?