Perhaps I'm making incorrect assumptions based upon the way 35mm DSLRs operate
You certainly are, if you believe, that it has to do with 35mm DSLR.
The issue is, that the histogram the camera displays is based on the resulting JPEG image (in case of raw recording, there is a JPEG image embedded in the raw file).
That JPEG has been created from the raw data based on the camera settings:
- contrast
- saturation
- sharpening
- color tone
but the single biggest factor is the white balance (which is not always a setting, it may have been evaluated by the camera).
I posted some raw histograms on this thread, and later the corresponding RGB histogram without de-mosaicing but after white balancing. You may notice, that the raw green values are over twice as high as the reds, but in the resulting image they end up as high as the green (the brightest part of that image is the white card, i.e. red, green and blue has to be equal at the right end).
There are some enviable cameras (or cameras of some enviable owners), which accept coefficients as setting. Some Nikons do that; in Canons only the very top models are on par.
Otherwise you have to experiment a lot with your camera to find out a WB setting (not only temperature but tint as well), which results in coefficients 1.0, 1.0, 1.0.
I rely upon the flashing highlight warning and the appearance of the histogram to tell me how close I am to full exposure
It is unreliable without the right guessing of WB.
There are some downsides of such a "raw setting":
- the image appears off-color on the camera,
- the thumbnail is off-color,
- the embedded JPEG is useless,
- you can not reasonably combine raw with in-camera JPEG