1. Why do you need ETTR specifically? How many times do you actually have so little DR in a scene and enough time to capture it, that ETTR is both viable and advantageous?
2. how do you know the camera isn't already doing ETTR under the hood?
3. would you be willing to sacrifice color-accuracy for true ETTR?
4. What do you hope to learn from a RAW histogram? If the histogram clips, what does that mean for actual image pixels? Specular highlights, or entire image elements like clouds?
5. would you prefer a proper clipping indication instead?
1. Because unlike film, which had a genuine, physical roll-off in the highlights, digital sensors have a "correct" exposure. That exposure is the highest exposure consistent with not clipping any individual channel, modulo the complication of specular highlights (which I'll discuss below). Coming from a physics background, I just want to be able to expose my sensor optimally, and in the vast majority of situations there actually is an optimal exposure choice.
How many times would I use it? All the time, for landscape. Especially if sunlight clouds are in shot, which where I live means "almost every shot".
2. Because when you dig into RAW files with RAW digger or similar, it clearly isn't. For example, the clipping indicator on my Sony A7RII show clipping a fair bit before actual saturation occurs. And your question answers itself- if I had RAW histograms, I'd KNOW whether or not my camera was doing ETTR under the hood. I want to KNOW- right there on the camera- which is the whole point of my request. The information is right there in the RAW file immediately after capture, but modern cameras do not have the facility to display it.
With the information, the choice is mine as to whether to redo the exposure to get closer to optimum. Without the exposure information, I frequently bracket and hope. I would prefer to remove the "hope" part of that operation.
3. I fail to see why presenting the RAW histogram would have any effect at all on colour accuracy. Possibly automatic ETTR in the camera would compromise colour accuracy, but presenting the INFORMATION to me cannot have any effect. And generally speaking ETTR is the best way to prevent loss of colour accuracy by making sure no individual channel is clipping.
4. I want to see the range of values in the image as shot. Simple as that. We have histograms now, and very useful they are, but they are much more useful if supplemented by the OPTION to see what's actually come directly off the sensor. RAW histograms are useful for when you are shooting RAW in the same way that histograms of in-camera JPEG are useful when shooting in-camera JPEG.
5. Proper clipping indication would be hugely useful as well! Personally if I had to choose one of the two, I'd prefer histograms, with a bin indicating total number of overflow sensels for each channel. (It just gives one more information). But yes, RED-style RGB clipping indicators would be a big step forward. It's one of the very best features of my RED Scarlet.
I'd really like it if the clipping overlay would show pixels which have clipped in each colour overlaid as well, so I can judge whether or not to care about a small amount of clipping. Specular highlights- don't care. Sun's disk in a sunset- do care if I possibly can. Sun's disk in a sunstar - do care, want to make sure the smallest amount is clipping consistent with not having noise in the shadows. A combination of a proper RAW-based clipping indication, RAW-based noise floor indication, channel-by-channel clipping indicators and RAW histograms gives one the necessary information, and that's what I want to have added to camera UI.
Hywel