I think people are over complicating things here.
No, you just find it complicated. It's really exposure 101, you're probably not there yet. Re-read what hjulenissen wrote, it's spot on:
I think that one exposure is "ideal" in the sense that it maximize information about the scene.
Exactly! ETTR is the beginning of the process of producing that. What's the actual sensitivity of the sensor and the role of ISO settings? What do I have to do from what the meter recommendation (again for a JPEG or film) to produce maximum information in the raw? Simple exposure and development testing as we did when shooting film. Nothing complicated about it, it's part of photography from the day it was invented!
There is one optimal, correct, ideal exposure for any scene with of course input by the image creator (espeically if the scene's DR is wider than what the camera can capture). ETTR is simply about providing that by ignoring the incorrect
meter/expsoure/development (for JPEG) seen on the camera histogram. The 'to the right
' means compensate because you're under exposing the raw data by correctly exposing for the JPEG. If you have a false understanding that the two should be treated the same, you fail to understand the differences in the data, encoding, oh, I'm probably going over your head, sorry. Bottom line is, just as you would never treat a neg 'rated' at ISO 400 the same as a chrome 'rated' at ISO 100 until you tested exposure+development to produce the best results, you don't do the same with digital capture.
Now if you want to point a meter, reflective or incident and treat film, JPEG, raw, any ISO, and the development the same and
expect the same results, you clearly don't understand exposure and development, again photography 101.
ETTR should just be called 'proper exposure for raw data' or whatever you want to name it. The only so called conflicting goals is being forced to under expose (and over develop, at the expense of data qualty) because you don't have enough light to fully embrace ETTR, correct exposure. In the film days we DID have to do this and we did see the result; more grain. Stopping the action or producing better data, any photographer will also understand they have to push the process, grain/noise as the net result. Lesser of two evils and again, photography 101.
There has been massive improvement in autoexposure systems since eighties.
The tools may be better, but the photographer who doesn't understand
what they are being told and follows the advise incorrectly are not aided here. The better metering you speak of isn't going to help the raw shooter if it's based on the JPEG and they do not implement ETTR.