We had no such LCD on film cameras yet many photographers learned to expose properly (ideally) for their film and development and didn't take any ISO setting at face value. ETTR is just ideal exposure for raw data. That the camera LCD is lying to us just forces us to learn to expose as we all did prior to digital capture. IF you capture raw data and use the LCD as your guide, you are basically under exposing that data which isn't ideal exposure. Just as we could under expose film and push it (to a degree), it wasn't considered an ideal exposure and development process.
I agree...sorta.
Just like with film, we can "calibrate" our results, so we "know' how to "get it right in the camera"....which means optimally exposed....which is different by media...whether it is different film stocks, digital jpeg, or RAWs from different digital sensors.
Doing my own testing with RawDigger, I found, for the 5D3, that I could spot meter the highest significant highlight and then place it at +3 to +3.5 stops over metered. This gave me an image exposed as far to the right as possible, without any channels clipped.
Jim Kasson, on his blog, The Last Word, did a lot of testing on different cameras...many which did not have spot meters, to arrive at means to adjust the histogram to more closely mimic a RAW histogram.