Because I don't believe that on the back of the camera you have enough information to know that you haven't lost data off the right side of the histogram.
(...)
as I increased the exposure to move to the right the shape of the histogram changes--the slope and magnitude of each peak changed. This suggests that the histogram isn't representing a linear transformation of the exposure data as you move to the right.
Fike, I totally agree with what you say, but we must be clear: the reason is not the principle of ETTR (which is conceptually correct), nor the way sensors work (they are very linear devices). The one and only reason for this uncertainty is
camera makers are making machines oriented to the JPEG shooter, not allowing the RAW shooter know about exposure accurately. Camera makers are making us RAW shooters prisoners in Plato's cave, trying to decipher what the shadows in the wall mean.
These are RAW histograms obtained from a series of shots equally spaced 1 stop apart over a white wall with uniform lighting, and represented in log scale. They move at totally regular 1EV intervals (as expected), and they don't change their shape (as expected; only the B channel spreaded slightly once all its surrounding R and G photocaptors reached saturation):