The Exposure slider in ACR and Lightroom is intentionally measured in stops (e.g., from -4 to +4 stops), and for the most part (*) it does correspond to the standard notion of photographic exposure. For example, if you set Exposure to +1, this will double the exposure of the image. The reason it may seem as if other non-linear stuff is going on, is because images are rendered in ACR/LR with tone curves -- e.g., for shaping the highlights and shadows. For example, the default point curve of Medium Contrast is part of this tone shaping. If you adjust exposure, then this affects how the values get mapped to the tone curve.
While a log-base-2 histogram is useful for determining exposure levels, it is not particularly useful for judging rendered output for a color image once tone mapping is applied. Remember, the histogram in ACR/LR reflects the rendered output (output-referred image), not the original input image (scene-referred input).
Eric
(*) The reason I say "for the most part" is because of course there are limits. For example, in the field if you were to reduce the exposure (e.g., by halving the exposure time), you can record more and more highlight detail. But if you try to do this in a raw conversion software, of course that doesn't work. In other words, reducing the Exposure slider can reveal a bit more highlight detail, but very soon you exhaust what was actually recorded in the image.