Perhaps my reply above is a bit abstract. Maybe better to show some examples to try and clarify what (I think) is happening.
To start with I made a completely out of focus shot of a evenly lit white subject. I did this for several exposure setting. One rather under exposed and one over exposed. I opened the resulting files (made with a Sony a99, same compression as a7(R)) in RawDigger. I selected a small part of the image in the center. I made a histogram of the selected area. The results are attached. In the underexposed example one can clearly see the lack of resolution, there are clear discrete values. This is due to the linear nature of the ADC. At these low values there is no compression yet and we simply see the resolution of the ADC. And there is not a single value, as one would expect, but a range of values. This variation is caused by noise. Maybe a part is caused by differences in the sensitivity of each individual pixel, but the bulk is most likely caused by a mix of noise sources.
The second attachment shows the result for a high light intensity. The maximum of 16384 is almost reached. RawDigger does not show all values in this histogram, the results are grouped in 'bins' of 1/96th EV. Effectively, RAW digger bins the result in a logarithmic fashion (not totally different from the way the Sony compression algorithm works). So all of the theoretical 8192 values in the highest EV, are reduced to 96 values. One would expect to see only a single value in the histogram. All pixels receive the same amount of light, so the values should be the same. Read noise no longer plays a role at this high level of illumination. But instead of a single value, there is a group of values. This is (mostly) caused by shot noise. One could do the same experiment for a D800, and the result should be comparable. In other words, the noise completely dominates the ADC resolution and there is really no need for the excess of resolution in the higher part of the EV range.