Please correct me if I am wrong, but I believe that TIFF and JPEG output, direct from cameras or from RAW converters, has had a tone curve and gamma compression applied; the numerical digital levels are no longer in direct proportion to recorded luminosity unless one does something very unusual. With even the smallest common gamma of 1.8 (the pre-press and Macintosh standard), up to fourteen significant bits of sensor dynamic range is compressed into the range covered by 8-bit formats (roughly, bits times gamma = dynamic range in stops).
If so, with 8-bit formats one loses fineness of tonal graduations within the overall range, but not dynamic range. For example, if, optimistically, the tone curve plus gamma choice divide the 256 available 8-bit levels more or less uniformly between each stop of an 11 stop range, one gets levels spaced at a bit less than 1/23 of a stop.
How visible are such increments of luminosity? I do not know, but have never heard of anyone working with smaller than 1/12 stop adjustments.