Well, a 24-bit Analog to Digital conversion using subtractive dither, where the signal has 16 bits and dither 8 bits, has advantages over pure 16 bit signal. I don't know if manufacturers implement such systems.
Note that I did say "playback". For recording, there are no limits to how much precision (might) be beneficial once you allow creative mangling of the sound files, much like the case in photography.
I dont think that anyone are able to distinguish (even naiively) truncated 16-bit audio from 16-bit properly dithered audio in a properly blinded and "relevant" test?
In laymans terms, I think that the very best 24+ bit audio ADC/DACs out there can be compared to an ideal 20-bit (or so) converter. In theory, having 20 rather than 16 bits of accuracy is a good thing, but once you put it into a room of 60dB or 80dB difference between peak levels and noise floor, the difference gets buried very deep into the noise floor.
Of some relevance to this question, see e.g.:
"Audibility of a CD-Standard A/D/A Loop Inserted into High-Resolution Audio Playback*", Meyer & Moran, J. Audio Eng. Soc., Vol. 55, No. 9, 2007 September
http://drewdaniels.com/audible.pdf-h