It is my understanding that it is _really_ hard to see the difference between 8-bit and 10-bit delivery for real-world images, given that:
1. The amount of in-display processing is low (color correction, sharpening,...)
2. The display brightness/DR is moderate (as is the norm for photographers)
Remember that 8-9 bits of gamma-encoded images correspond to 12-14 bits of linear encoded images in term of perceptual fidelity. I.e. you might not need as many bits in your display/printer (nonlinear) as you do in your camera (linear), at least if you apply dynamic processing lightly.
Of course, in the midst of a heavy image processing pipeline, many bits can have a real significance.
I have a Dell u2711 and a ATI graphics card + win7-64, all capable of 10-bits AFAIK (dont know about Lightroom). But since the display only does 10 bits over DisplayPort, and there were some reports of stability issues using DP, I have not tested it.
-h