That is good to know, but is the operating system, Photoshop, or the DVI output limiting the bit depth of the system?
The video card output determines the bit depth that goes to the display.
For example, my MacPro (2007) came with the ATI X1900XT video card. This card will take whatever bit depth is fed to it and convert it to 10 bits per color (30 bit output) if the display supports it. If not, it can also downsample (dither) the output to accommodate an 8 or 6 bit display.
"flexible display support
Dual integrated dual-link DVI transmitters
DVI 1.0 compliant / HDMI interoperable and HDCP ready*
Dual integrated 10 bit per channel 400 MHz DACs
16 bit per channel floating point HDR and 10 bit per channel DVI output
Programmable piecewise linear gamma correction, color correction, and color space conversion (10 bits per color)
Complete, independent color controls and video overlays for each display
High quality pre- and post-scaling engines, with underscan support for all outputs
Content-adaptive de-flicker filtering for interlaced displays
Xilleon™ TV encoder for high quality analog output
YPrPb component output for direct drive of HDTV displays
Spatial/temporal dithering enables 10-bit color quality on 8-bit and 6-bit displays
Fast, glitch-free mode switching
VGA mode support on all outputs
Drive two displays simultaneously with independent resolutions and refresh rates"
Photoshop CS4 also features native 16bit printing on Mac OSX which would suggest that both the OS and Photoshop can handle the bit depth.