I’m not confusing either setting, I’m quite familiar with both settings you’re referring to. If the GTX1070 wasn’t actually capable of displaying in 10/30bit, why then would Ps give me the option to enable the 30bit setting if the card couldn’t actually display at said bit depth? I can take a screenshot if it’ll make you happy but I’m not confused. I may be wrong, sure, but if I am then Ps is wrong too. Also, how would it be possible that the desktop could be displayed in 10bits but not the Ps workspace? Doesn’t really make sense but maybe you know something I don’t. If you could elaborate and explain how and why the desktop is able to be displayed in 10bits and Ps gives the option of ticking the 30bits setting in the preferences, I’d be quite interested to know how this is not actually 30bits.
Sent from my iPhone using Tapatalk
Ok. Then I am sorry for "you're confused"... You're wrong!
Yes, the PS would show the option, allow you to select it but... it's lying! Try with the file digital dog proposed and you'll see banding. Other way to actually confirm there's no 10bits for you is to dig into the logs PS offers in Photoshop, select Help > System Info. End of lies!
I am running a dual video card system, GTX 1080 Ti FE and the FirePro W7100, both hooked up to the same NEC. Trust me when real 10bits gets enabled for the FirePro, switching to the other port where the GTX is hooked would make PS crush, thus proving the 10bits on GTX is ... not there!
Again common and forgivable mistake... yet a msitake (pun intended)!
Sent from my iPhone using Tapatalk