Pages: [1]   Go Down

Author Topic: 10-bit graphics cards, what gives?  (Read 10258 times)

DrakeJ

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 79
10-bit graphics cards, what gives?
« on: September 21, 2017, 08:19:47 am »

Since support is coming with OSX High Sierra for an external graphics card, I am planning to extend my life on my mid 2015 macbook pro by doing such an upgrade.

And, since I've acquired a 10-bit monitor (BenQ SW320), the graphics card need to be able to produce true 10-bit colors for photoshop over a displayport cable.

A while back, only FirePro or Quadro graphics cards produced 10-bit colors, is that still true, or is there other (read cheaper) alternatives out there? It's difficult to google this topic since there is so little information regarding true 10-bit color.

ChrisMax

  • Newbie
  • *
  • Offline Offline
  • Posts: 22
Re: 10-bit graphics cards, what gives?
« Reply #1 on: September 21, 2017, 04:09:26 pm »

I had an AMD Radeon HD7870 which did output 10 bit color over the displayport connection but was informed by AMD tech support it wouldn't do 10 bit color in photoshop.  So I got a Nvidia Quadro P2000 for my Eizo CS2730 which is a native 10 bit screen not 8 bit plus FRC.  Bottom line all the consumer grade video cards like the HD7870 will put out the 10 bit signal only not in photoshop for that you still need a Quadro or Firepro card.
Logged

KagetsuBuic

  • Newbie
  • *
  • Offline Offline
  • Posts: 1
    • Kagetsu Buic Photography
Re: 10-bit graphics cards, what gives?
« Reply #2 on: September 22, 2017, 12:11:47 am »

Since support is coming with OSX High Sierra for an external graphics card, I am planning to extend my life on my mid 2015 macbook pro by doing such an upgrade.

And, since I've acquired a 10-bit monitor (BenQ SW320), the graphics card need to be able to produce true 10-bit colors for photoshop over a displayport cable.

A while back, only FirePro or Quadro graphics cards produced 10-bit colors, is that still true, or is there other (read cheaper) alternatives out there? It's difficult to google this topic since there is so little information regarding true 10-bit color.
The nVidia 10 series cards support 10bit output.
Currently using an SW320 monitor with the GTX1080, and can confirm it.
Logged

TonyW

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 643
Re: 10-bit graphics cards, what gives?
« Reply #3 on: September 22, 2017, 05:34:28 am »

I believe you will find the situation has not changed and you will need either Nvidia Quadro or AMD FirePro if you wish to take advantage of Photoshop 10 bit.
Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: 10-bit graphics cards, what gives?
« Reply #4 on: September 22, 2017, 08:11:56 am »

I believe you will find the situation has not changed and you will need either Nvidia Quadro or AMD FirePro if you wish to take advantage of Photoshop 10 bit.

Does this apply to both Mac and Windows?

Thanks,

Bill
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: 10-bit graphics cards, what gives?
« Reply #5 on: September 22, 2017, 08:37:59 am »

The nVidia 10 series cards support 10bit output.
Currently using an SW320 monitor with the GTX1080, and can confirm it.

Yet I can not find information on the NVIDIA website about any cards other than the Quadro models that support 30-bit (10/channel) output.

How did you confirm that yours does support it?

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

TonyW

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 643
Re: 10-bit graphics cards, what gives?
« Reply #6 on: September 22, 2017, 01:13:31 pm »

Does this apply to both Mac and Windows?

Thanks,

Bill
Hi Bill
I do not know for sure but my best guess is that it does apply to both Mac and Windows.  My reasoning being that Windows has had 10 bit output available since Windows 7, but without Quadro or Firepro cards you could not take advantage of it.  I think that Apple intro of 10 bit sometime late 2015 and I assume that the situation must be the same due to the fact that Nvidia and AMD seem to cut back on the capabilities of those cards that are perhaps considered consumer or gaming?  Accepting that I may be wide of the mark here as I do not have any real world evidence to support this

EDIT: Thinking again about the question I did a quick search and it seems my guess may be wrong about the card?
https://imagescience.com.au/knowledge/10-bit-output-support

Yet I can not find information on the NVIDIA website about any cards other than the Quadro models that support 30-bit (10/channel) output.

How did you confirm that yours does support it?

Cheers,
Bart
Hi Bart,
Information does seem to take a little digging to find but - NVidia Geforce have 10 bit per color output via DIrect X but this is not used in applications such as Photoshop as OpenGL used this require Nvidia Quadro with DisplayPort connecter
http://nvidia.custhelp.com/app/answers/detail/a_id/3011/~/10-bit-per-color-support-on-nvidia-geforce-gpus

Have not found much reference on AMD site but Eizo published this for CS6
http://www.eizoglobal.com/support/compatibility/photoshopcs6_nvidia_amd/

I have seen reference elsewhere that the general Nvidia and AMD cards have been hobbled even though they may offer the same chipset to mark the difference between the so called professional cards (Quadro Firepro) that may drive applications such as Photoshop, Autocad, Solidworks etc in 10 bit mode - I hasten to add that this is web chatter rather than something I would consider fact

Test ramp which should show now steps in gradation if 10 bit active
10 bit test ramp psd
« Last Edit: September 22, 2017, 01:30:34 pm by TonyW »
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: 10-bit graphics cards, what gives?
« Reply #7 on: September 22, 2017, 03:38:03 pm »

Test ramp which should show now steps in gradation if 10 bit active
10 bit test ramp psd
That's what I use. However, there is still some debate about if this is actually providing a full high bit video path in Photoshop or it's applying some dither.
I'm using a 2016 MacBook Pro Touchbar which has I believe two differing cards and the ramp looks smooth to me in PS CC on Mac. I'm shown this for Graphics:

Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Alan Goldhammer

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 4344
    • A Goldhammer Photography
Re: 10-bit graphics cards, what gives?
« Reply #8 on: September 22, 2017, 05:26:44 pm »

That's what I use. However, there is still some debate about if this is actually providing a full high bit video path in Photoshop or it's applying some dither.
I'm using a 2016 MacBook Pro Touchbar which has I believe two differing cards and the ramp looks smooth to me in PS CC on Mac. I'm shown this for Graphics:
If it is a laptop GPU it might not be true 10 bit as only top end workstation GPUs can enable this in PS.  As noted in an earlier post, PS only implements 10 bit with OpenGL and not DirectX where it's implemented through dithering.  Are you looking at the ramp on the laptop screen or an external monitor?
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: 10-bit graphics cards, what gives?
« Reply #9 on: September 22, 2017, 05:29:41 pm »

If it is a laptop GPU it might not be true 10 bit as only top end workstation GPUs can enable this in PS. 
Yet the gradient is as smooth as a baby's behind. That being the case, should I care what may or may not be a true high bit display path, at least for Photoshop?
Note too, earlier versions of OS X and Photoshop did not produce these smooth gradients.
I've viewing on an NEC PA272W.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Alan Goldhammer

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 4344
    • A Goldhammer Photography
Re: 10-bit graphics cards, what gives?
« Reply #10 on: September 22, 2017, 05:33:17 pm »

Probably does implement correctly.  Here's the list of MAC products that implement Open GL and CL; you can confirm your model as you didn't provide enough information for me to find it.  https://support.apple.com/en-us/HT202823
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: 10-bit graphics cards, what gives?
« Reply #11 on: September 22, 2017, 05:36:04 pm »

Probably does implement correctly.  Here's the list of MAC products that implement Open GL and CL; you can confirm your model as you didn't provide enough information for me to find it.  https://support.apple.com/en-us/HT202823
Neat thanks! All I see listed next to my model (nearly newest) is 4.1   and 1.2 so that's a yes I suppose.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

TonyW

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 643
Re: 10-bit graphics cards, what gives?
« Reply #12 on: September 22, 2017, 05:49:39 pm »

That's what I use. However, there is still some debate about if this is actually providing a full high bit video path in Photoshop or it's applying some dither.
I'm using a 2016 MacBook Pro Touchbar which has I believe two differing cards and the ramp looks smooth to me in PS CC on Mac. I'm shown this for Graphics:
One mention of dither attributed to Llyod Chambers and a conversation with someone at Adobe
https://macperformanceguide.com/blog/2015/20151105_2300-OSX_ElCapitan-10bit-dualNEC.html

Quote
Apple supports 30-bit output on iMac 5K under 10.11, while other devices get dithering with that option checked.” That raises the question of what OS X 30-bit color really does, especially with different combinations of graphics cards and displays.

So it is not as clear as I thought it may be with the Mac and I guess there have been no definitive statement by Adobe themselves?

As to PC's I went through the choice and ended up with an Nvidia GTX960 as a stop gap until I could decide if 10 bit really worth the extras expenditure even though I have an Eizo display that definitely supports 30 bit.  In a strange way I am pleased to say that it fails the Test Ramp file as it does show vertical banding as would be expected and I am confident that with a Quadro or FirePro and maybe FireGL then the Test file would look smooth.  So far I have not come across any real world image that displays any banding and if I did I think that I would run a check print to verify if baked into data or just a screen effect.  Incidently the check box for 30 bit does show and can be ticked in PS but obviously it does nothing as the graphics card does not support a 30 bit pipeline

I am a little more confident that the info for Windows fairly accurate and to make PS work you must use a suitable card Quadro/FirePro.  Although I would be concerned if the driver issues have been resolved by Adobe - maybe more up to date info somewhere but as of February 27 2017
https://helpx.adobe.com/photoshop/kb/photoshop-cc-gpu-card-faq.html

Quote
Photoshop graphics processor (GPU) card FAQ

30-bit Display (Windows® only): Allows Photoshop to display 30-bit data directly to screen on video cards that support it
Note:30-bit display is not functioning correctly with current drivers. We are working to address this issue as soon as possible.



Logged

ChrisMax

  • Newbie
  • *
  • Offline Offline
  • Posts: 22
Re: 10-bit graphics cards, what gives?
« Reply #13 on: September 23, 2017, 10:07:56 am »

In response to digitaldog:

You have a RadeonPro card in your machine which is a new line form AMD.  I almost bought the RadeonPro 5100 but found that the Nvidia P2000 was better for the Eizo. Pretty sure all the Radeon Pro cards can do 10-bit in photoshop.  Must be why the test is smooth on your monitor.

Link to some info:

http://creators.radeon.com/radeon-pro-workstation/
« Last Edit: September 23, 2017, 10:20:07 am by ChrisMax »
Logged

TonyW

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 643
Re: 10-bit graphics cards, what gives?
« Reply #14 on: September 23, 2017, 10:53:08 am »

In response to digitaldog:

You have a RadeonPro card in your machine which is a new line form AMD.  I almost bought the RadeonPro 5100 but found that the Nvidia P2000 was better for the Eizo. Pretty sure all the Radeon Pro cards can do 10-bit in photoshop.  Must be why the test is smooth on your monitor.

Link to some info:

http://creators.radeon.com/radeon-pro-workstation/
I think you are probably correct as the Radeon Pro replaced the FirePro brand last year.  And of course the Nvidia P2000 is a Quadro card
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: 10-bit graphics cards, what gives?
« Reply #15 on: September 23, 2017, 12:12:23 pm »

I'm happy the gradients are smooth but prior to all this (prior to the OS from Apple that finally supported this in the OS along with an update from Adobe), the gradient on a high bit panel was pretty darn smooth, just not as smooth as after said updates and my new MBP. So yeah, it's nice but certainly not a necessity that IMHO warrants buying a new video card. I suspect the biggest attribute in near prefect smoothness was my high bit panel and the great functionality of the SpectraView.


Am also happy to know that the smoothness isn't due to Adobe 'cheating' by using some kind of dither (if that is indeed the case).
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Alan Goldhammer

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 4344
    • A Goldhammer Photography
Re: 10-bit graphics cards, what gives?
« Reply #16 on: September 23, 2017, 12:57:39 pm »

Am also happy to know that the smoothness isn't due to Adobe 'cheating' by using some kind of dither (if that is indeed the case).
I think that it would not be Adobe 'cheating' as PS can only specify the implementation of the video card driver (in this case Open CL rather than DirectX).  any dithering would be done by the GPU driver.
Logged

ChrisMax

  • Newbie
  • *
  • Offline Offline
  • Posts: 22
Re: 10-bit graphics cards, what gives?
« Reply #17 on: September 23, 2017, 07:16:49 pm »

Quote from MacRumors forum:

My NEC PA271W hooked up over a DisplayPort connection reported 10-bit color fine (460 was outputting to it via Monoprice USB-C to DP cable).


Seems the MacbookPro itself has an 8-bit monitor but when hooked to an external 10-bit monitor the RadeonPro 460 does output 10-bit color. (hey got nothing better to do on Saturday Night in the country than research this stuff:)

Link to page:
https://forums.macrumors.com/threads/tell-my-why-you-guys-bought-a-device-with-no-hdr-support.2017611/page-4
Logged
Pages: [1]   Go Up