Pages: [1]   Go Down

Author Topic: LR 4 support of 10bit video pipeline?  (Read 7093 times)

wcwest

  • Newbie
  • *
  • Offline Offline
  • Posts: 37
LR 4 support of 10bit video pipeline?
« on: August 27, 2012, 06:55:52 pm »

I have a PA271W monitor on a Win7 Pro workstation and wondering if there is any advantage to installing a Radeon video card that supports 10bit pipeline to the monitor. I know PS CS6 supports and benefits from 10bit but question any advantage in LR4. I have looked at the ramp.psd file in PS CS6 and can see the vertical banding. However, when I open the file in LR4 the banding is essentially gone. Several Google searches haven't answered the question. I would greatly appreciate an answer.

Thanks,
Bill
Logged

Sheldon N

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 828
Re: LR 4 support of 10bit video pipeline?
« Reply #1 on: August 27, 2012, 07:53:05 pm »

I have 10 bit display set up with a FirePro card and 10 bit display also works well in CS6. I use the FirePro v4900 video card via displayport into a Dell U2711.

In LR 4.1 I see banding on the 10 bit test ramp when viewing in the Loupe view. However, if I view the 10 bit test ramp in the Develop Module there is no banding. However, I also see no banding in the Develop Module when working on a non-10 bit display. I think there might be some sort of dithering going on, but I'm not sure.

My understanding was always that 10 bit display only worked in Photoshop.

« Last Edit: August 27, 2012, 08:02:24 pm by Sheldon N »
Logged
Sheldon Nalos
[url=http://www.flickr.com

madmanchan

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2115
    • Web
Re: LR 4 support of 10bit video pipeline?
« Reply #2 on: August 27, 2012, 10:48:06 pm »

Library display (e.g., Grid, Loupe, etc.) is lower quality than Develop, because Library is essentially showing a JPEG, which is lossy compressed (quantized, etc.).  Thus banding artifacts that you may occasionally see in Library usually disappear in Develop.
Logged
Eric Chan

mac_paolo

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 431
Re: LR 4 support of 10bit video pipeline?
« Reply #3 on: August 28, 2012, 05:19:19 am »

Library display (e.g., Grid, Loupe, etc.) is lower quality than Develop, because Library is essentially showing a JPEG, which is lossy compressed (quantized, etc.).  Thus banding artifacts that you may occasionally see in Library usually disappear in Develop.
Eric, correct me if I'm wrong. Library view is in AdobeRGB colorspace while Develop module is in LightroomRGB. Is it right?
Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: LR 4 support of 10bit video pipeline?
« Reply #4 on: August 28, 2012, 06:11:17 am »

It is my understanding that it is _really_ hard to see the difference between 8-bit and 10-bit delivery for real-world images, given that:
1. The amount of in-display processing is low (color correction, sharpening,...)
2. The display brightness/DR is moderate (as is the norm for photographers)

Remember that 8-9 bits of gamma-encoded images correspond to 12-14 bits of linear encoded images in term of perceptual fidelity. I.e. you might not need as many bits in your display/printer (nonlinear) as you do in your camera (linear), at least if you apply dynamic processing lightly.

Of course, in the midst of a heavy image processing pipeline, many bits can have a real significance.

I have a Dell u2711 and a ATI graphics card + win7-64, all capable of 10-bits AFAIK (dont know about Lightroom). But since the display only does 10 bits over DisplayPort, and there were some reports of stability issues using DP, I have not tested it.

-h
« Last Edit: August 28, 2012, 06:14:34 am by hjulenissen »
Logged

Sheldon N

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 828
Re: LR 4 support of 10bit video pipeline?
« Reply #5 on: August 28, 2012, 11:01:30 am »

It is my understanding that it is _really_ hard to see the difference between 8-bit and 10-bit delivery for real-world images

This is quite true. You probably won't notice a difference in most real world images.

Where you might notice a difference is in smooth gradient display on areas of continuous tone.

Quote
I have a Dell u2711 and a ATI graphics card + win7-64, all capable of 10-bits AFAIK (dont know about Lightroom).

Unless it's a FirePro card I don't think you'll be able to get 10 bit display to work. I tried with a Radeon card and couldn't get it to work, only the FirePro would do it.
Logged
Sheldon Nalos
[url=http://www.flickr.com
Pages: [1]   Go Up