Pages: [1]   Go Down

Author Topic: sRGB tone response curve for calibrating a Spectraview Monitor?  (Read 11730 times)

tommm

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 78

As I understand it the tone curve used to calibrate your screen should match the tone curve used in the working space of the programme used for editing. As I use Lightroom and Lightroom uses an sRGB tone curve would using an sRGB curve (an option with spectraview) give better results than the normal 2.2?

Thanks,

Tom
Logged

Simon Garrett

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 742
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #1 on: August 15, 2013, 06:15:40 am »

If colour management is working properly it ought not to matter, as the colour management will map the image data from working space TRC to the monitor TRC. 

However, two points:
  • The sRGB tone response curve is very close to a gamma of 2.2. See http://en.wikipedia.org/wiki/Srgb, and look at the graph on the right.  The red curve and the hashed black curve underneath it are sRGB and 2.2 gamma respectively.  You can't see the black curve as it's, err, behind the red curve!
  • Lightroom uses a linear tone response curve in its working space.  The sRGB TRC is used only in displaying the histogram. 
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #2 on: August 15, 2013, 09:25:42 am »

As I understand it the tone curve used to calibrate your screen should match the tone curve used in the working space of the programme used for editing.

Not at all. The two do not need to match.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

tommm

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 78
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #3 on: August 15, 2013, 01:34:37 pm »

Simon,

They are very similar but not the same, the difference being mainly in the shadows.

I'm pretty sure the sRGB curve is used in displaying the image not just the histogram, otherwise images would appear very dark.


Andrew,

I think you probably know more than them but the Spectraview manual suggested that L* was the best curve to use but only if the programme you used also used L* (Capture One apparently does), this is what lead me to believe using sRGB might be best with photoshop but if you say not that's good enough for me.

Thanks
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #4 on: August 15, 2013, 01:38:07 pm »

L* is a 'big deal' in Europe and for the European version. There was a few discussions on the ColorSync list in the past which to my reading seems to indicate that L* has not peer review acceptance. IOW, it's the rage (like Lab editing) but with little to back up it's usefulness depending on what party you talk to.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

D Fosse

  • Guest
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #5 on: August 15, 2013, 05:40:58 pm »

L* gives a little shadow lift in the calibration LUT, to compensate for the native response of an average LCD. Of course this doesn't matter in a color managed environment where it's all accounted for and compensated in the profile. But outside a color managed environment it should improve the monitor's response, and give a better match between the two.

Personally I'm wary of anything that manipulates the monitor response more than absolutely necessary, especially in the shadows where there's great risk of banding. It might be acceptable with hardware calibration to the monitor's internal 10 bit LUT (NEC/Eizo), but certainly not with an 8 bit video LUT.

Here's how Color Eyes Display Pro video LUT looks with the two options, gamma 2,2 left and L* right:
Logged

D Fosse

  • Guest
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #6 on: August 16, 2013, 03:01:53 am »

I should add that the left curve above is close to a straight line because the monitor has already been adjusted to match the calibration targets. I'd get an even straighter line if I did this to my work monitor, a hardware calibrated Eizo. But I don't have Color Eyes installed on that system.
Logged

Simon Garrett

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 742
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #7 on: August 16, 2013, 04:07:38 am »

Simon,

They are very similar but not the same, the difference being mainly in the shadows.

I'm pretty sure the sRGB curve is used in displaying the image not just the histogram, otherwise images would appear very dark.
Yes, they are very similar, which is what I said.  But it doesn't matter if the TRC of the working space and the monitor are the same or different.  Colour management means that the software maps the colour space, white point and tone curve from that of the working space to that of the monitor.  To say, "...otherwise images would appear very dark" is not correct, as you see the same thing no matter what tone curve is used.    

And Lightroom really does use a linear tone curve in its working space (and an sRGB tone curve for histograms); see for example http://livedocs.adobe.com/en_US/Lightroom/1.0/help.html?content=WS0F7BFFFA-CE53-4ceb-B3D3-9D6256B8917D.html.  

Update: I think that reference is an old one (I just Googled for it).  A more up-to-date one seems to be http://help.adobe.com/en_US/lightroom/using/WS268F3399-80B2-4169-A598-04C7F769FFA0.html
« Last Edit: August 16, 2013, 08:05:34 am by Simon Garrett »
Logged

eliedinur

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 328
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #8 on: August 16, 2013, 04:49:29 am »


I'm pretty sure the sRGB curve is used in displaying the image not just the histogram, otherwise images would appear very dark.

As Simon says, because LR is color managed any image displayed is in the monitor space with its embedded TRC, whatever it may be, and the defined primary RGB values. The use of the sRGB curve and the ProPhoto primaries can only be for the histogram and its associated numerical readout.
« Last Edit: August 16, 2013, 04:51:51 am by elied »
Logged
Roll over Ed Weston,
Tell Ansel Adams th

tommm

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 78
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #9 on: August 19, 2013, 05:35:01 am »

Thanks for helping me get to grips with this. I think I'm starting to understand it a little better.

So, am I right in thinking that as far as what the image looks like on the screen is concerned only the images embedded profile (or lack of) and the monitor TRC matter? With the working space (Linear in Lightroom, whatever chosen in Photoshop) purely being used to do the calculations in the background and not affecting the visible image? (though obviously bigger colour spaces allowing those calculations to be done without any unnecessary clipping and with minimum degradation until conversion to an output profile)

In other words the same image opened in Lightroom and Photoshop should look the same on screen irrespective of the working space used in Photoshop?

If this is a correct understanding then maybe you could also help me figure out why when I open the same image in Photoshop and Lightroom they don't look the same on screen and even more different when soft proofed in the two programmes with the same soft proofing profile and paper and black point compensation (see separate post)?

Thanks for all the explaining!

Tom
Logged

D Fosse

  • Guest
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #10 on: August 19, 2013, 07:33:50 am »

It's really quite simple: The source profile (linear Prophoto in Lr; whatever in Ps) is converted to the monitor profile and then sent to the display. This is done by the application, on the fly, as you adjust the image.

So the result on screen should be identical whatever the source profile is, save for gamut limitations. If they are not identical, there is a problem with the monitor profile. The actual conversion to the monitor profile is different for different source profiles, so one conversion may go bad but not others. It can also happen that different applications react differently to problem profiles.

Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #11 on: August 19, 2013, 09:44:55 am »

L* is a 'big deal' in Europe and for the European version. There was a few discussions on the ColorSync list in the past which to my reading seems to indicate that L* has not peer review acceptance. IOW, it's the rage (like Lab editing) but with little to back up it's usefulness depending on what party you talk to.

As I understand it, the L* TRC has a theoretical advantage when one is mapping with limited precision (e.g. 8 bits), since the perceived change in brightness for one step is the same for small values as larger values and this helps to prevent banding in the lower luminances. As shown in Figure 2 of Greg Wards article on encodings, a linear ramp has big changes in perceived brightness for the lower values, whereas a gamma 2.2 encoding evens out the changes in perceived brightness and is less susceptible to banding in the lower range.

The ideal gamma that best approximates the L* curve is discussed by Bruce Lindbloom here. A gamma of approximately 2.2 is best, but there are still problems in the lower range, where the slope of the curve approaches infinity as the luminance approaches zero. This is addressed in sRGB by a linear segment for low luminances. The companding calculator only works for straight gamma TRCs and does not include an sRGB response. However, a gamma of 2.2 closely approximates the L* TRC except for the lower luminances and the sRGB TRC with a linear segment at the low end is sufficient for practical work.

Regards,

Bill
Logged

WombatHorror

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 299
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #12 on: September 05, 2013, 10:00:24 pm »

As I understand it the tone curve used to calibrate your screen should match the tone curve used in the working space of the programme used for editing. As I use Lightroom and Lightroom uses an sRGB tone curve would using an sRGB curve (an option with spectraview) give better results than the normal 2.2?

Thanks,

Tom

LR doesn't use an sRGB tone response curve. I think it uses Melissa color space which I believe it the gamut of ProphotoRGB combined with linear gamma but it doesn't use the 1.8 of Prophoto or the complex sRGB TRC for sure.

It's more critical to set the monitor TRC to match for non-LR work since LR uses color-management fully at least and translates to whatever the monitor is set to and it uses 16bits so there should not be much issue.

One thing is that if you set your monitor to gamma 2.2 and sRGB people think then if they use something only slightly color-managed such as IE that they see everything properly, but nope. IE doesn't use the monitor profile at all so it doesn't translate tone curves so even for sRGB images it shows every single last one wrong unless you calibrated your display to sRGB TRC.

For photo stuff I set the monitor to native gamut so I can use as much as it has to offer and usually gamma 2.2 or maybe native TRC would be better or just the same.

For TV/movies/some games I set it to sRGB gamut with gamma 2.2.

For web if I am not using a good browser such as Firefox and using something so-so such as IE or terrible (talking in terms of color) Chrome and such I set it to sRGB gamut and sRGB TRC.

If you use IE in sRGB gamut mode to view sRGB images and the monitor is set to gamma 2.2 the shadow detail is crushed, contrast looks a bit too high, highlights a trace different, saturation is a trace too high looking. If you set it to sRGB TRC then they look proper even when using IE or Chrome (although wide gamut images will look messed up in Chrome; if the monitor is in wide gamut mode instead of sRGB then any wide gamut images will look worst in IE, pseudo OKish sorta but not really in Chrome and perfect in Firefox). If you use Firefox they look proper no matter how anything is set or what gamut the images are in.
« Last Edit: September 05, 2013, 10:04:26 pm by WombatHorror »
Logged

WombatHorror

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 299
Re: sRGB tone response curve for calibrating a Spectraview Monitor?
« Reply #13 on: September 05, 2013, 10:02:27 pm »

If colour management is working properly it ought not to matter, as the colour management will map the image data from working space TRC to the monitor TRC. 

However, two points:
  • The sRGB tone response curve is very close to a gamma of 2.2. See http://en.wikipedia.org/wiki/Srgb, and look at the graph on the right.  The red curve and the hashed black curve underneath it are sRGB and 2.2 gamma respectively.  You can't see the black curve as it's, err, behind the red curve!

All the same the difference is pretty easy to spot. Just set the monitor to gamma 2.2 and then view in IE vs. Firefox. Shadows clearly look too dark when using IE, contrast a bit too high, saturation a trace too high, etc. It's much more noticeable than commonly thought.
[/list]
Logged
Pages: [1]   Go Up