Pages: 1 [2]   Go Down

Author Topic: TRC of the monitor vs gamma of the image in color aware applications  (Read 14737 times)

tho_mas

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1799
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #20 on: January 14, 2013, 07:55:52 pm »

Photohops can't really show the TRC with only one gamma value but it's significantly close to gamma 2.2 so when you open custom Profile dialog box where it shows you sort of a DNA of the profile it simply says  simplified sRGB. :)
sRGB is only similar to Gamma 2.2 in mid and bright tones. In dark tonal values it's different (see attachment).

Logged

Krunoslav Stifter

  • Newbie
  • *
  • Offline Offline
  • Posts: 12
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #21 on: January 14, 2013, 07:58:41 pm »

If your display is an LCD, I'm inclined to really doubt your assessment that your display's "native gamma" is 1.9. How did you determine that? Most native gammas of LCDs are 2.2-2.3 unless something else might be going on with your display pipeline..

Well I used Spyder 3 Elite to measures TRC using one of their tests. This is the result.



...unless I misinterpreted something as you seems to suspect I assumed it's Gamma 1.9. Since even out of the box it was quite bright.

EDIT: I just remembered to check the help section for the Spyder and here is what it says for the test.

"The Tone Response test measures the response (gamma) of the monitor and plots a graph comparing the measured tone response to standard gamma curves. If your monitor has a control for selecting different gamma settings, you can measure all of the different settings and then find out what is the actual gamma that the control sets. This helps you to select the best setting of such a control before calibration.

For example, a monitor may have a setting in its On Screen Display (OSD) for Gamma with choices "Gamma 1" and "Gamma 2". You can run the Tone Response test and measure the display in both of these settings. When the results are plotted you may find that "Gamma 1" is actually Gamma 2.1 and "Gamma 2" is actually Gamma 1.9. If you were then going to calibrate the monitor to Gamma 2.2 you would know that setting this control to "Gamma 1" would be the best choice since it is closer to the desired 2.2 target value."


This monitor is an older one and does not have an option to set gamma from OSD

Lenovo L220XWC 22-inch LCD Monitor

Panel Technology: S-PVA
Panel Manufacturer: Samsung
Glossy panel: no
Screen Size: 22,0 Zoll (inch)
Pixel Error Class: 2
Pixel Pitch: 0.246 mm
Response Time (rise + fall): -
Response Time (grey-to-grey): 6 ms
Native Resolution: 1920 x 1200
Brightness: 325 cd/qm
Contrast Ratio: 1200:1
Dynamic Contrast: -
Viewing Angle vertical (10:1): 178°
Viewing Angle horizontal (10:1): 178°
Viewing Angle vertical (5:1): -
Viewing Angle horizontal (5:1): -
Color depth: 16,70 Mio.
Net dimensions (WxHxD): -
Housing Color: black
Input Video Signal: D-Sub (analog)
DVI-D (digital)

Inputs (quantity): 2
HDCP: yes
S-Video Input: no
Base (swiveling): yes
Base (height adjustment): yes
Display slanting: yes
Pivotable: yes
Integrated Speakers: no
USB-Hub: yes
Weight: 7,2 kg

It doesn't mention gamma. I assume from the messured gamma using Spyder 3 that is the native gamma. Was I wrong?


GPU nVidia GeForce 8800 GT
OS: Windows 7 x64
« Last Edit: January 14, 2013, 08:48:10 pm by Krunoslav Stifter »
Logged

Krunoslav Stifter

  • Newbie
  • *
  • Offline Offline
  • Posts: 12
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #22 on: January 14, 2013, 08:01:08 pm »

sRGB is only similar to Gamma 2.2 in mid and bright tones. In dark tonal values it's different (see attachment).

That is is a nice image. Thanks. Can I use it in tutorial if I need it, please?

But do you think it effected what I was trying to show or are you saying that we can't talk about sRGB as being gamma 2.2 despite Adobe that it does. I mean it would depend on the context in which we use it, right?
Logged

samueljohnchia

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 498
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #23 on: January 14, 2013, 08:15:56 pm »

That is is a nice image. Thanks. Can I use it in tutorial if I need it, please?

But do you think it effected what I was trying to show or are you saying that we can't talk about sRGB as being gamma 2.2 despite Adobe that it does. I mean it would depend on the context in which we use it, right?

Adobe does not ever mention that sRGB is gamma 2.2! Please ignore the misleading "simplified gamma" in the custom profile dialogue box. Photoshop honors the 1024 TRC in the sRGB profile. Again, for accuracy, don't call it your display "gamma", it's a TRC. And sRGB is not 2.2...
Logged

tho_mas

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1799
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #24 on: January 14, 2013, 08:18:09 pm »

That is is a nice image. Thanks. Can I use it in tutorial if I need it, please?
sure... but better use this one (pure black was croped in the first image)
Logged

samueljohnchia

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 498
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #25 on: January 14, 2013, 08:21:00 pm »

Well I used Spyder 3 Elite to measures TRC using one of their tests. This is the result.
...unless I misinterpreted something as you seems to suspect I assumed it's Gamma 1.9. Since even out of the box it was quite bright.

It is possible that the spyder software is approximating your too-bright display to Gamma 1.9 than Gamma 2.2. tlooknbill suggested this early on by saying that you can change the display "gamma" by adjusting the brightness setting. Not the most technically correct statement, but we get the idea. If you lower your brightness setting, it may reflect a native TRC closer to gamma 2.2.
Logged

Krunoslav Stifter

  • Newbie
  • *
  • Offline Offline
  • Posts: 12
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #27 on: January 14, 2013, 08:33:44 pm »

Adobe does not ever mention that sRGB is gamma 2.2! Please ignore the misleading "simplified gamma" in the custom profile dialogue box. Photoshop honors the 1024 TRC in the sRGB profile. Again, for accuracy, don't call it your display "gamma", it's a TRC. And sRGB is not 2.2...

How about approx Gamma 2.2, does that satisfy you?

Context is everything and to an average user it won't matter much but it will add to the confusion, but I'm not dealing with average users here so I will use TRC of sRGB from now on.  

sure... but better use this one (pure black was croped in the first image)

Thank you. :)


It is possible that the spyder software is approximating your too-bright display to Gamma 1.9 than Gamma 2.2. tlooknbill suggested this early on by saying that you can change the display "gamma" by adjusting the brightness setting. Not the most technically correct statement, but we get the idea. If you lower your brightness setting, it may reflect a native TRC closer to gamma 2.2.

Not sure. But if you can effect the measured gamma of the display with changing the brightness setting, won't you compromise the target "luminance" in the process? But even if I do that, there had to be native gamma set in by the manufacturer, right? If I can effect the gamma myself than the term native gamma, seem to lose it's importance doesn't it?

I can't choose Native gamma in the calibration settings only this.



Under other I can specify manually whatever I wish.



And there is also non-gamma option for adjusting more complex TRC



Logged

samueljohnchia

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 498
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #28 on: January 14, 2013, 10:23:41 pm »

Quote
How about approx Gamma 2.2, does that satisfy you?

Context is everything and to an average user it won't matter much but it will add to the confusion, but I'm not dealing with average users here so I will use TRC of sRGB from now on.  

I fully agree that context is everything! If you include tho_mas' file in your tutorial, it would show that sRGB does indeed not have a 2.2 gamma in that context. In other contexts, sRGB could be a little different...

Quote
Not sure. But if you can effect the measured gamma of the display with changing the brightness setting, won't you compromise the target "luminance" in the process?

Yes, that was why in one of my earlier posts I mentioned that the Brightness setting is meant to allow you to achieve your target luminance/white level, not target gamma. But as all devices are non-linear, lowering the display brightness from its out-of-the-box too-bright state to something around 100cd/m2 may shift its TRC a bit...

Quote
I can't choose Native gamma in the calibration settings only this.

If I'm not mistaken, all the commonly known display calibration+profiling software attempt to map the TRC of a display to a single gamma value, or some special gray curves like sRGB-1024 or L*. If you wish to calibrate it to "native gamma", you allow the software to measure the uncalibrated display (as you have done) and calibrate it to that gamma value. That would be the closest to the native TRC of the display. It might be possible to plot the native TRC of your display, and build a gray curve profile to use under the non-gamma>other option, but that's a bit extreme...

P.S. I've sent you a PM...
« Last Edit: January 14, 2013, 10:25:25 pm by samueljohnchia »
Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #29 on: January 15, 2013, 12:34:51 am »

Quote
Not sure. But if you can effect the measured gamma of the display with changing the brightness setting, won't you compromise the target "luminance" in the process? But even if I do that, there had to be native gamma set in by the manufacturer, right? If I can effect the gamma myself than the term native gamma, seem to lose it's importance doesn't it?

I can't choose Native gamma in the calibration settings only this.

What happens when you choose that radio button titled "Non-Gamma"? Maybe that's Datacolor's term for Native Gamma.

Seeing that your display is S-PVA with an outrageous brightness and contrast ratio, you haven't mentioned your display's luminance white (in cd/m2) as measured by Spyder3. It needs to be at least at or below 120 cd/m2 unless you're working in a bright office environment.

Like I mentioned before you need to adjust the display's OSD Brightness and Contrast so its native gamma measures 2.2 and white measures between 100-120 cd/m2. The way I do this on my Dell 2209WA is load a 21 step Linear grayramp first made in Lab space and then converted to AdobeRGB and load it as my desktop. See below attached image of my Dell showing how this grayramp should look with regard to tonal distribution. Using a 2.2 gamma created grayramp will not allow you to clearly see how black appears next to another tone.

You need to adjust the brightness and contrast OSD buttons with NO LUT/Spyder3 profile loaded. Select sRGB as your system profile temporarily. You may need to reboot Windows if the display doesn't change globally switching from Spyder3 profile to sRGB.

If there is a noticeable change on the fly then start adjusting the brightness/contrast buttons to make that grayramp show an even and gradual distribution of each step from black to white. Also while you're at it adjust RGB gains if available on your display so white looks neutral or if no gains available pick a color temp/WB setting that gives the most neutral appearance.

When done, reprofile with the Spyder3 choosing Native WB and 2.2 gamma target. Check to see if Spyder3 measures native gamma as 2.2 as well.

I've also included the AdobeRGB tagged 21stepLinearLabGrayramp for you to use.




« Last Edit: January 15, 2013, 12:50:06 am by tlooknbill »
Logged

Krunoslav Stifter

  • Newbie
  • *
  • Offline Offline
  • Posts: 12
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #30 on: January 15, 2013, 01:58:37 am »

What happens when you choose that radio button titled "Non-Gamma"? Maybe that's Datacolor's term for Native Gamma.

"Non-Gamma" simply allows TRC that is more complex than one that can be describe with a single gamma value. I don't think Native gamma is mentioned anywhere, but under "gamma > other settings" I can spcify any gamma value that I want. I posted a screen shot earlier of that.

Seeing that your display is S-PVA with an outrageous brightness and contrast ratio, you haven't mentioned your display's luminance white (in cd/m2) as measured by Spyder3. It needs to be at least at or below 120 cd/m2 unless you're working in a bright office environment.

I set it to 110 cd/m2 for my environment that works fine. I think lowest I tried was down to 90 cd/m2

You need to adjust the brightness and contrast OSD buttons with NO LUT/Spyder3 profile loaded. Select sRGB as your system profile temporarily. You may need to reboot Windows if the display doesn't change globally switching from Spyder3 profile to sRGB.

I only have brightness not contrast option using OSD. And I believe sRGB is set to Windows by default, it is Microsoft after all.

If there is a noticeable change on the fly then start adjusting the brightness/contrast buttons to make that grayramp show an even and gradual distribution of each step from black to white. Also while you're at it adjust RGB gains if available on your display so white looks neutral or if no gains available pick a color temp/WB setting that gives the most neutral appearance.

When done, reprofile with the Spyder3 choosing Native WB and 2.2 gamma target. Check to see if Spyder3 measures native gamma as 2.2 as well.

I've also included the AdobeRGB tagged 21stepLinearLabGrayramp for you to use.

First I did just a test for Tone Curve with the loaded profile and it was gamma 1.9, than I deactivated the Sypder profile and ser sRGB as default and it was still gamma 1.9 despite screen being brighter. So what I'm guessing is that when Spyder dose the test it measures the TRC without loaded profile on it's own. I also tried to change brightness although not super drastic but enough that it should work. It still showed gamma 1.9 for the measured tone curve of the display.

« Last Edit: January 15, 2013, 02:00:35 am by Krunoslav Stifter »
Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #31 on: January 15, 2013, 02:52:12 am »

Quote
It still showed gamma 1.9 for the measured tone curve of the display.


Since you don't have a contrast adjust I'ld suggest you choose 1.9 as your target gamma. That way it will act somewhat as a null effect basically forcing/tricking the Spyder software into accepting the target as native gamma since that's what it measures as your native response. There's your native gamma setting. It just doesn't come with a menu selection.

Hopefully this will induce the software into building a profile that doesn't drastically globally change the brightness of the display (like choosing 1.00 gamma) and just builds RGB LUT curves that are basically close to looking flat/linear shaped that only correct for neutrality throughout the tonal scale according to WB measurements similar to a Mac's VCGT (video card gamma tag).

See screengrab below of blue channel downloaded to video card that's embedded in my i1Display profile. Red and Green are similarly shaped but slightly pulled back or pushed forward according to the color tint of white in order to make the tint of grays match the tint of white.
Logged

Krunoslav Stifter

  • Newbie
  • *
  • Offline Offline
  • Posts: 12
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #32 on: January 15, 2013, 02:59:13 am »


Since you don't have a contrast adjust I'ld suggest you choose 1.9 as your target gamma. That way it will act somewhat as a null effect basically forcing/tricking the Spyder software into accepting the target as native

Yes, but the difference between 2.2 and 1.9 is still pretty big. I don't want all my UI and everything to be that much brighter, I will rather stick to gamma 2.2 or possibly L-star. I don't see any advantage with calibrating to gamma 1.9 and let's say that is native. I just don't see any advantage in calibrating to native gamma. My gradients on this display at least are still not going to be perfect, I tried that and I get everything brighter. I have everything outside color aware app already way to saturated I don't want to make it brighter on top of that as well. BTW. everything is over saturated since its a wide gamut display. For example I can only use Firefox with color management activated for all rendered graphic, all other browsers are completely unusable. So along side gammut problems I would like to avoid gamma problems as well. My safest bet is to simply calibrate to standard of 2.2 and that it.
Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: TRC of the monitor vs gamma of the image in color aware applications
« Reply #33 on: January 15, 2013, 03:09:49 am »

You're right. I forgot the brightening affect of 1.9 gamma on the rest of non-color managed apps and OS GUI. 2.2 gamma target it is.
Logged
Pages: 1 [2]   Go Up