Luminous Landscape Forum

Raw & Post Processing, Printing => Colour Management => Topic started by: jljonathan on October 18, 2009, 12:57:46 pm

Title: 2.2 or L* ?
Post by: jljonathan on October 18, 2009, 12:57:46 pm
I have used Coloreyes to calibrate an older model Imac 24" to 90 cd/m2. I have run it using 2.2 and L* and do see a shift between the two gammas. Could someone please explain the differences between the two that would account for what I am seeing, and offer some recommendation as to which would be preferable. I work in PS CS4 using ProPhoto 16 bit.
Thanks
Jonathan
Title: 2.2 or L* ?
Post by: Czornyj on October 18, 2009, 01:47:35 pm
Quote from: jljonathan
I have used Coloreyes to calibrate an older model Imac 24" to 90 cd/m2. I have run it using 2.2 and L* and do see a shift between the two gammas. Could someone please explain the differences between the two that would account for what I am seeing, and offer some recommendation as to which would be preferable. I work in PS CS4 using ProPhoto 16 bit.
Thanks
Jonathan

It's not a good idea to calibrate iMac - that's natively gamma 2.2 calibrated - to L* TRC. It's also not a good idea to calibrate it to L*, when your working space (ProPhoto) is gamma 1.8 encoded. In your situation gamma 2.2 is the optimal choice.
Title: 2.2 or L* ?
Post by: Mark D Segal on October 18, 2009, 03:24:08 pm
You didn't explain the nature of the differences you are seeing. In the final analysis, one purpose of colour management (though not the only) is to achieve a good match between what you see on yoiur display and what comes out of your printer. Use which ever gamma setting systematically does a better job of this with your equipment and images.
Title: 2.2 or L* ?
Post by: Czornyj on October 18, 2009, 04:09:13 pm
Quote from: MarkDS
You didn't explain the nature of the differences you are seeing. In the final analysis, one purpose of colour management (though not the only) is to achieve a good match between what you see on yoiur display and what comes out of your printer. Use which ever gamma setting systematically does a better job of this with your equipment and images.

It's actually a good idea to match the gamma of monitor with the TRC of our working space to minimize banding, but it gives any results only if we have a display with high bit matrix, LUT, and hardware calibration. In case of L* calibrated iMac and ProPhoto the banding will only get worse.
Title: 2.2 or L* ?
Post by: Mark D Segal on October 18, 2009, 07:46:37 pm
I calibrated and profiled my LaCie 321 (1st edition) with ColorEyes Display, used L* as the gamma parameter and I have no problem with banding.
Title: 2.2 or L* ?
Post by: Czornyj on October 19, 2009, 03:57:53 am
Quote from: MarkDS
I calibrated and profiled my LaCie 321 (1st edition) with ColorEyes Display, used L* as the gamma parameter and I have no problem with banding.

But LaCie is a 10bit panel with 12bit LUT, and internal calibration. On iMac you only can calibrate the 8bit LUT on the graphics card. iMac is gamma 2.2 precalibrated, so when you'll calibrating it to L*, you're loosing at least 22 levels per channel, and then you're loosing another 20-30 levels per channel to display gamma 1.8 encoded image on L* calibrated display.
Title: 2.2 or L* ?
Post by: Mark D Segal on October 19, 2009, 08:37:45 am
OK, I missed the iMac part of it., and the gamma recommendation depends, inter alia, on the hardware one is using.

Title: 2.2 or L* ?
Post by: digitaldog on October 19, 2009, 10:36:39 am
L* calibration for displays is all the rage in Europe for some reason. Its yet to be proven to be useful in any peer reviewed, scientific piece I’m aware of. This was debated on the ColorSync list awhile ago. One of the most readable posts was from Lars Borg of Adobe (their head color scientist):


Quote
L* is great if you're making copies. However, in most other
scenarios, L* out is vastly different from L* in.  And when L* out is
different from L* in, an L* encoding is very inappropriate as
illustrated below.

Let me provide an example for video. Let's say you have a Macbeth
chart. On set, the six gray patches would measure around  L* 96, 81,
66, 51, 36, 21.

Assuming the camera is Rec.709 compliant, using a 16-235 digital
encoding, and the camera is set for the exposure of the Macbeth
chart, the video RGB values would be 224,183,145,109,76,46.

On a reference HD TV monitor they should reproduce at L* 95.5, 78.7,
62.2, 45.8, 29.6, 13.6.
If say 2% flare is present on the monitor (for example at home), the
projected values would be different again, here: 96.3, 79.9, 63.8,
48.4, 34.1, 22.5.

As you can see, L* out is clearly not the same as L* in.
Except for copiers, a system gamma greater than 1 is a required
feature for image reproduction systems aiming to please human eyes.
For example, film still photography has a much higher system gamma
than video.

Now, if you want an L* encoding for the video, which set of values
would you use:
96, 81, 66, 51, 36, 21 or
95.5, 78.7, 62.2, 45.8, 29.6, 13.6?
Either is wrong, when used in the wrong context.
If I need to restore the scene colorimetry for visual effects work, I
need 96, 81, 66, 51, 36, 21.
If I need to re-encode the HD TV monitor image for another device,
say a DVD, I need 95.5, 78.7, 62.2, 45.8, 29.6, 13.6.

In this context, using an L* encoding would be utterly confusing due
to the lack of common values for the same patches.  (Like using US
Dollars in Canada.)
Video solves this by not encoding in L*. (Admittedly, video encoding
is still somewhat confusing. Ask Charles Poynton.)

When cameras, video encoders, DVDs, computer displays, TV monitors,
DLPs, printers, etc., are not used for making exact copies, but
rather for the more common purpose of pleasing rendering, the L*
encoding is inappropriate as it will be a main source of confusion.

Are you planning to encode CMYK in L*, too?

And
Quote
This discussion seems to use the wrong premises,
focusing on one very narrow point,  the "optimal"
TRC for non-rerendered grays. This is a rat hole.
Grays make up only 0.00152588 percent of your
device colors. Unrendered colors are
uninteresting, unless you're making copiers. So
look at the other issues.

First, why does the L* gray TRC matter, when all
devices, even the eye, have to rerender for
optimal contrast (unless you're making copiers).
The digital system has to recode any (L* or not)
encoded data into other (L*) encoded data. To
make this more clear, optimally reproduce the
same image on newsprint, SWOP and transparency.
Clearly, the in-gamut L* values will differ. Show
me how an L* TRC would reduce quantization errors
when re-encoding from one dynamic range to
another.

Second, so far this discussion has completely
ignored colors. Even with L* TRCs, you have to
encode non-neutral colors. I know of no 8-bit
encoding (L* or not) that reduces quantization
errors when you convert say all saturated greens
from eci RGB to Adobe RGB. Show me the
quantization errors with different TRCs.

Third, where is the scientific foundation? Where
is the science that shows that the eye has a
natural L* TRC for any arbitrary color, not only
for neutrals? Where is the science that shows
that the eye has a natural L* TRC for neutrals at
arbitrary luminance levels and arbitrary flare
levels?
As far as I can tell, CIE LAB doesn't show any such thing.

I'm not picking on anyone in particular, but maybe you, Karl, could answer?

Lars

Title: 2.2 or L* ?
Post by: digitaldog on October 19, 2009, 10:41:13 am
Quote from: Czornyj
iMac is gamma 2.2 precalibrated...


I don’t know that’s true for all iMacs. I recall a session at PPE with Karl Lang and Chris Murphy reporting that their data showed that the iMac’s they tested were actually a native TRC gamma of 1.8. This was a good 2 years ago and it wouldn’t surprise me based on Apple’s ideas about gamma which they thankfully fixed in Snow Leopard.
Title: 2.2 or L* ?
Post by: jljonathan on October 19, 2009, 11:40:11 am
Thanks for the replies. 2.2 seems to be what's suggested and what I will try. Apple's site also recommends 2.2 and D65.
Jonathan
Title: 2.2 or L* ?
Post by: Czornyj on October 19, 2009, 11:49:21 am
Quote from: digitaldog
I don’t know that’s true for all iMacs. I recall a session at PPE with Karl Lang and Chris Murphy reporting that their data showed that the iMac’s they tested were actually a native TRC gamma of 1.8. This was a good 2 years ago and it wouldn’t surprise me based on Apple’s ideas about gamma which they thankfully fixed in Snow Leopard.

Quote from: jljonathan
Thanks for the replies. 2.2 seems to be what's suggested and what I will try. Apple's site also recommends 2.2 and D65.
Jonathan

If my memory serves me well, 2 samples of white 24" iMac I profiled were ~2.2 rather than 1.8, but it could change over time. The native white point was at ~6700-6800K, so yes, D65 seem to be a good recommendation.
Title: 2.2 or L* ?
Post by: JeffKohn on October 19, 2009, 02:54:54 pm
I have an Eizo display with its own LUT, but I still prefer gamma 2.2 over L*. What I've found is that when calibrating to L*, the lowest shadow tones are more opened up. Not only does it look somewhat artificial, but it's pretty much impossible to get those tones to reproduce in print. I get a better screen->print match when my display is calibrated to gamma 2.2.
Title: 2.2 or L* ?
Post by: Czornyj on October 19, 2009, 03:43:16 pm
Quote from: JeffKohn
I have an Eizo display with its own LUT, but I still prefer gamma 2.2 over L*. What I've found is that when calibrating to L*, the lowest shadow tones are more opened up. Not only does it look somewhat artificial, but it's pretty much impossible to get those tones to reproduce in print. I get a better screen->print match when my display is calibrated to gamma 2.2.

On L* calibrated display the shadows may only look different in non-color managed environment, otherwise it doesn't make a difference.
Title: 2.2 or L* ?
Post by: Brian Gilkes on October 19, 2009, 08:02:30 pm
The opening up of shadows can be very useful for some images, especially if perceptual edits for profiles or RIPs ensure deep saturated shadows maximize colour and not black ink. You can get deep , old tapestry colours. I would still stick to 2.2 though. A superior approach is to duplicate the file, convert mode on duplicate file to L*, duplicate the background . Move this layer onto the top of the original file and blend on luminosity. This will open shadows. If effect is too great , adjust opacity. This gives control and a printable result. Assigning screen response to L* will not as screen steps and printer steps will not match. I see no reason to adjust Mac screens to 1.8. In addition if an accurate perceptual response is required , file space using Joe Holmes spaces will give a better result than Adobe RGB etc. With screen, file and output spaces to be considered the whole thing gets a bit complicated. If all else fails ,follow the instructions.
Hope this helps
Brian
www.pharoseditions.com.au
Title: 2.2 or L* ?
Post by: MPatek on October 20, 2009, 01:25:47 am
Recently, I did some comparison of gamma 2.2 and L-star calibration using theoretical models as well as "real" calibrated display. Graphs are included in rather technical discussion at this page (http://www.marcelpatek.com/gamma.html#lstar).

As others already pointed out, calibration to L-star gamma indeed results in opened shadows clearly apparent in non color managed viewers for untagged images. I found this distracting when viewing images, web sites and pictures on the web. This shadow opening is noticeable for both gamma 1.8 and 2.2. I remember seeing quite posterized web graphics -- compressed jpegs optimized for gamma 2.2. So unless you work with icc profile-tagged images in completely color managed environment, I also recommend using gamma of 2.2 for general work.
Title: 2.2 or L* ?
Post by: Arkady on October 21, 2009, 05:06:29 pm
Lars is talking about encoding. I don't think it is directly applicable to monitor rendering (though indirectly it is, but conclusion could be different).

Quote from: jljonathan
I have used Coloreyes to calibrate an older model Imac 24" to 90 cd/m2. I have run it using 2.2 and L* and do see a shift between the two gammas. Could someone please explain the differences between the two that would account for what I am seeing, and offer some recommendation as to which would be preferable. I work in PS CS4 using ProPhoto 16 bit.

The idea behind 2.2 gamma is to have a non color managed solution for sRGB workflow. There is no other benefits except that manufactures may be hardwire their monitors to be close to 2.2.

The idea behind L* to have as linear color transform as possible while rendering to a monitor.  That is to say since all color transforms happens in single 3D LUT with common use of tetrahedral interpolation, L* ing your monitor allows for more linear dependency between PCS (CIE Lab) and the device (your monitor).  If L* were the only nonlinear component in the transformation then matching your monitor to L* would have allowed "lossless" color rendering to the monitor, namely, no bit resolution will be sacrificed during the rendering. That means that it would reduce banding! Sounds exciting?! Well the problem is that your monitor has its own LUT, thus if native gamma is 2.7 (as quite common case) any deflection from native gamma would result in loosing resolution bits.  

Is it beneficial? Theoretically, it might be. If you have higher bit monitor that allows some room for TRC adjustment without loosing color resolution. Or if the monitor is CRT and is driven through VGA cable.

However in general the answer will depend on many components - color transform LUT dimensions, interpolation used by CMM, monitor native gamma, LUT bit depth and size.

I would think that an optimal TRC lies between native gamma and L* in real world.


BTW. I don't think ProPhoto is a good choice, at least not in ICC based workflow.
Title: 2.2 or L* ?
Post by: digitaldog on October 21, 2009, 07:14:08 pm
Quote from: Arkady
The idea behind 2.2 gamma is to have a non color managed solution for sRGB workflow. There is no other benefits except that manufactures may be hardwire their monitors to be close to 2.2.

No, there ARE other benefits, mainly that the TRC gamma of displays is generally pretty darn close to 2.2!

Of course Lars is talking about encoding, that’s the entire topic here!

As for ProPhoto, lets not go there, too much else that’s of disagreement...
Title: 2.2 or L* ?
Post by: tho_mas on October 22, 2009, 11:01:10 am
Quote from: JeffKohn
I have an Eizo display with its own LUT, but I still prefer gamma 2.2 over L*. What I've found is that when calibrating to L*, the lowest shadow tones are more opened up. Not only does it look somewhat artificial, but it's pretty much impossible to get those tones to reproduce in print. I get a better screen->print match when my display is calibrated to gamma 2.2.
Color managed based on 16bit files you actually should not notice any difference. The problem is Eizo's Calibration software: L* is simply inaccurate in Color Navigator. So the entire dispersion of luminance will be off when you are using L* as target in Col.Navigator. Whereas Gamma 2.2 or 1.8 (or any Gamma) work fine.



Title: 2.2 or L* ?
Post by: Arkady on October 22, 2009, 01:40:55 pm
Quote from: digitaldog
No, there ARE other benefits, mainly that the TRC gamma of displays is generally pretty darn close to 2.2!

Well, that what I said, didn't I?


Though I respectfully disagree regarding to "darn close" stuff. It can be, however, dated disagreement. Back few years, even mid-high range LCD monitors set gamma on per channel basis the resulting gamma on device neutral were usually far off the 2.2  usually higher, like I mentioned 2.7 could be a quite common number.

Quote from: digitaldog
Of course Lars is talking about encoding, that’s the entire topic here!

No. Loosing bit resolution due to interpolation is an not an encoding problem it is round off problem. And that what I was talking about in the previous post when explaining rationale behind L* calibration. They have different causes, different math behind them and, correspondingly, different optimal solutions.

Though again, I don't advocate L* calibration and I completely agree with you that advantages of L* yet to be determined.

Title: 2.2 or L* ?
Post by: tho_mas on October 22, 2009, 03:39:10 pm
Quote from: Arkady
I completely agree with you that advantages of L* yet to be determined.
In 8bit workflows the advantages are quite obvious e.g. if you are printing gamma 2.2 files. In 16bit workflows TRC translations and quantization errors are negligible IMO.
Advantages - if you'd like to consider them advantages - are e.g.:
- equidistant and perceptual uniform TRC
- Mid gray = L50|0|0 = RGB 127/127/127
both might be helful for editing regarding gradation curves and other global adjustments.
Gamma 1.8 differentiates better in bright and Gamma 2.2 differentiates better in dark tonal values; L* differentiates equal all over the gray axis.
This is why L* color spaces can be considered as "media-" or "device independent" (actually Gamma 2.2 refers to TV and Gamma 1.8 to print-media).
So... the concept is very good. At least much better than the mixture of Gamma 2.2 for the display (only for historical reasons) and ~Gamma 1.8 for printers (the advantages of this mismatch are yet to be determined ;-)... )
Still, in 16bit workflows this is negligible.
Title: 2.2 or L* ?
Post by: MHMG on October 22, 2009, 04:20:01 pm
Not sure about all the high color theory, but on a practical note, using ColorEyes Display Pro to calibrate my Macbk Pro tethered to an Apple Cinema Display, I concluded my results were slightly better with L* compared to G2.2 and I had to lift my blackpoint a little (ie. not use "minimum" setting in the ColorEyes advanced menu) to get best results. Hence, I think the practical answer is system dependent and one should choose the settings that work best with your system.  An 8bit system is always going to face some compromises. I my case, L* tamed the "color ripple" I see in my 1 delta L step ramp compared to a G2.2 calibration, but it's certainly not perfect. I'd need a more high end system, I think, to do better. Nevertheless, I'm quite pleased with how well ColorEyes calibrates my display overall.

My pragmatic decision maker is a target called "MonitorChecker(v4)_LAB.psd".  I developed this target over the years starting with the interlacing approach I first saw in the vintage Adobe Gamma 1.8 target that Adobe used to bundle with Pagemaker (ah, the good ole days). I have expanded the target in scope to add shadow/highlight and "color ripple" guidance, and also to teach students about the fascinating surround effects that enter into image appearance decisions when zooming in and out of shadow areas, for example. I also include a Photoshop layer of instructions (toggle the layer off to use).

The Monitorchecker target is specifically designed to exercise the Photoshop-to-monitor profile hand-off and see how well it all works after monitor calibration. If you want to try it, with any luck it will download directly from here:

http://www.aardenburg-imaging.com//cgi-bin...DU2Nzg5LyoxMDM= (http://www.aardenburg-imaging.com//cgi-bin/mrk/_4919ZGxkLzBeMjAwMDAwMDAwMTIzNDU2Nzg5LyoxMDM=)

Otherwise, you can find it on my website "documents" page by scrolling down several items.

http://www.aardenburg-imaging.com/documents.html (http://www.aardenburg-imaging.com/documents.html)

cheers,

Mark


Title: 2.2 or L* ?
Post by: Czornyj on October 22, 2009, 05:01:18 pm
Quote from: MHMG
Not sure about all the high color theory, but on a practical note, using ColorEyes Display Pro to calibrate my Macbk Pro tethered to an Apple Cinema Display, I concluded my results were slightly better with L* compared to G2.2 and I had to lift my blackpoint a little (ie. not use "minimum" setting in the ColorEyes advanced menu) to get best results. Hence, I think the practical answer is system dependent and one should choose the settings that work best with your system.  An 8bit system is always going to face some compromises. I my case, L* tamed the "color ripple" I see in my 1 delta L step ramp compared to a G2.2 calibration, but it's certainly not perfect. I'd need a more high end system, I think, to do better. Nevertheless, I'm quite pleased with how well ColorEyes calibrates my display overall.

My pragmatic decision maker is a target called "MonitorChecker(v4)_LAB.psd".  I developed this target over the years starting with the interlacing approach I first saw in the vintage Adobe Gamma 1.8 target that Adobe used to bundle with Pagemaker (ah, the good ole days). I have expanded the target in scope to add shadow/highlight and "color ripple" guidance, and also to teach students about the fascinating surround effects that enter into image appearance decisions when zooming in and out of shadow areas, for example. I also include a Photoshop layer of instructions (toggle the layer off to use).

The Monitorchecker target is specifically designed to exercise the Photoshop-to-monitor profile hand-off and see how well it all works after monitor calibration. If you want to try it, with any luck it will download directly from here:

http://www.aardenburg-imaging.com//cgi-bin...DU2Nzg5LyoxMDM= (http://www.aardenburg-imaging.com//cgi-bin/mrk/_4919ZGxkLzBeMjAwMDAwMDAwMTIzNDU2Nzg5LyoxMDM=)

Otherwise, you can find it on my website "documents" page by scrolling down several items.

http://www.aardenburg-imaging.com/documents.html (http://www.aardenburg-imaging.com/documents.html)

cheers,

Mark

If you'll display 1dE gradient on 8bit gamma 2.2 panel it will always look bad - in reality the colorimetric distances will vary, so you'll see some "ripples". It will only look smooth on L* calibrated high bit  display. Let's take L* 13, 14 and 15 values as an example - on a gamma 2.2 display they really are L* 12,719, 14,364 and 14,906, so the colorimetric distance between 13 and 14 is dE 1,645, while the distance between 14 and 15 is only dE 0,542.
ACD is gamma 2.2 calibrated, and most popular editing spaces are gamma 2.2 or 1.8 encoded (with the exeption of L-star aka ECI v2), so L* calibration can only make things worse.

It only makes any sense with high bit, hardware calibrated display, and L* encoded editing space.
Title: 2.2 or L* ?
Post by: MHMG on October 22, 2009, 05:28:52 pm
Quote from: Czornyj
If you'll display 1dE gradient on 8bit gamma 2.2 panel it will always look bad - in reality the colorimetric distances will vary, so you'll see some "ripples". It will only look smooth on L* calibrated high bit  display.
ACD is gamma 2.2 calibrated, and most popular editing spaces are gamma 2.2 or 1.8 encoded (with the exeption of L-star aka ECI v2), so L* calibration can only make things worse.

It only makes any sense with high bit, hardware calibrated display, and use L* encoded editing space.

Well, a neutral delta L ramp of 1 L value on a G2.2 curve increases RGB values by about 2-6 RGB units. If the display is calibrated to show equal RGB values (e.g., 100,100,100) as neutral then the calibration technique that gets closest to appearing stepwise neutral over the full RGB range is doing a better job with low chroma color rendition. Color Ripple is reduced to a minimum. Not perfect, but definitely minimized.  In my case, L* calibration outperformed the G2.2 calibration on my ACD driven by my MKbkpro video card. We can debate theories as to why one setting should or shouldn't be better.  I simply suggest one try both calibrations and make a choice as to which one better suits your system's real versus theorized performance. I doubt it will always be G2.2  or G1.8 and I doubt it will always be L*. A pragmatic verification is the rationale for me to be using the monitorchecker target.  It steps outside any calibration software package's closed loop "validation" routine.

Title: 2.2 or L* ?
Post by: Czornyj on October 22, 2009, 05:40:07 pm
Quote from: MHMG
Well, a neutral delta L ramp of 1 L value on a G2.2 curve increases RGB values by about 2-6 RGB units. If the display is calibrated to show equal RGB values (e.g., 100,100,100) as neutral then the calibration technique that gets closest to appearing stepwise neutral over the full RGB range is doing a better job with low chroma color rendition. Color Ripple is reduced to a minimum. Not perfect, but definitely minimized.  In my case, L* calibration outperformed the G2.2 calibration on my ACD driven by my MKbkpro video card. We can debate theories as to why one setting should or shouldn't be better.  I simply suggest one try both calibrations and make a choice as to which one better suits your system's real versus theorized performance. I doubt it will always be G2.2  or G1.8 and I doubt it will always be L*. A pragmatic verification is the rationale for me to be using the monitorchecker target.  It steps outside any calibration software package's closed loop "validation" routine.

It may only make your 1dE L* gradient look a little bit better. But your images are most probably gamma 2.2 or 1.8 encoded, so it still takes you nowhere. Try that trick with normal 0-255 RGB gradient, and it will definitely look worse, especially if it will be tagged as AdobeRGB/sRGB/ProPhoto
Title: 2.2 or L* ?
Post by: MHMG on October 22, 2009, 06:20:13 pm
Quote from: Czornyj
It may only make your 1dE L* gradient look a little bit better. But your images are most probably gamma 2.2 or 1.8 encoded, so it still takes you nowhere. Try that trick with normal 0-255 RGB gradient, and it will definitely look worse, especially if it will be tagged as AdobeRGB/sRGB/ProPhoto


OK. Convert  1L gradient to sRGB,. aRGB, or prophoto, or make your own RGB neutral gradient in RGB stepwize increments that aren't L* related.. That gradient still looks more neutral on my ColorEyes L* calibrated display versus same ACD display calibrated to G2.2. Maybe not on your system, but definitely on mine. What more is there to say?
Title: 2.2 or L* ?
Post by: Czornyj on October 22, 2009, 06:35:48 pm
Quote from: MHMG
What more is there to say?

...that such gradient has nothing to do with usual RGB images. Of course, I can precisely calibrate my high bit display to L* TRC and it will look perfect, but it's meaningless - as long as you're not working with images that were rendered to L*a*b or ECI v2.
Open that 0-255 gradient, assign AdobeRGB profile, then change your monitor profile in your system preferences to default, open it once again and compare the difference:
(http://members.chello.pl/m.kaluza/szarosc.png)
Title: 2.2 or L* ?
Post by: MHMG on October 22, 2009, 08:51:37 pm
Quote from: Czornyj
...that such gradient has nothing to do with usual RGB images. Of course, I can precisely calibrate my high bit display to L* TRC and it will look perfect, but it's meaningless - as long as you're not working with images that were rendered to L*a*b or ECI v2.

I must be missing something here. Those 100  L* designated increments in an L* = 0 to 100 neutral step wedge having 1 delta L steps are also precisely 100 RGB neutral triplets in any image file encoded with idealized working spaces like sRGB, aRGB, or prophoto. My digital images often contain at least some of those 100 RGB specified neutral values. Now, if my L* calibrated display did a worse job than Gamma 2.2 on the other remaining 156 RGB neutral triplets that can be used to represent a neutral color in a 24 bit RGB bit image, or produced more clipping, tone distortion in highlights or shadows, or poor color reproduction then I might reconsider using L* as my gamma encoding choice for my display.  But it doesn't.  On my particular system, choosing the L* calibration option in ColorEyes Display Pro renders better overall neutral gray balance and equally uniform tonal separation from deep shadow to max highlights compared to calibrating it to G2.2. Others' mileage may vary, particularly if you use different hardware or different calibration software.

Yikes, and we didn't even get to debate the influence of Matrix versus LUT based display profiles  IMHO, that choice can produce much bigger differences than one's choice of gamma encoding. Ditto for different calibration software packages. Haven't seen any two yet that produce profiled display behavior anywhere near the subtle differences we've been debating as a consequence of gamma encoding choice.

cheers,

Mark
Title: 2.2 or L* ?
Post by: Arkady on October 22, 2009, 11:14:54 pm
Quote from: tho_mas
In 8bit workflows the advantages are quite obvious e.g. if you are printing gamma 2.2 files.

Well, to this point I was talking about screen calibration and rendering, no printing involved.

Quote from: tho_mas
Gamma 1.8 differentiates better in bright and Gamma 2.2 differentiates better in dark tonal values; L* differentiates equal all over the gray axis.

This will happen only outside of color managed workflow. Otherwise color management system using appropriate profile should produce very similar outputs.

But again, in the first post I explain why theoretically L* calibration may work better. L* calibration allow for more linear LUTs in a profile thus resulting in more accurate interpolation and preserving bit resolution. BUT at the same time L* calibration of a monitor optimized to 2.2 gamma may result in resolution loss due to limited bit depth of internal monitor LUT (logic).

But to the date I don't know any documented evaluation of L* vs 2.2 gamma calibration  of reasonable scale.

Title: 2.2 or L* ?
Post by: tho_mas on October 23, 2009, 04:40:00 am
Quote from: Arkady
This will happen only outside of color managed workflow. Otherwise color management system using appropriate profile should produce very similar outputs.
no, this has to do with the characteristics of the profiles.
e.g. AdobeRGB (with Gamma 2.2) is "wasting" roughly 10% of the entire coding space for the darkest 3% of the gray axis: from L*0 to L*3 it uses 20 coordinates (RGB 0-19). But these tonal values do not contain any relevant data, actually just noise.

http://luminous-landscape.com/forum/index....st&id=13908 (http://luminous-landscape.com/forum/index.php?act=attach&type=post&id=13908)
http://luminous-landscape.com/forum/index....st&id=13909 (http://luminous-landscape.com/forum/index.php?act=attach&type=post&id=13909)
from: http://www.colormanagement.org/download_fi...king-Spaces.zip (http://www.colormanagement.org/download_files/Working-Spaces.zip)

Title: 2.2 or L* ?
Post by: Arkady on October 23, 2009, 06:35:07 pm
Quote from: tho_mas
no, this has to do with the characteristics of the profiles.
e.g. AdobeRGB (with Gamma 2.2) is "wasting" roughly 10% of the entire coding space for the darkest 3% of the gray axis: from L*0 to L*3 it uses 20 coordinates (RGB 0-19). But these tonal values do not contain any relevant data, actually just noise.

Well, I guess we are talking about different animals. You're talking about how to place light levels so that an observer may perceive as many levels as possible. This is a problem of encoding. On which BTW (thanks to Andrew who brought it up) Lars Borg of Adobe said:

"Third, where is the scientific foundation? Where
is the science that shows that the eye has a
natural L* TRC for any arbitrary color, not only
for neutrals? Where is the science that shows
that the eye has a natural L* TRC for neutrals at
arbitrary luminance levels and arbitrary flare
levels?
As far as I can tell, CIE LAB doesn't show any such thing."

I absolutely agree with this statement/question and just add that CIE Lab is NOT a color space, it was not designed as such. CIE Lab is a color difference formula. It is not perceptually equidistant (such space yet to be discovered). So the phrase ""wasting" roughly 10% of the entire coding space" has a quite significant number of scientifically unfounded assumptions.

Thus, I would be careful drawing any conclusion regarding perception phenomena based solely on L*.


But again, I was talking about a completely different rationale of L*ing a monitor, which based on fact that LUT based transformations are more accurate and produce less artifacts if the encoded in the LUT transform is linear. Thus calibrating a monitor close to L* results in more linear LUT in the monitor profile which, in turn, may result in smoother gradations.

Title: 2.2 or L* ?
Post by: digitaldog on October 23, 2009, 08:47:16 pm
Quote from: Arkady
CIE Lab is NOT a color space, it was not designed as such. CIE Lab is a color difference formula.

You mean deltaE?

CIE Lab is a (theoretical) color space.
Title: 2.2 or L* ?
Post by: Arkady on October 24, 2009, 04:44:53 pm
Quote from: digitaldog
You mean deltaE?

CIE Lab is a (theoretical) color space.

Well, I have to agree here. It is commonly called a color space, even XYZ called a color space. I'm not sure if it is correct but let it be that way.

Just a note
Saying that is not color space, I meant that CIE Lab was not intended for reflect much of appearance phenomena (color is one of them) instead CIE Lab is a system that targets quantitative evaluation of (barely perceived) color differences. It was pretty much designed around this goals and was based on color matching experiments. Thus it does not guarantee and more over it is guaranteed that in many cases it does not reflect appearance phenomena (that what Lars referring to). That why I think CIE Lab is not color space, but rather is a color metric space or color difference space.  That's a pretty wordy off topic explanation .
Title: 2.2 or L* ?
Post by: digitaldog on October 24, 2009, 07:26:56 pm
Quote from: Arkady
Saying that is not color space, I meant that CIE Lab was not intended for reflect much of appearance phenomena (color is one of them) instead CIE Lab is a system that targets quantitative evaluation of (barely perceived) color differences.

Yes, that statement I’d fully agree with.
Title: 2.2 or L* ?
Post by: MHMG on November 01, 2009, 07:21:49 pm
Update on L* versus 2.2 calibration:

I just purchased the Datacolor Spyder3 StudioSR kit on generous discount at the Photo Plus East show in NY. As a current owner of ColorEyes Display Pro running a Monaco Optix XR colorimeter (aka Xrite Dtp94), the newly purchased Spyder3 Elite colorimeter (also supported by ColorEyes Pro) gave me a chance to revisit the L* versus gamma 2.2 discussion using the same software but with different sensor. Also, it gave me a chance to try a different calibration software package on my system.

Previously in this thread, I reported better calibration using the L* calibration on my system compared to G2.2 (Mkbkpro running Apple Cinema display, ie., the previous fluorescent ACD not the latest LED version). I suggested that, all theory aside, real world hardware/software compatibility may dictate what is best. So, now that I have a Spyder3 3 Elite instrument, I used ColorEyes on the same hardware setup, with aimpoint D50 as before, and revisited the G2.2 versus L* aimpoint results. Surprise... L* didn't work as well with the Spyder3 unit. G2.2 was the best overall calibration as validated by the MonitorChecker target I provided URL access to earlier in this thread.This result indicates that the optimum calibration settings are both hardware and software dependent notwithstanding all the theoretical constructs (and even includes one's choice of calibration device).

Next, I installed the Datacolor Spyder3 Pro software which in advance mode also supports both G2.2 and L* calibration. This is a more affordable package than ColorEyes. Result?. It was unable to achieve an excellent calibration of my ACD to either L* D50 or G2.2/D50, but resetting to G2.2/native whitepoint calibration did produce an excellent result, albeit not at the whitepoint I desired. The more expensive ColorEyes Pro software could calibrate my ACD to either D50 or native whitepoint with excellent results using the Monoco XR colorimeter, but produced lesser quality results with the Spyder3 unit and L* calibration whereas G2.2 was excellent with the Spyder 3 colorimeter on my system.

My conclusion: That at least for 8bit video technology, the best calibration state is a delicate interaction between Display, videocard, calibration software, calibration sensor, and desired gamma/whitepoint. This optimum state must be determined empirically using a good independent test target. Relying on the monitor calibration software to "validate" itself doesn't get there. Excellent validation results were returned in all cases by both software packages whereas the real impact on my MonitorChecker target indicated that very real differences existed.  The "best" calibration therefore requires some experimentation. Not all calibration aimpoints produce equally calibrated accuracy.


Cheers,

Mark
Title: 2.2 or L* ?
Post by: Czornyj on November 02, 2009, 08:55:25 am
Quote from: MHMG
but resetting to G2.2/native whitepoint calibration did produce an excellent result, albeit not at the whitepoint I desired.

I'd guess it's the optimal target for an ACD. The native TRC is close to gamma 2.2 and native wtpt is closer to D65 rather than D50. I'd leave it like that - it's an 8 bit panel, so the more you change the TRC and wtpt, the stronger posterisation you get.
Title: 2.2 or L* ?
Post by: MHMG on November 02, 2009, 11:51:27 am
Quote from: Czornyj
I'd guess it's the optimal target for an ACD. The native TRC is close to gamma 2.2 and native wtpt is closer to D65 rather than D50. I'd leave it like that - it's an 8 bit panel, so the more you change the TRC and wtpt, the stronger posterisation you get.

Right perhaps in theory, but in practice ColorEyes Display Pro did an outstanding job on this system at both D50/L* and Native/L* (native being as you noted close to D65 on this ACD), whereas Spyder Elite software was optimal at Native/G2.2 aimpoint, and not so great at D50/G2.2.  And Spyder Elite wasn't able to match ColorEyes Diplay Pro at all with an L* calibration aimpoint at any chosen color temp. Seems that the calibration software plus the mated sensor unit also plays a critical role into what settings are pragmatically best despite what may be a theoretical optimum for a particular display. Much probably has to do with the final display profile generated by the calibration software.  For example, CEDpro has a relative versus absolute feature (a very interesting option). My ACD is only spec'd at 400:1 contrast ratio which means that with the "absolute" rendering option used to build the profile, my Monitorchecker target should show blackpoint clipping at about L*=3 down to 0 on my system. It does! So why would one want to use an absolute rendering setting anyway? Well, In the case where one wants better overall agreement between two mediocre displays or when softproofing matte papers that don't get down to low L* minimums so the monitor blackpoint clipping is not seen in the softproof.

Since I work under 5000K print lighting in my lab, D50 is my preferred monitor color aimpoint.  The "validation" tools in both of these monitor calibration software packages indicated that all of the various gamma/whitepoint choices calibrated and "validated" with equally excellent results, but my eyes and the MonitorChecker target say different. Where perhaps Czornyj and I disagree is whether my Monitorchecker target is somehow biasing the outcome, ie. optimized to return a favorable result more for L* versus G2.2.  If that were the case, then both ColorEyes Display Pro and the Datacolor Spyder3 Elite software should have gravitated to the same "optimal" settings for this image target on my system. This didn't happen.

The only true constant I have in evaluating the various calibration outcomes is my target. The software "validation" tools (essentially based on delta E analyses of various measured patches) clearly fail to reveal the subtle differences that lead to shadow clipping and/or posterization. If anyone has a superior image test target or set of targets, I'd be very interested in giving it/them a try.  Colorful images don't seem to do it. It takes some very demanding gray ramps to flesh out the differences.


cheers,

Mark
Title: 2.2 or L* ?
Post by: digitaldog on November 02, 2009, 12:04:16 pm
Quote from: MHMG
The "validation" tools in both of these monitor calibration software packages indicated that all of the various gamma/whitepoint choices calibrated and "validated" with equally excellent results, but my eyes and the MonitorChecker target say different.

Probably because these validation processes, at least those that use the same instrument to read a set of patches the software decides it wants to send to the device and produce a deltaE report is basically bogus. Its a feel good button.
Title: 2.2 or L* ?
Post by: MHMG on November 02, 2009, 05:08:25 pm
Quote from: digitaldog
Probably because these validation processes, at least those that use the same instrument to read a set of patches the software decides it wants to send to the device and produce a deltaE report is basically bogus. Its a feel good button.

Agreed in most cases. If the sensor is highly repeatable but also inaccurate on an absolute basis, the sensor's systematic error will get nulled out in the validation process and you have no way of knowing. A bogus result as Andrew has noted.  But in the case where the sensor is indeed both highly accurate and repeatable, then the validation process will return delta E values that do have some technical merit. Of course, the validation is  even more rigorous if the sensor is then invited to read color patches that weren't all used in the initial calibration (and some software packages have allowed for this), but ultimately it is the limitation of delta E and not the sensor response that is the ultimate weakness in the current validation methods. Take, for example, the "absolute" display calibration of my ACD with CEDpro. It dutifully mapped all RGB values with L* < 3 to monitor black because my ACD can't render L* lower than 3 on an absolute basis. Assuming that pure neutral gray color patches with L* = 3 or less are rendered perfectly neutral at monitor blackpoint, then the delta E for those color patches compared to aimpoint would only be 3 or less. That doesn't sound too bad in a "validation" trial until one realizes that this clipping can totally wreck critical shadow contrast and appear posterized in those deepest blacks. Same situation would be true anywhere along the tonal curve if two near neighbor patches with 1, 2, or 3 L* difference suddenly ended up as a "flat spot" in the curve where the delta L relationship between them went to zero. Again, a delta E validation routine would say everything along the gray scale was within 1, 2, or 3 delta E assuming no additional hue and chroma errors.  Therein lies the visual contrast flaw in any delta E analysis. Delta E analyses don't adequately evaluate loss of local contrast which we detect as posterization. Andrew, as you and others already know, my answer to this technical limitation of delta E is the I* metric. A validation routine using the I* metric rather than delta E would flag the posterization problem immediately.

regards,

Mark
Title: 2.2 or L* ?
Post by: digitaldog on November 02, 2009, 05:54:25 pm
Also a big issue is, the profiler software can decide what values within color space to send to the display to measure and produce the report. It makes it easy to stack the deck in favor of a good report because there are areas in color space that are a cinch to hit and others very, very difficult. A bit like those defining CRI. When you get to pick the tiles, it really does make it easier to produce a higher value.

Yup, it would be great to load an iStar and maybe some additional problematic colors into the validation process and see what results.