Pages: 1 [2]   Go Down

Author Topic: 2.2 or L* ?  (Read 33116 times)

MHMG

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1285
2.2 or L* ?
« Reply #20 on: October 22, 2009, 04:20:01 pm »

Not sure about all the high color theory, but on a practical note, using ColorEyes Display Pro to calibrate my Macbk Pro tethered to an Apple Cinema Display, I concluded my results were slightly better with L* compared to G2.2 and I had to lift my blackpoint a little (ie. not use "minimum" setting in the ColorEyes advanced menu) to get best results. Hence, I think the practical answer is system dependent and one should choose the settings that work best with your system.  An 8bit system is always going to face some compromises. I my case, L* tamed the "color ripple" I see in my 1 delta L step ramp compared to a G2.2 calibration, but it's certainly not perfect. I'd need a more high end system, I think, to do better. Nevertheless, I'm quite pleased with how well ColorEyes calibrates my display overall.

My pragmatic decision maker is a target called "MonitorChecker(v4)_LAB.psd".  I developed this target over the years starting with the interlacing approach I first saw in the vintage Adobe Gamma 1.8 target that Adobe used to bundle with Pagemaker (ah, the good ole days). I have expanded the target in scope to add shadow/highlight and "color ripple" guidance, and also to teach students about the fascinating surround effects that enter into image appearance decisions when zooming in and out of shadow areas, for example. I also include a Photoshop layer of instructions (toggle the layer off to use).

The Monitorchecker target is specifically designed to exercise the Photoshop-to-monitor profile hand-off and see how well it all works after monitor calibration. If you want to try it, with any luck it will download directly from here:

http://www.aardenburg-imaging.com//cgi-bin...DU2Nzg5LyoxMDM=

Otherwise, you can find it on my website "documents" page by scrolling down several items.

http://www.aardenburg-imaging.com/documents.html

cheers,

Mark


Logged

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1948
    • zarzadzaniebarwa.pl
2.2 or L* ?
« Reply #21 on: October 22, 2009, 05:01:18 pm »

Quote from: MHMG
Not sure about all the high color theory, but on a practical note, using ColorEyes Display Pro to calibrate my Macbk Pro tethered to an Apple Cinema Display, I concluded my results were slightly better with L* compared to G2.2 and I had to lift my blackpoint a little (ie. not use "minimum" setting in the ColorEyes advanced menu) to get best results. Hence, I think the practical answer is system dependent and one should choose the settings that work best with your system.  An 8bit system is always going to face some compromises. I my case, L* tamed the "color ripple" I see in my 1 delta L step ramp compared to a G2.2 calibration, but it's certainly not perfect. I'd need a more high end system, I think, to do better. Nevertheless, I'm quite pleased with how well ColorEyes calibrates my display overall.

My pragmatic decision maker is a target called "MonitorChecker(v4)_LAB.psd".  I developed this target over the years starting with the interlacing approach I first saw in the vintage Adobe Gamma 1.8 target that Adobe used to bundle with Pagemaker (ah, the good ole days). I have expanded the target in scope to add shadow/highlight and "color ripple" guidance, and also to teach students about the fascinating surround effects that enter into image appearance decisions when zooming in and out of shadow areas, for example. I also include a Photoshop layer of instructions (toggle the layer off to use).

The Monitorchecker target is specifically designed to exercise the Photoshop-to-monitor profile hand-off and see how well it all works after monitor calibration. If you want to try it, with any luck it will download directly from here:

http://www.aardenburg-imaging.com//cgi-bin...DU2Nzg5LyoxMDM=

Otherwise, you can find it on my website "documents" page by scrolling down several items.

http://www.aardenburg-imaging.com/documents.html

cheers,

Mark

If you'll display 1dE gradient on 8bit gamma 2.2 panel it will always look bad - in reality the colorimetric distances will vary, so you'll see some "ripples". It will only look smooth on L* calibrated high bit  display. Let's take L* 13, 14 and 15 values as an example - on a gamma 2.2 display they really are L* 12,719, 14,364 and 14,906, so the colorimetric distance between 13 and 14 is dE 1,645, while the distance between 14 and 15 is only dE 0,542.
ACD is gamma 2.2 calibrated, and most popular editing spaces are gamma 2.2 or 1.8 encoded (with the exeption of L-star aka ECI v2), so L* calibration can only make things worse.

It only makes any sense with high bit, hardware calibrated display, and L* encoded editing space.
« Last Edit: October 22, 2009, 05:27:05 pm by Czornyj »
Logged
Marcin Kałuża | [URL=http://zarzadzaniebarwa

MHMG

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1285
2.2 or L* ?
« Reply #22 on: October 22, 2009, 05:28:52 pm »

Quote from: Czornyj
If you'll display 1dE gradient on 8bit gamma 2.2 panel it will always look bad - in reality the colorimetric distances will vary, so you'll see some "ripples". It will only look smooth on L* calibrated high bit  display.
ACD is gamma 2.2 calibrated, and most popular editing spaces are gamma 2.2 or 1.8 encoded (with the exeption of L-star aka ECI v2), so L* calibration can only make things worse.

It only makes any sense with high bit, hardware calibrated display, and use L* encoded editing space.

Well, a neutral delta L ramp of 1 L value on a G2.2 curve increases RGB values by about 2-6 RGB units. If the display is calibrated to show equal RGB values (e.g., 100,100,100) as neutral then the calibration technique that gets closest to appearing stepwise neutral over the full RGB range is doing a better job with low chroma color rendition. Color Ripple is reduced to a minimum. Not perfect, but definitely minimized.  In my case, L* calibration outperformed the G2.2 calibration on my ACD driven by my MKbkpro video card. We can debate theories as to why one setting should or shouldn't be better.  I simply suggest one try both calibrations and make a choice as to which one better suits your system's real versus theorized performance. I doubt it will always be G2.2  or G1.8 and I doubt it will always be L*. A pragmatic verification is the rationale for me to be using the monitorchecker target.  It steps outside any calibration software package's closed loop "validation" routine.

Logged

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1948
    • zarzadzaniebarwa.pl
2.2 or L* ?
« Reply #23 on: October 22, 2009, 05:40:07 pm »

Quote from: MHMG
Well, a neutral delta L ramp of 1 L value on a G2.2 curve increases RGB values by about 2-6 RGB units. If the display is calibrated to show equal RGB values (e.g., 100,100,100) as neutral then the calibration technique that gets closest to appearing stepwise neutral over the full RGB range is doing a better job with low chroma color rendition. Color Ripple is reduced to a minimum. Not perfect, but definitely minimized.  In my case, L* calibration outperformed the G2.2 calibration on my ACD driven by my MKbkpro video card. We can debate theories as to why one setting should or shouldn't be better.  I simply suggest one try both calibrations and make a choice as to which one better suits your system's real versus theorized performance. I doubt it will always be G2.2  or G1.8 and I doubt it will always be L*. A pragmatic verification is the rationale for me to be using the monitorchecker target.  It steps outside any calibration software package's closed loop "validation" routine.

It may only make your 1dE L* gradient look a little bit better. But your images are most probably gamma 2.2 or 1.8 encoded, so it still takes you nowhere. Try that trick with normal 0-255 RGB gradient, and it will definitely look worse, especially if it will be tagged as AdobeRGB/sRGB/ProPhoto
« Last Edit: October 22, 2009, 05:43:14 pm by Czornyj »
Logged
Marcin Kałuża | [URL=http://zarzadzaniebarwa

MHMG

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1285
2.2 or L* ?
« Reply #24 on: October 22, 2009, 06:20:13 pm »

Quote from: Czornyj
It may only make your 1dE L* gradient look a little bit better. But your images are most probably gamma 2.2 or 1.8 encoded, so it still takes you nowhere. Try that trick with normal 0-255 RGB gradient, and it will definitely look worse, especially if it will be tagged as AdobeRGB/sRGB/ProPhoto


OK. Convert  1L gradient to sRGB,. aRGB, or prophoto, or make your own RGB neutral gradient in RGB stepwize increments that aren't L* related.. That gradient still looks more neutral on my ColorEyes L* calibrated display versus same ACD display calibrated to G2.2. Maybe not on your system, but definitely on mine. What more is there to say?
Logged

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1948
    • zarzadzaniebarwa.pl
2.2 or L* ?
« Reply #25 on: October 22, 2009, 06:35:48 pm »

Quote from: MHMG
What more is there to say?

...that such gradient has nothing to do with usual RGB images. Of course, I can precisely calibrate my high bit display to L* TRC and it will look perfect, but it's meaningless - as long as you're not working with images that were rendered to L*a*b or ECI v2.
Open that 0-255 gradient, assign AdobeRGB profile, then change your monitor profile in your system preferences to default, open it once again and compare the difference:
« Last Edit: October 22, 2009, 06:37:13 pm by Czornyj »
Logged
Marcin Kałuża | [URL=http://zarzadzaniebarwa

MHMG

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1285
2.2 or L* ?
« Reply #26 on: October 22, 2009, 08:51:37 pm »

Quote from: Czornyj
...that such gradient has nothing to do with usual RGB images. Of course, I can precisely calibrate my high bit display to L* TRC and it will look perfect, but it's meaningless - as long as you're not working with images that were rendered to L*a*b or ECI v2.

I must be missing something here. Those 100  L* designated increments in an L* = 0 to 100 neutral step wedge having 1 delta L steps are also precisely 100 RGB neutral triplets in any image file encoded with idealized working spaces like sRGB, aRGB, or prophoto. My digital images often contain at least some of those 100 RGB specified neutral values. Now, if my L* calibrated display did a worse job than Gamma 2.2 on the other remaining 156 RGB neutral triplets that can be used to represent a neutral color in a 24 bit RGB bit image, or produced more clipping, tone distortion in highlights or shadows, or poor color reproduction then I might reconsider using L* as my gamma encoding choice for my display.  But it doesn't.  On my particular system, choosing the L* calibration option in ColorEyes Display Pro renders better overall neutral gray balance and equally uniform tonal separation from deep shadow to max highlights compared to calibrating it to G2.2. Others' mileage may vary, particularly if you use different hardware or different calibration software.

Yikes, and we didn't even get to debate the influence of Matrix versus LUT based display profiles  IMHO, that choice can produce much bigger differences than one's choice of gamma encoding. Ditto for different calibration software packages. Haven't seen any two yet that produce profiled display behavior anywhere near the subtle differences we've been debating as a consequence of gamma encoding choice.

cheers,

Mark
Logged

Arkady

  • Newbie
  • *
  • Offline Offline
  • Posts: 19
    • http://www.colorkeeper.com/
2.2 or L* ?
« Reply #27 on: October 22, 2009, 11:14:54 pm »

Quote from: tho_mas
In 8bit workflows the advantages are quite obvious e.g. if you are printing gamma 2.2 files.

Well, to this point I was talking about screen calibration and rendering, no printing involved.

Quote from: tho_mas
Gamma 1.8 differentiates better in bright and Gamma 2.2 differentiates better in dark tonal values; L* differentiates equal all over the gray axis.

This will happen only outside of color managed workflow. Otherwise color management system using appropriate profile should produce very similar outputs.

But again, in the first post I explain why theoretically L* calibration may work better. L* calibration allow for more linear LUTs in a profile thus resulting in more accurate interpolation and preserving bit resolution. BUT at the same time L* calibration of a monitor optimized to 2.2 gamma may result in resolution loss due to limited bit depth of internal monitor LUT (logic).

But to the date I don't know any documented evaluation of L* vs 2.2 gamma calibration  of reasonable scale.

tho_mas

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1799
2.2 or L* ?
« Reply #28 on: October 23, 2009, 04:40:00 am »

Quote from: Arkady
This will happen only outside of color managed workflow. Otherwise color management system using appropriate profile should produce very similar outputs.
no, this has to do with the characteristics of the profiles.
e.g. AdobeRGB (with Gamma 2.2) is "wasting" roughly 10% of the entire coding space for the darkest 3% of the gray axis: from L*0 to L*3 it uses 20 coordinates (RGB 0-19). But these tonal values do not contain any relevant data, actually just noise.

http://luminous-landscape.com/forum/index....st&id=13908
http://luminous-landscape.com/forum/index....st&id=13909
from: http://www.colormanagement.org/download_fi...king-Spaces.zip

Logged

Arkady

  • Newbie
  • *
  • Offline Offline
  • Posts: 19
    • http://www.colorkeeper.com/
2.2 or L* ?
« Reply #29 on: October 23, 2009, 06:35:07 pm »

Quote from: tho_mas
no, this has to do with the characteristics of the profiles.
e.g. AdobeRGB (with Gamma 2.2) is "wasting" roughly 10% of the entire coding space for the darkest 3% of the gray axis: from L*0 to L*3 it uses 20 coordinates (RGB 0-19). But these tonal values do not contain any relevant data, actually just noise.

Well, I guess we are talking about different animals. You're talking about how to place light levels so that an observer may perceive as many levels as possible. This is a problem of encoding. On which BTW (thanks to Andrew who brought it up) Lars Borg of Adobe said:

"Third, where is the scientific foundation? Where
is the science that shows that the eye has a
natural L* TRC for any arbitrary color, not only
for neutrals? Where is the science that shows
that the eye has a natural L* TRC for neutrals at
arbitrary luminance levels and arbitrary flare
levels?
As far as I can tell, CIE LAB doesn't show any such thing."

I absolutely agree with this statement/question and just add that CIE Lab is NOT a color space, it was not designed as such. CIE Lab is a color difference formula. It is not perceptually equidistant (such space yet to be discovered). So the phrase ""wasting" roughly 10% of the entire coding space" has a quite significant number of scientifically unfounded assumptions.

Thus, I would be careful drawing any conclusion regarding perception phenomena based solely on L*.


But again, I was talking about a completely different rationale of L*ing a monitor, which based on fact that LUT based transformations are more accurate and produce less artifacts if the encoded in the LUT transform is linear. Thus calibrating a monitor close to L* results in more linear LUT in the monitor profile which, in turn, may result in smoother gradations.

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20614
  • Andrew Rodney
    • http://www.digitaldog.net/
2.2 or L* ?
« Reply #30 on: October 23, 2009, 08:47:16 pm »

Quote from: Arkady
CIE Lab is NOT a color space, it was not designed as such. CIE Lab is a color difference formula.

You mean deltaE?

CIE Lab is a (theoretical) color space.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Arkady

  • Newbie
  • *
  • Offline Offline
  • Posts: 19
    • http://www.colorkeeper.com/
2.2 or L* ?
« Reply #31 on: October 24, 2009, 04:44:53 pm »

Quote from: digitaldog
You mean deltaE?

CIE Lab is a (theoretical) color space.

Well, I have to agree here. It is commonly called a color space, even XYZ called a color space. I'm not sure if it is correct but let it be that way.

Just a note
Saying that is not color space, I meant that CIE Lab was not intended for reflect much of appearance phenomena (color is one of them) instead CIE Lab is a system that targets quantitative evaluation of (barely perceived) color differences. It was pretty much designed around this goals and was based on color matching experiments. Thus it does not guarantee and more over it is guaranteed that in many cases it does not reflect appearance phenomena (that what Lars referring to). That why I think CIE Lab is not color space, but rather is a color metric space or color difference space.  That's a pretty wordy off topic explanation .

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20614
  • Andrew Rodney
    • http://www.digitaldog.net/
2.2 or L* ?
« Reply #32 on: October 24, 2009, 07:26:56 pm »

Quote from: Arkady
Saying that is not color space, I meant that CIE Lab was not intended for reflect much of appearance phenomena (color is one of them) instead CIE Lab is a system that targets quantitative evaluation of (barely perceived) color differences.

Yes, that statement I’d fully agree with.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

MHMG

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1285
2.2 or L* ?
« Reply #33 on: November 01, 2009, 07:21:49 pm »

Update on L* versus 2.2 calibration:

I just purchased the Datacolor Spyder3 StudioSR kit on generous discount at the Photo Plus East show in NY. As a current owner of ColorEyes Display Pro running a Monaco Optix XR colorimeter (aka Xrite Dtp94), the newly purchased Spyder3 Elite colorimeter (also supported by ColorEyes Pro) gave me a chance to revisit the L* versus gamma 2.2 discussion using the same software but with different sensor. Also, it gave me a chance to try a different calibration software package on my system.

Previously in this thread, I reported better calibration using the L* calibration on my system compared to G2.2 (Mkbkpro running Apple Cinema display, ie., the previous fluorescent ACD not the latest LED version). I suggested that, all theory aside, real world hardware/software compatibility may dictate what is best. So, now that I have a Spyder3 3 Elite instrument, I used ColorEyes on the same hardware setup, with aimpoint D50 as before, and revisited the G2.2 versus L* aimpoint results. Surprise... L* didn't work as well with the Spyder3 unit. G2.2 was the best overall calibration as validated by the MonitorChecker target I provided URL access to earlier in this thread.This result indicates that the optimum calibration settings are both hardware and software dependent notwithstanding all the theoretical constructs (and even includes one's choice of calibration device).

Next, I installed the Datacolor Spyder3 Pro software which in advance mode also supports both G2.2 and L* calibration. This is a more affordable package than ColorEyes. Result?. It was unable to achieve an excellent calibration of my ACD to either L* D50 or G2.2/D50, but resetting to G2.2/native whitepoint calibration did produce an excellent result, albeit not at the whitepoint I desired. The more expensive ColorEyes Pro software could calibrate my ACD to either D50 or native whitepoint with excellent results using the Monoco XR colorimeter, but produced lesser quality results with the Spyder3 unit and L* calibration whereas G2.2 was excellent with the Spyder 3 colorimeter on my system.

My conclusion: That at least for 8bit video technology, the best calibration state is a delicate interaction between Display, videocard, calibration software, calibration sensor, and desired gamma/whitepoint. This optimum state must be determined empirically using a good independent test target. Relying on the monitor calibration software to "validate" itself doesn't get there. Excellent validation results were returned in all cases by both software packages whereas the real impact on my MonitorChecker target indicated that very real differences existed.  The "best" calibration therefore requires some experimentation. Not all calibration aimpoints produce equally calibrated accuracy.


Cheers,

Mark
Logged

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1948
    • zarzadzaniebarwa.pl
2.2 or L* ?
« Reply #34 on: November 02, 2009, 08:55:25 am »

Quote from: MHMG
but resetting to G2.2/native whitepoint calibration did produce an excellent result, albeit not at the whitepoint I desired.

I'd guess it's the optimal target for an ACD. The native TRC is close to gamma 2.2 and native wtpt is closer to D65 rather than D50. I'd leave it like that - it's an 8 bit panel, so the more you change the TRC and wtpt, the stronger posterisation you get.
« Last Edit: November 02, 2009, 08:55:44 am by Czornyj »
Logged
Marcin Kałuża | [URL=http://zarzadzaniebarwa

MHMG

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1285
2.2 or L* ?
« Reply #35 on: November 02, 2009, 11:51:27 am »

Quote from: Czornyj
I'd guess it's the optimal target for an ACD. The native TRC is close to gamma 2.2 and native wtpt is closer to D65 rather than D50. I'd leave it like that - it's an 8 bit panel, so the more you change the TRC and wtpt, the stronger posterisation you get.

Right perhaps in theory, but in practice ColorEyes Display Pro did an outstanding job on this system at both D50/L* and Native/L* (native being as you noted close to D65 on this ACD), whereas Spyder Elite software was optimal at Native/G2.2 aimpoint, and not so great at D50/G2.2.  And Spyder Elite wasn't able to match ColorEyes Diplay Pro at all with an L* calibration aimpoint at any chosen color temp. Seems that the calibration software plus the mated sensor unit also plays a critical role into what settings are pragmatically best despite what may be a theoretical optimum for a particular display. Much probably has to do with the final display profile generated by the calibration software.  For example, CEDpro has a relative versus absolute feature (a very interesting option). My ACD is only spec'd at 400:1 contrast ratio which means that with the "absolute" rendering option used to build the profile, my Monitorchecker target should show blackpoint clipping at about L*=3 down to 0 on my system. It does! So why would one want to use an absolute rendering setting anyway? Well, In the case where one wants better overall agreement between two mediocre displays or when softproofing matte papers that don't get down to low L* minimums so the monitor blackpoint clipping is not seen in the softproof.

Since I work under 5000K print lighting in my lab, D50 is my preferred monitor color aimpoint.  The "validation" tools in both of these monitor calibration software packages indicated that all of the various gamma/whitepoint choices calibrated and "validated" with equally excellent results, but my eyes and the MonitorChecker target say different. Where perhaps Czornyj and I disagree is whether my Monitorchecker target is somehow biasing the outcome, ie. optimized to return a favorable result more for L* versus G2.2.  If that were the case, then both ColorEyes Display Pro and the Datacolor Spyder3 Elite software should have gravitated to the same "optimal" settings for this image target on my system. This didn't happen.

The only true constant I have in evaluating the various calibration outcomes is my target. The software "validation" tools (essentially based on delta E analyses of various measured patches) clearly fail to reveal the subtle differences that lead to shadow clipping and/or posterization. If anyone has a superior image test target or set of targets, I'd be very interested in giving it/them a try.  Colorful images don't seem to do it. It takes some very demanding gray ramps to flesh out the differences.


cheers,

Mark
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20614
  • Andrew Rodney
    • http://www.digitaldog.net/
2.2 or L* ?
« Reply #36 on: November 02, 2009, 12:04:16 pm »

Quote from: MHMG
The "validation" tools in both of these monitor calibration software packages indicated that all of the various gamma/whitepoint choices calibrated and "validated" with equally excellent results, but my eyes and the MonitorChecker target say different.

Probably because these validation processes, at least those that use the same instrument to read a set of patches the software decides it wants to send to the device and produce a deltaE report is basically bogus. Its a feel good button.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

MHMG

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1285
2.2 or L* ?
« Reply #37 on: November 02, 2009, 05:08:25 pm »

Quote from: digitaldog
Probably because these validation processes, at least those that use the same instrument to read a set of patches the software decides it wants to send to the device and produce a deltaE report is basically bogus. Its a feel good button.

Agreed in most cases. If the sensor is highly repeatable but also inaccurate on an absolute basis, the sensor's systematic error will get nulled out in the validation process and you have no way of knowing. A bogus result as Andrew has noted.  But in the case where the sensor is indeed both highly accurate and repeatable, then the validation process will return delta E values that do have some technical merit. Of course, the validation is  even more rigorous if the sensor is then invited to read color patches that weren't all used in the initial calibration (and some software packages have allowed for this), but ultimately it is the limitation of delta E and not the sensor response that is the ultimate weakness in the current validation methods. Take, for example, the "absolute" display calibration of my ACD with CEDpro. It dutifully mapped all RGB values with L* < 3 to monitor black because my ACD can't render L* lower than 3 on an absolute basis. Assuming that pure neutral gray color patches with L* = 3 or less are rendered perfectly neutral at monitor blackpoint, then the delta E for those color patches compared to aimpoint would only be 3 or less. That doesn't sound too bad in a "validation" trial until one realizes that this clipping can totally wreck critical shadow contrast and appear posterized in those deepest blacks. Same situation would be true anywhere along the tonal curve if two near neighbor patches with 1, 2, or 3 L* difference suddenly ended up as a "flat spot" in the curve where the delta L relationship between them went to zero. Again, a delta E validation routine would say everything along the gray scale was within 1, 2, or 3 delta E assuming no additional hue and chroma errors.  Therein lies the visual contrast flaw in any delta E analysis. Delta E analyses don't adequately evaluate loss of local contrast which we detect as posterization. Andrew, as you and others already know, my answer to this technical limitation of delta E is the I* metric. A validation routine using the I* metric rather than delta E would flag the posterization problem immediately.

regards,

Mark
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20614
  • Andrew Rodney
    • http://www.digitaldog.net/
2.2 or L* ?
« Reply #38 on: November 02, 2009, 05:54:25 pm »

Also a big issue is, the profiler software can decide what values within color space to send to the display to measure and produce the report. It makes it easy to stack the deck in favor of a good report because there are areas in color space that are a cinch to hit and others very, very difficult. A bit like those defining CRI. When you get to pick the tiles, it really does make it easier to produce a higher value.

Yup, it would be great to load an iStar and maybe some additional problematic colors into the validation process and see what results.
« Last Edit: November 02, 2009, 06:27:48 pm by digitaldog »
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".
Pages: 1 [2]   Go Up