A while back in this thread, before Eric Chan pointed out that there were two image processing pipelines in Lightroom and that all my integer TIFF testing wasn't really relevant to raw processing, some of you asked about different error measures than CIELab Delta-E, specifically something related to hue errors. These color difference measures can get quite complicated.

Here's an example. I can, if there's sufficient interest, implement some of these more complex, and presumably, accurate, measures, but I'll present here just the basics. In the precious post, I presented the Delta-E curves.

Here's a set of Delta-H curves, the basic CIEL*a*b* difference metric that measures hue error:

Like Delta-E, Delta-H can't go negative.

Here are the Delta-C curves, which measures differences in what's called "metric chroma", the radial distance from the grey axis:

Delta-C is positive if the test color is more chromatic than the reference, and negative if the test color is less chromatic than the reference. The fact that there's a tendency towards positive Delta-C indicates that making positive LR exposure adjustments tends to make the colors more chromatic than they should be. You can see that from the scatter plots as well.

If you look at the numbers in the Delta-H and Delta-C plots, you can see that most of the chromaticity error comes from too much chroma, not from hue shift.

And, although it's not a CIELab standard, I like to look at hue angle errors. The big potential problem with looking at hue angle alone is that errors near the grey axis, where they are inconsequential, can have the same value as errors in highly chromatic colors, where they are significant. Still, it gives indications of any possible hue shift bias. Here is the hue angle data:

Pretty symmetrical.

Jim