Pages: 1 [2] 3 4 ... 17   Go Down

Author Topic: DSLR testing sites like DXOmark and Imaging Resource use HMI and LEDs for color  (Read 53708 times)

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2192

The colorchecker is good in representing memory colors in a small sample set. BTW you don`t have many "daylight metamers" for orange, yellow, red and aqua in the nature. Highly saturated reflected colors are less prone to metamerism.
100% LI conformity is not the goal, the "acceptable" color accuracy is. Perfect is the enemy of good.
No, reaching LI nirvana is unrealistic. However, I have a white LED that has quite good spectral characteristics outside of the hump at 460nm and dip around 500nm. It would have a hard time rendering a highly saturated cyan (in chromaticity). The luminosity of a saturated cyan is quite low. But it has no problem rendering the cyan in a colorchecker because it isn't very saturated and the peak/valley pretty much overlaps the cyan patches reflectance spectrum. But if I had a cyan that had a passband from 480 to 530 the LED would fail, creating a much darker cyan than D50 with the same Lux.

I also took a lot at a couple oranges that are D50 metamers. Turns out the reflectance spectrum of one differs quite a bit from the other with the main shift occurring 15nm different from each other. They would fail to be metamers if the illuminant had abrupt changes in that 15nm interval and there are fluorescents that do. And orange is one of those colors very sensitive to hue shift.

But it comes down to both the illuminant and the CFA filters. Both contribute to essential error. How would you quantify their relative contributions? I've looked at the color shifts induced by F8 fluorescents and a high CRI ( >95 ) LED. The LED wins out over the F8 but still produces dEs of 1-3 with my print inks. I believe Jim K. has done some work with camera imagers. IIRC, the dEs were somewhat larger.

In any case the contribution of both the illuminant and sensor/CFA to color error needs to be quantified. I don't put much credence in imatest's color accuracy measurements because they don't have a way to measure error from the illuminant even though I suspect it is smaller than the sensor/CFA. But who knows for sure?
« Last Edit: March 18, 2018, 01:09:11 am by Doug Gray »
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20305
  • Andrew Rodney
    • http://www.digitaldog.net/

No, reaching LI nirvana is unrealistic.
So what about post #16?
Logged
http://www.digitaldog.net/
Author “Color Management for Photographers”.

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word

It occurs to me that if the camera were LI, the light source wouldn't make any difference. The camera would faithfully recreate the response of the Standard Observer to the patch under consideration with the illuminant chosen.

Jim

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word

So what about post #16?

I can't find it at B&H.

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20305
  • Andrew Rodney
    • http://www.digitaldog.net/

It occurs to me that if the camera were LI, the light source wouldn't make any difference. The camera would faithfully recreate the response of the Standard Observer to the patch under consideration with the illuminant chosen.

Jim
I absolutely agree!
Logged
http://www.digitaldog.net/
Author “Color Management for Photographers”.

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2192

It occurs to me that if the camera were LI, the light source wouldn't make any difference. The camera would faithfully recreate the response of the Standard Observer to the patch under consideration with the illuminant chosen.

Jim

I absolutely agree!

Which is why the spectral response of the CFA/sensor is the gold standard. Illuminant drops out of the equation.
Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436

The CRI replacement that is gaining the most interest lately is IES TM-30-15.

In the video the hue shift of the bell peppers caused by saturation and luminance changes (supposedly caused by the light source), does not happen in reality as far as I've seen. But how much of the color changes are due to the camera and software in what appears to me to be from a scene that doesn't look quite accurate with regards to contrast levels. I realize it's just a demo but it does show what I have to deal with editing images that has nothing to do with the light source used.

After editing 1000's of Raws over the years I am constantly custom shaping ACR's Point Curve in the mid range to deep shadows section in order to override this contrast inducing hue/sat shift that's part of the "filmic" look engineered into ACR's color rendering tools. Flat Linear curve looks too washed out making it not so accurate to the actual scene. Clarity and definition takes a hit.

Accuracy of the scene to the viewer is highly affected by this contrast weighted appearance which has to be viewed on a display having the same contrast and overall luminance. I don't see how anyone can separate what image rendering characteristic is being influenced by other capture/rendering mechanics and know for sure it's caused by the light source the scene is lit by compared to the display.

Even when image to scene accuracy examples are rendered by measured numbers it doesn't look so accurate as well. The camera will always get in the way and a camera profile only affects hue/sat and some contrast rendering but not enough to make the scene captured appear accurate.
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20305
  • Andrew Rodney
    • http://www.digitaldog.net/

Even when image to scene accuracy examples are rendered by measured numbers it doesn't look so accurate as well. The camera will always get in the way and a camera profile only affects hue/sat and some contrast rendering but not enough to make the scene captured appear accurate.
Because (again!) there is a massive difference between color perception and color appearance. We've been over this no*?
I'll refrain from asking you what you mean by 'accuracy' below.


*Colorimetry and the dE testing is about color perception. It is not about color appearance. The reason why viewing a print is more valid than measuring it is because measurement is about comparing solid colors. Color appearance is about evaluating images and color in context which measurement devices can't provide. Colorimetry is about color perception. It is not about color appearance. Colorimetry was never, designed as a color appearance model. It was never designed to even be used as an interchange space between device dependent color models. It's not designed for imagery at all. Colorimetry based on solid  colors in very specific ambient and surround conditions.
Logged
http://www.digitaldog.net/
Author “Color Management for Photographers”.

joofa

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 544

I can't find it at B&H.

Doesn't matter. The point was having a camera. Not one at B&H.
Logged
Joofa
http://www.djjoofa.com
Download Photoshop and After Effects plugins

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436

Because (again!) there is a massive difference between color perception and color appearance. We've been over this no*?
I'll refrain from asking you what you mean by 'accuracy' below.


*Colorimetry and the dE testing is about color perception. It is not about color appearance. The reason why viewing a print is more valid than measuring it is because measurement is about comparing solid colors. Color appearance is about evaluating images and color in context which measurement devices can't provide. Colorimetry is about color perception. It is not about color appearance. Colorimetry was never, designed as a color appearance model. It was never designed to even be used as an interchange space between device dependent color models. It's not designed for imagery at all. Colorimetry based on solid  colors in very specific ambient and surround conditions.

You don't offer solutions, Andrew. You only state absolutes based on data that doesn't help photographers or explain the issue I just described about contrast and what and why there are hue shifts with saturation and luminous changes. You don't provide any practical information that makes a connection between light source used to photograph or view prints under that cause this contrast/hue change.

And thanks for refraining from stating the obvious about color perception and appearance. It would help if you could provide data other than spectral graphs and dry color science articles.
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20305
  • Andrew Rodney
    • http://www.digitaldog.net/

You don't offer solutions, Andrew. You only state absolutes based on data that doesn't help photographers or explain the issue I just described about contrast and what and why there are hue shifts with saturation and luminous changes.
I'd advise photographers to ignore this entirely and render the images as they desire. Which has absolutely nothing to do with color accuracy nor must it!

Logged
http://www.digitaldog.net/
Author “Color Management for Photographers”.

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word

You only state absolutes based on data that doesn't help photographers or explain the issue I just described about contrast and what and why there are hue shifts with saturation and luminous changes.


The reason that most -- but not all -- saturation and luminance changes performed in RGB color spaces result in hue shifts is not a mystery, nor is it without mitigation:

http://www.ingentaconnect.com/content/ist/cic/1994/00001994/00000001/art00023

Ps has blending modes that do similar things to the techniques that I developed in the paper. If you're interested in the math behind this and don't want to spring for the 12 bucks, I might be able to dredge up a copy of the paper and scan it.

Jim

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436

The reason that most -- but not all -- saturation and luminance changes performed in RGB color spaces result in hue shifts is not a mystery, nor is it without mitigation:

http://www.ingentaconnect.com/content/ist/cic/1994/00001994/00000001/art00023

Ps has blending modes that do similar things to the techniques that I developed in the paper. If you're interested in the math behind this and don't want to spring for the 12 bucks, I might be able to dredge up a copy of the paper and scan it.

Jim

I don't see how that would explain how this has anything to do with a light source spectra' affect on color appearance accuracy in repro work where the subject lit by manufactured lighting is right next to the calibrated editing workstation. I also don't see how that can be integrated within a Raw workflow within the Raw converter. Not interested in working on pixels if that's where the math is being applied.

Yes, I know this RGB hue shift behavior is not a mystery. It's a known issue for some time but the corrections and the complexity involved to mitigate this I've seen implemented show a complete lack of knowing when something doesn't look right because the methods used are so complex and scientific based that to find what to do to make it look right isn't worth the time to dig into.

Nothing's perfect I guess even when math and science is used.

Scientists seem to have a different perception of color relationships.

Thanks for the offer to scan the papers you mentioned but I'm not going to be able to understand it anyway.
Logged

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2192

I don't see how that would explain how this has anything to do with a light source spectra' affect on color appearance accuracy in repro work where the subject lit by manufactured lighting is right next to the calibrated editing workstation..

Repro work is quite a different beast. There colorimetry rules. You don't even need a good monitor because you are trying to get the same colors (in a colorimetric sense) on the repro as are on the original. The appearances should match if the perceptual (colorimetry) matches. The largest sources of errors doing this are from the illuminant's divergence from D50 (if using D50 for profiles) and the camera (or scanner) divergence from LI.

Generally, for repro work one doesn't tweak colors (there are exceptions where the imager capture is off in critical areas due to LI) . One also uses Abs. Col. Intent. The most important thing is to match the white point or most luminous part of the original to the captured image. This all assumes the gamut of the printer is sufficient to encapsulate the colors in the original. If not your reproduction will be flawed at the start.
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20305
  • Andrew Rodney
    • http://www.digitaldog.net/

The largest sources of errors doing this are from the illuminant's divergence from D50 (if using D50 for profiles) and the camera (or scanner) divergence from LI.
Is what is degree such errors?
Logged
http://www.digitaldog.net/
Author “Color Management for Photographers”.

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436

Repro work is quite a different beast. There colorimetry rules. You don't even need a good monitor because you are trying to get the same colors (in a colorimetric sense) on the repro as are on the original. The appearances should match if the perceptual (colorimetry) matches. The largest sources of errors doing this are from the illuminant's divergence from D50 (if using D50 for profiles) and the camera (or scanner) divergence from LI.

Generally, for repro work one doesn't tweak colors (there are exceptions where the imager capture is off in critical areas due to LI) . One also uses Abs. Col. Intent. The most important thing is to match the white point or most luminous part of the original to the captured image. This all assumes the gamut of the printer is sufficient to encapsulate the colors in the original. If not your reproduction will be flawed at the start.

Don't you think the video's demonstration of colored LED's shifting hue/sat in the bell pepper examples as a predictor of white light of various spectra output isn't being cause by the light but through maybe software manipulation? I've never seen any of the white lights I've used do that to a bell pepper.

That site's info and video is made by very learned color scientists and engineers, so I'm surprised they attribute that color behavior to lighting. Most white lighting I've worked with applies a more flat filter effect as an evenly distributed wash/stain of the color temp hue over the entire scene but never to the extent they shift hue/sat that severely.

I've never seen a camera profile attempt to characterize that behavior and correct it applying the profile in post. Editing is further required.
Logged

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2192

Don't you think the video's demonstration of colored LED's shifting hue/sat in the bell pepper examples as a predictor of white light of various spectra output isn't being cause by the light but through maybe software manipulation? I've never seen any of the white lights I've used do that to a bell pepper.

That site's info and video is made by very learned color scientists and engineers, so I'm surprised they attribute that color behavior to lighting. Most white lighting I've worked with applies a more flat filter effect as an evenly distributed wash/stain of the color temp hue over the entire scene but never to the extent they shift hue/sat that severely.

I've never seen a camera profile attempt to characterize that behavior and correct it applying the profile in post. Editing is further required.

Tim, what they are doing is rather unique.  They are creating "white" light by selecting and changing the contributions of a large number of LEDs, each with a fairly narrow bandwidth. They have a diffuser so that you don't see any of the LED colors separately. And the "whites" are all computer controlled so that the chromaticity values are those of D60 above 6000K and black body below 4000K, avoiding the jump that is made at D50 to 4999K.

They then modify the spectrum but maintain the chromaticity of the white point.

Think of the specific example of your monitor. Let's assume you profiled it for D65 and filled it with white. That white does not have the same spectrum as D65. It just has the same chromaticity so it looks white and will appear to have the same white as D65 daylight.  But spectrally it is VERY lumpy. One of the consequences of that is if you use your monitor's "white" output to illuminate say, a red apple. It will actually appear to be a more saturated red than if you illuminated it with D65 daylight at the same Lux.

What they are doing is demonstrating that by creating lumpy spectra that is still "white" one can increase the color saturation of many common objects. They believe this might be of interest commercially. For instance to enhance the appearance of produce or make product packaging stand out more.

And, there are ways to decrease saturation by moving the spectrum inwards from both ends a bit so long as they maintain the chromaticity white point. The whites still look the same but all the colors from the objects change.

Along the way they describe a couple metrics, one of which could be used as an improved CRI.

« Last Edit: March 19, 2018, 12:47:12 am by Doug Gray »
Logged

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2192

Is what is degree such errors?
Not sure exactly what you're asking. Generally, what few problems I run into where repro colors are noticeably different are in things like magenta and oranges that are more perceptually sensitive to hue errors. For instance a purplish blue dress might not get rendered by the imager accurately. The problem is almost always that my imaging system simply has the color too far off. LI strikes again. I'll take a spectro reading of the original if it's critical. Still, pretty rare and easily fixed in Photoshop.
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto

Hi,

I think Jim Kasson did present a list of cameras fulfilling the Luther-Ives condition. It is reproduced here:




Best regards
Erik


What camera conforms to the Luther-Ives condition?
Logged
Erik Kaffehr
 

Lundberg02

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 379

since there was nothing there. I assume that's what you mean.
Logged
Pages: 1 [2] 3 4 ... 17   Go Up