Pages: 1 ... 9 10 [11] 12 13 ... 17   Go Down

Author Topic: DSLR testing sites like DXOmark and Imaging Resource use HMI and LEDs for color  (Read 55957 times)

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word


I read Jim's paper last night.  Even though I didn't understand it, I was impressed with the complexity that goes on behind the scenes in image processing programs.

Here's a better place to start:

https://blog.kasson.com/the-last-word/the-color-reproduction-problem/

Jim

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word

My experiments were designed to find a better CRI. 

If we're ignoring capture metameric error, then I think a reasonable way to improve the CRI is to extend the patch set. This is not too hard to do (but can take a while to run if you've got thousands of patches) with simulations but can be difficult IRL. There is a NASA set of natural spectra that you can download, and I think that Burns at RIT published a set of paint spectra. Sorry, I don't have links, but I might be able to find them if you want to go that route.

Jim

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2197

Good stuff, Jim!
Logged

Alexey.Danilchenko

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 257
    • Spectron

If we're ignoring capture metameric error, then I think a reasonable way to improve the CRI is to extend the patch set. This is not too hard to do (but can take a while to run if you've got thousands of patches) with simulations but can be difficult IRL. There is a NASA set of natural spectra that you can download, and I think that Burns at RIT published a set of paint spectra. Sorry, I don't have links, but I might be able to find them if you want to go that route.

I would be very interested in those Jim if you can find them.

Thanks
Logged

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word

I would be very interested in those Jim if you can find them.

Here's one:

https://speclib.jpl.nasa.gov/

Unfortunately, the lambda range greater than we'd prefer. I've seen more appropriate data sets from NASA. I'll keep looking.

Here's another set:

https://crustal.usgs.gov/speclab/QueryAll07a.php

Here is Berns' artist paint database:

https://www.rit.edu/cos/colorscience/mellon/pubs.php

By the way, there was a thread a while back that is tangentially related to this thread. It was concerned with CFA filters, but the same techniques could be used to evaluate lighting and camera interactions.

https://www.dpreview.com/forums/post/60278621

Jim
« Last Edit: June 13, 2018, 02:16:10 pm by Jim Kasson »
Logged

Alexey.Danilchenko

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 257
    • Spectron

Here's one:

https://speclib.jpl.nasa.gov/

Unfortunately, the lambda range greater than we'd prefer. I've seen more appropriate data sets from NASA. I'll keep looking.

Here's another set:

https://crustal.usgs.gov/speclab/QueryAll07a.php

Here is Berns' artist paint database:

https://www.rit.edu/cos/colorscience/mellon/pubs.php

By the way, there was a thread a while back that is tangentially related to this thread. It was concerned with CFA filters, but the same techniques could be used to evaluate lighting and camera interactions.

https://www.dpreview.com/forums/post/60278621


Brilliant - thanks Jim.
Logged

WayneLarmon

  • Full Member
  • ***
  • Offline Offline
  • Posts: 162

Here's a better place to start:
https://blog.kasson.com/the-last-word/the-color-reproduction-problem/

Thanks.  This is a more approachable explanation.  The DPR CFA II modeling thread explains the difficulties you alluded to in your earlier post.  All of this is, of course, several orders of magnitude above my head.  But reading your blog posts and looking at the DPR threads makes me appreciate the complexities.

I really appreciate your using prose instead of equations in the fifth post in your series.  If Amazon would make an equation-to-prose translator I'd understand Wyszecki and Stiles better.  :)

In blog post 14 you noted how tedious it was to transcribe Lab numbers from Photoshop.  This is exactly why I tried my 3D plot approach of visualizing spectra.  I had followed Andrew's recommendation for BabelColor CC&T and transcribing Photoshop Lab values from different ColorChecker chart readings got old real fast.  I was looking at different spectral analysis programs hoping to find one where you could just refer to image files of a CC chart and a reference CC chart (like we can with the Adobe DNG Profile editor) when I got to the video of the photographer using Chromix ColorThink Pro to plot the gamut of an image against a color profile and I remembered that I could do the same thing with ArgyllCMS programs.  It was only a short leap to plot a CC gamut against a different CC gamut.

I'm looking for a middle ground that shows something meaningful about an illuminant's color rendering accuracy that doesn't require a very long detour into complicated math.  I can handle scripting ArgyllCMS (or other) command line programs.   

Doug said in an earlier post that hue shifts are more important than saturation changes.  If so, then doesn't this also apply to printer profiling?  Aren't hue shifts meaningful when profiling printers?  I don't think we can plot hue shifts with ArgyllCMS 3D plotting.

I like 3D plots because, as Doug said, they are pretty.  I'd also like them to be meaningful.
« Last Edit: June 14, 2018, 07:00:13 am by WayneLarmon »
Logged

Iliah

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 770

You are not a color scientist, Alexey. You're a programmer as you've pointed out.

If you're so sure about this process then provide practical application that solves the problem of color reproduction using the photographic process.

As it is you don't know diddly squat about what you are doing. Get a degree in color science and come back and report that you solved a problem with that education that actually helped people, not just informed them. Talk is cheap and you're no expert. No one here is an expert, not even me.

I'm a pragmatist. I get things done! I don't talk about getting it done!

From the colour science perspective, you are utterly failing to refute anything Alexey is saying. Maybe it is you who are in need of an education to even begin understanding what is discussed here.
Logged

WayneLarmon

  • Full Member
  • ***
  • Offline Offline
  • Posts: 162

The problem is that getting the closest to some CCT ignores the green/magenta error, which is unlimited.

Could you explain the unlimited green/magenta error?  Is this something about Duv?

Quote
And those (hue shifts) are what human vision is most sensitive to as saturation increases. In fact, generally human vision becomes less sensitive to changes in saturation as the saturation increases while hue shifts remain more perceivable. This is a principal reason I find 3D gamut images not very useful. It's more useful for showing saturation than hue shifts.

OK, 3D gamut plots aren't useful for comparing different measured lights against daylight, but what about the simpler case of comparing the gamut of a real world image against the gamut of a color space (i.e., sRGB) as a 3D plot?  Illustrated by this plot of a synthetic Lab CC chart against sRGB Compared to my real world shot of a CC chart shot under Walmart lights. In both cases cyan pokes out.  (The point of the two comparisons is to show that the gamut of my real world shot of the CC chart is comparable to the gamut of the synthetic CC chart, in addition to showing the general validity of plotting image gamuts against color space gamuts.)

Under the assumption that you shouldn't use a color space that is larger than is needed.  We all know approximately what the gamut of a CC chart is but we don't know the gamut of any arbitrary real world image.  Isn't plotting the gamut of an image against a color space useful?   
Logged

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2197

Could you explain the unlimited green/magenta error?  Is this something about Duv?
Yes, For any given CCT, there are an infinite number of xy (or uv) coordinates with the same CCT. Duv is one way to specify how far away from the black body locus having that CCT a specific color not on the black body locus is.

That said, how small the Duv is together with the CCT only determines what the "white illuminant" is and says almost nothing about how well rendered actual objects are. Here's a specific example. There exists multiple pairs of two saturated single wavelengths, each at the boundaries of the CIExy human vision gamut, that can produce a "white" at exactly D50*. One might look like a strong cyan, the other a strong orange. Mixed together in the right proportions the result is "white." But if you then illuminate a colorchecker with those two wavelengths all the colors except for the 6 neutrals will look highly distorted. So focusing on how close a "white" is to D50 (or any other white point) tells only a small part of the story. You can have an extremely spectrally spikey illuminant that is very close to D50 yet is awful rendering colors and likely has a very poor CRI. As inadequate as CRI is, it's more useful than Duv.

http://www.brucelindbloom.com/Eqn_XYZ_to_T.html

*This assumes a person with color perception that matches the "Standard Observer" and for cases of only two wavelengths there are significant variations between individuals.
Quote


OK, 3D gamut plots aren't useful for comparing different measured lights against daylight, but what about the simpler case of comparing the gamut of a real world image against the gamut of a color space (i.e., sRGB) as a 3D plot?  Illustrated by this plot of a synthetic Lab CC chart against sRGB Compared to my real world shot of a CC chart shot under Walmart lights. In both cases cyan pokes out.  (The point of the two comparisons is to show that the gamut of my real world shot of the CC chart is comparable to the gamut of the synthetic CC chart, in addition to showing the general validity of plotting image gamuts against color space gamuts.)

Under the assumption that you shouldn't use a color space that is larger than is needed.  We all know approximately what the gamut of a CC chart is but we don't know the gamut of any arbitrary real world image.  Isn't plotting the gamut of an image against a color space useful?

Usefulness depends. People understand comparisons in different ways. I'm somewhat numbers oriented and would rather see the 24 Lab numbers resulting from the Walmart light compared with the D50 standard. Especially with the dE00 difference on each patch. It's a short text listing. For me it's more informative. I do like comparing to Colorchecker patches since I'm more familiar with them.
« Last Edit: June 16, 2018, 10:30:28 pm by Doug Gray »
Logged

Jim Kasson

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2370
    • The Last Word


Under the assumption that you shouldn't use a color space that is larger than is needed. 

I think that is old-fashioned thinking from the bad old days of 8-bit precision. I am perfectly happy to use PPRGB at Ps's 15-bit-plus-one-value precision.

https://blog.kasson.com/?s=color+space+conversion

Jim

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/

I think that is old-fashioned thinking from the bad old days of 8-bit precision. I am perfectly happy to use PPRGB at Ps's 15-bit-plus-one-value precision.

https://blog.kasson.com/?s=color+space+conversion

Jim

Yes, that concept, outside the bit depth considerations is an urban legend, perhaps created on the DPR forums that are colorimetrically false. First, those who propose it can't tell us how we should determine how the image gamut would fit. Sure, you can render from raw into various color spaces and plot them. Image by image? More importantly, unnecessary. When I heard this silly idea that you should use a color gamut working space that isn't any larger than the image, I colorimetrically proved it wrong. I took an image that from raw can easily fit into sRGB, did so, then rendered into ProPhoto RGB RGB. There's no difference using either color space!
If an image from raw can't fit into the resulting color space gamut, you clip colors; not good.
If an image easily fits, using something significantly larger makes NO difference.


These are empty containers of differing sizes and without an encoded pixel, they contain nothing.
Anyway, here's a video I did, dedicated to a fellow who states this nonsense over and over again in another forum and can't explain nor prove what he states has any validity, because it doesn't.

sRGB urban legend Part 1

In this 30 minute video I'll cover:
Is there benefit or harm using a wider color gamut working space than the image data?
Should sRGB image data always be encoded into sRGB?
What are RGB working spaces and how do they differ?
What is Delta-E and how we use it to evaluate color differences.
Color Accuracy: what it really means, how we measure it!
Using Photoshop to numerically and visually see color differences when using differing working spaces.
Using ColorThink Pro and BableColor CT&A to show the effects of differing working space on our data and analyzing if using a smaller gamut working space is necessary.
Appendix: testing methodology, how differing raw converters encode into working spaces, capturing in-camera JPEG data and color accuracy.

Low resolution (YouTube): https://www.youtube.com/watch?v=1w0zUIl-dzY
High resoution: http://digitaldog.net/files/sRGBMyths.mp4
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2197

Anyway, here's a video I did, dedicated to a fellow who states this nonsense over and over again in another forum and can't explain nor prove what he states has any validity, because it doesn't.

sRGB urban legend Part 1

Low resolution (YouTube): https://www.youtube.com/watch?v=1w0zUIl-dzY
High resoution: http://digitaldog.net/files/sRGBMyths.mp4
As usual, another excellent video. I've watched many of yours but this is the first time I saw this one.  Quite good and should help folks get more of a handle on colorspaces and discard misconceptions many seem to have acquired.

One note. You pointed out Photoshop truncates Lab values in the info panel.  They finally fixed that! Photoshop now allows high resolution of Lab values in the info panel but only if you select 32bits and Lab. Then you can get things like (71.23, 19.84, 18.64). It works even with 8 bit RGB images and shows the actual Lab values fractionally.
Logged

Iliah

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 770

I think that is old-fashioned thinking from the bad old days of 8-bit precision. I am perfectly happy to use PPRGB at Ps's 15-bit-plus-one-value precision.

https://blog.kasson.com/?s=color+space+conversion

Jim

I also doubt "need" is something easily predictable. I know of 4 versions of Monet's San Giorgio Maggiore by Twilight; all in different colour, and from measurements one version doesn't fit sRGB.
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/

You pointed out Photoshop truncates Lab values in the info panel.  They finally fixed that! Photoshop now allows high resolution of Lab values in the info panel but only if you select 32bits and Lab. Then you can get things like (71.23, 19.84, 18.64). It works even with 8 bit RGB images and shows the actual Lab values fractionally.
Yes but the newer values don't make a lot of sense to me when set to 16-bit, why?. Sampler point on some pixels with the readout set to 8bit first, then 16-bit next.
Indeed, 32 bit is the way to go....
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/

Just discovered a cool trick: Make multiple sampler points on an image. Hold down Option key (Mac), toggle to Lab then toggle to 32 bit; all sampler points update to reflect that setting.
Too bad we can't set this in Preferences so it always reflects the settings on all documents. :-[  Everything defaults back to RGB/8-bits expect the 'main' readout. At least it's sticky.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Iliah

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 770

Yes but the newer values don't make a lot of sense to me when set to 16-bit, why?. Sampler point on some pixels with the readout set to 8bit first, then 16-bit next.
Indeed, 32 bit is the way to go....
2 taken to the power of 15, 2^15 = 32768
44 / 100 * 2^15 = 14417.92
 Given that 44 is probably rounded, seems close enough.
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/

2 taken to the power of 15, 2^15 = 32768
44 / 100 * 2^15 = 14417.92
 Given that 44 is probably rounded, seems close enough.
Thanks, good to know. I wonder however, why Adobe provided that option. Is there any reason to use it? It's just a readout that's quite foreign to me. 
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

WayneLarmon

  • Full Member
  • ***
  • Offline Offline
  • Posts: 162

Yes, that concept, outside the bit depth considerations is an urban legend, perhaps created on the DPR forums that are colorimetrically false.

My assumption was based on several premises
  • That smaller is smaller and is computationally cheaper.  CPUs and storage space neither free nor infinite.
  • If you do extensive editing in a smaller space there is less danger of posterization in gradients (skies) than if you do extensive editing in a larger space and then export to a smaller space.
  • If you do extensive editing in a larger space (that contains colors that any real world monitor can't display) that there is increased danger of hue and saturation shifts when you export to a smaller space.
 
If you know beforehand how your real world image fits in various color spaces, so you don't have to spend a lot of time with Photoshop's crude gamut clipping soft proofing tools.

Quote
First, those who propose it can't tell us how we should determine how the image gamut would fit. Sure, you can render from raw into various color spaces and plot them. Image by image?

I am working on a tool that uses 100% free components and is easy to use.  Where is the disadvantage?

Quote
More importantly, unnecessary. When I heard this silly idea that you should use a color gamut working space that isn't any larger than the image, I colorimetrically proved it wrong. I took an image that from raw can easily fit into sRGB, did so, then rendered into ProPhoto RGB RGB. There's no difference using either color space!
If an image from raw can't fit into the resulting color space gamut, you clip colors; not good.
If an image easily fits, using something significantly larger makes NO difference.

I already told you I watched your sRGB Myths video.  I just watched it again.  I obliquely referenced it in my post to Doug when I highlighted CC chart cyan poking out of sRGB.   Your video covered a CC chart and an unedited image of a white dog and snow.  Neither illustrate the concerns I raised in my list, above.

I also said that I did a random walk examining different spectral analysis tools.  Babelcolor CC&T ($125), Robin Myers Imaging SpectraShop 5 ($99), and Chromix ColorThink Pro ($399).

I started out using CC&T to compare Lab values of different CC charts (ones I shot under different light against the reference CC Lab values) and hand transcribing Lab values from the PhotoShop info panes into CC&T got tedious real quick.  I wanted something that would compare the Lab values from each square of the CC charts as one operation.  I examined all three programs and I don't think that any of them could do it.  So I returned to my existing methods of running images (and color spaces) through the ArgyllCMS utilities to produce 3D plots.  This is quick and easy--all I had to do was add the filenames of the various image files and ICC profiles to a configuration file and run my script.  And the interactive HTMLish 3D plots are easily sharable.  (ColorThink Pro's...?)

I also read Jim's blog posts

https://blog.kasson.com/the-last-word/the-color-reproduction-problem/.  As I posted earlier "In blog post 14 you noted how tedious it was to transcribe Lab numbers from Photoshop"  and then the post referenced Matlab.  I did a quick check of Matlab and that is "write for a quote", which I assumed didn't mean that it was cheap.  So I kept on with my script and ArgyllCMS programs and asked for suggestions on how to improve my methodology.

My random walk continued.  Last night I got to GNU Octave which is free and claims to be "Drop-in compatible with many Matlab scripts".   So possibly this could be used as a free untedious way of comparing CC chart Lab values? (And for doing many other things.)

But, not knowing anything about Matlab, I could use a head start in doing this.
« Last Edit: June 17, 2018, 12:46:17 pm by WayneLarmon »
Logged

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2197

Just discovered a cool trick: Make multiple sampler points on an image. Hold down Option key (Mac), toggle to Lab then toggle to 32 bit; all sampler points update to reflect that setting.
Too bad we can't set this in Preferences so it always reflects the settings on all documents. :-[  Everything defaults back to RGB/8-bits expect the 'main' readout. At least it's sticky.

Nice!  One caution on the hi-rez (32 bit) Lab readouts. They are wrong when zoom levels are under 100%. Looks like Adobe is grabbing them from downsampled 8 bit data.

I do find the hi-rez stuff useful for things like examining the color shifts over a neutral gradient R=B=G in printer device space to see the sorts of color perturbations from device neutral. Big differences between my 9500 and 9800. The profiling software works harder (and needs more patches) in a few areas of the neutrals.
Logged
Pages: 1 ... 9 10 [11] 12 13 ... 17   Go Up