Pages: 1 ... 3 4 [5]   Go Down

Author Topic: DxO PhotoLab 2 working space tests  (Read 12536 times)

Jack Hogan

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 798
    • Hikes -more than strolls- with my dog
Re: DxO PhotoLab 2 working space tests
« Reply #80 on: January 19, 2019, 09:31:45 am »

Hi Doug,

That's just me thinking floating point vs unsigned integers.  It appears that many folks over-think the in/out of working gamut issue, which is instead pretty simple both in principle and to understand: start by visualizing a cube, end by visualizing a cube, what image information falls outside of the ending cube is out of gamut.  That's it.

Some free-wheeling thoughts.  Assuming linearity for the sake of simplicity, image information:

1) is originally in a plane of raw data (say rggb);
2) it is assembled in the form of an initial rgb cube which, after white balance and demosaicing, has origin at [0,0,0]; and
3) it's still the same cube, albeit viewed from a different perspective, when it is stored in the file to be displayed.  The final point of view is specified by the relevant spec (e.g. Adobe RGB)

So how can image information end up falling out of the final cube?  There are only two ways (they can actually be considered one and the same):

i) White balance
ii) The change of perspective matrix multiplication.

White balance multipliers are what they are given the sensor and the illuminant.  For my 5DSR ISO100 example above they were

 1 / [0.45819,1,0.65374]

for r, g and b resp.  This means that, absent some brightness manipulation, image information from the red CFA channel greater than 45.8% of full scale will be out of gamut (also greater than 65.4% blue and 100% green).

After white balancing (so the initial cube will have origin at [0,0,0] and normalized vertex at [1,1,1]) only positive normalized values between 0 and 1 are part of the color space cube, everything else is out of gamut.  The wbraw->aRGB compromise color matrix for this case happens to be:

[   1.4047   -0.3878   -0.0169
   -0.2301    1.7805   -0.5504
    0.0043   -0.4620    1.4577 ]

This matrix will project all rgb image data in the initial cube 2) to the final cube 3) above.  Note for instance that the origin of the initial cube [0,0,0] maps to also [0,0,0] in the final cube; and the initial normalized vertex [1,1,1] maps to [1,1,1] in the final cube.  Black to black, white to white, so far so good.

But it is also obvious that the projection may push some image data out of the final RGB cube.  For instance a full green CFA input signal with no red and blue in the initial cube [0,1,0] will result in a tone outside of the final cube: [-0.3878 1.7805 -0.4620].  Those coordinates are not between zero and one, so the tone is out of gamut.  Many more like that near there.

That's all there is to it, no magic.  Just in or out of the final RGB cube, which in this example is Adobe RGB.

My earlier point was simply that if one sticks to floating point and retains all values (less than zero, greater than one and all) until the final conversion, it really does not make any difference whatsoever to tone 'accuracy' what the matrix (hence the working color space or its gamut size) is.  No need to work in PP for instance.  Even so I would want a working color space to have a gamut similar to that of the monitor on which the image being adjusted is viewed, so that one can see what they are doing.  For most of us these days that means at best a true 8-bit video path to an Adobe RGB monitor.

Cheers,
Jack


Logged

32BT

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3095
    • Pictures
Re: DxO PhotoLab 2 working space tests
« Reply #81 on: January 19, 2019, 10:22:20 am »

Is that a round about way of saying your workingspace is unlimited XYZ?

I implemented that at some point. Used a perceptual XYZ space (which is obviously an RGB space, just with virtual primaries) as a workingspace to do additional processing. Because technically you just want to stay in source space as long as possible for exposure- and whitebalance compensation.

For creative adjustments however, one may opt for an independent workingspace which should at least operate relatively predictably. Perceptual XYZ seemed to work fine on modern hw.
Logged
Regards,
~ O ~
If you can stomach it: pictures

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2197
Re: DxO PhotoLab 2 working space tests
« Reply #82 on: January 19, 2019, 11:31:35 am »

Hi Doug,

That's just me thinking floating point vs unsigned integers.  It appears that many folks over-think the in/out of working gamut issue, which is instead pretty simple both in principle and to understand: start by visualizing a cube, end by visualizing a cube, what image information falls outside of the ending cube is out of gamut.  That's it.

Sure. Any RGB space, represented in floating point (including negatives) will cover the entire human gamut. There is even a spec for signed sRGB. The notion was that the negative digital values would just be clipped when going to a sRGB like monitor but larger gamut monitors could display more saturated colors.
Quote


Some free-wheeling thoughts.  Assuming linearity for the sake of simplicity, image information:

1) is originally in a plane of raw data (say rggb);
2) it is assembled in the form of an initial rgb cube which, after white balance and demosaicing, has origin at [0,0,0]; and
3) it's still the same cube, albeit viewed from a different perspective, when it is stored in the file to be displayed.  The final point of view is specified by the relevant spec (e.g. Adobe RGB)

So how can image information end up falling out of the final cube?  There are only two ways (they can actually be considered one and the same):

i) White balance
ii) The change of perspective matrix multiplication.

White balance multipliers are what they are given the sensor and the illuminant.  For my 5DSR ISO100 example above they were

 1 / [0.45819,1,0.65374]

for r, g and b resp.  This means that, absent some brightness manipulation, image information from the red CFA channel greater than 45.8% of full scale will be out of gamut (also greater than 65.4% blue and 100% green).

After white balancing (so the initial cube will have origin at [0,0,0] and normalized vertex at [1,1,1]) only positive normalized values between 0 and 1 are part of the color space cube, everything else is out of gamut.  The wbraw->aRGB compromise color matrix for this case happens to be:

[   1.4047   -0.3878   -0.0169
   -0.2301    1.7805   -0.5504
    0.0043   -0.4620    1.4577 ]

This matrix will project all rgb image data in the initial cube 2) to the final cube 3) above.  Note for instance that the origin of the initial cube [0,0,0] maps to also [0,0,0] in the final cube; and the initial normalized vertex [1,1,1] maps to [1,1,1] in the final cube.  Black to black, white to white, so far so good.

But it is also obvious that the projection may push some image data out of the final RGB cube.  For instance a full green CFA input signal with no red and blue in the initial cube [0,1,0] will result in a tone outside of the final cube: [-0.3878 1.7805 -0.4620].  Those coordinates are not between zero and one, so the tone is out of gamut.  Many more like that near there.

That's all there is to it, no magic.  Just in or out of the final RGB cube, which in this example is Adobe RGB.

Absolutely! Camera sensors don't have a gamut. At least in the normal sense of the word. They do have a response. Sensors, in combination with the CFAs, being quite linear. To the degree the combination meets L/I, the sensor's  channels will map perfectly to XYZ (positive numbers only) and any other RGB space if negative numbers are allowed. L/I is an unmet ideal so the mapping is not one to one. If one goes around the wavelengths of 400nm to 700nm the xy transform will move in and out of the human xy horseshoe for linear curve fits.
Quote


My earlier point was simply that if one sticks to floating point and retains all values (less than zero, greater than one and all) until the final conversion, it really does not make any difference whatsoever to tone 'accuracy' what the matrix (hence the working color space or its gamut size) is.  No need to work in PP for instance.  Even so I would want a working color space to have a gamut similar to that of the monitor on which the image being adjusted is viewed, so that one can see what they are doing.  For most of us these days that means at best a true 8-bit video path to an Adobe RGB monitor.

Of course.
Quote



Cheers,
Jack

Check your PM for email.
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: DxO PhotoLab 2 working space tests
« Reply #83 on: January 19, 2019, 11:47:29 am »

“Human gamut”? May I suggest something like simply “human perception of color”? Humans (like cameras) do not have a color gamut. Considering the forum and a lack of quotes around human gamut, I hope no one is upset by the clarification.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2197
Re: DxO PhotoLab 2 working space tests
« Reply #84 on: January 19, 2019, 12:37:29 pm »

“Human gamut”? May I suggest something like simply “human perception of color”? Humans (like cameras) do not have a color gamut. Considering the forum and a lack of quotes around human gamut, I hope no one is upset by the clarification.
Yeah. Probably a much better term is the CIExy gamut, which is the physical limit of single wavelength chromaticity response modeled through the color matching functions. A specific xyY defines a stimulus, but not the perceived apparent color which depends strongly on adaptation state and the current surround.
« Last Edit: January 19, 2019, 01:58:00 pm by Doug Gray »
Logged

GWGill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 608
  • Author of ArgyllCMS & ArgyllPRO ColorMeter
    • ArgyllCMS
Re: DxO PhotoLab 2 working space tests
« Reply #85 on: January 19, 2019, 08:49:55 pm »

start by visualizing a cube, end by visualizing a cube, what image information falls outside of the ending cube is out of gamut.  That's it.
Beware - the cube assumption is good for many situations, but not all. Move to CMYK and you get a hyper cube. Add in ink limits and the hyper cube gets cut plains-taken out of it. When you map a hyper cube into 3 dimensions (especially with non-linearity) there is no guarantee that planes remain well ordered - different surface can inter-penetrate, so that no simple geometry describes the surface. Real world devices and inks can behave non-monotonically (i.e. especially the maximum end may fold back), so that in general, you can't simply assume that the gamut surface corresponds to the maximum device value.
Logged

Jack Hogan

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 798
    • Hikes -more than strolls- with my dog
Re: DxO PhotoLab 2 working space tests
« Reply #86 on: January 20, 2019, 04:06:34 am »

It appears that many folks over-think the in/out of working gamut issue, which is instead pretty simple both in principle and to understand: start by visualizing a cube, end by visualizing a cube, what image information falls outside of the ending cube is out of gamut.  That's it.

Beware - the cube assumption is good for many situations, but not all. Move to CMYK and you get a hyper cube. Add in ink limits and the hyper cube gets cut plains-taken out of it. When you map a hyper cube into 3 dimensions (especially with non-linearity) there is no guarantee that planes remain well ordered - different surface can inter-penetrate, so that no simple geometry describes the surface. Real world devices and inks can behave non-monotonically (i.e. especially the maximum end may fold back), so that in general, you can't simply assume that the gamut surface corresponds to the maximum device value.

Right.  To further clarify my simple visualization was aimed at the choice of a working color space facing the OP and the typical raw-conversion operator when adjusting his/her capture by viewing it on a monitor: from demosaiced white-balanced raw data to the file to be shared/output, typically sRGB, Adobe RGB or ProPhoto RGB (I guess we'll have to add Rec2020 in the near future to that list).

Jack
Logged

Jack Hogan

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 798
    • Hikes -more than strolls- with my dog
Re: DxO PhotoLab 2 working space tests
« Reply #87 on: January 20, 2019, 04:11:28 am »

Is that a round about way of saying your workingspace is unlimited XYZ?

More like white-balanced and demosaiced raw 'space', the normalized 'gamut' of which is a cube - though I know some purists would object to some of these words.
Logged

32BT

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3095
    • Pictures
Re: DxO PhotoLab 2 working space tests
« Reply #88 on: January 20, 2019, 04:39:41 am »

More like white-balanced and demosaiced raw 'space', the normalized 'gamut' of which is a cube - though I know some purists would object to some of these words.

Right, so what would be the objectives?

I'd presume:

1. Keep the data in capture space as long as possible
2. Some kind of consistency when editing
3. A useful representation of output limitations

Regarding 1:
That would be useful for editing exposure, whitebalance, and, very importantly, lenscorrections. Exposure and white balancing within capturespace allows one to edit while maintaining (in)consistency of capture device. Chromatic Aberration corrections are best done before anything introduces crosschannel talk. (Including demosaicing).

Regarding 2:
When editing images creatively, it is useful if the data is controllable and operates predictably. This requires a controllable colorspace, which is to say: perceptual, not too large, and equal for different input devices. (It may allow negative values and overshoot, which has consequences for user controls).

Regarding 3:
There are obviously different ways of representing the limitations, usually out-of-gamut colors. Most oog colors are primarily oversaturated colors. An interesting problem would be capture colors with one or two negative coordinates. No amount of exposure compensation will bring those colors back in outputspace.
Logged
Regards,
~ O ~
If you can stomach it: pictures
Pages: 1 ... 3 4 [5]   Go Up