Andrew, most of those are straw-man arguments. The only one that clearly isn't is #3.Yes they are, that's why the video is to address myths and misinformation.
With all due respect, you're going down the same useless rat hole that resulted in a 27 page thread by talking about sRGB vs Adobe.Yes I am, that's kind of the point of the video, a rebuttal if you will, much of it discussed here but in one location for those who don't want to read through 27 pages. A video too would allow the use of 3D gamut plots, showing the affect of the Assign Profile command and so forth. IOW, I could write an article or just copy and paste all the good stuff from those 27 pages, but a 10-15 minute video would IMHO be much more effective to present to someone who is looking for the crux of the facts. It isn't intended for this audience.
I'd have to agree with Sandy. A more useful approach is "what happens when color management breaks down" - and the various ways in which that can happen. Explain how and why it's essential to use a profile that actually describes the color space it refers to. "There are no sRGB printers" would fit nicely in here. Using the wrong display profile also.I was hoping the Assign Profile demo would illustrate this. If you take Adobe RGB (1998) data and Assign sRGB, what you'd see on-screen would appear incorrect despite the numbers not changing and I'd have the Info palette up to show this. This illustrates that the numbers must have the correct scale associated or they "look dull" or "look too saturated". It is a color management failure. If you move forward with that incorrect tag, the output will of course suffer. So the numbers are and have been correct from the beginning. The color management failure is the improper assignment (scale) which affects everything moving forward. Message to viewer: don't do that!
Discussing the relative merits of different color spaces is indeed the wrong question and beside the point.Well there are merits to posting sRGB for web viewing right? And there are merits to sending Adobe RGB (1998) instead of sRGB to the printer as Bill illustrated in his flower image. What the message should be IMHO is, test both (here's how). Just as Bill did. Take Adobe RGB (1998) and convert to sRGB, maybe the same with ProPhoto too. Take each and properly send them to a printer and examine the results.
An important point to get across, IMO, is that color management isn't nearly as difficult as many try to make it. The problems start when color management breaks down and stops.Duly noted and I agree. That's why I suggest that assigning sRGB to Adobe RGB data or vise versa is wrong but not due to the color spaces but the user we handled the data incorrectly.
Yes they are, that's why the video is to address myths and misinformation...
Andrew, most of those are straw-man arguments. The only one that clearly isn't is #3.
#4 is interesting, because it is confusing, including myself. As someone noticed, if we perceive more differentiated colors (or variations, nuances of, more saturated) in say Adobe RGB, than in plain English it does have more colors.
I am sure you understand the meaning of "straw-man" argument, i.e., it is you who first creates a "myth," and then gloriously debunks it. Otherwise, nobody claims that "Adobe produces duller colors," nobody claims there are "sRGB printers," etc., these are just your hairsplitting, literal interpretations of contextual shorthands people use.I'm glad you brought that up because after you wrote the first piece, I did look it up (I've heard it for years but didn't understand your context). One example I saw:
Description of Straw ManSo someone watches a video where a person talks about duller colors and sRGB printers. What can and should be done to dismiss those fallacies and reeducate that viewer? Is this idea of mine pointless (I'm willing to abandon it, but I think folks need some facts presented to them properly).
The Straw Man fallacy is committed when a person simply ignores a person's actual position and substitutes a distorted, exaggerated or misrepresented version of that position. This sort of "reasoning" has the following pattern:
Person A has position X.
Person B presents position Y (which is a distorted version of X).
Person B attacks position Y.
Therefore X is false/incorrect/flawed.
This sort of "reasoning" is fallacious because attacking a distorted version of a position simply does not constitute an attack on the position itself. One might as well expect an attack on a poor drawing of a person to hurt the person.
I'd have to agree with Sandy. A more useful approach is "what happens when color management breaks down" - and the various ways in which that can happen. Explain how and why it's essential to use a profile that actually describes the color space it refers to. "There are no sRGB printers" would fit nicely in here. Using the wrong display profile also.I agree and think also that this is a bit of a fools errand. What is the audience for this video, not those of us on LuLa who have learned all of this. I don't think it's the general public that shoots JPEG and posts to various media sites. Jeff Schewe condenses all of this to a relatively few very well written pages in his books on the digital negative and print. Your book can be gone to for more in depth coverage (and I have a well worn copy!). Anyone who does printing should have some good knowledge of things, those who just post to the Internet could care less.
Discussing the relative merits of different color spaces is indeed the wrong question and beside the point.
An important point to get across, IMO, is that color management isn't nearly as difficult as many try to make it. The problems start when color management breaks down and stops.
Regarding number 4, I think a good, accurate analogy might help people understand. I think the main problem is that the question is ill-formed. AdobeRGB and sRGB are just spaces, they don't inherently have any information (other than specifications for primaries, white point, etc). Until you actually have a pixel, there isn't any color information. The analogy I like, which I think is accurate enough, is a thermometer. If you have a thermometer that goes from 0-100º C and another that goes from 0-200º C, which one has more information? It's kind of a dumb question right? Thermometers don't have information — the information is the actual measurements. If I take to two measurements with each, do I have more information if I use the 0-200º thermometer? No, you have exactly two measurements worth of information in each case. Those measurement could be specified by how accurate they and this could be expressed as bits. You probably have to draw the wonk-line somewhere before you start talking about Claude Shannon, but I'm not sure where that is. A question you can ask with this analogy is this: how fine of a distinction in temperature can a person make? If you determined that and added a scale in these units to the thermometer, then the 0-200º thermometer would measure more of these units. But that's different than saying it has more information and very different from saying that's all the precision we need in practice.Absolutely brilliant! Great analogy. I'm going to want to steal it <g>.
... do I have more information if I use the 0-200º thermometer?...
You do if your temperature is 150 degrees.
You do if your temperature is 150 degrees.Well yes and you're dead ;D
I agree and think also that this is a bit of a fools errand.You might indeed be correct.
What is the audience for this video, not those of us on LuLa who have learned all of this.No, not this audience by a long shot.
It would be better with a video explaining color management in so clear and concise way that there are no more misconceptions. Such a video could have an audience other than this preaching to the choir stuff.I'm beginning to think you folks are correct, this is a fools errand. When I ask people for advise, I generally take it (depending on the people of course ;D).
Others have said, loosely quoted "as simple as possible but no simpler" and "increasing the signal".
... If you have a thermometer that goes from 0-100º C and another that goes from 0-200º C, which one has more information?... If I take to two measurements with each, do I have more information if I use the 0-200º thermometer? No, you have exactly two measurements worth of information in each case...
I would strongly suggest we leave Gary out of this thread. If Andrew wants to create a video, it has to have a broader goal than just continue the bickering with one person.I'll start with this last thread first. I agree. There are numerous and equally (?) incorrect video's about sRGB vs. Adobe RGB (1998) on YouTube that it isn't necessary to single out one person.
You analogy is based on a presumption that the reading will be within the healthy human range, i.e., up to 98.6.Well let's agree up to 105 or so, whatever agreed upon temp under which you're not dead. If the reason for taking the measurement is to see if you have a fever and if so, how high. As such, if we agree the max value is 105, the 200 degree meter isn't useful.
The best rebuttal to bad info is, I think, not to get dragged into a tit-for-tat exchange with the disinformationists, tactically fighting problems on a case-by-case basis. I think you'd be better spending your time putting out quality information that engages the viewership it's meant to help, and then circulating that material around. Like immunizing a population against a common virus. :)I agree.
Andrew, in my humble opinion the issue has never been about an argument pitting sRGB and AdobeRGB against each other.+1 (and Royce's post)
Instead Gary was committing acts of violence against a rational colour management-based workflow.
Far better to explain the place of the various colourspaces in a rational workflow.
How, depending on one's goals at shooting and the intended output, one can make very rational choices about assigning workspaces in camera (not applicable when shooting RAW - this should be explained too).
Some time should be spent discussing the rationale behind Adobe giving photographers no choice in their working colourspace - it is ProPhotoRGB for everyone - so many of the strange and incorrect misconceptions that Gary has can be put to bed here.
A good explanation of colourspaces on input, working colourspaces, and output colourspaces integrated into the above discussion.
How it is possible to take an image from one colourspace to another without doing the sort of violence to it that Gary demonstrated i.e. what happens with a colourspace conversion and the place of softproofing.
Thinking about it - I am not sure that any of us could adequately explain any of this in few minutes but the challenge is to make rational colour management-based workflows accessible to the very owners of "Digital Rebels" that Gary Fong believes cannot understand principles of colour management.
You may note that I have not included Gary's classification of individuals who are not interested in colour management - they will no more view Gary Fong's videos on colour management than they will Andrew Rodney's.
I was once one of those Digital Rebel owners who knew nothing about colour management but gradually became dimly aware that it may actually have some relevance - the journey started there.
So, I strongly believe that we should not get sucked into Gary Fong's misinformation agenda (the sRGB vs AdobeRGB non-issue) and concentrate on explaining rational and practical colour management-based workflow decision-making.
My 0.02c
Tony Jay
If you look at the triangles, how can you NOT say that the bigger triangle (Adobe RGB) has more colors?
For "printers with a color space larger than sRGB", besides showing gamut plot differences, it would be great if you could somehow show what those differences translate to in an actual print. I know it is like judging a high-end stereo over the telephone but I think giving your audience a hint of what to look for (increased saturation, more detail/less hue shift/less blockiness/posterization in saturated areas) would be helpful.
AdobeRGB and sRGB are just spaces, they don't inherently have any information (other than specifications for primaries, white point, etc). Until you actually have a pixel, there isn't any color information.I believe that is quite true.
Those measurement could be specified by how accurate they and this could be expressed as bits.We should consider bit depth and encoding. If the two PS documents above are 8-bit per color or 16-bit per color, any differences?
If you look at the triangles, how can you NOT say that the bigger triangle (Adobe RGB) has more colors?And here's the rub. The gamut is larger, no question. The range of colors is greater in one than the other. Does that mean one has more colors? Isn't that an attribute of the encoding of the pixles? Which goes back to gamut volume. Adobe RGB (1998) does have a larger gamut volume than sRGB but more colors?
Remember your geometry? A big triangle and a small triangle both have an infinite number of points. One does not have more than the other.
The statement also fails from a practical standpoint since the nature of digital imagery caps the points/colors for images in both color spaces to the same maximum. You have a larger range/gamut for one but the same number of potential colors as dictated by the number of bits used.
Also, that fact they might have the same number of potential colors, does not imply they have the same number of actual colors.And that's a critical distinction!
HuH!? I might have a rusty geometry memory, but never heard of "infinite number of points." Sure one has a larger area that the other?
Also, that fact they might have the same number of potential colors, does not imply they have the same number of actual colors. Again, I am speaking from a layman's perspective.
Colors are not physical properties, they are perceptions in our brain.
... We are almost literally into "number of angels on the head of a pin" territory.
I just think that the clearest and least-misleading way to talk about it is by sticking with words like gamut, range, and volume and NOT talking about quantities of discrete colors.
Oh, your triangle explanation is definitely in the "number of angels on the head of a pin" territory. ;)
If "sticking with words like gamut, range, and volume" you are DEFINITELY weaseling out of using plain English. You are hiding behind another layer that needs explanation (gamut, range...). All this just to avoid saying simply that Adobe RGB does have more discernible, actual colors, while at at the same time having the same number of potential colors as sRGB.
If "sticking with words like gamut, range, and volume" you are DEFINITELY weaseling out of using plain English. You are hiding behind another layer that needs explanation (gamut, range...). All this just to avoid saying simply that Adobe RGB does have more discernible, actual colors, while at at the same time having the same number of potential colors as sRGB.
This is basically what Chromix is doing when it uses a Delta-e value to define the area of a "color". Just be aware though that this is placing a human perceptual limitation on what a color is.But that's pretty important. As already pointed out, color, is a perceptual property. So if you can't see it it's not a color. A coordinate in a "colorspace" outside the spectrum locus is not a
Buffalo city or buffalo animal?Animal.
Also Slobodan, if you're subtly trying to imply with all this dialog that "that guy's video" was in layman's terms territory and we should have given him a break, I'll remind you of the following:I don't know why you have to and why Slobodan appears to be deliberately obtuse on the subject (sorry).
... Also Slobodan, if you're subtly trying to imply with all this dialog that "that guy's video" was in layman's terms territory and we should have given him a break...
...All he needed to say was "I recommend you stick with sRGB because AdobeRGB can get converted incorrectly by you, your printing company, or your web audience resulting in a dull image. You can re-consider using a larger color space like AdobeRGB as you learn more about color management and are better able to reduce the risks while better appreciating the benefits." He basically did what he accuses all of us doing by trying to impress people with charts and terminology for no good reason.
... Your reply was to ask if I was referring to a city or an animal...
... Do you really want to understand this and add to the conversation or just always play (and not always well) devils advocate....
...The triangles are of differing size just like the two states. Assuming that means the larger triangle has more colors is faith based on your part, unless you can prove otherwise. If you look at the US states, how can you NOT say that the bigger state (New Mexico) has more people than NY? Because it isn't true!
I don't know why you have to and why Slobodan appears to be deliberately obtuse on the subject (sorry).
Slobodan, you stated: If you look at the triangles, how can you NOT say that the bigger triangle (Adobe RGB) has more colors? I gave you an analogy with the size of a state and the incorrect assumption that it therefore has more people which isn't true. Your reply was to ask if I was referring to a city or an animal. Do you really want to understand this and add to the conversation or just always play (and not always well) devils advocate. The triangles are of differing size just like the two states. Assuming that means the larger triangle has more colors is faith based on your part, unless you can prove otherwise.
If you look at the US states, how can you NOT say that the bigger state (New Mexico) has more people than NY? Because it isn't true!
Because you wrote Buffalo NY with a capital B? And there is such a city and it is in the state of NY. I was just clarifying for the purpose of analogy. No games here.OK. Sorry for the confusion.
Ok... let's stick to this analogy. I do get it that two states might have different area but same (and definitely different) population. But if they have the same population, doesn't it mean that population density is different, i.e., there is more space between humans in the larger state?I don't think so, not necessarily. Take NM. There is just a tad over 2 million in that very large sized state. The vast majority of the population reside in two cities (Albuquerque and Las Cruces) and there are huge areas with no population. Not sure what that has to do with color gamut. The space between colors in an image in a color space would again be an attribute of the encoding.
Andrew,
Are you saying that the colour points aren't uniformly distributed throughout the gamut volume? I can understand that the population of a state may not be uniformly distributed, but surely points within a gamut are.
Martin
Are you saying that the colour points aren't uniformly distributed throughout the gamut volume?No, not at all. At least I don't believe that is the case. I'm saying the size of the triangle doesn't have anything to do with the number of colors. Slobodan and others see a larger gamut and assume a larger number of colors.
Gamut maps are not necessarily perceptually uniform, which is equivalent to say that identifiable colors are not uniformly distributed in a gamut. In any case, any plot 2D or 3D of sRGB and Adobe RGB will show that sRGB is a subset of Adobe RGB, so I don't even understand how this can be a discussion
That's what's confusing me. If we have an sRGB volume and we look at the 16.7 million colour points in it, I think we all agree we can only perceive a subset of those because some of them are just too close to see any difference. If we look at an AdobeRGB volume and look at the 16.7 million possible colour points in it, it would seem logical to me that since they are spaced further apart because they need to cover a larger volume, that we should be able to perceive a greater number of them. Is this not the case?
Martin
Adobe RGB (1998) has more colors than sRGB.This seems to be more about introducing the concept of a perceptually uniform colorspace, and the distinction between such a space and non-perceptually uniform spaces. Once perceptually uniform space is understood, then a gamut volume measured in such a space makes sense.
Comments?
If we have an sRGB volume and we look at the 16.7 million colour points in it, I think we all agree we can only perceive a subset of those because some of them are just too close to see any difference. If we look at an AdobeRGB volume and look at the 16.7 million possible colour points in it, it would seem logical to me that since they are spaced further apart because they need to cover a larger volume, that we should be able to perceive a greater number of them. Is this not the case?That seems reasonable but...
It is the case, you will be able to perceive a greater number of them. My point is that when you look a graph or a plot of the color spaces, you cannot conclude by the relative difference in size the amount of additional colors in the larger space. E.G a 2D plot shows one gamut being 20% larger than the smaller gamut -> There are not necessarily 20% more identifiable colors, some more colors for sure.
AdobeRGB and sRGB are just spaces, they don't inherently have any information (other than specifications for primaries, white point, etc). Until you actually have a pixel, there isn't any information.
This seems to be more about introducing the concept of a perceptually uniform colorspace, and the distinction between such a space and non-perceptually uniform spaces. Once perceptually uniform space is understood, then a gamut volume measured in such a space makes sense.That sounds very reasonable. There is some debate if Lab is truly a perceptually uniform colorspace.
That's what's confusing me. If we have an sRGB volume and we look at the 16.7 million colour points in it, I think we all agree we can only perceive a subset of those because some of them are just too close to see any difference. If we look at an AdobeRGB volume and look at the 16.7 million possible colour points in it, it would seem logical to me that since they are spaced further apart because they need to cover a larger volume, that we should be able to perceive a greater number of them. Is this not the case?
Martin
something from a digital capture?
That sounds very reasonable. There is some debate if Lab is truly a perceptually uniform colorspace.There's no debate - it's not a perfectly perceptually uniform space, and its many flaws have been well documented :-)
There's no debate - it's not a perfectly perceptually uniform space, and its many flaws have been well documented :-)
But it's much more perceptually uniform than XYZ or most device spaces, is widely accepted and understood, and there is no accepted replacement for it.
There is some debate if Lab is truly a perceptually uniform colorspace.
A "bigger" space does not need to encompass all of a "smaller" space
All possible colors in sRGB can be described in Adobe RGB
Some colors in Adobe RGB are out of sRGB
Result: More colors can be described in Adobe RGB
This isn't precisely true. sRGB has greater precision that AdobeRGB so there will be sRGB values that cannot be captured by adobeRGB to the same precision. Or to put another way, there will be pairs (or more) sRGB numbers that are described by a single AdobeRGB value.
On reflection, the thermometer analogy has a serious flaw. Alan Goldhammer hit on it. Here's the problem.
Although you might say that if I take one measurement from each thermometer I have the same amount of information from each, but that may not be true - all measurements are not the same. The amount of information a measurement has depends on the precision and scale of the system (I think this is a simplification of what information theory calls the entropy of the system). Say both thermometers go from 0-100. If one thermometer measures to a hundredth of a degree I will get a measurement like 90.75, but another might only measure to a tenth of a degree so I'll get a measurement of 90.8. There first measurement contains more information. To express it on a computer would require me to distinguish all values between 0 and 100 to hundredth of a degree precision - significantly more values than the 0-100 at a tenth degree precision.
Now if we go back to the original thermometers 0-100 and 0-200 and imagine both have a precision to 1 degree. Measurement from the 0-200 will have more information — they will require 1 more bit to store (i.e you need to distinguish between 201 values as opposed to 101). The situation changes, however if the scale changes. Rather than 1 degree precision, lets say they thermometers are each marked off from 0 - 255 (and readings between the lines is meaningless). Now a measurement from each contains precisely the same amount of information, exactly 8 bits. The 0-100 thermometer is twice as precise, but the 0-200 thermometer has twice the range.
That might be true if you are limiting to 8 bit Jpegs, but not if you use 16 bit Tiffs (or 15 bit as Photoshop really works).
The abstraction of color spaces per se does not limit the amount of digits you can use
Bruce Lindbloom has proposed a "Uniform Perceptual Lab" profile, you can read about it here (http://www.brucelindbloom.com/UPLab.html)There have been many efforts to produce more perceptually uniform colorspaces. Many of them are of a similar form to a cLUT profile - ie. numerically optimizing against CIEDE2000, McAdams elipses, CIECAM02 space etc., but this form is not very useful for general use - a simple set of equations is what's needed. Something like IPT space or DIN99 comes close to that (though neither are as simple as L*a*b*), but so far nothing seems to have hit the "sweet spot" of being noticeably better than L*a*b*, simple enough to be widely useful, and widely or officially accepted.
It doesn't matter if you are using 16 bits or 10 billion bits — sRGB will always be more precise for a given bit depth because those bits will be spread across a smaller area than adobeRGB. Maybe, I'm misunderstanding you on this…
sRGB has greater precision that AdobeRGB so there will be sRGB values that cannot be captured by adobeRGB to the same precision.With the same encoding, because the colorimetric distance is father apart in the wider gamut space? (the pixels are farther apart ;D).
Although you might say that if I take one measurement from each thermometer I have the same amount of information from each, but that may not be true - all measurements are not the same. The amount of information a measurement has depends on the precision and scale of the system (I think this is a simplification of what information theory calls the entropy of the system). Say both thermometers go from 0-100. If one thermometer measures to a hundredth of a degree I will get a measurement like 90.75, but another might only measure to a tenth of a degree so I'll get a measurement of 90.8. There first measurement contains more information. To express it on a computer would require me to distinguish all values between 0 and 100 to hundredth of a degree precision - significantly more values than the 0-100 at a tenth degree precision.Analogous to the encoding, 24 bit/48 bit finer possible ways to divide up the data?
This second scenario is exactly the situation we find ourselves in with colorimetric values. Because the amount of information contained in a pixel is given by definition of the encoding, the space is not relevant. You can have a space 50 billion times larger than proPhoto, but you still have 24 bits of information by definition.Makes sense.
But that's pretty important. As already pointed out, color, is a perceptual property. So if you can't see it it's not a color. A coordinate in a "colorspace" outside the spectrum locus is not a
color. Color is not a particular wavelength of light. It is a cognitive perception, the excitation of photoreceptors followed by retinal processing and ending in the our visual cortex, within our brains. As such, colors are defined based on perceptual experiments. And from that, we get deltaE.
... I guess you could state that "AdobeRGB can contain a greater number of colors than sRGB".... Notice though that I am still not saying that AdobeRGB "has more colors"...
Hmmm... "a greater number" does NOT equal "more"?
Substitute "more" for "a greater number". No problem. The important difference for me is between "has" and "can contain".
The important difference for me is between "has" and "can contain".Indeed. So we go back to the image itself, the encoding, other possible factors. Meaning it's probably not a good idea to say "Adobe RGB has more colors than sRGB" any more than "New Mexico has a larger population than New York".
..."New Mexico has a larger population than New York".
Ah, OK. So, when does "can contain" transition to "has"? When the subject has colors outside of sRGB?
Soon, soon...Maybe but my point is, today one could say the population of New Mexico is larger than New York if one doesn't specify the population of specifically what. It is true for buffalo and untrue if we assume humans. It's probably not a good idea to say "Adobe RGB has more colors than sRGB" without laying out a lot of caveats first.
Perhaps a more correct way to say it is that an image in the AdobeRGB color space can contain more colors than one in the sRGB color space.I like that!
The color space does not "have" the colors, the image does.
The "can" then changes to "has" when you have identified a particular image and determined if it does or does not.
Perhaps a more correct way to say it is that an image in the AdobeRGB color space can contain more colors than one in the sRGB color space...
Since we always assume, for all practical purposes, that when we say "Adobe RGB" we mean "an image in the AdobeRGB color space," (i.e., it is a shorthand), thus we can then say that Adobe RGB has more colors than sRGB.Not necessarily. Again, the image content and the encoding need to be considered. Eyeball summed it up best thus far and a key part of that was: The color space does not "have" the colors, the image does.
However, under no circumstance or caveats we can say that sRGB can have more colors than Adobe RGB. Nor we can say that sRGB can have the same amount of colors as Adobe RGB.
However, under no circumstances or caveats we can say that sRGB can have more colors than Adobe RGB. Nor we can say that sRGB can have the same amount of colors as Adobe RGB.With proper explanation without assumptions, yes, sRGB could easily have more colors than Adobe RGB (1998). Bills photo of flowers in sRGB vs. a photo of a gray card in Adobe RGB comes to mind.
With proper explanation without assumptions, yes, sRGB could easily have more colors than Adobe RGB (1998). Bills photo of flowers in sRGB vs. a photo of a gray card in Adobe RGB comes to mind...
Oh, come on! You have to assume the same image.You didn't say that. I didn't assume it. Again, you need to be clear and define what you're talking about; people or buffalo.
You didn't say that. I didn't assume it. Again, you need to be clear and define what you're talking about; people or buffalo.
You really don't get that using a simplistic statement "Adobe RGB has more colors than sRGB" is as dumb as saying NM has a larger population than NY? After all these posts?
Do you know the old saying about assumptions?
Can we agree on:I'm OK with that thus far.... ;D
"The AdobeRGB98 colour space has the capability of representing more human perceivable colours than the sRGB colour space. A given image may, or may not, take advantage of this potential depending on subject matter."
Andrew, it goes without saying it must be the same image for comparison purposes. You know, apples with apples, oranges with oranges.IF the issue is a comparison with images, sure. But that's not the statements we see or hear from others. They show a gamut plot of two color spaces. They jump to the same conclusions you did about the size of the two triangles and say "Adobe RGB has more colors than sRGB". This thread, which has been very useful, illustrates that without an image, which has a gamut, the statement alone doesn't make sense.
Ah, the lengths you guys are going to go just to avoid admitting that, in plain English, and for all practical purposes, "Adobe RGB has more colors than sRGB."I disagree. This is like Gary saying "just tell me what's going to be better, Adobe RGB or sRGB" based on his flawed testing. And no, not for all practical proposes, in specific situations with the rules set so no goal posts get moved. Without assumptions.
And here's another question: what about encoding?
We have an image (Bills Flowers) in 16-bit Adobe RGB which we convert to sRGB. We then convert his Adobe RGB image to 8-bits. Could the sRGB image have more colors?
I would really like to see a real example where thanks to the "increased precision" of sRGB it is possible to differentiate two colors that otherwise will appear as one color in AdobeRGB due to the "lower precision" of using the same number of digits in both spaces.
Volume of AdobeRGB in DeltaE^3 = 1,208,631 (Bruce Lindbloom)
Possible numbers in 8 bits x 3 channels = (255*255*255) = 16,581,375
Even in Adobe RGB, with 8 bits you are already one order of magnitude above the possible identifiable colors (if I understand correctly that a person will not differentiate between colors less than 1 DeltaE apart), and this is considering no noise (which does not happen in the real world)
So, my answer would be no, the sRGB image cannot have more colors, More numbers? yes, but they do not lead to different identifiable colors
Sure, it's not hard - for every AdobeRGB color value that is out of the sRGB gamut there is an one more sRGB color within the sRGB gamut than there is an AdobeRGB value. Here are 2 sRGB colors that map to one AdobeRGB color on PhotoshopCC on the Mac: [2, 255, 240] & [1, 255, 240] - They both map to AdobeRGB [144, 255, 240].
I've attached the file I used (it's only 2 pixels so you'll need to zoom in).
While this might be true about color spaces, in the case of this discussion, Abobe RGB does encompass all of sRGB.
I agree that this can become too abstract and we can get easily carried away in technicalities
A possible source for learning are the tutorials at Cambridge in Colour here (http://www.cambridgeincolour.com/color-management-printing.htm). In this (http://www.cambridgeincolour.com/tutorials/sRGB-AdobeRGB1998.htm) comparison between Adobe RGB and sRGB i find it useful the color gamut graphs at the bottom of the page, showing the difference when you use CIE xyz vs. CIE u'v', the latter giving a better approximation to what we really perceive
Mark, I understand your point and I agree that those different triplets in sRGB map to a single triplet in Adobe RGB. In my monitor, which is calibrated but not a sophisticated one, just a regular one, I cannot see any difference between them in sRGB. I look for the DeltaE between the two colors and the result that I get is 0.0211 (This might be not correct, I'll appreciate if anybody could confirm) which by definition implies that both colors cannot be differentiated
If the issue was precision, we could just move to tiff 16 bits or if that was not enough, then some high end application would handle 32 or more bits per channel, but no real need for it has been proved.
... To me, sorry to say, when you say that Adobe RGB has more colors than sRGB, it's sounds a bit like that fellow who started this whole mess in the first place :D :o
...it has the exact same number of RGB combinations....
1. Is it possible that within one of the images we now observe least two patches that can no longer be distinguished? If so, is this more likely to occur in the sRGB image or the aRGB image?
2. When comparing individual patches between the two images, will we see differences in some (but not all) patches?
3. If, between the two images, some patches are distinguishably different, is this confined to the patches which were outside the sRGB gamut?
...It's a container...
I really don't like the idea of saying that aRGB has more colors than sRGB. Because in fact, it has the exact same number of RGB combinations. And each RGB combination is a unique color value, mathematically speaking.
I think the confusion in this thread illustrates why this whole subject can be impenetrable to mere mortals.The problem is right at the beginning. Don't talk about "number of colors" if that is not the concept you want to convey. Say "wider gammut", the correct technical term, and illustrate what that means in being able to reach more vivid colors with AdobeRGB.
And, just to add to it, I really don't like the idea of saying that aRGB has more colors than sRGB. Because in fact, it has the exact same number of RGB combinations.
The problem is right at the beginning. Don't talk about "number of colors" if that is not the concept you want to convey. Say "wider gammut", the correct technical term, and illustrate what that means in being able to reach more vivid colors with AdobeRGB.
Saying "wider gamut" just shifts the burden to defining then "gamut." Besides, "more vivid colors" simply means more colors.
Ok, that sounds promising. If so, what makes them different?The primary difference between sRGB and aRGB in my view is the level of color saturation that can be encoded. The actual encoding in any given color space is limited to the no of bits per primary (in this case R, G, B) used. So it comes down to the color , actually the color objective, you have in an image and if it can be encoded in a color space without falling outside of the color space boundaries, ie its gamut.
Please, FORGET about the numbers, do you want more? What if I develop a software that handles 1 billion bits per channel? Will you have more colors because you have more mathematically different color values?
The numbers are just tools, This is signal processing, just use as many digits as you need. There can be more identifiable colors in Adobe RGB than in sRGB.
I understand your argument, but you confuse the concept. And to be fair, in 8 bit workflows, gamut density does have a real effect. Our novices are working in 8 bit jpg. It doesn't matter what software you can develop. Why make this more abstract?
fellow color geeks...
I think the confusion in this thread illustrates why this whole subject can be impenetrable to mere mortals.
And, just to add to it, I really don't like the idea of saying that aRGB has more colors than sRGB. Because in fact, it has the exact same number of RGB combinations. And each RGB combination is a unique color value, mathematically speaking. When you start saying that some values are so close, as to be the same color, it really confuses the whole concept by bringing perception into muddy the idea.
I would suggest maybe you start with something like this: Assume two color spaces, space "A" and space "B" where each color space consists of 4 colors. Color space "A" has it's most saturated colors a little bit less saturated than the most saturated colors in color space "B"...
But both spaces contain ONLY 4 colors. This is something everyone can see without complex 3d graphs. Extrapolate from there to the spaces with millions of colors.
To me, sorry to say, when you say that Adobe RGB has more colors than sRGB, it's sounds a bit like that fellow who started this whole mess in the first place :D :o
I didn't even worry about color space until the recent Fong blowup and after perusing various websites, I think I have come to the conclusion that Gary Fong got the right answer using the wrong math. I think these two articles explain it better, at least for me:You do realise that RAW files cannot have a colourspace assigned to them in camera like a JPEG.
http://help.smugmug.com/customer/portal/articles/93362
http://www.cambridgeincolour.com/tutorials/sRGB-AdobeRGB1998.htm
For workflow purposes, why put yourself through extra steps by shooting aRGB? You wind up having to convert it to sRGB a lot of times for sharing online/on tablets/etc. and even for some printing services apparently. So it seems like for most (not all) people, it'd be more time-efficient to shoot JPG + RAW and have the JPG on sRGB. That way you can share online/on an iPad/etc. without any more steps, and for the few photos that need printing, you can just go back to the RAW and tinker with it in aRGB or ProPhoto or whatever your destination printer uses.
This won't work for some people but I think it would work for most people. Y'all can argue about color management in an ideal world, but we don't live in such an ideal world yet, so shooting sRGB JPG + RAW seems like the most time-efficient way to deal with current realities, yes?
If I can make a suggestion - the best success that I've had in talking to people for whom "gamut" wouldn't be appropriate is to talk about a "range of colors". So e.g., Adobe RGB can represent a greater range of colors than sRGB.
Talking about "number of colors" generally results in total confusion, because it confuses number of bits in a particular file format with color space.
So actually, in an 8-bit JPEG file, then sRGB and Adobe RGB can represent exactly the same number of colors (depending on how you count colors, which another topic on its own, say 16.7 million odd). However:
- The sRGB version of the file can show a smaller range of colors, but with greater granularity (better color resolution)
- The Adobe RGB version of the file can show a greater range of colors, but with lower granularity
In practice the ability of the human eye to perceive subtle differences in color isn't too great, so the likelihood is that you won't actually see any difference from the granularity.
Sandy
If we define colour as unique RGB value
Unique RGB value is not a colour, so it would be wrong and misleading definition.
I was talking about unique RGB values within Bill's flower 16 bit image in the sRGB space. Are you saying those aren't colours?
Martin
I also think after reading through this thread that Andrew is going to have his work cut out for him.Coming to many new posts after my nap ;D. I do want to address a few items in the order I've read them (up to this post).
I think if I were asked by a neophyte photographer “Does sRGB have more colours than Adobe RGB ?”, my answer would be “No, but Adobe RGB can contain a wider range or gamut of colours than sRGB. It is not about the ‘number’ of colours but about the range that can be represented [here a picture of the two colour spaces would depict this very well and aid understanding].”. I would go on to emphasise that in terms of colour spaces it is not the ‘number’ of colours than can be represented but the range or gamut that is important. I think this is all someone who asks this question would need to know. If they want to know more there are plenty of excellent books out there – authors Andrew Rodney and Jeff Schewe to name but two.
If it were me I would forget the thermometer analogy and stick to a picture of the two colour spaces (2D or 3D – doesn’t matter). These show exactly what we are talking about (and no one need die from overheating !). ;D
If I can make a suggestion - the best success that I've had in talking to people for whom "gamut" wouldn't be appropriate is to talk about a "range of colors". So e.g., Adobe RGB can represent a greater range of colors than sRGB.That's exactly the language I use Sandy. Range not number. In fact I was asked to do a 15 lecture to business folks who deal with selling a product that produces color (can't go into details). They were being told that product A produces billions of colors (due to bit depth) while product B only millions of colors. The competition was also showing gamut maps in 2D. I used exactly the same language by saying Gamut and number of colors specified are completely different. Gamut is range of color not number of colors. Bit depth can specify number of colors but not their saturation within the gamut maps (maybe I should have said Chroma).
Talking about "number of colors" generally results in total confusion, because it confuses number of bits in a particular file format with color space.It is similar in some respect to the confusion between dynamic range and bit depth going back to the old and I believe useful staircase analogy (height of staircase is a very different attribute than the number of steps).
Saying "wider gamut" just shifts the burden to defining then "gamut." Besides, "more vivid colors" simply means more colors.No it doesn't. We're back to the human vs. buffalo analogy. How does more vivid equate to larger number?
The reason that the sRGB image has fewer unique colors is that with relative colorimetric rendering, some unique colors in the ProPhoto image that are out of sRGB gamut are clipped to the same value, reducing the number of colors. Had I used perceptual rendering that did not clip values, the number of color values would be same in both images. However, current perceptual rendering does not look at the gamut of the colors actually in the image and proceed accordingly with compression. It merely compresses the values by an arbitrary amount, whether or not they are out of gamut. If the compression is insufficient, clipping can and often does occur. Whether or not all of these values are perceptibly different is another matter.Thanks Bill, I'm seeing a similar report too. I just did another test using an image that doesn't have much saturation (one of my dogs at White Sands).
...RAW files not optimised by ETTR will show excessive noise in the shadows...
An important point to get across, IMO, is that color management isn't nearly as difficult as many try to make it. The problems start when color management breaks down and stops.
I think trying to talk about the “number of colours” in a given colour space is a complete red herring and will confuse the target audience of Andrew’s proposed video.
Trying to keep it simple…
As I understand it (and I am ready to be corrected on this) a colour space defines a gamut or ‘range’ of colours. It is, I believe, misleading to talk about a gamut or range as containing a given “number” of colours. To put it very simply the gamut or range defines the boundaries of what can be represented.
Still keeping it simple my understanding is that bit depth defines the number of differences between colours that can be represented and would, therefore, be colour space agnostic.
I think if I were asked by a neophyte photographer “Does sRGB have more colours than Adobe RGB ?”, my answer would be “No, but Adobe RGB can contain a wider range or gamut of colours than sRGB. It is not about the ‘number’ of colours but about the range that can be represented [here a picture of the two colour spaces would depict this very well and aid understanding].”. I would go on to emphasise that in terms of colour spaces it is not the ‘number’ of colours than can be represented but the range or gamut that is important. I think this is all someone who asks this question would need to know. If they want to know more there are plenty of excellent books out there – authors Andrew Rodney and Jeff Schewe to name but two.
If it were me I would forget the thermometer analogy and stick to a picture of the two colour spaces (2D or 3D – doesn’t matter). These show exactly what we are talking about (and no one need die from overheating !). ;D
but today we have more new OS and printer driver updates routinely busting some part of the color managed chain, often doing so in an effort to "simplify it" for the average consumer.As far as I'm aware this is only effects Mac users. The Windows CM system hasn't really changed in the last 15 years. Almost all issues that have changed with respect to colour management on software on Windows have been as a result of trying to deal with Mac changes (eg loosing the NCM print option in Photoshop).
As far as I'm aware this is only effects Mac users. The Windows CM system hasn't really changed in the last 15 years. Almost all issues that have changed with respect to colour management on software on Windows have been as a result of trying to deal with Mac changes (eg loosing the NCM print option in Photoshop).
The Windows CM system hasn't really changed in the last 15 years.
I didn't even worry about color space until the recent Fong blowup and after perusing various websites, I think I have come to the conclusion that Gary Fong got the right answer using the wrong math.He didn't even get to math, that would have been even more hilarious.
For workflow purposes, why put yourself through extra steps by shooting aRGB?For superior output? Bill illustrated that in the locked post, and Gary admitted it too.
You wind up having to convert it to sRGB a lot of times for sharing online/on tablets/etc. and even for some printing services apparently.Yes you do if your goal is to optimized the data for the output. If you don't care (or don't know this) then yes, stick to sRGB.
So it seems like for most (not all) people, it'd be more time-efficient to shoot JPG + RAW and have the JPG on sRGB.Yes and no (maybe). If your goal is quick and dirty (OK, quick and clean), just skip raw. If your goal is to capture all the data, not have some in-camera processor decide what is 'correct color', you want raw. And rendering the image for output is a fundamental part of photography if you have an interest in that process (see: http://www.lumita.com/site_media/work/whitepapers/files/pscs3_rendering_image.pdf). Again, if your goal is to produce 'snap shots', you don't care about how the image is rendered, stick with sRGB and move on.
This won't work for some people but I think it would work for most people.It is dangerous to make assumptions and speak for anyone but yourself in this regard.
Y'all can argue about color management in an ideal world, but we don't live in such an ideal world yet, so shooting sRGB JPG + RAW seems like the most time-efficient way to deal with current realities, yes?If that workflow makes sense to you, works for you, then the answer for you is clear. And I don't see too many arguments about color management, at least in the locked thread. There were agreements about technically correct teachings of a technical subject with the person who was writing technically incorrect statements ;D. I don't recall anyone suggesting not implementing color management.
Y'all can argue about color management in an ideal world, but we don't live in such an ideal world yet, so shooting sRGB JPG + RAW seems like the most time-efficient way to deal with current realities, yes?That can work, yes.
Not quite true! Microsoft introduced a new colour management system called "Windows Color System" with Vista in around 2008.But it didn't significantly change or break anything!
Having been practicing fully color managed workflows for nearly twenty years now, I respectfully disagree that color managed workflows are easier today.I think they are overall (look at Lightroom as an example) but I agree, there are lots of areas that stuff can break and has. Especially for printing which isn't getting the love from Apple and other's it used to.
This is course opens up whether we should be capturing and editing in higher bit depths. I would definitely argue that we need to, even if we can't see any difference because it gives us more latitude to manipulate and adjust the image before it starts to posterize.No question about that! Especially given the high bit data we're starting with from our capture devices. Unlike the 'what is better Adobe RGB or sRGB?' debate some who's name we will not mention use to attract attention to themselves, I've not yet seen the same people argue high bit capture and editing is a poor idea. But let's not give them any ammo in that respect. Those same people will tell their audience there's no reason to capture raw data!
If you have any aspirations to improve your photos and do any post processing on them, eg cropping, adjusting colour balance/exposure/generally getting the best from your camera, you're going to have to use some sort of image editor. Once you choose to do that you might as well just use RAW in the first place, then pass everything though a package like Lightroom/Aperture/Capture One and let that handle resizing and colourspace issues for online use ..
You can go bigger gamut to smaller but it is pointless to go the other way. Think of it as starting with a gallon container holding water. You can pour that into a quart container. But pouring a quart of water into an empty gallon container doesn't give you a gallon of water. So I would render from raw to the highest resolution your camera can produce, widest gamut (ProPhoto RGB), 16-bit, do all the work on that as your master image archive. Then you can size down the resolution and gamut for output to other needs like posting to the internet, slide shows etc.
- Always shoot RAW
- Get a wide-gamut monitor and a calibration system and use the latter at least 1x/month
- In software, always work in ProPhoto
- Make sure you have the right profile for your printer and paper
- Apply the profile in software or at the printer, never both
- Softproofing is your friend
Color management is deeply rooted in CIEXYZ or CIELAB as the underlying backbone of color translation. I have no problem with that at all. However, image reproduction quality when comparing original source color to output color is also overwhelmingly expressed in terms of various flavors of delta E, and delta E is of course a vary useful metric for many types of color discrimination tasks like matching paint or textile colors. However, both delta E and total color gamut estimations fall flat on their faces when it comes to properly characterizing tone reproduction fidelity.
Bravo, Mark!
I read An Introduction to the I* Metric. Revelatory. Do you have Matlab code available?
Jim
Andrew,
Can't think of a more succinct explanation or better advice for a 'newb' than the above three posts. If you want to succeed in getting your message across, keep it as simple and direct as possible - don't get drawn into the minutiae (if you're addressing the wider public).
No, I don't have MatLab code for the I* metric, but it can be relatively easily programmed in Excel.
Great post Mark. In terms of dE, and in the context of this discussion, is it useful in terms of evaluating what is and thus isn't visible when trying to decide if one color space with a fixed image does or doesn't contain more colors than the other? Or is it simply fruitless to even go there? Do you suppose ColorThink is using this metric in any way to produce the extraction of unique colors in the reports Bill and I produced?
Mark,
Can I just use the formulae in this paper: http://wilhelm-research.com/ist/WIR_IST_2004_11_MMG_HW_DS.pdf
or is there something newer?
Thanks,
Jim
In terms of dE, and in the context of this discussion, is it useful in terms of evaluating what is and thus isn't visible when trying to decide if one color space with a fixed image does or doesn't contain more colors than the other? Or is it simply fruitless to even go there? Do you suppose ColorThink is using this metric in any way to produce the extraction of unique colors in the reports Bill and I produced?
Try taking your all white image, for example, that would generate only one unique color even if the image had, say, a total count of 100 all white (RGB =255,255,255) pixels in it. Next, take just one of those white pixels and change it's RGB triplet value by a large enough amount to generate a new LAB value that differs by 1 unit or more in any of the L*, a*, b* values returned in the info tool. That now guarantees the image has two unique colors whether counted in RGB or LAB numeric calculations by Colorthink, so Colorthink should return a unique count of two no matter what.Great idea! I'm not sure if the test I just did is valid, I've had a few glasses of wine with dinner ::).
Would be cool if Colorthink allowed you to set your own threshold for this type of analysis. Does it?I see a preference for Unique Color Extraction: Ignore white (255/255/255) and Ignore Black (0/0/0) they are both off. There are options for different dE formula but only when comparing two color sets, they don't alter the extraction of the colors (altering dE formula doesn't update the reported values above).
Great idea! I'm not sure if the test I just did is valid, I've had a few glasses of wine with dinner ::).
In Photoshop I made a two pixel document IN Lab.
One pixel is Lstar 100/0/0. The other is Lstar 99/0/0.
ColorThink extracts two unique values which is kind of expected.
What is interesting is the Lab values it provides which is a tad different than Photoshop and may be a clue to what is going on with the report.
#1 100.00/-0.00/0.00
#2 99.12/-0.00/0.00
ColorThink has more precision in the values than Photoshop.
Now I make a similar 2 pixel document in ProPhoto RGB. One pixel is 255/255/255, the other is set for 254/254/254. I use the pencil tool set for one pixel and click on that 2nd pixel.Yes, interesting that you chose RGB triplets with equal values so that L* is the only value showing movement. Keeping in mind that some sRGB profiles use simple matrix math with gamma 2.2 tone curve which matches aRGB but not ProPhoto while true sRGB has flare compensation in the low end of the RGB scale so L* calculations can differ from aRGB in the low end, and of course, ProPhoto is gamma 1.8, so converting between these color spaces is bound to generate different numerically unique color values depending on how many shadow versus highlight tones are represented in the image and even if the image is a perfectly neutral gray monochrome RGB image. Since Colorthink is merely looking for unique math values (which is more possible with higher significant digits than is shown in PS into tool) the analysis of an image in sRGB versus aRGB is bound to differ, but not necessarily in a visually significant way since combination theory is not being used to rule out pairs with Delta E below any rational JND like 1.0. In other words, the fact that Colorthink claims two pixel values are unique because one has L*= 99.73 and the other has L* = 100.0 proves again that Colothink is looking at the calculated numeric values but not rounding them as much as PS info tool and not checking against a JND threshold to verify that the calculated difference can rationally be perceived by the human observer.
ColorThink reports the white pixel as we expect, Lstar 100. The pixel that was set for 254/254/254 is reported as Lstar 99.73
Great post Mark. In terms of dE, and in the context of this discussion, is it useful in terms of evaluating what is and thus isn't visible when trying to decide if one color space with a fixed image does or doesn't contain more colors than the other? Or is it simply fruitless to even go there? Do you suppose ColorThink is using this metric in any way to produce the extraction of unique colors in the reports Bill and I produced?
If you take the 2-pixel image I attached earlier (I'll attach again) and extract the color list, Colorthink will identify both colors even though they are very close together. They are separated by about 0.02 ∆E that should be well below the threshold for distinct visual colors. This would suggest ColorThink doesn't really attempt to identify visually distinct colors and is using some other metric.
If you take the 2-pixel image I attached earlier (I'll attach again) and extract the color list, Colorthink will identify both colors even though they are very close together. They are separated by about 0.02 ∆E that should be well below the threshold for distinct visual colors. This would suggest ColorThink doesn't really attempt to identify visually distinct colors and is using some other metric.Ops, I missed that too, sorry.
I think it's safe to assume the color list is not a good indication of the number of discernible colors in a file.Excellent, I'm happy we went down this route and came to this conclusion. ColorThink was one tool that I used for analysis to determine if Adobe RGB had more colors than sRGB, it told me it did, that threw me for a bit of a loop. Since we appear to agree that the number of colors have to be discernible to be valid, I can forget this analysis. Hopefully Bill who also conducted similar CT tests is seeing all this too. He also had a good explanation why using CT for this analysis wasn't effective in answering the original question: does Adobe RGB (1998) have more colors than sRGB.
Andrew, I think one of the issues you'll have to wrestle with when you construct your video is what level of mathematical skills to assume on the part of your viewers.Like me, virtually none (I'm quite mathematically challenged).
Here's another image that's illuminating when opened in ColorThink. This image is a 256 x256 slice of the RGB space where the blue value is zero. ColorThink seems to limit its color list to 10000 points and it seems to decide this by tossing values at set RGB value intervals (culling more in the shadows and fewer in the highlights). I think it's safe to assume the color list is not a good indication of the number of discernible colors in a file.
It might also be nice to post the link to the X-Rite/Pantone Hue Color Test (http://www.xrite.com/custom_page.aspx?PageID=77&Lang=en) that viewers can take. This really highlights how we all see and perceive color.Well kind of. I think it's fun, it does give people an idea of what the correct vision test could do, but it's largely a marketing vehicle. Considering how it arrives to a viewer (on the internet), that depending on browser, calibration of the display and other factors, not sure. I could just as easily create two squares in Photoshop to show on a video that had a dE of less than one, 1 and higher but the same viewing caveats would apply. Maybe a document one could download and view in Photoshop? Or just the recipe for the colors one could create like I did to test CT?
Since we appear to agree that the number of colors have to be discernible to be valid…
While many people in this thread have pointed out that different stimuli can result in the same color perception, the opposite is also true. An example might help. Consider the sRGB value (130, 70, 15). Does that stimulus map to one perception? If you were to count the number of distinct colors in the following image, will that RGB value be tallied as one perception? If so, which perception does it map to, the top center square or the bottom?Lot's to chew on with that post Mark. It seems to enforce even more the reason not to say: Adobe RGB (1998) has more colors than sRGB. If someone wants to know why, we've got several pages of reasons.
... the reason not to say: Adobe RGB (1998) has more colors than sRGB. If someone wants to know why, we've got several pages of reasons.
Lot's to chew on with that post Mark. It seems to enforce even more the reason not to say: Adobe RGB (1998) has more colors than sRGB. If someone wants to know why, we've got several pages of reasons.
Ok... so they have the same number of colors... what's different then? And btw, try to answer in one, simple-English sentence, not "several pages."What's different? The range of colors is different. G255 in sRGB is in a different location within human vision as defined by the CIE chromaticity plot than Adobe RGB (1998).
Raw does have a greater dynamic range. JPG's are 8 bit per channel.Let's not go there ;D
... G255 in sRGB is in a different location within human vision as defined by the CIE chromaticity plot than Adobe RGB (1998).
Let me see if I get it: are you saying that both spaces have a color defined as G255 (by which I assume you mean a maximum pure green?), just in different locations? Are those the two circles I pointed out in the attached image (assume that I more or less correctly pinpointed the locations)?
The plot you provided, that Gary can't understand or use, shows this quite clearly. That horseshoe shape represents human vision.
If we're going to use chromaticity diagrams, can we at least use u'v' instead of xy, which unrealistically overemphasizes the importance of green changes? u'v' isn't perceptually uniform, but it's a lot better than xy.We (Slobodan) could but considering the audience and the lesson being taught, the method he used is certainly adequate.
We (Slobodan) could but considering the audience and the lesson being taught, the method he used is certainly adequate.
Let me see if I get it: are you saying that both spaces have a color defined as G255 (by which I assume you mean a maximum pure green?), just in different locations? Are those the two circles I pointed out in the attached image (assume that I more or less correctly pinpointed the locations)?Maybe to point out the obvious Slobodan (apologies if that is the case) but 0,255,0 in sRGB can likely be represented by some other set of numbers in AdobeRGB say 0,221,0.
Saying "wider gamut" just shifts the burden to defining then "gamut." Besides, "more vivid colors" simply means more colors.That's the point. Shift the burden to a valid viewpoint, rather than a nonsense one.
2) All of these discreetly addressable values are visually discernible values but with three important caveats: a) To be discernible each addressable color value must be applied to enough adjacent pixels in order to cover an area in the image that subtends a large enough viewing angle to be resolved by the human observer and b) Appropriate surround conditions must be met, i.e., the surrounding near neighbor color values are sufficiently different by at least 1 JND (just noticeable difference) to the human observer, and c) the surrounding color values must also cover image area that can be easily resolved by the viewer.
I'm not quite sure what you mean by this one. You can easily find pairs of colors in an RGB space that will be very close to each other like the previously-mentioned cyan pair that are separated by only .02 ∆E.
I have trouble imagining a situation in which you could discern between those two values, but I'm not sure how our ability to discern color differences changes in different situations. Also spaces like ProPhoto RGB have colors that fall outside the spectral locus — they aren't real colors. I think it's hard to argue that they are visually discernible in any meaningful way.
1) All the RGB working spaces have exactly the same number of addressable color values and the total number is set by the bit depth of the image file, i.e., 8 bit, 16 bit, etc.).
Colorspaces are conceptually continuous, not discrete, therefore it's wrong to talk about number of colors.
In practice they are often quantized for compact transmission of images, but this is orthogonal to gamut size, and rather arbitrary (ie. a power of 2). Ideally such quantization is too small to be perceivable and so isn't something you use to make the colors countable.
2) All of these discreetly addressable values are visually discernible values but with three important caveats: a) To be discernible each addressable color value must be applied to enough adjacent pixels in order to cover an area in the image that subtends a large enough viewing angle to be resolved by the human observer and b) Appropriate surround conditions must be met, i.e., the surrounding near neighbor color values are sufficiently different by at least 1 JND (just noticeable difference) to the human observer, and c) the surrounding color values must also cover image area that can be easily resolved by the viewer.
Then you have Mark Fairchild going out on a limb and claiming an infinite number: http://www.rit-mcsl.org/fairchild/WhyIsColor/files/ExamplePage.pdf.
... Shift the burden to a valid viewpoint, rather than a nonsense one.
Colorspaces are conceptually continuous, not discrete, therefore it's wrong to talk about number of colors...
For the purpose of this debate (which wasn't meant to be a geeks' orgy, btw), this makes as much sense as saying to someone who asked you how far away is the nearest town: "Sorry, Sir, that's a nonsense question. Everybody knows that distances are conceptually continuous, therefore it's wrong to talk about number of miles or kilometers."The purpose of what debate? And who says it wasn't meant to be anything but clarification of terminology (does Adobe RGB (1998) have more colors than sRGB).
... I'm not sure what your point of posting here...
... to present to someone who is looking for the crux of the facts. It isn't intended for this audience.
Andrew, would you like me to stop posting here? Stop asking questions?Not at all! I'd like you to attempt understand and then verify back to us what you've been told rather than argue about what you've been told. Big difference IMHO. I pointed out three posts you made on the topic that contained answers, you didn't even verify you read or understood them. You just argued with some about what you've been told which isn't useful for either party.
Not at all! I'd like you to attempt understand and then verify back to us what you've been told rather than argue about what you've been told.
Sounds more like a religious dogma to me. Just listen what we say, repeat, and do not question it. You got to be kidding me, Andrew! That is exactly why I do not go to church.Just the opposite considering everything discussed here is based on science (color science). The guy you attempted to defend on that closed post was the fellow with the religious dogma. His information about color was technically so off base and grounded in religion not science that it was a bit shocking to see you come to his defence.
... I'm not sure what your point of posting here is unless you are still unclear about what color gamut is and what it defines. Are you?
Yes. And that is why I am still posting here.Well what is unclear?
For the purpose of this debate (which wasn't meant to be a geeks' orgy, btw), this makes as much sense as saying to someone who asked you how far away is the nearest town: "Sorry, Sir, that's a nonsense question. Everybody knows that distances are conceptually continuous, therefore it's wrong to talk about number of miles or kilometers."
... I have come to these conclusions...:
1) All the RGB working spaces have exactly the same number of addressable color values and the total number is set by the bit depth of the image file, i.e., 8 bit, 16 bit, etc.).
2) All of these discreetly addressable values are visually discernible values but with three important caveats: a) To be discernible each addressable color value must be applied to enough adjacent pixels in order to cover an area in the image that subtends a large enough viewing angle to be resolved by the human observer and b) Appropriate surround conditions must be met, i.e., the surrounding near neighbor color values are sufficiently different by at least 1 JND (just noticeable difference) to the human observer, and c) the surrounding color values must also cover image area that can be easily resolved by the viewer.
3) That leaves us with the important concepts of RANGE and SCALE (thank you Andrew). ProPHoto covers a larger chroma range than aRGB and aRGB covers a larger chroma range than sRGB. The range of hues and range of lightness values remains the same in all of the working RGB color spaces. Thus, Chroma is the only color property that gets expanded to a larger range and this is done by assigning higher numeric values on the a* and b* scales when transforming the RGB triplets to CIELAB values.
4) The consequence of encoding a larger chroma range is that precision between the discreetly encoded color values is reduced, hence, the often recommended advice to use higher bit depth, especially with ProPHoto, to ensure that image editing software can manipulate the image tones and colors with sufficient mathematical precision to avoid subtle banding or posterization effects in the final image reproduction.
That's my best shot. Be gentle ::)
best,
Mark
http://www.aardenburg-imaging.com
Right, good point. ProPhoto does indeed have encoded values that become imaginary on a theoretical basis. I do concede that point. However, those values are forced by any display system for viewing the image inbounds by one method or another. Hence they become discernible, albeit with the caveats I listed before to be met for that imaginary-turned-real color value to be useful and discernible in the image.This seems to be yet another data point for completely separating number of colors from color gamut.
"What is the difference between various color spaces, does Adobe RGB have more colors than sRGB, and why my Adobe RGB files appear dull and muted on some displays and in some printouts?"Adobe RGB doesn't have more colors. And it doesn't appear dull unless you mishandle it, treat it as something it isn't (like sRGB).
All RGB spaces have the same number of colors. They differ in their ability to display vivid colors, with wider spaces being able to display a wider range of colors, i.e., more saturated colors. If wider spaces display muted colors, blame the human error, not the color space.Much better!
Does Adobe RGB (1998) have more colors than sRGB? No. But to uncover why, we have to look at a few facts about color spaces, specifically RGB working spaces like sRGB, Adobe RGB (1998), ProPhoto RGB.
Adobe RGB (1998) and sRGB are just color spaces, containers. They don't inherently have any information other than specifications for primaries, white point, and gamma. Until we actually have a pixel, they don’t contain any information. The pixel has what is called an encoding which can provide a number of possible color values. For example, 24 bit color, (three channels, 8-bit each) can mathematically define 16.7 million color values. Can we see 16.7 million colors? No. Far less. The number is up to debate but the point is, we can use math to produce a value that has no actual relationship to what we can see. All the RGB working spaces have exactly the same number of addressable color values and the total number is set by the bit depth of the image file, i.e., 8-bit, 16-bit.
Before we can define a number of colors, we have to define: what is color? Color, is a perceptual property, something that occurs deep inside our brains. So if you can't see it, it's not a color. As such, colors are defined based on perceptual experiments. Color is not a particular wavelength of light, It is a cognitive perception. We can use math and a metric called deltaE to define when one set of color numbers are imperceptible (indistinguishable) from another set of numbers. For sake or argument, let’s say in one color space, it isn’t possible to see a difference between R0/G78/B0 and R0/G79/B0. As such, we can’t count that example as being two colors, we can’t see any difference between them, they look identical.
Now we have to look at color spaces like ProPhoto RGB. If you examine a plot of this synthetic color space on top of the gamut of human vision, part of it falls outside the plot. It can define numbers which represent colors we can’t see. So these “imagery colors” can’t be counted when we ask, does ProPhoto RGB have more colors than sRGB or another color space. One of the best explanations of why it is folly to even attempt to put a number (of colors) on top of a color space comes from Graeme Gill the creator of the Argyll Color Management System: Colorspaces are conceptually continuous, not discrete, therefore it's wrong to talk about number of colors. Just examining ProPhoto RGB further illustrates it’s impossible to define the number of colors it can contain as it can contain values that we can’t see as colors. Just as 24 bit color can define more values of color numbers than we can see. Encoding is however a useful attribute when editing our images so the point isn’t to dismiss it but rather point out, it provides numbers for something that isn’t a color, it’s just a number. As an analogy, if you were to purchase a ruler to measure something, it is possible the tiny lines that divide up the unit of measure could be finer than you can see. What would be the point of giving you a 1 foot long ruler where there were lines that defined the distance between each was a micron apart instead of a 1/16 of an inch? The micron unit is valid. You can’t see it or use it with your naked eye to measure anything. Think of the encoding of a pixel value the same way with respect to color expect unlike a micron that does exist, a value defining a color you can’t see doesn’t exist; it’s not a color.
The difference in color gamuts is their range and the scale of colors, not the number of colors. This confuses many people because they see a larger gamut plot, a larger volume, and assume larger means more colors. But one has nothing to do with the other. ProPhoto RGB covers a larger range of chroma (what some call Saturation) than Adobe RGB (1998). Adobe RGB (1998) covers a larger range of chroma than sRGB. This has nothing to do with the number of colors, that’s an attribute of how we encode the pixel values. And we can use finer ways to divide up this data. For example, in 16-bit color, the math allows us to define billion’s of color values, but that doesn’t change the fact we still can’t see 16.7 million colors in the 24 bit encoding of these pixels. As such, it’s best to talk about encoding having a potential to define millions or billions of numbers that could be associated to a color if we could see them. But if we can’t differentiae them visibly, it is kind of silly to suggest they are indeed colors. Don’t confuse a color number for a color itself, a color you can see!
Yes! The green isn't the same. The scale is different even though both use G255 to define this green. 1 yard and 1 meter are not the same distance, the scale is different. The number (1) is the same. G255 in sRGB is a different color than in Adobe RGB (1998). The plot you provided, that Gary can't understand or use, shows this quite clearly. That horseshoe shape represents human vision. One appears more saturated than the other. But they have the same number of colors. Color numbers without a scale (the color space) do not provide enough information to define that color. If you ask me how far I live from you and say 389, am I talking miles, kilometers, lightyears? And none of this has anything to do with the number of colors.
The Photoshop info pallet allows values to be expressed as either 8 or 16 integers or in floating point notation with 3 significant digits, and I think this is what the percentage in Lightroom also is using instead reporting as values 0..255 or 0..32769. This actually makes more sense to me.I agree but you can't believe how may LR users hate it and want the same feedback as Photoshop. I think LR should give them that option but I prefer the current behavior. For new users, it makes more sense and is far easier for them to grasp too.
With floating point notation, the number of encoded colors can be arbitrarily large.
So based on the comments on color gamut vs. color space, how does this sound? I've obviously done some picking pieces from here and there to come up with this. It's got nothing to do with a video per se, but more of a distilling of the last number of pages on color gamut vs. color numbers:IMHO an excellent summary that really does highlight both the similarities and differences between colourspaces.
<nit-pick mode on> The number of discrete values expressible as an n-bit unsigned integer and as an n-bit floating point number are equal for all values of n in the set of positive integers, are they not? In both cases, the number of discrete values is 2^n. Actually, as I think about it, the number of discrete values expressible in floating point notation is somewhat less, since some bit combinations are not allowed, for instance if subnormal numbers aren't allowed. Then there's signed zero.<nit-pick mode off>
Jim
For the purpose of this debate (which wasn't meant to be a geeks' orgy, btw), this makes as much sense as saying to someone who asked you how far away is the nearest town: "Sorry, Sir, that's a nonsense question. Everybody knows that distances are conceptually continuous, therefore it's wrong to talk about number of miles or kilometers."You have taken what I wrote and now misconstrued it into something I didn't say. Counting the "number of colors" in a colorspace is not the only way to measure it. By all means measure it in some device independent way like delta E. But introducing such a metric is hardly an easy explain to color novices.
The thing is, as regards gamut, there is no "miles or kilometers" or any generally accepted thing that you could count - at least not in any sense that would be relevant to a non-geek.Yes there is - distance in device independent space (ie. a colorspace directly related to what we see).
To count things, you have to be counting individual values, which immediately means you're in a particular file format.Not at all. We don't measure the distance between things by seeing how many 1 meter (or 1 foot) rulers we can lay end to end, beyond elementary school. At some stage in our education we are introduced to real numbers rather than simple counting. So it is with measuring colorspaces.
<nit-pick mode on> The number of discrete values expressible as an n-bit unsigned integer and as an n-bit floating point number are equal for all values of n in the set of positive integers, are they not? In both cases, the number of discrete values is 2^n. Actually, as I think about it, the number of discrete values expressible in floating point notation is somewhat less, since some bit combinations are not allowed, for instance if subnormal numbers aren't allowed. Then there's signed zero.<nit-pick mode off>
Good post, though. And I think Andrew is on the right track.
Jim
I've been following this thread...as best I can :)
From something that started to explain color spaces to the un-initiated, it's become quite technical and even perceptual.
So I'd like to throw another wrench into the works:
I will assume that ProPhotoRGB is such a large color space, that it contains all the range of color that our digital capture devices are capable today.
So, by working in ProPhotoRGB, in a high enough bit depth, I'm basically maintaining all my captured image data. So far, so good.
But, I need to process my RAW data and map it into ProPhotoRGB. OK, but, I if I have the best display available, I must STILL edit, and make my decisions viewing the data in Adobe RGB (or something close). I'll leave final output to print, or whatever for later consideration.
But I have a question: Isn't it possible convert from RAW directly into Adobe RGB without throwing away any data? Simply by re-mapping all the values so that they fit into Adobe RGB? Yes, I understand that the colors will look dull, at first, on the display. But consider this:
Each color space contains the same number of addressable values. So, if I can fit the image into ProPhotoRGB, then I can fit the same image into Adobe RGB, or even, sRGB. Upon editing (manipulation of the values) I can choose which values to present in my final image, and which to throw away (clipped). And, if working in floating point bit depth, I suppose the clipped data can still be saved with the image file.
So my point really is. No matter what we do, we are really working in the color space of our display device. And no matter how we parse it, this will ALWAYS be our limitation. If someone has an sRGB (or near sRGB) display device, they will be editing in sRGB no matter what. And, "soft proofing", which is a good aid to preview prints, still can not show any values outside of the ability of the display to present them.
We have a tendency to think "bigger is always better". A 36mp camera is twice as good as an 18mp camera etc. But, what we are really doing here is saying "11 is one louder than 10". (Joke from the movie "this is spinal tap")
I will assume that ProPhotoRGB is such a large color space, that it contains all the range of color that our digital capture devices are capable today.Not necessarily, but you can go to the locked down "Stand Up comic" post and see the discussion of camera with respect to gamut and color mixing functions. But for this purpose, let's just agree with the above premise.
So, by working in ProPhotoRGB, in a high enough bit depth, I'm basically maintaining all my captured image data. So far, so good.Correct.
But, I need to process my RAW data and map it into ProPhotoRGB. OK, but, I if I have the best display available, I must STILL edit, and make my decisions viewing the data in Adobe RGB (or something close).Correct. So just be careful when editing such that if you are moving a slider as an example, and suddenly you stop seeing the preview appear to update, you're probably affecting colors you can't see. Back off!
But I have a question: Isn't it possible convert from RAW directly into Adobe RGB without throwing away any data?If you clip colors doing this because the working space is smaller than the gamut of the image, no. You clipped that data.
Simply by re-mapping all the values so that they fit into Adobe RGB? Yes, I understand that the colors will look dull, at first, on the display.NO. They will not. Don't drink that Fong coolaid. It will only look 'dull' if you don't properly color manage the data (treat Adobe RGB as sRGB).
So my point really is. No matter what we do, we are really working in the color space of our display device.Absolutely not. Since Photoshop 5, the display and how we edit our images has been divorced. That's why they introduced RGB working spaces, editing spaces that have nothing to do with your display.
We have a tendency to think "bigger is always better".In this context, that's right. At least better unless you are OK clipping colors you can capture and reproduce.
Not necessarily, but you can go to the locked down "Stand Up comic" post and see the discussion of camera with respect to gamut and color mixing functions. But for this purpose, let's just agree with the above premise. Correct. Correct. So just be careful when editing such that if you are moving a slider as an example, and suddenly you stop seeing the preview appear to update, you're probably affecting colors you can't see. Back off! If you clip colors doing this because the working space is smaller than the gamut of the image, no. You clipped that data. NO. They will not. Don't drink that Fong coolaid. It will only look 'dull' if you don't properly color manage the data (treat Adobe RGB as sRGB). Absolutely not. Since Photoshop 5, the display and how we edit our images has been divorced. That's why they introduced RGB working spaces, editing spaces that have nothing to do with your display.In this context, that's right. At least better unless you are OK clipping colors you can capture and reproduce.
And I'm completely serious. It is possible to fit all the camera data into sRGB.No. Not if your definition of the camera data is the color's resulting from a raw you could encode into a working space.
And in the ultimate reality is that we WILL be editing in our display gamut, like it or NOTYes and if that's your only output, use sRGB.
No. Not if your definition of the camera data is the color's resulting from a raw you could encode into a working space. Yes and if that's your only output, use sRGB.
Here's the question to ask yourself. Do you want to retain all the color data you can but not necessarily see it on-screen, data you can output? Or do you want to limit the data so you can see it all but not use all you could have retained and used for output to another device? I have no issue which you pick. Pick the one that makes the most sense to you.
My question is: Will I like viewing the "newly visible" data?Soft proof and you'll know (pretty much).
Maybe, maybe not?
Soft proof and you'll know (pretty much).How can soft proof show me colors my display can not reproduce?
How can soft proof show me colors my display can not reproduce?One of these images is in sRGB, one in Adobe RGB (1998) and screen capture was made on a wide gamut display. You're making a mountain out of a molehill.
I could add something like: A deltaE of less than 1 between two colors is said to be imperceptible but to complicate matters, there are several formulas for calculating this metric. Further the ability of the eye distinguish two colors as different and is more limited for yellows but is better for greens and blues. This just adds even more difficulty in assigning a meaningful and accurate number of colors to these colors spaces.Yes indeed, deltaE does not compute well with human colour perception.
Yes indeed, deltaE does not compute well with human colour perception.I'm not sure why you say that, as it's not true. Delta E attempts to accord well
I guess a decision would need to be made as to how helpful this is as "entry-level" information about colourspaces.It's pretty basic - you can't manipulate color in a meaningful way without introducing device independent color space, and the limitations of interpreting the significance of numbers in XYZ space lays a trap that will quickly catch the unwary.
Andrew, I haven't drunk any kool-aide from you know who:)Just to emphasize what Andrew has said:
And I'm completely serious. It is possible to fit all the camera data into sRGB. It just will not display the way you expect. But, it is possible to reverse the process and display it as YOU WANT. (not yelling, just trying to emphasize a little:)
And in the ultimate reality is that we WILL be editing in our display gamut, like it or NOT.
I think I explained this a little bit in the 2nd half of my post which came after your reply.
I'm not sure why you say that, as it's not true. Delta E attempts to accord well
with human perception, whereas delta sRGB, delta AdobeRGB, delta printerCMYK or deltaXYZ
is very much worse.
One of these images is in sRGB, one in Adobe RGB (1998) and screen capture was made on a wide gamut display. You're making a mountain out of a molehill.
Just to emphasize what Andrew has said:
Nearly every camera can capture a gamut that is far larger than sRGB (and AdobeRGB for that matter) - so, no, once a file is imported from the camera using sRGB as an embedded colourspace a lot of colour information is potentially lost.
It is not possible to "reverse" the process and regain the lost colour information.
This is an example of what is called "early binding" in colour management and it means that you have taken a committed step that cannot be undone.
This is the reason most of us use a workflow that employs "late binding" - ie we only commit late in the process to embedding an ICC profile or colourspace to our images once we know exactly what we want to do with that image.
In this context the fact that we may be using ProPhotoRGB as our working colourspace is fine since it does not limit our decision-making and choices.
Tony Jay
I'm not sure why you say that, as it's not true. Delta E attempts to accord wellWhat I mentioned earlier about how different colours are more difficult to discriminate as different (such as the yellows) needs a delteE of up to 6, compared to the greens and blues where a deltaE of perhaps only 0.2 is sufficient for our vision to tell them apart.
with human perception, whereas delta sRGB, delta AdobeRGB, delta printerCMYK or deltaXYZ
is very much worse, and makes no attempt to accord with human perception of color difference.It's pretty basic - you can't manipulate color in a meaningful way without introducing device independent color space, and the limitations of interpreting the significance of numbers in XYZ space lays a trap that will quickly catch the unwary.
Sorry Tony, I think you misunderstand me. If I lower the saturation by 1/2 I can fit my wide gamut image into sRGB for example. Then, I can increase my saturation and get back an image the retains the color detail, at the appropriate values for sRGB. Nothing lost. Still "late binding". We do this all the time in motion capture. Not a new idea.No worries!
No worries!
Given the amount of misinformation that has been squirted around recently it was worthwhile making sure that everyone is on the same page though.
Tony Jay
You've told me my Epson printer has a larger color gamut than sRGB. Why do I like the way it looks on my sRGB display, more than my print?
The printer in question may well have a larger gamut than sRGB, but there is still a lot of sRGB that your printer cannot reproduce. The printer gamut does not contain the sRGB gamut.
And yes, it could also have a lot to do the reflective vs transmisive, especially if you are used to working in the motion picture industry.
Brian A
Agreed, but maybe worth restating: everything in CIE colorimetry, including ∆E values, is based on judging color matches in a very controlled and artificial environment. If you take a look at the Beau Lotto image on page 9 of this thread (http://www.luminous-landscape.com/forum/index.php?topic=93004.msg757757#msg757757), you'll see an example of two very different "colors" with a ∆E of precisely zero. These tools are too blunt for asking nuanced questions like how many individual colors are in a real-world image if by 'color' we mean something other than a number.Use the right tool for the job. CIE colorimetry doesn't attempt to model spatial visual appearance. If you want to do that you need to look at refinements of CIE such as retinex, iCam, etc.
Yup!It isn't even "not really true", it is absolutely untrue!
And then, I'm thinking here of the whole issue of luminance levels, or ranges. Color spaces seem to assume fixed luminance ranges. Doesn't this effect the color gamut as well? It's an interesting limitation to the "color space" concept.
I can squeeze my 16 stop Dynamic Range from my camera into any display range by showing a low-contrast image. But what happens with the mapping of colors that we see in this low contrast image. When we add an "s" curve to cram that into the display, or print, what happens to the color?
What I mean is that now we are dealing very much with perception. And the whole idea that this is all kind of "automated" as long as one uses color management, is not really true. Since we are thinking about how to explain this stuff to the novices, then I think, we should just forget about it. That's why it takes an "artist" to make a good print. Even with a "soft proof", it still takes the interpretation of the artist to make the color space jumps successful.
Yes there is - distance in device independent space (ie. a colorspace directly related to what we see).
For a measure that can be compared in significance, using a perceptually uniform space is even better.
Hence the use of delta E as a measure of gamut size.Not at all. We don't measure the distance between things by seeing how many 1 meter (or 1 foot) rulers we can lay end to end, beyond elementary school. At some stage in our education we are introduced to real numbers rather than simple counting. So it is with measuring colorspaces.
...BTW, delta E is not a measure of gamut size, it is the measure of the difference between two points in color space. Aka, it the length of a line, not an area. http://en.wikipedia.org/wiki/Color_difference...Correct.
Not even on this forum, which is color geek heaven, have I ever seen anyone actually quote the size of a color space in any units.The first posting in this thread, point 4:
BTW, delta E is not a measure of gamut size, it is the measure of the difference between two points in color space. Aka, it the length of a line, not an area. http://en.wikipedia.org/wiki/Color_differenceIf delta E is a distance measure, then naturally you measure gamut volume in delta E cubed.
The first posting in this thread, point 4:
http://www.luminous-landscape.com/forum/index.php?topic=93004.msg757129#msg757129 (http://www.luminous-landscape.com/forum/index.php?topic=93004.msg757129#msg757129)
The prior thread:
http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756292#msg756292 (http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756292#msg756292)
http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756299#msg756299 (http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756299#msg756299)
http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756379#msg756379 (http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756379#msg756379)
http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756381#msg756381 (http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756381#msg756381)
etc.
And there are ArgyllCMS tools too, that compute gamut volume, as well as intersecting volumes.
(And I'd hardly call this forum "color geek heaven" - it's not that much fun.)
If delta E is a linear measure, then naturally you measure gamut volume in delta E cubed.
You're making a mountain out of a molehill.I want to go back to this discussion for smthopr because it rises an important additional point a video could raise. That being a wide gamut editing space is an issue because you can't see all the colors. As I pointed out to him, you have to decide if you want to contain and use colors you can't see or clip them so you can see them all.
It isn't even "not really true", it is absolutely untrue!
The concept of just ticking the softproofing option in Lightroom and "accepting" the result is complete anathema to a good understanding of colour management.
The problem is that one is actually not "doing " anything.
Ticking the softproofing button has been eloquently described as the "how to make my image look like cr@p button".
If no action is taken to make the image not look like cr@p then it will still look like cr@p when printed.
Usually some attention to contrast and perhaps some tweaks to colour may be necessary to get the proof copy to resemble as closely as possible the master.
The bottom line is that none of this happens by default.
It is true that it takes some aesthetic ability to get a good master.
It takes extra work to get the proof copy right.
Even then with an excellent ICC printer/paper profile it also takes some time to get used to the characteristics of both paper and printer to get the best out of them.
As for the characteristics of RGB colourspaces it is true that changing luminance affects chroma directly.
In Photoshop, at least, one can make adjustments in Lab if required since using Lab allows luminance or tone to be adjusted independently of hue at least. (the "L" stands for luminance, while "a" and "b" are the other axes that control hue).
Most of us, I think, would not bother to do that unless what we were doing was absolutely colour critical.
Again, I humbly apologize if all of this is known to you, but it may help those still trying to get to grips with the actual process of softproofing.
Tony Jay
The first posting in this thread, point 4:
http://www.luminous-landscape.com/forum/index.php?topic=93004.msg757129#msg757129 (http://www.luminous-landscape.com/forum/index.php?topic=93004.msg757129#msg757129)
The prior thread:
http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756292#msg756292 (http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756292#msg756292)
http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756299#msg756299 (http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756299#msg756299)
http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756379#msg756379 (http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756379#msg756379)
http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756381#msg756381 (http://www.luminous-landscape.com/forum/index.php?topic=92767.msg756381#msg756381)
etc.
And there are ArgyllCMS tools too, that compute gamut volume, as well as intersecting volumes.
(And I'd hardly call this forum "color geek heaven" - it's not that much fun.)
If delta E is a linear measure, then naturally you measure gamut volume in delta E cubed.
So yes, I think when people dismiss wide gamut working spaces to suggest 'there's no ProPhoto RGB displays' they are making a mountain out of a molehill. Just keep a steady hand on Vibrance and Saturation controls and don't keep moving them if you don't see any update on-screen. Danger Will Robinson, Danger.
And no, delta E cubed would NOT be a gamut measure, unless you believe that luminance is a component of gamut. (Is RGB 10,10,10 a different color to RGB 20,20,20?)
As per my post, a gamut measure would be an area measure, so a square, not a cube. Practically, if you really wanted one, an xy area from a xy gamut plot would be more useful.
Sandy, Yes, it is a brighter shade of gray in a three-dimensional space. It's chromaticity is on the axis between whitepoint and blackpoint of the particular space you are using.
Not really, unless you are confusing the CIE1931 diagram for a full description of a gamut, which it isn't. Gamuts are 3-dimensional…
Sandy, Yes, it is a brighter shade of gray in a three-dimensional space. It's chromaticity is on the axis between whitepoint and blackpoint of the particular space you are using.
Not really, unless you are confusing the CIE1931 diagram for a full description of a gamut, which it isn't. Gamuts are 3-dimensional, and one attempts to describe the perceptually relevant color differences with a delta-E cubed metric. As described in this thread, those delta-E's can be calculated in various ways, some methods are more useful (closer to actual human perception) than others, but none are perfect.
Cheers,
Bart
Yes, exactly. This idea works in the opposite direction as well.No it really doesn't, not unless you want to use a simplistic (Garry like) cause and effect principle to suggest it does. Did you see the example of Bill's flower's in the closed post? Or my example of data clumping in dark saturated shadows in the same thread? We could plot the dE differences between what can and cannot be seen on a display but there's far more to the inferior output seen in those examples than a dE difference distance between visible and OOG colors and the effect on the conversions to print.
At least the difference, as in your illustration, is a maximum of 5 DeltaE.Actually 6 but now go and convert ProPhoto to the output color space then sRGB to the output color space and the differences in how saturated colors are affected can be an issue as seen in Bill's examples. Suggesting or equating a small dE difference between what we can see between two color spaces on a display and the effect of the differing gamuts on conversions to print is something Gary might propose. Further, your capture device provided you this data. You are perhaps proposing we clip it solely so we can see some colors, while removing colors we can use for output. If you want to do that, go ahead. But retaining and using all the data is for many of us, far more critical than deleting color so we don't have to deal with colors we can't see on one device.
And, all one's editing decisions will be made while looking at sRGB(in this example).That statement would only be valid if the only output for the data was your display. There are colors we can retain and print we can't see on the display but could see on the print. Or another person's display. Clipping them so you can see them on one device seems rather pointless unless you're sure the only reproduction will be on that one device.
So how do you now explain this stuff to the novice? Isn't that what this thread was about?You show them the gamut maps compared to images as I did in my video. You tell them what I told you: do you want to clip colors you can't see on one device to edit them, colors that you can use on another device? Or do you retain all the color data and carefully edit the image, soft proof and use that data? It is then up to them to decide what route to take.
If I lower the saturation by 1/2 I can fit my wide gamut image into sRGB for example. Then, I can increase my saturation and get back an image the retains the color detail, at the appropriate values for sRGB. Nothing lost. Still "late binding". We do this all the time in motion capture.Yes, you can dial down the saturation when recording video in Rec 709 (~sRGB) color space, but there is something lost. The main deficiency in the gamut of sRGB compared to aRGB or ProPhoto is in saturation range, so if you reduce saturation level at capture, you can indeed fit a wider range of saturation into the recording. What you lose however, is the effective bit-depth of your recording's saturation range. When you reduce saturation by 50% at capture and increase it by 200% in post, you lose one bit of color resolution in the process. And since Rec 709 is typically recorded at 8-bit resolution, degrading color bit-depth to 7-bits can be quite noticeable while grading in post. The reason you did this, however, is because you'd rather lose some details in color resolution in order to prevent the intensely saturated colors from being clipped.
Here's some radical differences smthopr. We start with a Granger Rainbow that is converted to ProPhoto RGB and then sRGB from Lab, where it was created. The ProPhoto RGB and sRGB iterations are converted to an output color space (Epson 3880 Luster). Big dE differences (keep in mind, they are both in the same output color space seen here):
http://www.digitaldog.net/files/Granger_sRGB_ARGB_Epson_CT.jpg
sRGB to Epson on the left side of this report.
(http://www.digitaldog.net/files/Granger_sRGB_ARGB_Epson_CT.jpg)
Now let's look at a close up of one area of the two images in Photoshop, 100% zoom. What I see on this end, which may not appear on the web, is of course a more saturated color from ProPhoto to the output color space (Left side in this example) but as importantly, look at the effect of smoothness using ProPhoto or lack thereof from sRGB when converted to the output color space on the right side.
http://www.digitaldog.net/files/Granger_sRGB_ARGB_Epson.jpg
(http://www.digitaldog.net/files/Granger_sRGB_ARGB_Epson.jpg)
Clearly the source working space used to convert to the output color space plays a role in both the appearance of saturation (albeit on a display for this print) but also in terms of how smoothly colors map to the output color space (Epson 3880 luster). And this is a JPEG so you'll see some artifacts in both. None the less, like Bill's flowers, I think this illustrates how the two RGB working spaces and their gamuts affect conversions to the output color space. Starting with a larger gamut has advantages IMHO in both the appearance of saturation and smoothness. It might not be a worthwhile advantage to some, and that's a value judgement.
Have you made a video yet Andrew?Nope, but I'm working on it. The test file I just referenced is for those who want to forgo having to shoot something and just want to see the effect on their printer using a wide gamut source working space and sRGB or something smaller out to print.
And no, delta E cubed would NOT be a gamut measure, unless you believe that luminance is a component of gamut. (Is RGB 10,10,10 a different color to RGB 20,20,20?)This may be news to you, but color spaces are 3 dimensional. Therefore their gamut is a volume.
All of which I think makes the point - if a correct measure doesn't immediately come to the mind of the author of ArgyllCMS, then a generally accepted measure doesn't exist outside of deep geek world.Ah - I see - you're a troll.
Ah - I see - you're a troll.
Delta E is at its simplest form the length of a line in 3-space (aka in it's simple form, a square root of a sum of squares) - volume calculations don't need or want squares and square roots.
this thread started as a way to communicate with people that don't know much at all about color management. If they have ever seen a plot of Adobe RGB or sRGB, it will have been a 2-D xy plot. So an area plot.
Yes, technically gamuts (at least in LAB, etc) are 3D. But good luck trying to address that to this audience.
Hi Sandy,
Don't mistake the integer coordinates (e.g. 0 ... 255) you know from RGB coordinate systems with their origin, which are floating point approximations of a continuous colorspace, usually in XYZ coordinates or their conversion to *L*a*b space.
By changing the colorspace, straight line relationships in one space can be non-linear relationships in another space. Therefore cubed delta-E can be e.g. elliptical volumes in another space (the mathematical cube operation in floating point notation can result in extremely smooth shapes, not just "cubes").
It's all just a bit difficult to show in 2-D display or print, therefore interactive VRML display of 3-D gamuts are commonly used.
Cheers,
Bart
Bart, it doesn't have anything to do with integer or float; if you want to compute the volume of any shape its an integral over all three dimensions. To go mathematical, a triple integral of the constant function over its volume. For a cube, that simplifies to multiplying the lengths of the sides. What it is NOT is the triple integral of the delta E function, which is what is being claimed above. That makes no sense - it's a mathematical nonsense.
Sandy
I don't know precisely the delta E function, but I would like to know if it is a conservative one. I mean, if I move a specific amount of delta E in one dimension, then another movement in another dimension and one movement in the remaining direction, will I get to the same point regardless of the order of operations and can go back to the point of origin using the same delta E steps? If it is not conservative, then I will have to agree that it makes no sense
Francisco, there are different Delta E functions, but almost all are a square root of squares (of real numbers). So the function is always positive, hence you can never get back to point of origin.
Sandy
Francisco, there are different Delta E functions, but almost all are a square root of squares (of real numbers). So the function is always positive, hence you can never get back to point of origin.The many refinements of DeltaE functions used for the specific purpose of calculating delta E's have little or nothing to do with measuring gamut. The point is that a perceptually uniform colorspace strives to have coordinate units that are one delta E. So by definition volume in such a space is delta E cubed.
In the L*a*b space, aren't the units in the coordinates related to delta E?They are CIE76 delta E.
Sandy, I'm not sure I follow you, I understand that delta E is a "distance" or equivalent. Going back not necessarily requires the distance to be negative, just change the direction.
Maybe I change my question to:
In the L*a*b space, aren't the units in the coordinates related to delta E?
The point is that a perceptually uniform colorspace strives to have coordinate units that are one delta E. So by definition volume in such a space is delta E cubed.
Again, this time with meaning - delta E is a derived measure of color difference. That's all. It is not a measure of gamut. You cannot sensibly plot a gamut in delta E units and you cannot sensibly measure the volume of a gamut in delta E cube units.
Exactly, DE76 is the straight line distance between any two point in L*a*b* space, which can only be the case if units of L*a*b* space are DE76.
There's nothing complicated going on there, it's exactly like plotting real world objects against 3 orthogonal axes. If you want to compute the distance between any two points in such a coordinate system in (say) meters using straight line between the points, then the axes will also have to be measured in meters.
Delta E is only defined as difference between lab values, without a physical unit.
You're trying to make a "the lab value is a GPS coordinate, and the delta E is like mileage" analogy?There is no analogy here, DE76 is the distance between two points in L*a*b* space.
There is no analogy here, DE76 is the distance between two points in L*a*b* space.
But you'd be much more helpful if you went and looked some of these things up
rather than causing confusion by making assertions off the top of your head.
But however the rest of your stay on this forum turns out, you're now on my personal ignore list.Perfect - you're going on mine. I just pity all those being mislead by your determination to half understand color.
Do you have any access to a "sRGB printer"? you know what I mean by that.Unfortunately I don't. But I am shocked at the difference on a print from my 3880 between ProPhoto RGB and sRGB using my Gamut test file.
if you can have access to that so you can demonstrate and explain why noritsu or fuji controller are/aren't doing the transformation correctly or not.
Well, if we agree that color is not a physical property but a perception of our brain, then it cannot be possibly be described by a physical unit. The "volume" in this case is an abstraction.Not sure if steping into this part of the discussion is useful or dangerous but when I asked the nice folks at CHROMIX about color volume in CT they wrote back:
The ColorThink Grapher calculates the gamut volume in terms of cubic Lab values. A Lab value of one is one delta E (dE76 considering the way the Grapher is currently made). So each of these cubic Lab values represent the smallest discernible color difference, and each cubic Lab value represents a unique human-discernable color. So in that sense, the larger gamut will necessarily represent more distinct, humanly-perceptable colors than a smaller gamut. (Now that statement comes with our usual caveat that this volume number is a rough estimate, not a precise one - and it works well for and is intended for making comparisons between profiles, not for defining absolute volume numbers.)
ON the OTHER hand…..
There is a philosophical issue at stake here: Just what constitutes a color? Depending on how the numbers are encoded, you can have several million combinations of different numbers representing different colors in theory, but some will point out that these different number combinations do not constitute individual “colors” since they are not distinct enough to be *different* to the human vision system.
Well, if we agree that color is not a physical property but a perception of our brain...
Isn't everything else in the universe as well?I do not believe so. But let's not go down that new rabbit hole of yours until we know if you get the concept of gamut by staying on topic. OK?
I do not believe so. But let's not go down that new rabbit hole of yours until we know if you get the concept of gamut by staying on topic. OK?
After 14 pages of this thread, and the previous 27, you still can not come up with a reasonable, succinct, plain-English definition of anything under discussion, let alone gamut.One you can understand? Not sure. If you spent as much time letting us know what you understand as going off topic, we might know if you have a clue about the subject or not.
Well, Andrew, I am your resident dummy, I am someone looking for "the crux of the facts," explainable in plain English, and in as few sentences as possible.You got the facts, explained in plain English. Reply #192 on: September 01, 2014, 01:47:12 PM. A suggestion was made (« Reply #196 on: September 01, 2014, 03:47:10 PM ») concerning the ability of the eye distinguish two colors as different being limited in the yellows but is better in the greens and blues, which was added in another post.
After 14 pages of this thread, and the previous 27, you still can not come up with a reasonable, succinct, plain-English definition of anything under discussion, let alone gamut.That you can apparently understand. You should move on then. This isn't a topic for a resident dummy, this video on the subject is apparently more your speed:
It took only 13 pages for our resident geeks to jump off the deepest end of the philosophy pool, into solipsism (http://en.wikipedia.org/wiki/Solipsism).
... Oh, when he mentions muffins, ask yourself if it's blueberry or yellow corn. And does that have an affect on the gamut?
Hi Slobodan,
Bringing in solipsism, which is not appropriate in the context because we are talking about a physical phenomenon common to all humans and other species, does not help the subject under discussion, of course IMHO.
The difficulty of explaining a complex subject to laymen (not used as a qualifier of intelligence, just people not educated in this field), is not a simple task. The concepts used to describe the phenomenae we percieve, e.g. as different colors is also not straight forward (differences between individuals aside).
Even if we can pin-point (which is hard enough) that there is a certain minimum difference, say a delta-E of 1 unit, needed between two colors to perceive them as being different, we're not there yet. That's because it is a relative difference in a continuous volume. So if we use a coordinate to describe those colors, we can still add e.g. 0.1 delta-E to both colors which will thus still be perceivable as different, even if the change by 0.1 is too small to perceive.
That's even before we attempt to quantify/approximate gamut volumes, and differences between them.
Your curiosity would probably also not be fully satisfied by the scientific explanation either, because that is probably in the form of physiologically-relevant colour matching functions. Hence the difficult task of translation between the scientifically correct description and the more accessible form for normal human beings...
Of course it does!Maybe you are getting some of this despite repeated need to go OT! Progress? Perhaps.
You see, if it is a yellow corn muffin, "the ability of the eye distinguish two colors as different being is limited in the yellows."
If it is a blueberry muffin, however, "the ability... is better in the greens and blues"
It wasn't me who brought solipsism into debate, I just used a proper label for what (some of) you are saying.Oh, it was the other Slobodan « Reply #262 on: Today at 09:18:00 AM »
... According to some, at a certain level of abstraction, nothing is possible to define; colors do not exists, they can not be counted, differences can not be pinpointed, etc. ...
I appreciate the difficulty, the extent of which is exemplified in the 40+ pages so far, as you are apparently failing to reach an agreement among yourselves, let alone translate it to us "normal human beings."
I appreciate the difficulty, the extent of which is exemplified in the 40+ pages so far, as you are apparently failing to reach an agreement among yourselves, let alone translate it to us "normal human beings."We haven't failed to reach an agreement on a number of critical aspects of this discussion of gamut and color numbers. Just the opposite. That you state this illustrates you're not getting the subtle but important technical distinctions. And all of us are normal human beings. Not a single person here was born with an innate knowledge of this subject. Most of us worked quite hard to learn about it, and the 40+ pages you appear to look down on have continued to add to some of our understandings of the topic. Including you one would hope. At one point you asked if there were colors in a gamut larger than sRGB that can be used for reproduction, you got an answer. You stated wider gamut means more colors which hopefully you now know isn't the case. If wadding through 40 pages to learn just those important facts on color is too much for you and what you learned wasn't worth the effort, move on.
Bart, it doesn't have anything to do with integer or float; if you want to compute the volume of any shape its an integral over all three dimensions. To go mathematical, a triple integral of the constant function over its volume. For a cube, that simplifies to multiplying the lengths of the sides. What it is NOT is the triple integral of the delta E function, which is what is being claimed above. That makes no sense - it's a mathematical nonsense.
... With a 24-bit encoding system, we can define 16.7 million color values. Does this equate to our ability to see 16.7 million colors?
By the way, I (the "normal human being" in Bart's terminology) am expecting an answer from you, the scientist, not the other way round.I'm not a scientist by training or profession.
But I think the answer is: no, we can not. Which leads than to the next question:If you understand what color is, and no, it isn't solipsism, the answer is no (the one's we can't see). Here's about as simple a sentence as I can come up with: if you can't see it, it's not a color. Can we count what we can see? Yes but how is much of the recent debate.
- If we can not see them all, can we count or measure those we can, the discernible ones?
Could it be theoretically possible that we could discern more colors in a wider space, given that "the larger gamut will necessarily represent more distinct, humanly-perceptable colors than a smaller gamut."The given above isn't so based on what I believe you are writing. The crux of much of the 20 odd pages here.
... The given above isn't so..
So, you are basically saying that the following sentence is not correct: "the larger gamut will necessarily represent more distinct, humanly-perceptable colors than a smaller gamut."That is the crux of the "does a wider gamut working space have more colors than the smaller one?" Colors have to be humanly-perceptable to be considered colors and thus counted. And the recent debate is how are they counted as being humanly-perceptable. Go back to Reply #192 on: September 01, 2014, 01:47:12 PM »
That is the crux...
Once again, are you saying that sentence is NOT correct?Yes. Further, it's akin to the analogy I made about the weight of New Mexico. As Graeme said, it's wrong to talk about number of colors.
Yes...
Well, if we agree that color is not a physical property but a perception of our brain, then it cannot be possibly be described by a physical unit. The "volume" in this case is an abstraction.But it does have a component that can be measured and we do this all the time in science. Back in the good old days I often used different spectrophotometers to make laboratory measurements in experiments that I was carrying out. I also often used fluorescent molecules to tag proteins and the instrument made the measurement, not my brain (though I could see the color of the dye visually).
So, you are now saying that the sentence I quoted from YOUR quote from "the nice folks at CHROMIX" (post #261) is not correct!? You are now disputing your own quote!?No, I'm disputing their quote as you misunderstand it.
(Now that statement comes with our usual caveat that this volume number is a rough estimate, not a precise one - and it works well for and is intended for making comparisons between profiles, not for defining absolute volume numbers.)
ON the OTHER hand…..There is a philosophical issue at stake here: Just what constitutes a color?
That is the crux of the "does a wider gamut working space have more colors than the smaller one?" Colors have to be humanly-[perceptible] to be considered colors and thus counted.
I would love to see him give a 25-word, clear, simple explanation of General Relativity. ;)Or solipsism. ;D
Andrew, a few posts ago to Sandy I questioned the utility of gamut volume calculations. This whole "number of colors" thing is, IMHO, even less useful.Agreed. That's why several times here I've pasted Graeme's simple statement: therefore it's wrong to talk about number of colors.
Therefore, I think the challenge of the video is to get across the nature of color gamuts in three dimensions to people who are unsophisticated in envisioning 3D objects from looking at 2D displays (paper, computer monitor, TV, etc)Agreed. And that's why a video can work where a paper can't. We can show the gamuts in 3D, move them around etc. The only excuse for showing a 2D gamut is if you're limited to only two dimensions ;-)
No, I'm disputing their quote as you misunderstand it.
In your need to disagree to be disagreeable, you've decided to filter all the other points made here and a few even the CHROMIX email states (which is typical of you):
Further, what CHROMIX is talking about is gamut volume, NOT NUMBER OF COLORS. Please try to read and understand the points made before automatically deciding to disagree, got off topic or attempt to be another color management comedian. You're not very good at any!
... I would love to see him give a 25-word, clear, simple explanation of General Relativity. ;)
You quoted them, without saying you disagree. You are only now disagreeing, when caught contradicting yourself.But I don't. I disagree with your understanding of what they wrote. Big difference.
I am not unaware of their caveat ("that this volume number is a rough estimate, not a precise one") and a rough estimate is good enough for me (and them, apparently). That why neither I nor them use a "number" but a descriptor "more," which again is good enough for me. We do not know how much more is "more," but more it is.Gamut volume isn't number of colors. Nor is it number of encoded numbers of a pixel! Try connecting the dots.
Which also stands to reason: if there is the same number of theoretical colors in each space, it follows that wider spaces would have wider distances between "addressable locations," thus those wider distances would be easier discerned by the eye. The eye would have more trouble discerning different colors in a "squeezed" space like sRGB, then in a wide space like ProPhoto RGB.It only stands to reason in your mind at this point of understanding the topic. The colorimetric distance between colors in a wider color space is father apart than in a smaller space. Back to the half inflated balloon with 16.7 million dots vs. the balloon twice the same. That doesn't mean (and it isn't true) that there are 16.7 million colors! It's foolish to assume encoding of color values equals colors!
But I don't. I disagree with your understanding of what they wrote. Big difference. Gamut volume isn't number of colors. Nor is it number of encoded numbers of a pixel! Try connecting the dots. It only stands to reason in your mind at this point of understanding the topic. The colorimetric distance between colors in a wider color space is father apart than in a smaller space. Back to the half inflated balloon with 16.7 million dots vs. the balloon twice the same. That doesn't mean (and it isn't true) that there are 16.7 million colors! It's foolish to assume encoding of color values equals colors!
My understanding is exactly the same as their's, i.e., I am OK with a rough estimate, as they are. I did not, nor they did, use the word "number," just a more general descriptor "more," which again is fine by me.It has nothing to do with rough or fine estimates, you still don't get it. Your understanding isn't exactly the same as theirs. That's the problem.
Thus their sentence "the larger gamut will necessarily represent more distinct, humanly-perceptable colors than a smaller gamut." I accept as correct.You accept it not understanding much of what has transpired over the last pages or just don't want to accept them. You accept it by ignoring what was written in the same email:
it works well for and is intended for making comparisons between profiles, not for defining absolute volume numbersYou continue to ignore what the gamut volume metric is designed to describe. Believe whatever you wish.
Depending on how the numbers are encoded, you can have several million combinations of different numbers representing different colors in theory, but some will point out that these different number combinations do not constitute individual “colors” since they are not distinct enough to be *different* to the human vision system.
I would love to see him give a 25-word, clear, simple explanation of General Relativity. ;)An apple falling from a tree hits Newton's head but not Einstein's.
- Could it be theoretically possible that we could discern more colors in a wider space, given that "the larger gamut will necessarily represent more distinct, humanly-perceptable colors than a smaller gamut."
You keep introducing non-discernable colors into a discussion about the discernible ones.And you keep taking a simple statement you don't really understand and trying to use it to prove something that doesn't make sense.
I think you got the crux of the problem right here Slobodan...
The problem is that it is quite difficult to define these volumes of humanly-perceptable colors in LAB space. Although it's in the right spirit, ∆E doesn't work.Exactly. And let's not forget (or let Slobodan forget) how Lab and thus dE or the original mathematical models of which it's based come about in the first place. How the CIE came up with CIE XYZ (1931) to build the model in the first place. The tests involved showing groups of volunteers a sample color under very controlled conditions whereby each subject adjusted the intensity of red, green, and blue lights until the mix of the three matched the sample color.
Subject: Re: Understanding Colour Gamuts
Date Received: Wednesday, November 25, 1998 2:06:53 PM
Date Sent: Wednesday, November 25, 1998 2:06:53 PM
From: Fred Bunting <fred_bunting@ls.com>
Cameron writes ... a lot of questions. Most of this is addressed in the
ColorShop Color Primer which I wrote but is currently available only as
part of the documentation for various X-Rite products <shameless plug>.
But I'll 'summarize'.
Actually, I am amazed at how well you seem to have 'guessed' the right
answers to most of your questions.
The XYZ space is derived from color matching studies using human test
volunteers. From this data, the CIE defined three imaginary primaries X,
Y, and Z that model the color matching characteristics of the average test
subject, (what the CIE calls the Standard Observer). By 'imaginary' I mean
that these primaries do not actually exist in nature, but are
mathematically derived from the experimental data. And by 'model' I mean
that we can predict when most people would consider two stimuli a 'match',
based on whether the two stimuli have the same X, Y, and Z values, i.e.
whether the two stimuli have the same location in XYZ space.
And yes, you are correct that this matching must be done under some
reference lighting condition. A light source defined spectrally (i.e. in
terms of the energy at each wavelength of the spectrum) is called an
'illuminant'. The CIE has outlined a system of Standard Illuminants,
labeled A, B, C, D, E, and F. The D and F illuminants are actually series
of illuminants, representing various 'daylight' and 'fluorescent'
illuminants respectively. The D illuminants include D50, D55, D65, and
D75. The number represents the 'correlated color temperature' (which, yes,
does vary for daylight depending on time of day and latitude). By far the
most commonly used illuminants are D50 and D65, corresponding approximately
to 5000K and 6500K respectively.
The large asymmetrical diagram you speak of is indeed a 2D plot of the CIE
XYZ color space. More correctly, it is a projection of the space onto a
plane. The axes of this plane are labeled simply (lower-case) x and y and
and the x and y values for a given color are known as the 'chromaticity
coordinates' for that color. The 'achromatic' (luminance) component of a
color can be represented simply by the (cap) Y value from XYZ ... another
clever feature of the way the CIE defined the X, Y, Z, primaries. Thus xyY
is often thought of as another 'space' for representing colors.
All of the above (XYZ, the xy-diagram, Standard Observer, and Standard
Illuminants) were all defined by the CIE in 1931.
CIELAB was 'recommended' by the CIE in 1976, to address a specific problem:
namely, while identical XYZ values could tell you when two stimuli would be
experienced as the same 'color' by most observers, it did not tell you how
'close' two colors were if they were not exactly the same XYZ. I.e. if you
wanted to know whether sample A was more closely approximated by sample B
or C, the distances between the three points, A, B, and C in XYZ space did
not give the right answer as judged by test subjects. A number of
industries were asking for a way to compute 'color distance'. To this end,
the CIE introduced L*a*b* (otherwise written, CIELAB), which was just a
mathematical warping of the XYZ space to make the distances correlate
better with the results of test subjects.
CIELUV, by the way, was also introduced in 1976, for essentially the same
purpose, and was a rival method of warping the space ... and the two spaces
developed their different adherents in different industries.
The purpose of both CIELAB and CIELUV was to define a color difference
metric, known as 'delta-E' that is simply the distance between two points
in the color space (in this case CIELAB or CIELUV, although several other
ways of computing delta-E have also been proposed both within and outside
of the CIE, since 1976).
That's the general scoop, now to specific questions:
I vaguely remember someone on this list telling us that some or all CIELAB steps were not equal and I now want to know why?
CIELAB was an *attempt* to create a space where equal steps correlated to
color 'closeness' as judged by test subjects. No one ... especially not
the CIE ... claims that CIELAB is perfect in this regard. For example,
most color scientists will tell you that CIELAB tends to overexaggerate
distances in the yellows, and underreport distances in the blues. It also
tends to 'bend hue lines', i.e. if you try to increase saturation only, by
moving directly away from white in CIELAB space, most test subjects
perceive a change in hue as well ... again most pronounced in the blues ...
which is why we often see the classic purple sky problem. People have to
remember that CIELAB was invented in 1976 long before its purpose as a
computational space for 1990s digital Color Management was even a dream.
As such it has done remarkably well, but it is not perfect, and this is a
big area of research both by color scientists, and companies working in
color management.
I am told that the eye sees different wavelengths and the mind see them as
different colours.
May I wax philosophic? One legacy we have from Mssr. Descartes is to
separate things where no separation really exists in nature. To me there
is no dividing line between the 'eye' and the 'brain' ... color
'processing' occurs right in the first three layers of neurons in the
retina, and right down optic nerve to various way-stations in the brain on
the way back to the visual cortex ... and processing doesn't stop there.
Whether there is a dividing line between the 'brain' and the 'mind' (as you
put it) is the classic mind-body problem through which Descartes (IMO)
completely screwed us up. ... But that's another thread. :-)
Does the reference light source contain equal `amounts` of each wavelength,
That is illuminant E, the 'equal energy' illuminant ... entirely
hypothetical as no such illuminant actually exists in nature. This is just
one of several reference illuminants.
Do the three cones in the eye respond to differning wavelengths or wavebands in the same or different ways, with
equal or varying efficiency?
Yes, each type of cone responds to different regions of the spectrum, and
each has a peak wavelength that it is most responsive to with this response
trailing off in both directions up and down the spectrum.
In short, what accounts for the asymetry in the diagram, is it based in reality ...
It (the assymetry of the xy-diagram) is based on mathematical manipulation
of experimental data. I.e. it is indeed based on psychophyics (measuring
psychological response to a physical stimulus), but you should not read too
much into the *degree* of assymetry, as this may be exaggerated (or
diminished) by the mathematics.
... and is it [this assymetry] necessary for colour management?
Well, the *assymetry* of the xy-diagram is not necessary for color
management ... in fact it is a hindrance that the distances are distorted
... which is why most color management computations use CIELAB which
addresses this distortion. However, there are other features of the
diagram, such as the fact that it illustrates additive color quite nicely,
and offers a map of the entire visual system, which are useful in
explaining certain things like device gamuts.
Fred Bunting
X-Rite CTC
Well done Fred, well done!
But it does have a component that can be measured and we do this all the time in science. Back in the good old days I often used different spectrophotometers to make laboratory measurements in experiments that I was carrying out. I also often used fluorescent molecules to tag proteins and the instrument made the measurement, not my brain (though I could see the color of the dye visually).
Since then, thanks almost entirely to the LuLa resident gurus (Andrew, Jeff, Michael, and Eric Chan) I have learned some of the rudiments, and the results I get with calibrated monitor, reasonable printer profiles, and soft-proofing are quite generally satisfactory.
... If we're being truly practical about this, the whole question is not really important...
All I need to know, as a practicing photographer, is that Adobe RGB (or any other wider space) CAN reproduce more vivid colors if and when I need them and if the subject has them.The guy who stated that Adobe RGB (the wider color space) produced duller colors. Or that the two ends of each color space he was talking about (sRGB and Adobe RGB (1998)) are the same colors. And of course, that one of those color spaces had more colors than the other. He basically got just about everything wrong when talking about the two working spaces! All of that deserved ridicule.
The only reason I am pursuing clarification on the "more" issue, is that another practicing photographer, whose name shall not be mentioned here, was severely ridiculed for using it.
Isn't everything else in the universe as well?
It took only 13 pages for our resident geeks to jump off the deepest end of the philosophy pool, into solipsism (http://en.wikipedia.org/wiki/Solipsism).
Thanks for the opportunity, Eric!Then Einstein fails math as well, Slobodan. He used 28 words (and 28 is greater than -- or, as you might say, more than -- 25).
This is how Einstein himself explained: "Imagine one minute spent with a cute secretary... then imagine the same minute spent with your bare ass on a hot stove... which one is longer? That's relativity!"
Then Einstein fails math as well, Slobodan...
Of course he did.This is more of believing what one want's to believe, not necessarily the well established facts.
...the original myth I hear over and over again I was hoping to address in a video (among other myths), namely this working space has more colors than that working space. And perhaps it doesn't have to be aimed at the beginner.
Andrew, I don't think it should be part of a discussion aimed at beginners at all.I agree. So the question is, should the video be aimed at beginners?
when you convert to Lab (and the integer values of Lab are 1 delta E apart)
This is more of believing what one want's to believe, not necessarily the well established facts.
http://content.time.com/time/specials/packages/article/0,28804,1936731_1936743_1936758,00.html
http://www.todayifoundout.com/index.php/2011/12/albert-einstein-did-not-fail-at-mathematics-in-school/
http://www.andyborne.com/math/downloads/myth-buster-einstein.pdf
Back in 1905, Einstein had the biggest year of his life. He wrote, with the help of his wife, Mileva, five ground-breaking papers that, according to the Encyclopaedia Britannica "forever changed Man's view of the Universe"
with the help of his wife, Mileva Maric, to double check his work (she was a physicist and slightly more advanced than he in mathematics), he wrote four papers that changed the landscape of Physics:
If you look at the gamut plots of both, ProPhoto does show a wider area within in the spectrum locus but lots outside it. It appears that CT doesn't take into account 'colors' that fall outside the very gamut plot CT provides to build the Gamut Volume metric (sorry for the 2D map):
(http://www.digitaldog.net/files/ProPhotovsBetaRGB.jpg)
I agree. So the question is, should the video be aimed at beginners?Well what you've been discussing in tedious, pedantic, solipsistic detail over the last eight pages is, to say the very least, so esoteric only experts need watch.
In LAB space L* has 100 integer values on which the 256 values from an 8bit encoding are mapped (at least that's the way the TIF spec does it). This means that an 8 bit image has steps smaller than ∆E along the L* axis, even if photoshop is only reporting the integers .
Why was I only the only person that, correctly, answered Gary Fong's challenge to say whether a print from an Adobe RGB file would be better than from an sRGB file of the same subject ? Yes, assuming there was colour in the subject that wasn't in sRGB. Just arguing about the pedantic definitions just made everyone look as bad he expected(hoped?).
Why was I only the only person that, correctly, answered Gary Fong's challenge to say whether a print from an Adobe RGB file would be better than from an sRGB file of the same subject ?
There seems to be a huge resistance here to simplify and state the obvious in accessible terms.
Well what you've been discussing in tedious, pedantic, solipsistic detail over the last eight pages is, to say the very least, so esoteric only experts need watch.Well that's one opinion and you are of course entitled to it.
There seems to be a huge resistance here to simplify and state the obvious in accessible terms.I don't agree any of it is obvious. Or tedious, pedantic, solipsistic, just the opposite. I believe I know a little bit about this subject and I've learned a lot over the last few pages. As the original poster, I'm happy I posed the questions I did and appreciate the replies of those who also found the subject worth discussing even debating.
Why was I only the only person that, correctly, answered Gary Fong's challenge to say whether a print from an Adobe RGB file would be better than from an sRGB file of the same subject ?I can only speak for myself. I refused to take Gary's bait based on his absolutely silly described testing methodology.
Just arguing about the pedantic definitions just made everyone look as bad he expected(hoped?).Again, that's your opinion, I frankly found it quite different. The person who continued to look sillly was Gary (and anyone defending him) based on his flat earth color theories.
You've all been arguing about the statement "Has Adobe RGB got more colours than sRGB?"why ? Just think of it as a typing error and take the S out.Gary doesn't deserve that honor! It's a statement that you can find all over the internet. Much like "All displays are 72dpi, dymamic range is the same as bit depth, all output should have a resolution of 300DPI etc.
You get "AdobeRGB has more colour than sRGB" which is an easily understood way of explaining why we use wide gamut colourspaces.Adobe RGB doesn't have more color than sRGB. It has different colors. It can produce a more saturated color. Of like Gary, you can mangle how to use that working space and get duller colors.
Andrew, you are going off-topic here, but since you opened the door, I'll play.Since it's my topic, I'm OK with that ;D
Thanks, by the way, for the provided links, as they prove my point, rather than debunk it.I never disputed anything about his wife's skill's but thanks for adding data points that are not necessary nor up to dispute. The part about him failing math was very much up to dispute. Again, you seem to have difficulty separating these differing items out when going OT and disagreeing.
I wonder what the kinks in the PP lines are about...Yeah, that is quite odd! But I think at this point, we can probably move away from using CT and it's gamut volume metric to get to the bottom of the gamut's number of anything. And in defence of this fine product, the email I provided once again did states clearly it isn't intended for this analysis in the first place.
I agree. So the question is, should the video be aimed at beginners?
... Adobe RGB doesn't have more color than sRGB. It has different colors...
my 0.02c worth.Good points, thanks. It will help focus the presentation and I agree, it has to help folks make informed decisions. It will have go cover how to test this stuff too, from their own end. It doesn't matter what color space they pick, as long as they make the choice based on sound information and proper testing methodology.
Wow, thanks for taking the time to outline this Jim! I have a lot to think about and chew on. This may end up being a series, most likely more than one piece.
... A good discriminator for what to include is: does knowing this or that directly help one in making this or that workflow decision?...
Are you saying that, because they have different colors, Adobe RGB can not display colors that sRGB can?That isn't what I'm saying, no. The question you ask has been previously answered and I believe more than one.
Ok, let's work with that. Are you saying that, because they have different colors, Adobe RGB can not display colors that sRGB can?
Again depends on how you are defining color. sRGB can encode finer gradations of color than AdobeRGB can. Several pages back (http://www.luminous-landscape.com/forum/index.php?topic=93004.msg757332#msg757332) I gave an example. In that example sRGB (1, 255, 240) and (2, 255, 240) both map to the same AdobeRGB color.
If you are defining color differences purely in terms of visual distinctions, then these are the same color (at least to my eyes). But if you are going by the numbers, the distinction between these two color values is one AdobeRGB can't make but sRGB can.
Increase bit depth and you will be able to do it. This is an encoding problem, not a limitation of the color space per se, but I think we have covered that before
Increase bit depth and you will be able to do it.Not in terms of Mark's example. Take 16-bit sRGB using (1, 255, 240) and (2, 255, 240) both map to the same AdobeRGB color: 144/255/240.
Not in terms of Mark's example. Take 16-bit sRGB using (1, 255, 240) and (2, 255, 240) both map to the same AdobeRGB color: 144/255/240.
Since photoshop reports 8-bit values even for 16-bit images, we can't really tell by looking at the info pallet because it's reporting rounded values back to us.FWIW:
FWIW:
In 16-bit, using the example, ColorThink reports them as 144.1/255.0/240.2 and 144.2/255/240.2.
In 8-bit, using the example, ColorThink reports them as 144.0/255.0/241.0 for both.
Save both out as a color list for CT for it's dE report. On is 0.06 dE, the other 0.24. That's using dE2000. As such, I think we have to agree, they are the same color.
So it does make the distinction in 16 bits. That's a lot of angles on this particular pin.Yes indeed. But what Photoshop can't give us the dE differences so we can suggest there is a color difference we can see (and call them two colors). More proof that this 'color number' rabbit hole is messy.
Yes indeed. But what Photoshop can't give us the dE differences so we can suggest there is a color difference we can see (and call them two colors). More proof that this 'color number' rabbit hole is messy.
Here's a fun one too. Take ProPhoto RGB and mess with high number of blue, like 255/0/0 and 250/9/9 and convert them to Adobe RGB let alone sRGB. Same values. Perhaps that's why we see blobs of colors (If that's a fair term), lack of color details as I see on my prints from the Gamut Test file going from ProPhoto to Epson RGB vs. sRGB to Epson RGB.
I can only speak for myself. I refused to take Gary's bait based on his absolutely silly described testing methodology.In the process of that flounce, you just confirmed his prejudices.
Again, that's your opinion, I frankly found it quite different. The person who continued to look sillly was Gary (and anyone defending him) based on his flat earth color theoriesIf you weren't so personally involved and looked at that thread independently, you'd see a lot of people played into his hands by refusing to agree on anything so proving his point.
Adobe RGB doesn't have more color than sRGB. It has different colors.<sigh> come on, look at 3D gamut plots. There's virtually no colour range of any significance that is only in sRGB. For all practical purposes Adobe RGB encompasses sRGB.
I would say you were correct that an AdobeRGB files can make a better print, but only got lucky that his test agreed with you.Everyone that refused to make a prediction just played into his hands by refusing to offer the most basic simple advice.
Everyone that refused to make a prediction just played into his hands by refusing to offer the most basic simple advice.
We use wide gamut colourspaces to make better prints. Is it really so hard to agree on such a basic concept ?
I would say that we use color management to make better prints and yes, by using wide gamut you may be able to print more saturated colors in some areas.
The issue with making a prediction is that it was not evident that the proponent of the test knew or used color management properly.
Sandy, this calculation makes sense on a mathematical level, and, although I could quibble with the terminology, I don't find it too much of a stretch. Whether CIEL*a*b* is the right metric is also open to debate, but it has the advantage of familiarity.
.
.
.
.
[...]
Image 3 shows the resulting black image from the difference, which means both are equivalent
I looked closely and there were minor differences barely perceptible. Image 4 shows the differences applying a ridiculously aggresive curve to make the differences extreme
Well, this was using 8 bits. I repeated the test with 16 bits and the differences were almost nil even with the extreme curve applied.
Initially I thought it would not work, so I was wrong. In any case, I question the usefulness of such method, since I cannot see what edit I can possibly perform on such a desaturated and dull image other than cropping or straightening a crooked horizon.
FWIW:
In 16-bit, using the example, ColorThink reports them as 144.1/255.0/240.2 and 144.2/255/240.2.
In 8-bit, using the example, ColorThink reports them as 144.0/255.0/241.0 for both.
Save both out as a color list for CT for it's dE report. On is 0.06 dE, the other 0.24. That's using dE2000. As such, I think we have to agree, they are the same color.
I'd like to share another experiment I did with Bruce Lindbloom's RGB image with all possible values in 8 bit.
They are the same color only when considered as what I like to call "color for color's sake", rather than as "color in context" of a complex arrangement of colors and tones (i.e. a real photographic image, film or digital) and only when meeting other important requirements of the CIELAB color model, namely using a standardized illuminant, subtending a defined viewing angle (i.e, the 2 degree or the 10 degree observer), and when presented against a neutral gray surrounding scene.I agree but further, I'd prefer to say we have two different color values than two different colors. Both examples fall well below a dE of 1 so color in context or not, can we call them two different colors? We can say they define two different color values. We can say we are able to define 16.7 million color values but we can't see them all.
Thus, it all really boils down to "use aRGB or Prophoto RGB" when you need to preserve higher levels of color saturation in the image than can be properly encoded in the sRGB color space.Short, sweet and concise!
There is no technical advantage to the "bigger" RGB working spaces except in their ability to encode higher color saturation.Agreed.
How do you know when a color space is "too small" to encode your chosen image color saturation values correctly? One fairly straight-forward way is to use the histogram function in Lightroom, for example, to see if there is R, G, and/or B channel clipping and if it goes away when choosing a different "bigger" RGB color space.Indeed and I illustrate that using the 'boat image' in my Gamut video. Clipping in sRGB disappears when one toggles to ProPhoto RGB.
QuoteThere is no technical advantage to the "bigger" RGB working spaces except in their ability to encode higher color saturation.Agreed.
It seems that larger spaces like ProPhotoRGB extend above smaller spaces like sRGB on L* axis suggesting they can encode both more saturated colors as well as lighter colors.I'm not seeing that viewing the two in 3D in ColorThink. Unless I'm misunderstanding what you are saying or how I'm viewing. At least at L*100. But there IS a difference as the larger space moves down from that point.
sRGB's red primary at [255, 0, 0] has a LAB value of 54, 81, 70. ProPhoto will happily accommodate 60, 81, 70 which is out of the sRGB gamut.I get ProPhoto showing a LAB value of 61/128/105 for 255/0/0 so yes, big difference.
Agreed.
Is that really true? It seems that larger spaces like ProPhotoRGB extend above smaller spaces like sRGB on L* axis suggesting they can encode both more saturated colors as well as lighter colors.
LAB is intended for reflected not emitted light and 100 means a 100 % perfectly reflecting and diffusing surface. While we could encode, say, L =100, a* = 1 or higher, b* =1 or higher, it would be an imaginary color since LAB 100,0,0 is defined as reflecting 100% of all light falling on the surface hence hue is undefined, so no chroma either. Brightness is another visual phenomenon, similar to lightness, but not to be confused or used interchangeably with lightness. We can and do perceive pure white specular highlights in a reflection print as being brighter in appearance than the brightness of the diffuse media white point, but the L* value of that highlight if measured would be the same as the media whitepoint even though in the image it looks brighter.
What people do gain from aRGB or ProPhotoRGB is more headroom for color saturation in color values that are very close to L* =100 but not at 100. Same as they do in mid tone and shadow areas as well.
Yes, that's what I was thinking. Or to put another way:
sRGB's red primary at [255, 0, 0] has a LAB value of 54, 81, 70. ProPhoto will happily accommodate 60, 81, 70 which is out of the sRGB gamut.
Yes, but you can produce a red value at L=60 in sRGB as well and with the same hue using a different RGB triplet as the starting value. It's just that the color you assign won't be as saturated,
Jim, I was thinking of starting with the results and working backwards.
...I was thinking of starting with the results and working backwards...
Or you can have the same hue and saturation, it just won't be as bright, which is another way of saying a larger gamut buys you volume on every dimension, not just saturation.
Francisco, I found this a while ago, but your post reminded me. Thought you would enjoy it in case you get tired of the Lindbloom image: http://allrgb.com
I understand what you mean, but I would like to specify (sorry for being pedantic)
- in a cylindrical coordinate system, such as Lch, the difference between gamuts is in one dimension only or "c" (chroma). E.g you have two cans of soda of the same height but different diameters
- in an orthogonal coordinate system such as Lab, the change is in two dimensions, "a" and "b" (the base of the parallelepiped) e.g you have two cartons of milk the same height but different bases
- in the RGB coordinate system, the difference is in the three dimensions. This is more difficult to visualize in a meaningful way to color
... That is, show the two prints I made using the Gamut Print Test image from ProPhoto RGB and sRGB output to my 3880.Andrew, FWIW I have been following the thread with interest, although have to admit that some of the deeper discussion made my head spin what you are proposing sounds really promising to dispel some of the myths surrounding this complex subject. The real world examples illustrating the difference will I am sure stimulate interest and I look forward to seeing your work
... I'd like the viewer to get an idea what to look for and suggest they do their own testing with the file I provide.
... I could show and explain the Assign Profile command and illustrate how that works and how colors 'go dull' when mistreated.
... I'm hoping if I scan or shoot the two prints side by side and show them up close, the first take home will be, there IS a difference in the two working spaces all things being equal.
... I'm thinking that showing how the rubber hits the road first will hook some of the viewers into sticking around for the 'whys' of this improvement in output.
... Of course I'm still running this around in my head. But I'm thinking for most audiences, if they are interested in the practical advantages of a wide gamut working space over a narrow one, from raw, show them the output first. Then explain how to do the testing on their own, then explain what's happening.
Francisco, a cylindrical coordinate system is a three-dimensional system. In Lab's case, the longitudinal axis is L*, the azimuth is hue angle, and the radius is chroma. A three-dimensional surface representing the gamut of a device or working space can be specified by enumerating points which define the surface, just as in RGB or the Cartesian form of Lab, whose axes are L*, a*, and b*.
http://en.wikipedia.org/wiki/Cylindrical_coordinate_system
Or perhaps you mean something else by "dimension", in which case, please enlighten me.
Thanks,
Jim
I might be using the wrong terms, but I think we are on the same page. I should have said axis instead of dimensions
Having said that, unless I'm mistaken, Lab is not a cylindrical coordinate system but a Cartesian one. The cylindrical coordinate is Lch which corresponds to L*, chroma and hue
What I tried to explain in my previous post, is that depending on which coordinate system you use, cylindrical or Cartesian, the difference between a larger gamut such as Prophoto RGB and a smaller one such as sRGB is in one (c in Lch) or twodimensionsaxis (a & b Lab) meaning that L does not vary.
Another way of making your point might be to say: "Chroma may be calculated using one dimension (or axis) of Lch, two of Lab, and all three of RGB." Is that right?
Jim
- in a cylindrical coordinate system, such as Lch, the difference between gamuts is in one dimension only or "c" (chroma). E.g you have two cans of soda of the same height but different diameters
Or you can have the same hue and saturation, it just won't be as bright, which is another way of saying a larger gamut buys you volume on every dimension, not just saturation.
But I think I now understand the point your making — correct me if I'm wrong: while a larger space extends the upper limit of saturation it doesn't extend the limit on the L* axis. To put another way if I have an sRGB color with 100% saturation, proPhotoRGB might allow a color that is essentially 110% saturation in sRGB, but neither space can exceed 100% L*. I don't think anyone can disagree with that.
My point was that at a particular chromaticity, larger spaces allow higher lightness for that particular chromaticity.
Hi Andrew,
There are many other finer points in your discussion but I thought I’d respond to this last one and see if that clears any of them up or inflames them or….
A long time ago I started to clearly differentiate between color values and device values - as you said above it’s semantics but it helps me be clear about the differences.
So, in that terminology, a few elementary points:
- color values refer to human perception and specifically to colorimetry.
- Lab, Luv, XYZ, Yxy, etc are all color values. delta-E refers to differences in color values.
- spectral values are not technically color values but can obviously be converted into color values so I often treat them as synonymous even though technically they aren’t
- device values are *not* color values and only become color if sent to a device or come from the device. They can be correlated with color values using an ICC profile or in a measurement file of some sort.
So…. (again, using the device vs. color semantics)
- AdobeRGB and sRGB have the same number of device values (in either 8 or 16 bit) but *not* the same number of color values. AdobeRGB can hold more *unique* color values (more below). Another saying of thinking about it is to say that if you graphed sRGB and aRGB in 3D RGB space, they would be the same size. But when graphed in 3D color space, aRGB is larger.
- gamut volumes of printing devices also have the same number of device values but different color volumes
- gamut volumes in ColorThink are calculated using color values and refer to the number of unique color values (again, more below)
- unique color values in ColorThink refer to the number of distinct colors (as per human vision) that exist in a color gamut or a color image or color list or whatever.
- In color gamuts we just use cubic Lab values, the idea being that one delta-E76 value away from a color *should* result in a perceptually different color (more on *that* below)
- in color images/lists we roughly round color values when counting unique colors to make two very similar colors (less than 1 dE) become the *same* color before counting the number of unique colors. Note: this is *not* the same thing as unique Lab values, which also has value for other purposes. The process is rough, for speed and doesn’t use delta-E but you get the gist.
- the idea behind these calculations was to provide rough approximations to allow for the comparison of color gamuts, image conversions, etc as well as to reduce the number of duplicate (or near-duplicate) color points when graphing image colors in 3D.
- it could certainly be argued that cubic Lab values and delta-E76 calculations do not represent unique color values. Only if Lab were perceptually uniform as it was intended would this be true. Delta-E 2000 and other differencing equations are attempts to correct for this non-uniformity and might lead us to better ways of calculating unique color values (but they might not, as well)
- the problems with using dE2000 equations for such purposes is that there is no color space defined by (or corrected by) dE2000, it is computationally expensive (29 separate equations for each color difference), and people still don’t think it’s the best representation of color perception - only better.
- with this in mind I decided to use Lab and dE76 for ColorThink’s initial foray into these types of calculations. Then at least the math would be reasonable and we could all agree on how it might be done, even if we also all agreed that Lab was not the best space for such things and we should pursue better things going forward.
So, if you look at the unique color values that ColorThink gives for your images in sRGB vs aRGB it makes sense that there are fewer in sRGB. ColorThink likely coalesces some sRGB values together as they are bunched tighter together.
Does this make sense? At the very least it’s been what I’ve been thinking and developing toward over the years.
Feel free to quote me on the LL list if you want.
regards,
Steve
A little off topic, but I'm curious about this. I've never seen a gamut plot in this coordinate space. It's clear the space would be perfectly cylindrical, but do the gamuts remain cylindrical in the space as well? I have seen gamuts plotted in HSB, but that's just a device-dependent transformation of RGB — not so useful for comparing gamuts. Do you have an example of a color space plotted in Lch? Is the L the same calculation as L* in LAB?
Going back to ColorThink, I got this useful email from Steve Upton. He said it was OK to post it. I'm still working out the details in my brain!
A gamut plotted in Lch will look exactly the same as the same gamut plotted in Lab. The underlying triplets describing each point will be different. Yes, the L* axis is the same.
Jim
So when somebody talks about Lch they are just talking about the transforms from LAB we find in sources like Wyszecki (i.e. h = arctan(a*/b*) ) etc.?
These calculations just describe the same points in terms of angle and distance from from the origin. In that case the plots wouldn't look like soda cans. So I guess I missed Francisco's point.
QuoteA long time ago I started to clearly differentiate between color values and device values - as you said above it’s semantics but it helps me be clear about the differences.
Like the formulae on page 168 of the second edition? Yep.
...I'm still working out the details in my brain!
Just make sure you process properly the following from Steve:I'm sure it works for you, I'm not buying into it yet, I'm still talking with Steve about this. Case in point.
"AdobeRGB can hold MORE *unique* color values... the number of distinct colors (as per human vision)... than sRGB"
Works for me, though.
On Sep 4, 2014, at 5:13 PM, Steve Upton <upton@chromix.com> wrote:
I’d recommend moving from "color value like R89/B87/B255” as you mention above to “device value” - but again that’s my taste.
I'm OK with that once I get all the various flavors figured out. For example, if we are talking about an RGB working space and values, R89/B87/B255 as the example, we can call that a device value? Considering they are based on devices, that's probably kosher.
It would be useful to have a group of names with color behind them to define what we mean.
Color alone I suspect should be what we perceive.
Device Value as you point out has something to do with a device (and based on my question above, perhaps a working space).
Color Values as you point out refer to perception and specifically to colorimetry.
The confusion is what do we call a value, device or otherwise that isn't perceivable? Color Values doesn't seem kosher if again, the definition is based on perception.
Andrew
I do like what Steve suggests, using Device Value for, well devices. I'm still not clear about Color Values however, certainly not values we can't see as a color.
I do like what Steve suggests, using Device Value for, well devices. I'm still not clear about Color Values however, certainly not values we can't see as a color.
I don't think we should use Device Value for colorimetric working spaces, even if they are nominally based on (idealized) physical devices. I think that term, if we don't use Colorant Value, should be reserved for the data sent to physical devices or their drivers.I'm OK since they are based on a device (a theoretical one) and I'd hate to have to use another term just for those cases.
Andrew Rodney:The confusion is what do we call a value, device or otherwise that isn't perceivable? Color Values doesn't seem kosher if again, the definition is based on perception.
SU: well, that’s another good point but I think it’s really just a side effect of assigning numbers to colors.
As soon as you do that you realize that numbers have a habit of going on forever and color perception certainly doesn’t. So we end up with a 3D system where a bunch of the numbers clustered near the center correspond to human perception and if you go too far out then they no longer do.
I think it really only comes up in our world as a result of ProPhoto having an imaginary blue value. It makes sense for the reason that it was used but it does mean that it should probably only be used for advanced users. Certain things, like making a synthetic gradient from 0,0,255 to 255,255,255 means that some of the gradient is “outside human gamut” and should not be expected to look right or convert to other spaces in a pleasing way. Those who expect it to are probably not ready for ProPhoto.
Oh, and also, ColorThink does *not* take this into consideration when calculating gamut volume. It could if we calculated the portion of the ProPhoto gamut that was outside of human perception and subtracted it from the total volume. BUT that would require that we had good, solid data on the edges of human perception and that’s not something I’ve been able to track down either….
Steve
I'm OK since they are based on a device (a theoretical one) and I'd hate to have to use another term just for those cases.
Here's more from Steve:
What Steve is saying is what Graeme said I believe: This is a side effect of assigning numbers to colors.
So I still think the question "does Adobe RGB have more colors than sRGB" can't be answered and it's the wrong question. If we talk about device values, they are the same, Steve has said that. Nearly everyone here agrees on that as well. So Adobe RGB and sRGB have the same number of colors (based on encoding) or using Steve's usage, they have the same device values.
This is a side effect of assigning numbers to colors.
So when somebody talks about Lch they are just talking about the transforms from LAB we find in sources like Wyszecki (i.e. h = arctan(a*/b*) ) etc.? These calculations just describe the same points in terms of angle and distance from from the origin. In that case the plots wouldn't look like soda cans. So I guess I missed Francisco's point.
Francisco, there are different Delta E functions, but almost all are a square root of squares (of real numbers). So the function is always positive, hence you can never get back to point of origin.
Sandy
I remained a little confused by this explanation and looked for info about deltaE. It turns out that what you refer as the square root of squares of real numbers which is always positive is nothing more that the simple equation for calculating the "Euclidian" distance in a Cartesian, orthogonal three dimensional space such as Lab.
I understand the issues abou lab not being really perceptually uniform, the refinements made to the deltaE formula and that we cannot consider the gamut volume as an exact representation of the number of colors, but I think your explanation is not correct.
Regards
The problems only happen when you conflate color stimuli with the subjective internal experience of perceiving color. If you don't do that, the problems go away and the answer to how many colors can AdobeRGB contain is: AdobeRGB doesn't contain colors — it is a measure of color stimulus which is continuous, not discrete.Perfectly true, and does help clarify the discussion here, but really just kicks the problem down the road. The fundamental question can be phrased a little more carefully -- "How many perceivably different colors can be described by AdobeRGB (or sRGB) stimulus values?"
I've mentioned it in passing, and it seems nitpick-y, but in context I think it's worth saying again. We are not assigning numbers to colors. We are assigning numbers to stimuli. That's why the numbers are often referred to as tristimulus values.
It's a very important distinction. We have all this light bouncing around at various wavelengths. We can measure this stimulus with tools that allow much finer precision than we can see. We can take these measurements, manipulate them with equations, then send the numbers to a device that will output some other stimulus. Colorimetry will tell us if these two stimuli — the input and the output — should match under certain conditions even when the spectral components are not the same. If we call these what they are: stimulus values or tristimulus values in the specific case, we will be in agreement with the existing literature and avoid the complications of calling them colors. Colors (as defined by subjective experience) can't be measured, color stimuli can. There's nothing complicated or philosophically problematic with any of this. The stimuli are properties of the outside world, which we can measure and reproduce with as much precision as current technology allows. This is reflected in the continuous nature of the math. The problems only happen when you conflate color stimuli with the subjective internal experience of perceiving color. If you don't do that, the problems go away and the answer to how many colors can AdobeRGB contain is: AdobeRGB doesn't contain colors — it is a measure of color stimulus which is continuous, not discrete.
But perhaps I miss your point. We have exchanged the color scientist's secret handshake -- W&S -- so I have great respect for your opinions. ;)
Francisco, my answer about delta E wasn't really intended as a part of my explanation; it was just in the nature of a direct answer to the question you raised. For the purposes of the explanation that I was trying to make, whether or not the function is conservative doesn't matter. (Perhaps it should, but that's not part of what I was saying).
Sandy
You might then say, yes, but colorimetry also gives us color difference formulas, and we can use those to ask questions such as: given a set of values like the 16.7M 8-bit numbers and a known threshold, how many of those colors fall within this threshold and have a difference smaller than the average person can perceive? You can also ask in a space like ProPhotoRGB how man fall outside the spectral locus? These are easy-to-understand questions and easy to answer.I'm with you Mark. Especially after my useful conversation with Steve where is said: Color values refer to human perception and specifically to colorimetry.
Sandy,
On second thoughts it seems to me the issue that lead to the discussion is a matter of the way the terms are used (nomenclature)
The term deltaE is being used for two different things and this lead to confusion.
One use, which I think is the correct one, is the "distance" between perceivable colors.
The second use is refer to deltaE as the unit of the axis in the Lab space (at least in CIE 76). We could say that the units of those axis are one deltaE apart, but it is not technically correct to call the unit of the axis deltaE.
It would be equivalent to measure physical distances in meters and instead of calling the axis meters and volumes cubic meters, to call the axis units distances and volumes cubic distances
Naming the volume of a color gamut cubic deltaE would be like calling volumes of physical objects cubic distances.
Since the distance is deltaE, could the unit be "E"?
Regards
- working spaces as virtual devices. I say why not? From a workflow or color calculation point of view why should I care if the ICC profile I'm using corresponds to a real device, an averaged set of devices, or an idealized device? The RGB values still mean the same thing, images convert the same way, etc. The best qualities of idealized devices are that we can make them perfect (gray balanced, no weird issues introduced by measurement flaws) and we can arbitrarily set their primary colors depending on our needs.
While I appreciate that, my copy of W&S is an indication of aspiration rather than knowledge. I'm just a photographer (and hobbyist programmer)— definitely not a color scientist.
In that case, can we agree that colorimetrically characterized virtual emissive displays -- like sRGB & Adobe RGB -- are decidedly on the "color" side of the wall, not the "device" side?
The one in my living room certainly is! Less room for furniture.
But grand pianos is not actually a measure of volume. ;D
This thread has totally flushed my non-nerd-like brain of all color space understanding. PLEASE, someone just look at my simple work flow chart and tell me how far right or wrong it is.I don't see anything wrong with it per se. It doesn't address a slew of questions we've been trying to nail down.
Also, I'm digging the geek level of this group. As a dog-eared Wyszecki & Stiles owner, I appreciate just how deep the rabbit hole can go.
Desirable characteristics for Device-Independent Interchange Color Spaces
A device-independent color space should see colors the way that color-normal people do; colors that match for such people should map to similar positions in the color space, and colors that don’t appear to match should be farther apart. This implies the existence of exact transforms to and from internationally-recognized colorimetric representations, such as CIE 1931 XYZ. Defining transforms between a color space and XYZ implicitly defines transforms to all other spaces having such transforms. A further implication is that a device-independent color space should allow representation of most, if not all, visible colors.
A device-independent color space should allow compact, accurate representation. In order to minimize storage and transmission costs and improve performance, colors should be represented in the minimum number of bits, given the desired accuracy. Inaccuracies will be introduced by quantizing, and may be aggravated by manipulations of quantized data. In order to further provide a compact representation, any space should produce compact results when subjected to common image-compression techniques. This criterion favors perceptually-uniform color spaces; nonuniform spaces will waste precision quantizing the parts of the space where colors are farther apart than they should be, and may not resolve perceptually-important differences in the portions of the color space where colors are closer together than a uniform representation would place them.
Most image compression algorithms are themselves monochromatic, even though they are used on color images. JPEG, for example, performs compression of color images by compressing each color plane independently. The lossy discrete cosine transform compression performed by the JPEG algorithm works by discarding information rendered invisible by its spatial frequency content. Human luminance response extends to higher spatial frequency than chrominance response. If an image contains high spatial frequency information, only the luminance component of that image must be stored and transmitted at high resolution; some chrominance information can be discarded with little or no visual effect. Effective lossy image compression algorithms such as DCT can take advantage of the difference in visual spatial resolution for luminance and chrominance, but, since they themselves are monochromatic, they can only do so if the image color space separates the two components. Thus, a color space used with lossy compression should have a luminance component.
The existence of a separate luminance channel is necessary, but not sufficient. There also should be little luminance information in the putative chrominance channels, where its presence will cause several problems. If the threshold matrices for the chrominance channels are constructed with the knowledge that those channels are contaminated with luminance information, the compressed chrominance channels will contain more high-frequency information than would the compressed version of uncontaminated chrominance channels. Hence, a compressed image with luminance-contaminated chrominance channels will require greater storage for the same quality than an uncontaminated image. If the threshold matrices for the chrominance channels are constructed assuming that the channels are uncontaminated, visible luminance information in these channels will be discarded during compression. Normal reconstruction algorithms will produce luminance errors in the reconstructed image because the missing luminance information in the chrominance components will affect the overall luminance of each reconstructed pixel. Sophisticated reconstruction algorithms that ignore the luminance information in the chrominance channels and make the luminance of each pixel purely a function of the information in the luminance channel will correctly reconstruct the luminance information, but will be more computationally complex.
A device-independent color space should minimize computations for translations between the interchange color space and the native spaces of common devices. It is unlikely that the interchange color space will be the native space of many devices. Most devices will have to perform some conversion from their native spaces into the interchange space. System cost will be minimized if these computations are easily implemented.
Specifically, "colorimetrically characterized virtual emissive displays" as you refer to them are not one or the other. They, like any other ICC profile, are both - and the means to convert between them.
Or perhaps I'm misunderstanding you and thinking you were referring to ICC profiles rather than the virtual devices. Then still no, by my reckoning. They're devices right? We send device values to them (for storage in a working space rather than display on a physical device) and we get device values from them (the working space).
Or am I totally missing something?
I think it must be some kind of communication problem, or else I'm about to learn something important. Would you call the triplets in an Adobe RGB file colors? I would.
Good stuff in your history lesson. Great wording of some of the basic building blocks of the architecture - and what it should be like.
Well, to try to stick to the Chinese wall idea of separation, no. The triplets are RGB values so they are device values. They don't become colors until sent to a device OR converted to color values using a profile.
They have the means to be color values by converting through the AdobeRGB profile but if the profile is not saved into the file and it's just another untagged RGB file then it's lost in device land until the appropriate profile is added to the mix.
Steve, let me work through this and try to figure out when a triplet stops being a color.
I guess they are all colors, at least it seems that way to me. It's all about colorimetry right?
If you get to the point where you create a triplet that is intended to be treated as an RGB value and sent off to a display, then I'd say you're now in device space.
That's a good read Jim, it does a nice job using just enough math without being overwhelming.
If you ever revise or publish a version I can save you the trouble of looking up XXX Stiles: Walter Stanley Stiles. He almost always published under WS — I don't think many people even knew his first name. He died in 1985 and in one of the obituaries there is an account of a discussion between him and RWG Hunt (whose full name I actually don't know):
Hunt: "I think Dr Stiles should explain how he can study colours and dispense with sensations."
Stiles: "You put a man down behind a colorimeter, you guide his hand to three knobs and let him go ahead."
Hunt: "This tells you everything?"
Stiles: "Of course not. But you ask him to make certain settings based on the appearance of the colorimeter field. You draw your conclusions from the relations between the stimuli exposed in the fields, and the settings he makes. In expressing these relations it is not necessary to claim one is 'measuring a sensation' or in fact to 'regard a sensation' as having any particular meaning as scientific term. Of course, the word 'sensation' may be used colloquially to explain to the observer what you want him to do."
I think Dr. Stiles is wise to make this distinction and I think it is very close to the distinction I was attempting between color (the sensation) and stimulus the thing colorimetry quantifies.
I am considering a revision to bring the thing forwards in time 20+ years, and I thank you for Stiles' given names. These days, MacAdam seems to get all the ellipsoid credit. Can you point me at something that sorts that out?
There's actually a discussion in WS about this starting on page 665 in the 2nd edition.
A problem with xy chromaticity space is that equal steps at various places on the diagram correspond to different perceptual changes: a large numerical change in the chromaticity of a green color may be barely noticeable, while a small change in that of a blue could dramatically change the perceived color. In 1942, David MacAdam performed a study in which he measured the amount of change in color that produced a just-noticeable difference in a set of observers. He presented his results in the form of a set of ellipsoids in XYZ. Shortly afterward, Walter Stiles predicted the shape of a set of ellipsoids based on other testing. The two sets of ellipsoids are similar, but not identical. If Stiles’ ellipsoids are enlarged by a factor of ten and converted to xy chromaticities, they become ellipses. Plotting the major and minor axes of these ellipses results in the following diagram:
I think so, and you'll notice that I got to the point where the image could have been in sRGB, right?
But what if it's just intended as an RGB editing space, maybe one that doesn't have, and can't ever have, a real-world device that works that way -- like PPRGB?
I'm really uncomfortable saying that if I'm editing in Lab, I'm editing colors, but if I'm editing in PPRGB (or aRGB, for that matter) I'm editing device values.
Why?I'm not sure why editing in Lab or ProPhoto RGB or Adobe RGB (1998) both don't share the same terminology (Device Values perhaps Color Values). But these are Steve's terms so if I'm understanding the distinction:
device values are *not* color values and only become color if sent to a device or come from the device. They can be correlated with color values using an ICC profile or in a measurement file of some sort.Are the Lab Color Values not Device Values by virtue of editing them using a display which of course isn't providing us Lab but RGB but none the less correlated with color values using an ICC profile?
I'm not sure why editing in Lab or ProPhoto RGB or Adobe RGB (1998) both don't share the same terminology (Device Values perhaps Color Values). But these are Steve's terms so if I'm understanding the distinction:
Color Value refer to human perception and specifically to colorimetry. Lab, Luv, XYZ, Yxy, etc are all color values.
Device Value* refer to the encoding of a pixel with a possible number based on that encoding but may not refer to human perception.
Is that about right Steve?
If those definitions are OK, when editing an image in Photoshop, in sRGB or Lab, one is certainly a Color Value and I suspect a Device Value and one is only a Device Value.
*SU:Are the Lab Color Values not Device Values by virtue of editing them using a display which of course isn't providing us Lab but RGB but none the less correlated with color values using an ICC profile?
Now, let's say I treat every triplet in the XYZ file as a column vector and multiply all of them by an arbitrary nonsingular 3x3 matrix. Are the triplets still colors? If you say no, what if the matrix is the one that gets from XYZ to the CIE 1931 RGB Color matching functions? They're the basis for XYZ, so they've got to be colors, right?
There is no difference in 'structure' in XYZ and any other color 'space' derived from it by a 3x3 matrix. <snip> They are not different spaces. They are what may be considered different coordinate systems within the same 3D space.
You made that point in another context a month or two ago, and it resonated with me. Upon reflection, I think your use of the word space here is mathematically accurate, although it is considerably different from the usual usage in the color management world. I don't propose that the color management world change its terminology -- I think that would be an uphill struggle to win the hearts and minds of a ship that's already sailed (Excuse me, I love mixing metaphors). Maybe you could propose another word that captures your meaning?
Getting ready to record part one of my video. I've got an option to record on the Wide Gamut NEC or as I've always done on the past (to be safe), a MacBook display which is "sRGB-ish" ;D
The differences in the output of the Gamut Test Print don't translate that well to video but none the less I do see some useful differences. I'm was wondering if the application I use (ScreenFlow) would handle the wide gamut display properly so I did a very quick test showing one freeze frame of the prints side by side (viewed in LR's Compare View). I'm wondering if a few of you can tell me the two versions look OK and as I see them in Safari, basically the same. I've got about 5 seconds of each option. Unfortunately the scaling is a tad different. But it appears to me there is no difference if I record the presentation on the NEC or the MacBook display but I'd like to know if other's see the same thing.
i don't think I can see any difference myself. Can you see a difference on your display? (I'm viewing it on an iMac display so outer gamut diffs may not show up... or would they...?)No, I see no difference on either and using Safari which is kind of good (it means ScreenFlow is color managed). But I wanted to be on the safe side before moving towards recording.
Also, do you know where you'll be posting your video when it's done? (as in URL).Low Rez will be on YouTube, high rez on my site. Thanks.
I noticed that you had Send 16 bit Data option ticked. As you know this option not available for Windows users.Right but it will not make any difference I've ever seen on the print. And on Windows, it will still be apples to apples comparison. I don't think you'll see any difference without that setting. I expect some viewers may not be using Epson printers at all
OK, got a rough cut here. Video is pretty high rez at this point.
http://digitaldog.net/files/WideGamutPrintVideo.mov
Leaving early tomorrow for a job and while I'll have the footage and editing software on my MacBook, not sure if I'll have time to do any further editing or 'fixes' (if necessary). Doesn't mean I'm not done polishing or ignoring any comments here, just engaged in 'real work' until early nest week ;D.
OK, got a rough cut here. Video is pretty high rez at this point.
http://digitaldog.net/files/WideGamutPrintVideo.mov
Leaving early tomorrow for a job and while I'll have the footage and editing software on my MacBook, not sure if I'll have time to do any further editing or 'fixes' (if necessary). Doesn't mean I'm not done polishing or ignoring any comments here, just engaged in 'real work' until early nest week ;D.
2. There's quite a bit of video compression induced banding on a few Granger Rainbow comparisons (not all) that make it hard to distinguish whether it's on the print or in the video.I can play around a bit more in exporting the video, I'm not real savvy on all the permutations in ScreenFlow for video output. I can make it a higher rez (which will make for a much bigger download). It will look even worse on YouTube!
Please correct the spelling of "benefits" in the title page!Good catch, fixed. Thanks. Not sure why the video doesn't just play through your web browser. It's a .mov and runs fine on this end (Mac).
Also after saving it to the desktop as a webarchive, it will not load and play. I saved it a couple other ways and will try them.
Odd that you don't see improvement in yellow. That's one of the reputed advantages of Pro Photo.
The blue balls show banding on the higher luminance ball and the lower luminance ball is very dark. The Epson profiles are apparently not as good as your custom profile. What do you think?Just printed a test using the supplied Epson profile for glossy. I concur with your observations! I think it is a profile issue. If you can build a custom profile, that might be the way to go, this test file shows the 'issues' with the Epson Glossy profile as far as I'm concerned.
From EarthBound Light "What About ProPhoto RGB?" among many others:
It is the only one of the three though that encompasses the entire gamut of the Epson printer shown. sRGB severely clips the Epson gamut in the cyan to green region (bottom left) and yellow-orange region (top). Adobe RGB can still clip some very saturated yellows but does cover all the greens, and green is a very important color being in the middle of the visual spectrum and very prevalent in nature.
...we have to accept the fact that we also will be encompassing colors we can't even see, never mind print.
But if we do, we have to accept the fact that we also will be encompassing colors we can't even see, never mind print.And the same is true for sRGB! This is another red herring like "I can't use a working space who's gamut is larger than my display". As if the output profile doesn't produce colors (yes actual colors we can see) that can't be seen on the display with or without soft proofing. They fall outside display gamut (even those that can produce Adobe RGB (1998) gamut).
sRGB, Adobe RGB or ProPhoto RGB each have their place: if you only want to use one color space and mostly print then ProPhoto is clearly a good choice; if you only output to the web then sRGB is the obvious choice because that is currently the only supported color space on the web. Adobe RGB is not a bad compromise, and it conveniently fits the newer wide-gamut monitors. If you are printing in black and white it doesn't matter what of the working spaces you use. And so on ...
Robert
Hi Robert,
Two questions:
- Has not both Adobe RGB and sRGB has passed it's best before date, now that we have 4K with Rec. 2020 colour space?
- Is there a need for a new colour space needed for 8-bit representation of the Rec 2020 colour space?
I don't pretend to be an expert in colour management just asking for your opinion?
Best regards
Erik
Hi Robert,
Two questions:
- Has not both Adobe RGB and sRGB has passed it's best before date, now that we have 4K with Rec. 2020 colour space?
- Is there a need for a new colour space needed for 8-bit representation of the Rec 2020 colour space?
I don't pretend to be an expert in colour management just asking for your opinion?
Best regards
Erik
sRGB, Adobe RGB or ProPhoto RGB each have their place: if you only want to use one color space and mostly print then ProPhoto is clearly a good choice; if you only output to the web then sRGB is the obvious choice because that is currently the only supported color space on the web.And that's why those images are in ProPhoto RGB, they need to be printed and go to the web. The myths and misinformation you missed, because you failed to read the posts here is the suggestion to just use sRGB for everything.
And that's why those images are in ProPhoto RGB, they need to be printed and go to the web. The myths and misinformation you missed, because you failed to read the posts here is the suggestion to just use sRGB for everything.
I've had a quick scan of the thread and I don't see where there is a suggestion (or where there is a myth/misinformation) to use sRGB for everything.
I don't know if you're referring to Gary Fong or what ... but I'm not going to pay him money to hear his views on the subject: not because I object to paying, but because I'm quite sure I'll learn nothing from it.
On the other hand, what your demonstration SHOULD also have shown, is that your test image contains colors that are massively outside the gamut of currently available monitors and printersI did show that, another video that goes into detail was referenced, the results of that wide gamut data was presented with a file anyone can use to see this. Again, make your own video demonstrating what you think a video on gamut's of working space should show. This thread is many, many pages long, the only one who's come here to suggest the video is flawed is you sir, and the lack of anyone replying to your posts is telling. As such, time to but on the Ardill filter again, you've proven you simply don't get it!
Unfortunately we often do need to convert to sRGB because that is what the web uses: so advice on what to do to get an image with a wide gamut into the small-gamut sRGB space would be a useful contribution. Could we, for example, construct a table-based sRGB-like profile that would allow us to do a perceptual mapping to this space, followed by a RC mapping to sRGB (thus allowing us to effectively do a perceptual mapping to sRGB)?That you still don't understand that going from ProPhoto RGB to sRGB can and can/will clip colors despite any RI used illustrates you're unable to understand a very simple concept of color management and until you do so, going down another Ardill rabbit hole is pointless.
As usual, you're having severe difficulties connecting the dots.
1. The video is called "The benefits of wide gamut working space for print output" and that's exactly what it shows.
2. Yes, this is in a way, a direct response to Gary and no, you don't have to pay to hear his flat earth theories on color management.
3. Yes, there is a big world out there where people like Fong, Crockett, Rockwell and others suggest their audience use sRGB for evertything! The video shows why that's a dumb idea.
4. Any of the actual images in the test page could have been shot and converted in-camera to sRGB. Or shot raw and encoded into sRGB for all further output. The video shows why this is far from an optimal workflow!
5. You're still obsessed and confused about rendering intents, they all clip OOG colors (but do it differently) so I'll ignore your latest post about this here, as should others, as yet another of your rabbit holes and stick to my original idea to put you on the do not call list/ignore. But here I'm replying because you've again failed to understand the points. If you so desire and wish to create your own video addressing these ideas, please do so. I did show that, another video that goes into detail was referenced, the results of that wide gamut data was presented with a file anyone can use to see this. Again, make your own video demonstrating what you think a video on gamut's of working space should show. This thread is many, many pages long, the only one who's come here to suggest the video is flawed is you sir, and the lack of anyone replying to your posts is telling. As such, time to but on the Ardill filter again, you've proven you simply don't get it! That you still don't understand that going from ProPhoto RGB to sRGB can and can/will clip colors despite any RI used illustrates you're unable to understand a very simple concept of color management and until you do so, going down another Ardill rabbit hole is pointless.
Guys, correct me if I'm mistaken, but... Doesn't a conversion from prophotoRGB to sRGB map all colors to reproduce the same except for out of gamut colors which will convert to the nearest color?
Which implies that, for example, all very saturated, out if gamut pure reds will convert to the same maximum value in sRGB. IOW they clip when using RC intent.
To avoid any clipping one would do an "assign profile" command and then see desaturated color.
I think the lesson of all these pages here is that we need displays that can reproduce the gamut of all the other output devices!
Yes, with v2 profiles a pure red 255,0,0 in ProPhoto will map to a pure red 255,0,0 in sRGB and as Andrew explained these are not the same colors. So the color is shifted, but not clipped (unless you define this shifting as a clipping).
Robert, I think most people would consider this case to be "clipping", rather than "shifting", for a few reasons:
- It is the color value of the pixel that is being truncated or "clipped" to the border of the smaller gamut. The pixel itself is not being clipped or eliminated.
- The important difference between pixels with "clipped" color values when using relative colorimetric and pixels with "shifted" color values when using perceptual is that the shifted values could be reverse-shifted back to the original gamut (or at least something close). Clipped values cannot be "reverse-clipped" because that relative gamut-positioning information has been lost. In your example, there are a whole range of potential color values that are going to get clipped to 255,0,0 in sRGB and once they are clipped that way, there is no way to re-create that original range of out-of-gamut color values.
- A similar thing occurs when we commonly refer to "clipped" highlights or shadows. It doesn't mean that the pixel turns invisible or is eliminated from the image. It means that the true brightness value was beyond what could be recorded and was therefor recorded as the min or max value. The true brightness was "clipped" to the min or max pixel value. Information was lost that cannot be recovered.
Yes, with v2 profiles a pure red 255,0,0 in ProPhoto will map to a pure red 255,0,0 in sRGB and as Andrew explained these are not the same colors. So the color is shifted, but not clipped (unless you define this shifting as a clipping).
Well, Assign Profile is doing a completely invalid interpretation of the data - the only time it makes sense to use is when there is no profile attached to the image, and then we make a best guess as to what color space it should have (usually sRGB).
It would certainly make things a whole lot easier if our displays were the widest gamut devices in our armory!!
Robert
If ppRGB 255,0,0 and 250,0,0 and 245,0,0 all map to sRGB 255,0,0 then I would refer to that as clipping.
Erik,
I'm not an expert either, and perhaps Andrew can comment. However, I looked up the Rec 2020 gamut on Wikipedia (shown below, first image). It does not appear to be much wider than Adobe RGB (second image below, which is taken from a paper (http://docs-hoffmann.de/gamuts08072002.pdf) by Prof Dr Hoffmann). What he describes as real world surface colors are those colors that occur in nature from non-emissve (reflected light) sources. These are outlined by the dashed line. Some of the greens in his diagram appear to be out of the 2020 gamut. Some knowledgeable sources have said that this real world surface color gamut is too restrictive. Personally, I don't see any need for a new color space. Properly used, ProPhotoRGB works fine.
Bill