Luminous Landscape Forum

Raw & Post Processing, Printing => Digital Image Processing => Topic started by: brucepercy1 on July 20, 2006, 06:17:32 am

Title: Bit-Depth understanding required
Post by: brucepercy1 on July 20, 2006, 06:17:32 am
Hello All,

For many years, I've been happily using a Mamiya 7 and Coolscan scanner. I always now scan at 14-bits (the max my scanner can capture) when editing in Photoshop.

I've just been experimenting with a Canon 5D and I've seen what appears to me to be a lot more evidence of banding, in particular, in clear areas of sky where there is a very gradual tonal change. The images have been shot in RAW

So I'd like to ask a question regarding bit-depth. My understanding is that most DSLR's are 12-bit, including the 5D. This means that each channel should have 4095 discreet values possible.

My scanner samples at 14-bit, which produces scans where each channel should have 16384 discreen values possible.

What I would like to know is - have I got this right? Am I missing something here?

The difference between 12 bit and 14 bit is quite large, and I think this is why I am seeing such a difference - or evidence of banding when doing some serious adjustments in photoshop.

Also, I've been wondering for a while why most DSLR's at the moment are 12 bit.

Any info etc, would be greatly appreciated.

Lastly, my initial impressions of the 5D, when it's tripod mounted, is that the image quality is superb. I'm using primes and the cheap 24 F2.8 lens shows little softness at the edges. I've also noticed that opening the RAW files in Canon's DPP provides more neutral tones, but DPP has terrible sharpening, so I transfer to Photoshop and do an initial sharpen, whilst retaining the netural tones. ACR (which admittedly is not listed as supporting the 5D) does not produce the same neutral tones IMHO.

I feel I still have a long way to go, before I even consider selling up the Mamiya and Coolscan. I do love film, like the grain, like the response curves I get from films like Portra for people shots etc. It's an interesting time.

Any thoughts on the Bit-Depth understanding would be greatly appreciated.

Best Wishes,
Bruce Percy
The Light & the Land Photography
http://www.thelightandtheland.com (http://www.thelightandtheland.com)
Title: Bit-Depth understanding required
Post by: bjanes on July 20, 2006, 09:14:35 am
Quote
Hello All,

For many years, I've been happily using a Mamiya 7 and Coolscan scanner. I always now scan at 14-bits (the max my scanner can capture) when editing in Photoshop.

I've just been experimenting with a Canon 5D and I've seen what appears to me to be a lot more evidence of banding, in particular, in clear areas of sky where there is a very gradual tonal change. The images have been shot in RAW

So I'd like to ask a question regarding bit-depth. My understanding is that most DSLR's are 12-bit, including the 5D. This means that each channel should have 4095 discreet values possible.

My scanner samples at 14-bit, which produces scans where each channel should have 16384 discreen values possible.

What I would like to know is - have I got this right? Am I missing something here?

The difference between 12 bit and 14 bit is quite large, and I think this is why I am seeing such a difference - or evidence of banding when doing some serious adjustments in photoshop.

Also, I've been wondering for a while why most DSLR's at the moment are 12 bit.

Any info etc, would be greatly appreciated.

Lastly, my initial impressions of the 5D, when it's tripod mounted, is that the image quality is superb. I'm using primes and the cheap 24 F2.8 lens shows little softness at the edges. I've also noticed that opening the RAW files in Canon's DPP provides more neutral tones, but DPP has terrible sharpening, so I transfer to Photoshop and do an initial sharpen, whilst retaining the netural tones. ACR (which admittedly is not listed as supporting the 5D) does not produce the same neutral tones IMHO.

I feel I still have a long way to go, before I even consider selling up the Mamiya and Coolscan. I do love film, like the grain, like the response curves I get from films like Portra for people shots etc. It's an interesting time.

Any thoughts on the Bit-Depth understanding would be greatly appreciated.

Best Wishes,
Bruce Percy
The Light & the Land Photography
http://www.thelightandtheland.com (http://www.thelightandtheland.com)
[{POST_SNAPBACK}][/a] (http://index.php?act=findpost&pid=71254\")

Bruce,

This thread on DPReview should give you some insight into these matters, although this thread is more concerned with dynamic range than the number of levels. Just because your scanner has a 14 bit analog to digital converter does not mean that the sensors are capable the implied level of performance.

[a href=\"http://forums.dpreview.com/forums/read.asp?forum=1021&message=18931888]http://forums.dpreview.com/forums/read.asp...essage=18931888[/url]

Norman Koren's website has some good information on tonality and the number of levels in the various zones. Blue sky is about Zone V and a 12 bit capture should have 128 levels in that zone, more than the eye can distinguish. If you are converting your raw images at a bit depth of 16 and are seeing banding in the sky, something is wrong with your workflow. Banding is most likely to show up in the shadows, where there are fewer levels.

http://www.normankoren.com/digital_tonality.html#measure (http://www.normankoren.com/digital_tonality.html#measure)

BTW, the latest version of ACR does support the 5D. You can download it from the Adobe web site. It does require Photoshop CS2.

Bill
Title: Bit-Depth understanding required
Post by: John Sheehy on July 20, 2006, 01:24:53 pm
Quote
I've just been experimenting with a Canon 5D and I've seen what appears to me to be a lot more evidence of banding, in particular, in clear areas of sky where there is a very gradual tonal change. The images have been shot in RAW
http://www.thelightandtheland.com (http://www.thelightandtheland.com)
[a href=\"index.php?act=findpost&pid=71254\"][{POST_SNAPBACK}][/a]

I assume you're talking about contour banding in the gradients (as opposed to vertical and horizontal lines super-imposed on the shadows, which is also referred to as "banding").

Contour banding occurs when noise is low and quantization is coarse.  Blue sky has a very weak red component, because digital cameras are very insensitive to red, and there isn't much red in deep blue sky to begin with.  You really need to expose as high as you can without clipping highlights, and this can't be done properly at ISO 50, so don't use that.  Also, don't make dark conversions to brighten in post-processing, get the levels high in the conversion and do only darkening in PP.  Convert to 16 bits if you're going to do any editing of levels or saturation.
Title: Bit-Depth understanding required
Post by: bjanes on July 20, 2006, 02:50:54 pm
Quote
I assume you're talking about contour banding in the gradients (as opposed to vertical and horizontal lines super-imposed on the shadows, which is also referred to as "banding").

Contour banding occurs when noise is low and quantization is coarse.  Blue sky has a very weak red component, because digital cameras are very insensitive to red, and there isn't much red in deep blue sky to begin with.  You really need to expose as high as you can without clipping highlights, and this can't be done properly at ISO 50, so don't use that.  Also, don't make dark conversions to brighten in post-processing, get the levels high in the conversion and do only darkening in PP.  Convert to 16 bits if you're going to do any editing of levels or saturation.
[{POST_SNAPBACK}][/a] (http://index.php?act=findpost&pid=71285\")

While I agree with exposing to the right, a blue sky has more red than you might think, since it is not that saturated. For example, Bruce Lindbloom gives a value for the blue sky patch of a ColorChecker in ProPhotoRGB as 95, 102, 134. In one of my photos I measured a clear north blue sky at 104, 119, 184 in ProPhotoRGB. I have Canon 1D Mark II image of a ColorChecker where the sky patch reads 122, 128, 162 and the white patch reads 253, 252, 248 when converted into ProPhotoRGB with DCRaw. The values of the sky patch in raw are 1365, 2264, and 1365. The normalized pixel value for the red is 1365/4095 or 0.33. With the 12 bit raw, that should give over 512 levels for that zone, plenty to avoid posterization.

[a href=\"http://www.normankoren.com/digital_tonality.html]http://www.normankoren.com/digital_tonality.html[/url]
Title: Bit-Depth understanding required
Post by: jani on July 20, 2006, 04:34:03 pm
Quote
While I agree with exposing to the right, a blue sky has more red than you might think, since it is not that saturated. For example, Bruce Lindbloom gives a value for the blue sky patch of a ColorChecker in ProPhotoRGB as 95, 102, 134. In one of my photos I measured a clear north blue sky at 104, 119, 184 in ProPhotoRGB. I have Canon 1D Mark II image of a ColorChecker where the sky patch reads 122, 128, 162 and the white patch reads 253, 252, 248 when converted into ProPhotoRGB with DCRaw. The values of the sky patch in raw are 1365, 2264, and 1365. The normalized pixel value for the red is 1365/4095 or 0.33. With the 12 bit raw, that should give over 512 levels for that zone, plenty to avoid posterization.
Uhm, I think you misunderstand how RGB works.

For instance, 128,128,255 is a medium blue. There is absolutely no red in it whatsoever.

64,128,255 would be a blue with a green tint.
Title: Bit-Depth understanding required
Post by: dlashier on July 20, 2006, 05:46:29 pm
Quote
For instance, 128,128,255 is a medium blue. There is absolutely no red in it whatsoever.

[a href=\"index.php?act=findpost&pid=71306\"][{POST_SNAPBACK}][/a]

Huh? r=128 is not no red. Just because it happens to be balanced by green doesn't mean there's no red.

- DL
Title: Bit-Depth understanding required
Post by: bjanes on July 20, 2006, 06:50:03 pm
Quote
Uhm, I think you misunderstand how RGB works.

For instance, 128,128,255 is a medium blue. There is absolutely no red in it whatsoever.

64,128,255 would be a blue with a green tint.
[a href=\"index.php?act=findpost&pid=71306\"][{POST_SNAPBACK}][/a]

 
Title: Bit-Depth understanding required
Post by: sgwrx on July 20, 2006, 07:39:39 pm
what's interesting is one of my photos with a circular polarizer, shows about 6 distinct bands in the "gradient" of the darkest blue to lightest blue when i see it in any of my color managed photo editors. when i soft proof it with a generic silver rag profile, it has really bad banding but about 10 distinct bands.  when i printed it on the SR, 6x9, i can't seem to see those banded areas.  this told me that perhaps it's a monitor limitation? but maybe it's just not big enough?
Title: Bit-Depth understanding required
Post by: John Sheehy on July 20, 2006, 09:22:48 pm
Quote
The values of the sky patch in raw are 1365, 2264, and 1365. The normalized pixel value for the red is 1365/4095 or 0.33.
[a href=\"index.php?act=findpost&pid=71293\"][{POST_SNAPBACK}][/a]

That's quite impossible.  How could the red be equal to the blue when red is less sensitive in the camera?  That would mean the sky-blue square actually had more red than blue.

You didn't shoot this with a warm light, did you?
Title: Bit-Depth understanding required
Post by: John Sheehy on July 21, 2006, 01:34:34 am
Quote
The values of the sky patch in raw are 1365, 2264, and 1365. The normalized pixel value for the red is 1365/4095 or 0.33. With the 12 bit raw, that should give over 512 levels for that zone, plenty to avoid posterization.
[a href=\"index.php?act=findpost&pid=71293\"][{POST_SNAPBACK}][/a]

I am looking at the RAW values from a patch of NYC blue sky (which isn't really very blue in the summer, with the humidity), and the RAW values for the sky are 85:255:244.  That's without a polarizer.  A polarizer in the desert would be much more extreme.  In another one, where the sky isn't quite as blue, and exposure is such that white clouds are just about reaching with the maximum RAW value, the values of the bluer corner of sky are about 350:870:740.
Title: Bit-Depth understanding required
Post by: jani on July 21, 2006, 02:11:49 am
Quote
Huh? r=128 is not no red. Just because it happens to be balanced by green doesn't mean there's no red.
Yes, that's exactly what it means.

When green and red are balanced, all you're dealing with is various brightnesses of blue.

If it wasn't, it would be impossible to create brighter blues than 0,0,255, and RGB would be completely useless for imagery.

Edit:

Maybe you're confused by the RGBG pattern of colour filters across the sensor. If you were to access the raw format and find that there is indeed red light, then that's another matter.
Title: Bit-Depth understanding required
Post by: jani on July 21, 2006, 02:16:14 am
Quote
what's interesting is one of my photos with a circular polarizer, shows about 6 distinct bands in the "gradient" of the darkest blue to lightest blue when i see it in any of my color managed photo editors. when i soft proof it with a generic silver rag profile, it has really bad banding but about 10 distinct bands.  when i printed it on the SR, 6x9, i can't seem to see those banded areas.  this told me that perhaps it's a monitor limitation? but maybe it's just not big enough?
Assuming that the rest of your colour managed workflow is allright:

It does resemble a calibration problem, for instance that LUT adjustments have provided you with significantly less than the full 24-bit spectrum. But it could also be that your monitor isn't able to represent all the gradients.
Title: Bit-Depth understanding required
Post by: dlashier on July 21, 2006, 03:10:42 am
Quote
Yes, that's exactly what it means.

When green and red are balanced, all you're dealing with is various brightnesses of blue.
[a href=\"index.php?act=findpost&pid=71354\"][{POST_SNAPBACK}][/a]

No, what your dealing with is various saturation levels of blue. Brightness level of blue is determined by the "b" value.

Quote
Maybe you're confused ...
[a href=\"index.php?act=findpost&pid=71354\"][{POST_SNAPBACK}][/a]
I'm not the one who is confused  . No red in a color would be only when r = 0. For a color be be a red, r would need to be the highest value, for a color to be reddish or have a red cast, r would need to be at least the second highest value, but for the color to have no red, r would have to be 0. That's what no means - none, zip, nada, zilch  

- DL
Title: Bit-Depth understanding required
Post by: jani on July 21, 2006, 05:49:54 am
Quote
No, what your dealing with is various saturation levels of blue. Brightness level of blue is determined by the "b" value.
I'm not the one who is confused  . No red in a color would be only when r = 0. For a color be be a red, r would need to be the highest value, for a color to be reddish or have a red cast, r would need to be at least the second highest value, but for the color to have no red, r would have to be 0. That's what no means - none, zip, nada, zilch 
In that case, this is a case about terminology confusion, and I'm at blame.

Yes, I mean that the sky doesn't have a red cast.

Let me try to be a bit more precise about what I meant to say (and keep in mind that I'm not Bruce Fraser or Andrew Rodney ).

The original claim was that "a blue sky has more red than you might think, since it is not that saturated", and bjanes then proceeds to use an RGB representation of the sky as a proof.

This is where I think that he misunderstands the RGB colour model, and forgets that it's just a simulation. That there is a "red" value in RGB doesn't mean that the sky "has red" in it.

Keep in mind that RGB (regardless of colour space) is used to simulate a part of the visible spectrum by using a three-colour composite. Yes, the representation in RGB is using red to simulate the visible spectrum, but you could just as well say that a sky with 128,128,255 "has yellow" (since 128,128,0 is a dark yellow), or that a bright blue sky "has cyan", "has turquoise", "has mauve", "has maroon", "has green", "has violet", "has ultraviolet", "has x-ray", ...

Since it's possible to use an N-colour composite instead of the 3-colour composite RGB, and still be able to simulate the visible spectrum, it's possible to have models where red isn't one of the components. CIA L*a*b* is one model that doesn't use red as a component (its three axes are lightness from black to white, green to magenta and blue to yellow). Or you could've used a two-colour composite of orange and blue.

Would you still claim that -- in that two-colour composite -- the sky "has red"?

Going back to RGB, the value set of 128,128,255 isn't a simulation of a colour with red in it. It's a simulation of a clear blue.
Title: Bit-Depth understanding required
Post by: Ray on July 21, 2006, 06:37:13 am
Quote
Let me try to be a bit more precise about what I meant to say (and keep in mind that I'm not Bruce Fraser or Andrew Rodney ).

Going back to RGB, the value set of 128,128,255 isn't a simulation of a colour with red in it. It's a simulation of a clear blue.
[a href=\"index.php?act=findpost&pid=71362\"][{POST_SNAPBACK}][/a]
 

Jani,
As much as I sympathise with you for not having the knowledge of Bruce Fraser and Andrew Rodney, I have to agree with Don that 128, 128, 255 is not a clear blue. It's a blue that contains both red and green, both of which change the hue of the blue dramatically so that one really needs another name for the color.

Below is an example I created. The numbers might have changed slightly in the conversion, but it's clear that 128, 128, 255 is 'bluish' but not blue.

[attachment=834:attachment]
Title: Bit-Depth understanding required
Post by: jani on July 21, 2006, 07:19:01 am
Quote
Below is an example I created. The numbers might have changed slightly in the conversion, but it's clear that 128, 128, 255 is 'bluish' but not blue.
When I save the file and check the colour values with jpegtopnm, I see that it's 129,128,255. (A quick test in the Gimp verifies this.)

Perhaps you're reinterpreting it according to some colour space? That would explain the colour shift.

Try creating the colour with HSL adjustments instead.
Title: Bit-Depth understanding required
Post by: bjanes on July 21, 2006, 07:29:12 am
Quote
That's quite impossible.  How could the red be equal to the blue when red is less sensitive in the camera?  That would mean the sky-blue square actually had more red than blue.

You didn't shoot this with a warm light, did you?
[a href=\"index.php?act=findpost&pid=71335\"][{POST_SNAPBACK}][/a]

John,

You are correct, as usual. On rechecking my raw readings and converting to 12 bit, the RGB values of the blue patch are 288, 736, 800. The white balance in ACR is 5200 -7, and the white patch is neutral when converted; therefore, the white balance is good and the illumination is daylight. The normalized red is then 288/4095 = 0.07, and  that zone would have about 128 levels in the 12 bit file and 20 levels in the gamma 2.2 eight bit file according to Norman's chart. With much editing banding could well occur with this number of levels, but using 16 bit for editing would preserve all 128 levels and banding should not occur with reasonable editing.

Bill
Title: Bit-Depth understanding required
Post by: Ray on July 21, 2006, 07:36:18 am
Quote
When I save the file and check the colour values with jpegtopnm, I see that it's 129,128,255.
[a href=\"index.php?act=findpost&pid=71368\"][{POST_SNAPBACK}][/a]

Not surprising! I work in ProPhoto and the image has been converted to sRGB and jpeg compressed. But 128 or 129 is nitpicking. The color is essentially as it was when created. It's no longer a clear blue because of the presence of green and red. You might give it a name containing the word 'blue' because blue is the predominant primary, but the presence of red and green changes the hue whether or not the red and green channels have exactly equal value.
Title: Bit-Depth understanding required
Post by: bjanes on July 21, 2006, 07:54:00 am
Quote
In that case, this is a case about terminology confusion, and I'm at blame.

The original claim was that "a blue sky has more red than you might think, since it is not that saturated", and bjanes then proceeds to use an RGB representation of the sky as a proof.

This is where I think that he misunderstands the RGB colour model, and forgets that it's just a simulation. That there is a "red" value in RGB doesn't mean that the sky "has red" in it.
[{POST_SNAPBACK}][/a] (http://index.php?act=findpost&pid=71362\")

Well, our eyes and the camera both use RGB tristimulus values to simulate the color of the blue sky. For that color to be recorded by the camera or perceived by a human, it is necessary to have the red sensor activated. As you may recall, the Rho (red), Gamma (green) and Beta (blue) sensors of the eye have considerable overlap, especially for green and red. The camera sensors have less overlap.


[a href=\"http://www.photo.net/photo/edscott/vis00010.htm]http://www.photo.net/photo/edscott/vis00010.htm[/url]

In the RGB model, an unsaturated blue must have red and green components. If the red sensor of the camera is activated, this is equivalent to saying that the unsaturated blue has a red component as well as a green component. It may all be a matter of terminolgy, so why don't we just drop this topic?
Title: Bit-Depth understanding required
Post by: John Sheehy on July 21, 2006, 10:15:59 am
Quote
The normalized red is then 288/4095 = 0.07, and  that zone would have about 128 levels in the 12 bit file and 20 levels in the gamma 2.2 eight bit file according to Norman's chart. With much editing banding could well occur with this number of levels, but using 16 bit for editing would preserve all 128 levels and banding should not occur with reasonable editing.
[a href=\"index.php?act=findpost&pid=71369\"][{POST_SNAPBACK}][/a]

Normally, noise prevents contour banding.  Perhaps something in the workflow is smoothing it away, or something in the display hardware or calibration is quantizing it further.
Title: Bit-Depth understanding required
Post by: Dennis on July 22, 2006, 09:44:31 am
Quote
...because digital cameras are very insensitive to red,
Digital cameras are very insensitive in the blue spectrum, that's why the blue channel ist usually the one containing the most amount of noise. The sensors of digital cameras are highly sensitive for the red spectrum - so sensitive, that they need a special IR cut filter to cut the light somewhere at 700nm. So, I'd rather say digital cameras are highly sensitive to red.
Title: Bit-Depth understanding required
Post by: John Sheehy on July 22, 2006, 05:41:56 pm
Quote
Digital cameras are very insensitive in the blue spectrum, that's why the blue channel ist usually the one containing the most amount of noise. The sensors of digital cameras are highly sensitive for the red spectrum - so sensitive, that they need a special IR cut filter to cut the light somewhere at 700nm. So, I'd rather say digital cameras are highly sensitive to red.
[a href=\"index.php?act=findpost&pid=71467\"][{POST_SNAPBACK}][/a]

The *sensor*, operating in monochrome mode, with no filters, is most sensitive to red.  The hot mirror drastically reduces red response, and the entire sensor/CFA/AAfilter/hotmirror sandwich is least sensitive to red, and that's all that matters when your're taking a picture.

It is a well know fact that RAW converters multiply the red channel data by about 2 when doing daylight WB, and about 1.4 for blue.

If blue is the noisiest channel, it is because the light source or subject is lacking in blue, such as incandescent lighting.  For incandescent lighting, the blue RAW data is usually multiplied by about 4 to achieve incandescent WB (red and green are very close in strength).

This is for Bayer RGB cameras in general.  There are individual small-sensor compacts that have blue as the least sensitive (2.25x for daylight WB).  They must be horrendous in incandescent light.
Title: Bit-Depth understanding required
Post by: jani on July 23, 2006, 07:14:45 am
Quote
Well, our eyes and the camera both use RGB tristimulus values to simulate the color of the blue sky. For that color to be recorded by the camera or perceived by a human, it is necessary to have the red sensor activated. As you may recall, the Rho (red), Gamma (green) and Beta (blue) sensors of the eye have considerable overlap, especially for green and red. The camera sensors have less overlap.
http://www.photo.net/photo/edscott/vis00010.htm (http://www.photo.net/photo/edscott/vis00010.htm)
Indeed.

But the human colour management apparatus is a bit more advanced than that.

This article from last year (http://www.sciencedaily.com/releases/2005/10/051026082313.htm) should provide some rather interesting additional insights.

Quote
In the RGB model, an unsaturated blue must have red and green components. If the red sensor of the camera is activated, this is equivalent to saying that the unsaturated blue has a red component as well as a green component. It may all be a matter of terminolgy, so why don't we just drop this topic?
Well, it's rather obvious that an unsaturated blue must have (equal) red and green components in the RGB colour model, at least if you're following the RGB colour cube. That's how it is by definition, but it doesn't mean that the unsaturated blue has red it in it, colour-wise.
Title: Bit-Depth understanding required
Post by: jani on July 23, 2006, 07:20:43 am
Quote
Not surprising! I work in ProPhoto and the image has been converted to sRGB and jpeg compressed. But 128 or 129 is nitpicking. The color is essentially as it was when created. It's no longer a clear blue because of the presence of green and red.
I'm sorry, that's simply not correct in the RGB colour cube model; it is equivalent to saying that e.g. 128,128,128 isn't a neutral grey.

As an experiment, create the inverse colour of a neutral 50% saturated blue, and invert it.

Which colour do you get?

Also, feel free to experiment with a different colour model where you're able to adjust the hue, saturation and brightness levels. Set the hue to pure blue, and then set brightness to 100% and saturation to 50%. Convert to RGB; what RGB values do you get?

This is central in the colour theory document linked to by bjanes; if your claim had merit, changing the R and G values with the same amount would pull the blue colour off the surface of the colour cube.
Title: Bit-Depth understanding required
Post by: Dennis on July 23, 2006, 08:05:12 am
Quote
If blue is the noisiest channel, it is because the light source or subject is lacking in blue, such as incandescent lighting.
In earlier times, Phil Askey's photographic noise tests at dpreview.com were split up to the three color channels. The shots were taken at daylight lightening, and the blue channel was always the worst one.
Title: Bit-Depth understanding required
Post by: John Sheehy on July 23, 2006, 08:29:03 am
Quote
In earlier times, Phil Askey's photographic noise tests at dpreview.com were split up to the three color channels. The shots were taken at daylight lightening, and the blue channel was always the worst one.
[a href=\"index.php?act=findpost&pid=71529\"][{POST_SNAPBACK}][/a]

And  he also used camera-produced JPEGs in earlier times, didn't he?

Subject color can make a difference, too.
Title: Bit-Depth understanding required
Post by: Ray on July 23, 2006, 11:01:37 am
Quote
I'm sorry, that's simply not correct in the RGB colour cube model; it is equivalent to saying that e.g. 128,128,128 isn't a neutral grey.

[a href=\"index.php?act=findpost&pid=71527\"][{POST_SNAPBACK}][/a]

Interesting! When is blue not quite blue? When it comes to what's blue and what's not blue, I guess I rely upon my eyeballs.

In the following samples of blue, colors 1-4 are all shades of pure blue, 0, 0, B.

Of the 4 colors on the right, #6 looks bluest to me and that's got more green than red (128, 150, 255). #8 looks least blue to me and that's got more red than green (150, 128, 255). #5, which you describe as a clear blue (128, 128, 255) does not look as blue as #6 to my eyes.

Perhaps your argument is a bit like declaring a banana is not a fruit but a herb. I believe from a strictly botanical perspective, a banana is a herb, although it tastes pretty fruity to me   .

[attachment=836:attachment]
Title: Bit-Depth understanding required
Post by: jani on July 24, 2006, 05:07:22 am
Quote
Interesting! When is blue not quite blue? When it comes to what's blue and what's not blue, I guess I rely upon my eyeballs.
And therein lies the trap.

Keep in mind the visual spectrum, which ranges from near-infrared to near-ultraviolet. Red and blue are at opposite ends, and violet is beyond blue. Yet many people seem to think of violet as a blue with a minor red tint to it. Why is that? I honestly don't know, I don't recall seeing an explanation for it.

Quote
In the following samples of blue, colors 1-4 are all shades of pure blue, 0, 0, B.

Of the 4 colors on the right, #6 looks bluest to me and that's got more green than red (128, 150, 255). #8 looks least blue to me and that's got more red than green (150, 128, 255). #5, which you describe as a clear blue (128, 128, 255) does not look as blue as #6 to my eyes.
To my eyes -- on my monitor at work -- #6 looks markedly blue-green, #7 I'm unsure of, and #8 definitely is redder than the rest. Compared to #6, #5 looks like it's on the other side of blue, but since they're next to each other, it's impossible to tell.

Quote
Perhaps your argument is a bit like declaring a banana is not a fruit but a herb. I believe from a strictly botanical perspective, a banana is a herb, although it tastes pretty fruity to me   .
Yes, it is a bit like that. And the RGB colour cube doesn't give a perfect simulation. In addition we have all the usual variables like monitor, human eye/processing system and language apparatus involved.

Don't take the following as anything but amusing anecdotes.

As a child, I had a few quarrels with other people regarding whether something was more yellow than green or vice versa. A British girl had quarrels with her mom about the colour of some surfaces, which the girl claimed were a deep red, while everyone else saw them as black. The Science Daily article sort of explains why that might have been the case.
Title: Bit-Depth understanding required
Post by: bjanes on July 24, 2006, 10:53:06 am
Quote
Indeed.

But the human colour management apparatus is a bit more advanced than that.

This article from last year (http://www.sciencedaily.com/releases/2005/10/051026082313.htm) should provide some rather interesting additional insights.
Well, it's rather obvious that an unsaturated blue must have (equal) red and green components in the RGB colour model, at least if you're following the RGB colour cube. That's how it is by definition, but it doesn't mean that the unsaturated blue has red it in it, colour-wise.
[a href=\"index.php?act=findpost&pid=71526\"][{POST_SNAPBACK}][/a]

Jani,

You can obfuscate the discussion by bringing in the opponency color models such as CIE LAB or even the theories about color perecption--colors exist only in the brain. However, with our cameras we are measuring light, not color, and the components of light that we are measuring are red, blue, and green as measured through relatively wide pass color filters.

Yellow exists as a pure spectral color with a certain wave length. If we used very narrow band RGB filters in the camera, yellow light would be recorded by neither the blue nor green sensors of the camera and the result would be black (R = G = B = 0). However, with wider pass filters, the yellow light is recorded by both the red and green sensors. Combining red and green wave lenghts of light does not produce yellow light with its characteristic wave length, but the color is perceived as yellow, since it activates the red and green sensors in the eye, producing a metameric match.

Now, does yellow light consist of a mixture of red and green light? No, but it it can be matched by a combination of red and green. In the RGB tristimulus model (Young- Helmholtz) it has red and green components.

Now in the case of an unsaturated blue, the blue is mixed with white light, which does contain RGB components along with the other colors of the rainbow. That unsaturated blue does have a true red component.
Title: Bit-Depth understanding required
Post by: jani on July 24, 2006, 03:57:33 pm
Quote
Now in the case of an unsaturated blue, the blue is mixed with white light, which does contain RGB components along with the other colors of the rainbow. That unsaturated blue does have a true red component.
Yes, but it's a meaningless statement. That -- plus the fact that you didn't have the RGB caveat in your original post -- is my point.
Title: Bit-Depth understanding required
Post by: bjanes on July 24, 2006, 06:58:52 pm
Quote
And therein lies the trap.

 The Science Daily article sort of explains why that might have been the case.
[a href=\"index.php?act=findpost&pid=71595\"][{POST_SNAPBACK}][/a]

The science daily article explains nothing with regard to this discussion. We are dealing with the recording of RGB values of a blue sky by a sensor, not perception of color.
Title: Bit-Depth understanding required
Post by: bjanes on July 24, 2006, 07:00:07 pm
Quote
Yes, but it's a meaningless statement. That -- plus the fact that you didn't have the RGB caveat in your original post -- is my point.
[a href=\"index.php?act=findpost&pid=71647\"][{POST_SNAPBACK}][/a]

Jani,

Your whole arguments are meaningless. I will waste no more time replying to your posts. Go argue with the Brittish girl.  
Title: Bit-Depth understanding required
Post by: Ray on July 25, 2006, 11:54:28 pm
I make no claim to special insights into color matters, but the friction between Jani and Bill Janes makes me think there's something of interest to be resolved here.

Why should a red and green light, mixed together, produce yellow? Off the top of my head, I would say, because the shorter wave lengths of red are yellow and the longer wavelengths of green are also yellow. Combine them both and you get a reinforcement of the frequencies in common which produce the sensation of yellow in our minds.

Yet, doing some Googling research on the matter, I came across the following reference from the most amazing of all encyclopedias, Wikipedia, which is in a constant state of revision and correction by anyone who thinks he has the knowledge. (Great concept!)

[attachment=840:attachment]

As you will see from the above image, there is no overlap between the red and green frequencies, so the question arises, if red and green light with the exact frequencies as shown in the Wikipedia image were superimposed, or mixed, would yellow result? If so, how?
Title: Bit-Depth understanding required
Post by: Ray on July 26, 2006, 12:34:14 am
To make it clearer for those who might be confused by tables, red plus orange is  a longer wavlength than yellow. Green is a shorter wavelength than yellow.

How the heck does yellow appear as a result of the combination of the two.
Title: Bit-Depth understanding required
Post by: Jack Flesher on July 26, 2006, 12:44:38 am
Quote
Why should a red and green light, mixed together, produce yellow?

Huh???  Maybe because of Physics 101?  All colors of light are composed of the three primary colors for light, Red, Green and Blue.  From those three primaries you can make ALL colors of light.  Combine any two of the primaries and you get the secondary colors: Green and Blue combined in equal parts make Cyan; Blue and Red make Magenta and voilla, Red and Green combined in equal parts make yellow. Amazing how that works!   (But I suspect you really did already know this and are just teasing us  )


Now for the real kicker -- and IMO why discussions on bit-depth always break down to arguments...  

The simple answer is RGB color models suck    

The longer answer is Numerical RGB (and CMYk for that matter) models by themselves are meaningless.  IOW saying 100, 110, 244 is color Bxxx will NEVER be accurate in and of itself.  The reason is the numbers have no reference point unless you first define the size of the space they are representing.  So any RGB numerical model also must have a color space designation to have meaning -- Ah hah!    All bigger bit depth gives us is more numbers to define the colors with and thus we can define them more accurately, especially as the color space itself gets bigger.  BUT! ALL spaces will still contain an infinite number of colors between neighboring numerical coordinates and hence can only be APPROXIMATELY defined by the numerical representation, regardless of how big it is.  (Though 16-bits per channel can describe a BUNCH of different colors -- and most importantly, far more shades of colors than can be distinguished with the Human eye.)

This is the main reason LAB or HSB makes a better model for discussing color, as they are both absolute color designations: Having a color space designation for them is irrelevant, because any color we define with LAB or HSB is either inside that space or not, thus the limits of the space can simply be designated directly by absolute values in either system and the space definition become superfluous.

Cheers,
Title: Bit-Depth understanding required
Post by: Jack Flesher on July 26, 2006, 01:12:28 am
Quote
To make it clearer for those who might be confused by tables, red plus orange is  a longer wavlength than yellow. Green is a shorter wavelength than yellow.

How the heck does yellow appear as a result of the combination of the two.
[a href=\"index.php?act=findpost&pid=71732\"][{POST_SNAPBACK}][/a]

Uhhhh...  Let's see if I can rephrase for you: red is a longer wave than yellow, green is a shorter wave than yellow and you are asking why if we average the red and green waves together we get yellow?  The same yellow that has a wavelength part-way between red and green?  

Well Ray, if you are serious this really does explain a *lot* about the nature of your posts  
Title: Bit-Depth understanding required
Post by: 32BT on July 26, 2006, 04:34:20 am
Quote
Uhhhh...  Let's see if I can rephrase for you: red is a longer wave than yellow, green is a shorter wave than yellow and you are asking why if we average the red and green waves together we get yellow?  The same yellow that has a wavelength part-way between red and green? 

Well Ray, if you are serious this really does explain a *lot* about the nature of your posts  
[a href=\"index.php?act=findpost&pid=71734\"][{POST_SNAPBACK}][/a]


Well, actually that is a valid question, and I am afraid it has nothing to do with averaging. The reason we perceive a combination of those two frequencies as yellow is because our perception seems to work as a trichromatic scanner. That is: our perception can be thought of as filtering light with three colored filters just as an RGB scanner might do.

And these filters are broadband filters, not monochromatic frequency detectors. In other words: if you excite 2 of the filters, both with a single frequency, it will appear to us as if we see a combination of the filter colors, not the frequency colors, because frequencies do not have the property "color".

And this also  means that two other frequencies may result in the same perceived color, because the 2 filters are excited similarly. hence: metamerism.


Also, the OP's problem was a relative problem particularly the difference of any single primary over a graduated color range. The sky for example might go from an almost white to a desaturated blue. And even though the red component will likely show the largest difference, it may still be less than necessary for a perceived smooth gradation. The usual perceptual calculations (delta E = 3 etc) do not apply, because if we are dealing with gradations of a single color, our perception becomes more sensitive to differences.

Even so, I don't believe the difference between 12bit and 16bit is noticeable or relevant in the original problem, so I don't think it is related to bit differences perse, but perhaps OP is shooting JPEG for example... or the colorspace differences between scanner and camera are such that the screen representation produces more banding issues etc...
Title: Bit-Depth understanding required
Post by: PeterLange on July 26, 2006, 07:05:38 am
Quote
This is the main reason LAB or HSB makes a better model for discussing color, as they are both absolute color designations: Having a color space designation for them is irrelevant...

HSB is a pure RGB derivative,
and therefore dependent on the underlying color space.

At least that’s how Photoshop computes
(see also http://en.wikipedia.org/wiki/HSV_color_space) (http://en.wikipedia.org/wiki/HSV_color_space))

Peter

--
Title: Bit-Depth understanding required
Post by: bjanes on July 26, 2006, 08:01:08 am
Quote
I make no claim to special insights into color matters, but the friction between Jani and Bill Janes makes me think there's something of interest to be resolved here.

Why should a red and green light, mixed together, produce yellow? Off the top of my head, I would say, because the shorter wave lengths of red are yellow and the longer wavelengths of green are also yellow. Combine them both and you get a reinforcement of the frequencies in common which produce the sensation of yellow in our minds.

Yet, doing some Googling research on the matter, I came across the following reference from the most amazing of all encyclopedias, Wikipedia, which is in a constant state of revision and correction by anyone who thinks he has the knowledge. (Great concept!)

[attachment=840:attachment]

As you will see from the above image, there is no overlap between the red and green frequencies, so the question arises, if red and green light with the exact frequencies as shown in the Wikipedia image were superimposed, or mixed, would yellow result? If so, how?
[{POST_SNAPBACK}][/a] (http://index.php?act=findpost&pid=71730\")

Ray,

The answer to your question is that mixing red and green light does not produce yellow light for the reasons you cite, but the mixture is perceived as yellow because the red and green sensors of the eye have a broad spectral response centered on red and green respectively, but actually responding to a wide range of colors. If you look at the reference I posted previously both the red and green sensors have a strong response to yellow light.

[a href=\"http://www.photo.net/photo/edscott/vis00010.htm]http://www.photo.net/photo/edscott/vis00010.htm[/url]

It is interesting that the red and green cone response curves are releatively close together and overlap, but the blue response is considerably to the left on the graph and there appears to be a gap in sensor response between blue and green. An interesting article in this month's Scientific American magazine (an American magazine for the intelligent layman) explains the reason for this in evolutionary terms.

It turns out that birds have a fourth sensor between the blue and green and have better and more extended color vision than humans. As mammals evolved from the common ancestor, they lost two color sensors since they were operating on the forest floor where light was dim and color vision was not useful. These mammals relied mainly on the responses of the rods.

As primates evolved, they ascended into the trees and needed color vision in order to pick out colored fruits and they regained a sensor. However, there is still a gap where the fourth sensor was originally, and this explains the gap between the blue and green sensors.

For color perecetion, the brain intetegrates the responses from all the sensors and determines the color by a process similar to triangulation.

Bill



http://www.photo.net/photo/edscott/vis00010.htm (http://www.photo.net/photo/edscott/vis00010.htm)
Title: Bit-Depth understanding required
Post by: Ray on July 26, 2006, 08:04:10 am
Quote
And these filters are broadband filters, not monochromatic frequency detectors. In other words: if you excite 2 of the filters, both with a single frequency, it will appear to us as if we see a combination of the filter colors, not the frequency colors, because frequencies do not have the property "color".



As I imagined. Thanks for the explanation, opgr. If a narrow band of red light (700 nm) is combined with a narrow band of green (550 nm), the wavelengths of those 2 frequencies do not change, or average, and no frequency which we might ascribe to pure yellow (580 nm) necessarily exists as a result of that combination. The visual cortex is so constructed that a single frequency of pure yellow (580 nm) will have the same (or similar) effect to a combination of two quite different frequencies, 700 nm and 550 nm. Interesting!
Title: Bit-Depth understanding required
Post by: Ray on July 26, 2006, 08:37:18 am
Quote
It turns out that birds have a fourth sensor between the blue and green and have better and more extended color vision than humans.


Bill,
Very interesting stuff. Yes, I've read of this fourth primary color sense which some birds are supposed to have. It explains to some extent the brilliant blue plumage that some 'birds of paradise' display.

What struck me as interesting in the photo.net article you refer to is the 'red' receptor or cone is most sensitive to yellow, not red, that is, light with a wavelength of 580 nm, hence the argument in favour of yellow fire engines.
Title: Bit-Depth understanding required
Post by: PeterLange on July 26, 2006, 04:19:08 pm
Quote
To make it clearer for those who might be confused by tables....

In addition to Bill’s and Oscar’s posts,
let me try to develop an example:

… by first creating a most saturated Yellow in CIE RGB:
RGB = 255, 255, 0
HSB = 60, 100, 100

… now, referring to the CIE color matching functions
(see page 4: http://www.fho-emden.de/~hoffmann/ciexyz29082000.pdf) (http://www.fho-emden.de/~hoffmann/ciexyz29082000.pdf))
it can be easily seen that the r and g weighting curves meet each other at about 570 nm.

… so IF my humble monitor would be able to show all CIE RGB colors
based on the respective RGB primaries of 700/546.1/435.8 nm
(in the sense of a thought experiment)
above created yellow would look right the same
as a spectrally pure yellow of 570 nm.

At least this should be valid for the standard human observer; neither for birds nor for people with fancy perception.

Peter

--
Title: Bit-Depth understanding required
Post by: Dennis on July 27, 2006, 08:27:53 am
Quote
It turns out that birds have a fourth sensor between the blue and green and have better and more extended color vision than humans.
It's not a trick, it's a SONY ;-)

 I always thought, that SONY invented the RGBE pattern...
Title: Bit-Depth understanding required
Post by: Dennis on July 27, 2006, 08:42:48 am
Quote
All colors of light are composed of the three primary colors for light, Red, Green and Blue.
Aha. Would you say, that the yellow light of the sodium emission spectrum is composed by red and green?

(http://library.thinkquest.org/21008/pictures/spectrum3.jpg)
(http://library.thinkquest.org/21008/data/sky/spectroscopy1.htm)

All colors of light can be composed of the three primary colors for light
Title: Bit-Depth understanding required
Post by: Ray on July 27, 2006, 08:14:39 pm
Quote
All colors of light can be composed of the three primary colors for light
[a href=\"index.php?act=findpost&pid=71849\"][{POST_SNAPBACK}][/a]


I agree, this distinction should be made and it appears is sometimes forgotten by many of us, including me. I now wonder what the difference would be between a red and green light, superimposed, neither of which extended into the yellow part of the spectrum, and a red and green light which both extended into the yellow region. I presume one would experience a sensation of 'brighter' yellow, but maybe not.
Title: Bit-Depth understanding required
Post by: DiaAzul on July 28, 2006, 04:11:24 am
Quote
All colors of light are composed of the three primary colors for light, Red, Green and Blue.

I agree, this distinction should be made and it appears is sometimes forgotten by many of us, including me. I now wonder what the difference would be between a red and green light, superimposed, neither of which extended into the yellow part of the spectrum, and a red and green light which both extended into the yellow region. I presume one would experience a sensation of 'brighter' yellow, but maybe not.
[a href=\"index.php?act=findpost&pid=71914\"][{POST_SNAPBACK}][/a]

Jack's original comment needs some modification - OUR PERCEPTION / VISUAL SYSTEM can be fooled into seeing all colours by careful stimulation of the rods and cones by the three primary colours. The three primary colours being those colours to which the rods/cones each individually are most sensitive.

It is not possible in physics to create a new colour (wavelength) by mixing three separate colours (wavelength) in the way that is being conveyed in this thread.


The choice of three primary colours is arbitrary in any system and, ideally, the camera sensor, display and printer primary colours (and colour response charactersitics) should match those of the average human eye (forget the need for the argument over sRGB, aRGB, ProPhoto, etc..which gives the best match to our own visual system).
Title: Bit-Depth understanding required
Post by: 32BT on July 28, 2006, 06:00:16 am
Quote
Jack's original comment needs some modification - OUR PERCEPTION / VISUAL SYSTEM can be fooled into seeing all colours by careful stimulation of the rods and cones by the three primary colours. The three primary colours being those colours to which the rods/cones each individually are most sensitive.

It is not possible in physics to create a new colour (wavelength) by mixing three separate colours (wavelength) in the way that is being conveyed in this thread.
The choice of three primary colours is arbitrary in any system and, ideally, the camera sensor, display and printer primary colours (and colour response charactersitics) should match those of the average human eye (forget the need for the argument over sRGB, aRGB, ProPhoto, etc..which gives the best match to our own visual system).
[a href=\"index.php?act=findpost&pid=71973\"][{POST_SNAPBACK}][/a]


Nope, forget all of this...

It turns out that, in order to simulate ALL visible colors with a single set of primaries, these primaries have to be defined outside the visible spectrum.

So ALL colors of light are NOT composed of some combination of a single set of REAL primaries, and that sort of eliminates the possibility of an ideal device with human response characteristics.

Think of the colorful shoe-sole. Can we define three points within this set that completely encompass the entire set?
Title: Bit-Depth understanding required
Post by: DiaAzul on July 28, 2006, 01:24:08 pm
Quote
So ALL colors of light are NOT composed of some combination of a single set of REAL primaries, and that sort of eliminates the possibility of an ideal device with human response characteristics.

[a href=\"index.php?act=findpost&pid=71984\"][{POST_SNAPBACK}][/a]

But we don't need to talk about all colours of light...only those that we can actually see. What is the point of recording colours that will be neither reproducable nor observable? That type of device may be required in a scientific or engineering environment, but for photography leading to viewable images there is no need to go to the obsessive extremes of the perfect device at exhorbitant cost.
Title: Bit-Depth understanding required
Post by: jani on July 29, 2006, 06:39:13 am
Quote
But we don't need to talk about all colours of light...only those that we can actually see. What is the point of recording colours that will be neither reproducable nor observable? That type of device may be required in a scientific or engineering environment, but for photography leading to viewable images there is no need to go to the obsessive extremes of the perfect device at exhorbitant cost.
 

So you're suggesting that IR and UV photography "isn't necessary", or that photography is always documentary of 'that we can actually see"?

I also suggest you read up a bit on human vision; the response curves cited around here are the normalised curves based on tests on a selection of humans. It most certainly isn't how everybody see.

Try looking up e.g. tetrachromacy (http://www.science-writer.co.uk/award_winners/16-19_years/2004/winner.html), or at the very least read the basic articles about human color vision and color blindness on Wikipedia.
Title: Bit-Depth understanding required
Post by: PeterLange on July 29, 2006, 05:49:43 pm
Quote
But we don't need to talk about all colours of light...only those that we can actually see. What is the point of recording colours that will be neither reproducable nor observable? That type of device may be required in a scientific or engineering environment, but for photography leading to viewable images there is no need to go to the obsessive extremes of the perfect device at exhorbitant cost.
[a href=\"index.php?act=findpost&pid=72012\"][{POST_SNAPBACK}][/a]
Imaginary colors & primaries are just a tradeoff to hold more or even all colors of human vision within a ‘triangular’ matrix space.

Monitors are of course limited to real-world phosphors / primaries.
Cameras are not.

Peter

--
Title: Bit-Depth understanding required
Post by: DiaAzul on July 30, 2006, 03:03:44 pm
Quote


So you're suggesting that IR and UV photography "isn't necessary", or that photography is always documentary of 'that we can actually see"?

I also suggest you read up a bit on human vision; the response curves cited around here are the normalised curves based on tests on a selection of humans. It most certainly isn't how everybody see.

Try looking up e.g. tetrachromacy (http://www.science-writer.co.uk/award_winners/16-19_years/2004/winner.html), or at the very least read the basic articles about human color vision and color blindness on Wikipedia.
[a href=\"index.php?act=findpost&pid=72054\"][{POST_SNAPBACK}][/a]

So what is your point, other than trying to be patronising?
Title: Bit-Depth understanding required
Post by: jani on July 30, 2006, 07:20:35 pm
Quote
So what is your point, other than trying to be patronising?
My point was not about trying to be patronising at all.

I was just shocked at your attitude towards IR and UV photography, that you should think that it somehow isn't usable for other than "scientific or engineering", and that you seemed so ignorant of the variations in human vision.

While a device doesn't need to be "perfect" -- whatever you mean by that -- there certainly are other uses -- artistic uses, like those of e.g. Bjørn Rørslett or our patron's -- than those represented with a very basic trichromatic model centered around narrowly chosen primaries.