Luminous Landscape Forum

Raw & Post Processing, Printing => Digital Image Processing => Topic started by: Guillermo Luijk on August 30, 2009, 06:04:45 am

Title: Is 8-bit enough for OUTPUT quality?
Post by: Guillermo Luijk on August 30, 2009, 06:04:45 am
A couple of times it has been discussed whether if 8 bits (i.e. 256 values) is enough or not for output quality, even assuming the processing was perfectly done in 16 bits and the conversion to 8 bit took place in the last step.

In plain colour areas with a good exposure, i.e. where the dithering effect of noise is minimised, some images display banding. Some people claims if this happens is only because of a bad processing, but the true is that with just one 8-bit level increments from one band to another, I can actually see the bands in the following image (look for the bands in the first image):

(http://img42.imageshack.us/img42/2587/bandas.jpg)

In the second version I enhanced these bands so that they could be easily identified.
The level increment in the first image was the finest achievable with an 8-bit encoding.

This also happens sometimes in the skies, which have the same properties as this wall (plain colour and absence of noise).

What do you think?

1. Is 8-bit (and therefore JPEG) enough for output quality? or it dangerously nears the perceivable limit?
2. Is this very slight banding visible when printing?
3. Could I be being able to see these bands because of a bad calibration of my screen?

Regards.
Title: Is 8-bit enough for OUTPUT quality?
Post by: Graeme Nattress on August 30, 2009, 09:22:11 am
8bit, properly dithered, is probably good enough, but not optimum. It looks like your image was not dithered, or if it was lossy compressed with JPEG, that lossy codec will see the dither as noise and probably remove most of it.
Title: Is 8-bit enough for OUTPUT quality?
Post by: bjanes on August 30, 2009, 10:10:16 am
Quote from: GLuijk
What do you think?

1. Is 8-bit (and therefore JPEG) enough for output quality? or it dangerously nears the perceivable limit?
2. Is this very slight banding visible when printing?
3. Could I be being able to see these bands because of a bad calibration of my screen?

Regards.

Guillermo,

A thought provoking post. Your demonstration image appears to have a very steep contrast curve in the areas where the banding is apparent, which accentuates the banding. Whether or not banding would be present with more normal contrast is debatable. Before attempting to answer your question, I performed a Google search and reviewed some literature on the topic.

Norman Koren (http://www.normankoren.com/digital_tonality.html) uses a theoretical approach based on the Weber-Fechner law, which deals with just perceptible differences in vision, and concludes that an 8 bit work flow for the final image is acceptable, but just so.

A Previous Thread (http://luminous-landscape.com/forum/index.php?s=&showtopic=14241&view=findpost&p=96132) on LL discussed 16 bit printing, and some photographers claimed to see a difference. However, 16 bit printer drivers at that time appeared to be available only on the Mac with more recent versions of the Mac operating system. For us Windows users, the question may be academic unless special drivers supporting 16 bit are available.

Mike Chaney (http://www.steves-digicams.com/knowledge-center/hype-or-hero-take-2-16-bit-printers.html) discusses 16 bit printing with the newer ink jet printers offering an expanded gamut and concludes that 16 bit printing might offer a small advantage for those using 16 bit ProPhotoRGB.

Since I use 16 bit ProPhotoRGB for my more critical work with my Epson 2200 printer, it would make sense to have a 16 bit driver, but I have not noted banding in my output and do not think that 16 bit output is necessary with this printer. With the more recent Epsons with still wider gamuts, it could help, but I can't comment on this. From a leterature review, I conclude that the improvement would be small. I would be interested in hearing from users of wider gamut printers, since I may upgrade printers in the near future. Also, I would like to know if 16 bit output would be possible with Windows 7 with Photoshop.

Title: Is 8-bit enough for OUTPUT quality?
Post by: feppe on August 30, 2009, 10:22:46 am
Very good post, indeed, looking forward to insights from the more technically oriented. My gut feeling is that there are surely theoretical and technical demonstrations to show there is a difference, but in 99.999% of real-world photo cases you won't see a difference.

The banding you show in the original picture is not visible on my non-calibrated* monitor. The banding is clearly visible in the exaggerated version, but I'm not sure what the point of showing that is since that's not the on you'll print?

I've only seen banding in one of my photos in the sky, but that was on monitor.

* had to reinstall Win7 since my WD Raptor died last week, thanks for reminding to calibrate.
Title: Is 8-bit enough for OUTPUT quality?
Post by: digitaldog on August 30, 2009, 10:38:51 am
Quote from: GLuijk
1. Is 8-bit (and therefore JPEG) enough for output quality? or it dangerously nears the perceivable limit?
2. Is this very slight banding visible when printing?
3. Could I be being able to see these bands because of a bad calibration of my screen?

1. I suspect its possible, it depends on the output device, but most as mentioned are handing off 8-bits of data anyway. 8-bit good bits have been fine for years but output devices are getting better. Epson claims that while you'll be hard pressed to see a difference sending more than 8-bits through there newer drivers, it gives them more ammo in the future to deal with the data, dither etc and, since you started out with more than 8-bits, why truncate the data?

2. I suspect it will be difficult to see if at all but again, depends on the printer and most importantly, #3, it is possible this is the effect of banding from the display if its just a screen capture. The father you calibrate a display past its native behavior, with panels that don't support high bit (use 8-bit through the graphic system), the more likely you can see banding on screen. The problem is, as we see here, its unknown if the banding is in the document or the result of the display.
Title: Is 8-bit enough for OUTPUT quality?
Post by: Mark D Segal on August 30, 2009, 11:44:50 am
Guillermo,

My take on this is as follows:

(1) 8-bit vs 16-bit impact on image quality occurs mainly at the processing stage, not the printing stage. It's best to have the math performed on high-bit data so you don't get the kind of gaps between levels which would show as banding. It makes a hell of a difference if you lose say 20 levels in a 255 level scale versus the same loss an app. 32,000 level scale (Photoshop 16 bit is 15-bit +1). But once all the editing work is done, and you've resized the image to output specification in 16-bit mode, it's baked into the image at that point, and if you then convert it to 8-bits and do nothing but send it to an output device you won't lose any smoothness of tonal gradation.

(2) As Bill Janes observed, the lower half of your illustration has extremes of tonal compression and with adjustments of this radical nature you've lost too many levels in 8-bit mode and bust the barriers so to speak. You can only stretch an image so far with these adjustments before the destruction of levels shows very obviously. However, if you repeat the same experiment in 16-bit it could look less bad.

(3) As Andrew says, the printers are 8-bit devices. Added to that, how they print has nothing to do with the bit depth of your image file. The printer driver is figuring out the best way to dither the CMYK ink dots of your RGB image in order to print a photographic-looking image. This technology is unrelated to the bit depth of the image file - it's the screening and dithering technology of the printer driver, or a RIP if you are using one of those. I don't know what a 16-bit printer would do to improve how this technology works, but what we have now in the new generation of Epson printers (all 8 bit devices) delivers truly astounding image quality. It's always hard to imagine much better unless you actually see it when they invent it!

Mark
Title: Is 8-bit enough for OUTPUT quality?
Post by: sandymc on August 30, 2009, 11:48:36 am
Thing is, with a sufficiently steep tone curve in post processing, you could get a 16 bit image to show the same banding.  

The question of whether 8-bits is enough is really two questions - is 8-bits enough for your processing, and is 8 bits enough for your output?

In this case, I think it's clear that 8 bits is not enough for your processing.

Sandy
Title: Is 8-bit enough for OUTPUT quality?
Post by: gcs on August 30, 2009, 12:08:22 pm
Quote from: MarkDS
Guillermo,

My take on this is as follows:

(1) 8-bit vs 16-bit impact on image quality occurs mainly at the processing stage, not the printing stage. It's best to have the math performed on high-bit data so you don't get the kind of gaps between levels which would show as banding. It makes a hell of a difference if you lose say 20 levels in a 255 level scale versus the same loss an app. 32,000 level scale (Photoshop 16 bit is 15-bit +1). But once all the editing work is done, and you've resized the image to output specification in 16-bit mode, it's baked into the image at that point, and if you then convert it to 8-bits and do nothing but send it to an output device you won't lose any smoothness of tonal gradation.

Mark,

Reading your comment (1), I now understand why I had banding on the sky of a 40x40" (100x100 cm) b&w print, I first converted from 16 to 8 bit, then made the  resize. After printing the Epson technician said that the banding problem came from the image and not the printer. I opened the file at 16 bit, zoom to 1000% and showed him that my file had no banding, but never checked the 8 bit version.

I will check it again.

Gonzalo
Title: Is 8-bit enough for OUTPUT quality?
Post by: Guillermo Luijk on August 30, 2009, 12:17:30 pm
Quote from: MarkDS
(2) As Bill Janes observed, the lower half of your illustration has extremes of tonal compression and with adjustments of this radical nature you've lost too many levels in 8-bit mode and bust the barriers so to speak. You can only stretch an image so far with these adjustments before the destruction of levels shows very obviously. However, if you repeat the same experiment in 16-bit it could look less bad.

Hi Mark and all, I am not sure if my images are being correctly understood. The image where I can see the banding is the first one. It's hard to see it, but still can be perceived at least in my monitor. The second version was just a pp to clearly display the bands actually found in the first image, where the level jumps could not be smoother in a 8-bit scale.

So I wonder what's the solution for this: ignore it since the banding is very slight and hardly seen, add noise to dither banding, or completely forget about it because it will not be a problem when printing.

The source of the image is an overexposed RAW file that had to be underxposed by 4EV to crate the final image, the high SNR is the reason why there is an almost total absence of noise.

Regards.
Title: Is 8-bit enough for OUTPUT quality?
Post by: Mark D Segal on August 30, 2009, 12:23:24 pm
Guillermo, do you think the enhancement of the bands also contributed to worsening them?

Anyhow, I think what I suggested to you holds. Stay in 16 bit as long as you can. Good insurance.

Mark
Title: Is 8-bit enough for OUTPUT quality?
Post by: Guillermo Luijk on August 30, 2009, 12:30:52 pm
Quote from: MarkDS
Guillermo, do you think the enhancement of the bands also contributed to worsening them?
No, the bands are the same on both images, just contrast added on the second image. Note that we could never have smoother transitions since the jump in levels is the minimum for an 8-bit scale: (89,79,70) to (90,80,71) for the sample, just 1 level shift on all 3 channels.

Reading at the comments about the improving printing equipment, it's perhaps the time for a 16-bits lossy compressed format to start becoming popular for demanding applications.

BR
Title: Is 8-bit enough for OUTPUT quality?
Post by: Panopeeper on August 30, 2009, 09:06:47 pm
Guillermo,

questions:

1. Is what you posted a 100% crop or a reduced size?

2. In which size do you see the banding - 100%, or displayed in Photoshop 25% or whatever?

3. Falls these are reduced sizes: have you tried reduce the size of the image (not the displayed size by zooming out?

4. How did you "reduce the exposure by 4 EV"?
Title: Is 8-bit enough for OUTPUT quality?
Post by: laughfta on August 31, 2009, 08:34:31 am
I generally approach these types of problems quite "unencumbered by the thought process", so if you could make the file available, I would be happy to print it, which would at least be an indication of where the problem lies. I have printed quite a few files where some banding was visible on the display (at any %), but the print itself was fine. I have also had a file with barely visible banding in the sky that showed colored banding in the print. Reducing the image/file size (radically) eliminated the problem. That was an 8-bit imageā€”I don't remember seeing this problem at all working in 16-bit.
Title: Is 8-bit enough for OUTPUT quality?
Post by: Guillermo Luijk on August 31, 2009, 03:01:29 pm
Quote from: Panopeeper
1. Is what you posted a 100% crop or a reduced size?
2. In which size do you see the banding - 100%, or displayed in Photoshop 25% or whatever?
3. Falls these are reduced sizes: have you tried reduce the size of the image (not the displayed size by zooming out?
4. How did you "reduce the exposure by 4 EV"?

1. It's a resized version of an entire image. Unfortunately I don't have the original nor the RAW files anymore.
2. I see it in that size, don't remember about how it did in the original size but checking similar images I have of the same scene, at full size I cannot distinguish any banding.
4. Dividing the RGB values by a scaling factor close to 16.

I suppose the key you are seeking is in 2. Perhaps for being a reduced size, noise was averaged and SNR improved so much that banding became visible, but was probably not visible in the original image thanks to noise dithering.

Anyway I wouldn't like to focus on this particular image, but in the general case of problems arising with 8-bit outputs, no matter if they come from a photograph or any other source such a synthetic gradation. Image you compose an image with a synthetic background which is a perfect transition of levels, and after converting to 8-bit you see the banding. I think this proves 8 bit could not be enough to cheat our eye in all situations,making some dithering techniques necessary to display without noticing any banding.

(http://img27.imageshack.us/img27/8062/bandsl.gif)

BR
Title: Is 8-bit enough for OUTPUT quality?
Post by: cmi on August 31, 2009, 05:15:29 pm
Quote from: GLuijk
A couple of times it has been discussed whether if 8 bits (i.e. 256 values) is enough or not for output quality, even assuming the processing was perfectly done in 16 bits and the conversion to 8 bit took place in the last step.

Very easy issue. That 8bit imagery can show visible banding given fine enough color transitions doesnt need to be discussed. If it gonna be an issue depends on the image and the output media. When printing fine gradients, I always put in noise (error diffusion) to hide the banding, seems to be the common way. Also when outputting to TV we often used dithering for some gradients. What I havn't done but heard is output to film, and the consensus back then was that e.g. for the big screen, cinema, 8 bit just wont cut it, so they used 12 (if I remember right) or more bits. So for me, for print, I would personally need to see the difference between 8 bit dithered and 16 bit undithered. Maybe it exists, but these cases have to be rare.

Christian


//edit: For very large prints I cannot say, maybe there is a point where 8 bit dithered files are just not adequate. Somebody with actual hands on experience printing very large sizes could very easily clarify that.
Title: Is 8-bit enough for OUTPUT quality?
Post by: NikoJorj on September 01, 2009, 04:09:52 pm
Quote from: GLuijk
In plain colour areas with a good exposure, i.e. where the dithering effect of noise is minimised, some images display banding.
Just to add 2c of Euro : I also can see some slight banding in the first image, but I'd personnally have the opinion that the dithering introduced by inkjet printing could easily hide it in most cases...
Title: Is 8-bit enough for OUTPUT quality?
Post by: madmanchan on September 01, 2009, 04:27:40 pm
My opinion is that the original question needs to be posed more carefully. Guillermo wants to know whether 8 bits is enough for output, not editing. But what, exactly, does the term "output" mean? To me, the term "output" really means the final space after which no additional color transformations are applied by devices (hardware or software) prior to viewing.

If you post an 8-bit file on a web site, as done here, that's not really an "output" file. That 8-bit image may undergo additional color transformations via the viewing application or OS's color management system (via the display profile), and possibly the video card. These can easily introduce banding and other artifacts.

Similarly, you could take an 8-bit file tagged with a standard RGB working space and say it's ready to be printed, no further editing to be done ... but it's still not really an "output" file. The image data will still undergo a transformation via the output profile into the device space, and again, artifacts can be introduced in this process and can sometimes be visible in the final print.
Title: Is 8-bit enough for OUTPUT quality?
Post by: Mark D Segal on September 01, 2009, 05:00:09 pm
Quote from: madmanchan
My opinion is that the original question needs to be posed more carefully. Guillermo wants to know whether 8 bits is enough for output, not editing. But what, exactly, does the term "output" mean? To me, the term "output" really means the final space after which no additional color transformations are applied by devices (hardware or software) prior to viewing.

If you post an 8-bit file on a web site, as done here, that's not really an "output" file. That 8-bit image may undergo additional color transformations via the viewing application or OS's color management system (via the display profile), and possibly the video card. These can easily introduce banding and other artifacts.

Similarly, you could take an 8-bit file tagged with a standard RGB working space and say it's ready to be printed, no further editing to be done ... but it's still not really an "output" file. The image data will still undergo a transformation via the output profile into the device space, and again, artifacts can be introduced in this process and can sometimes be visible in the final print.

Eric, in your last para above, if I understand correctly, once the file is prepped for print but we haven't pushed the print button yet, eventhough we see a version in the Print Preview, that's not the final set of data going to the printer. To understand the process a bit better- what happens after we click Print? Let's say it's going to an Epson 3800 and we're not using a RIP - we're using the Epson driver, and before we click print we're in 16 bit ProPhoto colour working space. At least three things I'm aware of need to happen - Photoshop (through the PCS) converts the numbers from Photoshop's ProPhoto RGB working color space to the print colour space guided by the printer/paper profile, then the Epson driver converts the data from 16 to 8 bit and it needs to re-interpret the RGB data to CMYK for printing. (1) Is this a correct portrayal of the critical steps? (2) Where in the chain do you think would enter the greatest risk of banding that you could not see on say a "10-bit" display but would show on paper?

Mark
Title: Is 8-bit enough for OUTPUT quality?
Post by: bjanes on September 01, 2009, 05:40:44 pm
Quote from: GLuijk
Hi Mark and all, I am not sure if my images are being correctly understood. The image where I can see the banding is the first one. It's hard to see it, but still can be perceived at least in my monitor. The second version was just a pp to clearly display the bands actually found in the first image, where the level jumps could not be smoother in a 8-bit scale.
Regards.

I, for one, mis-interpreted your original intent. I now underastand that banding was present in the original image (I can't see it on my screen), but was accentuated by the contrast curve. Since 8 bits is just adequate for most images, there must be some cases where 16 bit output would be better. Whenever editing of an image causes the Weber-Fechner limit to be exceeded, banding might be perceptible. However, if it can be perceived only with special curves and is not visible in the final image, then it is of little concern.  With the current low noise of current cameras at low ISO, noise dithering may no longer mask posterization. Emil's web site has examples of how noise can dither out posterization.
Title: Is 8-bit enough for OUTPUT quality?
Post by: Guillermo Luijk on September 01, 2009, 05:54:10 pm
Quote from: bjanes
With the current low noise of current cameras at low ISO, noise dithering may no longer mask posterization.
I think this is the key of the story. Following this idea, the problem should arise more easily on cameras with high per-pixel SNR. For instance modern FF cameras with relatively low resolutions like the Nikon D700.

BR
Title: Is 8-bit enough for OUTPUT quality?
Post by: Schewe on September 01, 2009, 06:49:29 pm
Quote from: madmanchan
My opinion is that the original question needs to be posed more carefully. Guillermo wants to know whether 8 bits is enough for output, not editing. But what, exactly, does the term "output" mean? To me, the term "output" really means the final space after which no additional color transformations are applied by devices (hardware or software) prior to viewing.


Output should mean at the print head _AFTER_ all manner of color transforms have occurred or at the display, same spec. At the moment, only 3rd party rips and certain drivers (the Epson Mac drivers or the Canon plug-ins) can even handle more than 8 bits/channel because for Windows, there is no wider bit pipeline (talk to MSFT about that if it bothers you).

When it comes down to the display, the question is what is the computer feeding the system's display pipeline with and what is the vid card and/or display doing with the data?

In terms of 8 bit at the print head, yes, I do think that 256 actual levels per channel is enough for pretty much anything other than perhaps synthetic gradient blends that have no photo grain or noise built in. But the real question is what constitutes the actual output? Is it _AFTER_ any and all color space and gamma adjustments have been made? If so, prolly yes for photos...possibly not for Illustrator type gradations and tints...
Title: Is 8-bit enough for OUTPUT quality?
Post by: madmanchan on September 02, 2009, 04:03:52 pm
Hi Mark, yes, you understood correctly. I think banding can certainly be introduced at both of those steps: i.e., the color transformation via the printer profile, as well as the color math inside the driver. I cannot really comment on the latter, knowing very little about how the internals of modern print drivers work -- though I would guess it is rather unlikely to be problematic.

It is usually quite easy to avoid the former problem. For example, if you take a raw file in LR and hit the Print button, LR will render the raw file to a 16-bit in-memory image (you can think of this as a 16-bit ProPhoto TIFF that is never written to disk as a file) and does the printer profile color math on it, using the 16-bit data in the ICC profile (assuming your printer profile is 16 bits -- many are). I have not seen a case where visible banding in the print was due to this part of the processing pipeline.

Cheers,
Eric
Title: Is 8-bit enough for OUTPUT quality?
Post by: Mark D Segal on September 02, 2009, 07:30:19 pm
Quote from: madmanchan
It is usually quite easy to avoid the former problem. For example, if you take a raw file in LR and hit the Print button, LR will render the raw file to a 16-bit in-memory image (you can think of this as a 16-bit ProPhoto TIFF that is never written to disk as a file) and does the printer profile color math on it, using the 16-bit data in the ICC profile (assuming your printer profile is 16 bits -- many are). I have not seen a case where visible banding in the print was due to this part of the processing pipeline.

Cheers,
Eric

Thanks Eric, that is most interesting about the LR pipeline - it seems kind of like a RIP?  I wonder whether Epson's own profiles or the ones I roll with my Pulse Elite and accompanying X-Rite software are indeed 16-bit and whether there is a way to know for sure. Also I'm wondering whether the process you describe for LR is also valid for PS. E.G. I'm printing from CS4 because I still need to make the final tweaks under softproof   I must say, though, I very seldom have a case of banding on paper, even with sky gradients I make from time to time and which can begin to show slight evidence of banding on the display while in 16-bit PS ProPhoto working space.
Title: Is 8-bit enough for OUTPUT quality?
Post by: madmanchan on September 03, 2009, 12:20:19 am
Hi Mark, yes, subtle sky gradients to tend to show banding on displays. This is a good example of how 8 bits isn't enough for an intermediate representation (the intermediate representation being the data sent to the video card; the image data then typically undergoes further encodings ...)

Epson's provided profiles are 16-bit. I'm pretty sure all of the stock profiles provided by the popular printer makers are 16-bit, though I haven't personally checked them all.
Title: Is 8-bit enough for OUTPUT quality?
Post by: papa v2.0 on September 03, 2009, 09:00:28 am
Hi
Interesting thread.

can I ask what is the image history, ie from capture to the example posted.

8 bit should be sufficient for most applications. 3*8 bit give 16777 million colours and it is suggested that we see only 10 million (Judd and Wyszecki,1975; Pointer and Altridge,1998; Pointer, 1998). Although having said that the reproduction of a continuous tone greyscale has its problems, partly due to the eyes unequal response to different parts of the greyscale compared with equal luminance luminance quantization for example.

So I wonder where the problem is lying.

Title: Is 8-bit enough for OUTPUT quality?
Post by: Mark D Segal on September 03, 2009, 09:33:08 am
Quote from: papa v2.0
Hi
Interesting thread.

can I ask what is the image history, ie from capture to the example posted.

8 bit should be sufficient for most applications. 3*8 bit give 16777 million colours and it is suggested that we see only 10 million (Judd and Wyszecki,1975; Pointer and Altridge,1998; Pointer, 1998). Although having said that the reproduction of a continuous tone greyscale has its problems, partly due to the eyes unequal response to different parts of the greyscale compared with equal luminance luminance quantization for example.

So I wonder where the problem is lying.

The issue isn't how many colours you can see. It is about the number of levels of tonality needed to protect against posterization or banding in the transformation process, because levels get lost with tonal transformations, so the more you start with the lower the risk
Title: Is 8-bit enough for OUTPUT quality?
Post by: Mark D Segal on September 03, 2009, 10:45:00 am
Quote from: madmanchan
Hi Mark, yes, subtle sky gradients to tend to show banding on displays. This is a good example of how 8 bits isn't enough for an intermediate representation (the intermediate representation being the data sent to the video card; the image data then typically undergoes further encodings ...)

Epson's provided profiles are 16-bit. I'm pretty sure all of the stock profiles provided by the popular printer makers are 16-bit, though I haven't personally checked them all.

Thanks Eric. Sounds right. I should try to find out whether the X-Rite Pulse Elite profiles are also 16 bit - one would think so.

Title: Is 8-bit enough for OUTPUT quality?
Post by: Schewe on September 03, 2009, 11:14:10 am
Quote from: MarkDS
I wonder whether Epson's own profiles or the ones I roll with my Pulse Elite and accompanying X-Rite software are indeed 16-bit and whether there is a way to know for sure.


It's not the profile that determines the bit depth of the color transform, it's the CMM. ColorSynce and Windows have CMMs that are 16 bit but if you do color transforms in Photoshop or Lightroom using the Adobe ACE CMM, the transform is done in 20 bits/channel precision, not limited to 16 bits. So, if you are using Photoshop or Lightroom to handle the color profile transforms and you are starting in 16 bit, the profile transforms are done 16>16 in 20 bits then reduced to the bit depth of the print pipeline AFTER the transform. In the case of Leopard and certain printers, that's 16 bit. On Windows it's currently limited to 8 bits. But it doesn't hit 8 bits till AFTER the color transforms.
Title: Is 8-bit enough for OUTPUT quality?
Post by: Mark D Segal on September 03, 2009, 11:44:23 am
Quote from: Schewe
It's not the profile that determines the bit depth of the color transform, it's the CMM. ColorSynce and Windows have CMMs that are 16 bit but if you do color transforms in Photoshop or Lightroom using the Adobe ACE CMM, the transform is done in 20 bits/channel precision, not limited to 16 bits. So, if you are using Photoshop or Lightroom to handle the color profile transforms and you are starting in 16 bit, the profile transforms are done 16>16 in 20 bits then reduced to the bit depth of the print pipeline AFTER the transform. In the case of Leopard and certain printers, that's 16 bit. On Windows it's currently limited to 8 bits. But it doesn't hit 8 bits till AFTER the color transforms.

Hi Jeff, and thanks for those clarifications. I guess from what you are saying it would be fait to conclude that because any math likely to impact on the banding issue is done before the printer driver in 16-bit, we've got that protection, then the rest depends on the printer driver, which is 8 bit on most of our Epsons anyhow; so we wouldn't worry about the Windows constraint unless we owned one of those 16 bit printers. Is that correct?

Title: Is 8-bit enough for OUTPUT quality?
Post by: JeffKohn on September 03, 2009, 12:25:55 pm
Quote from: MarkDS
Hi Jeff, and thanks for those clarifications. I guess from what you are saying it would be fait to conclude that because any math likely to impact on the banding issue is done before the printer driver in 16-bit, we've got that protection, then the rest depends on the printer driver, which is 8 bit on most of our Epsons anyhow; so we wouldn't worry about the Windows constraint unless we owned one of those 16 bit printers. Is that correct?
Yes it's correct in theory, but it doesn't mean that the transform to the printer profile will never cause banding. Even 20-bit precision won't guarantee that you avoid banding if it's a crummy printer profile. The quality of the printer profile is definitely a factor.
Title: Is 8-bit enough for OUTPUT quality?
Post by: Mark D Segal on September 03, 2009, 12:44:33 pm
Quote from: JeffKohn
The quality of the printer profile is definitely a factor.

Sure, I would expect so.

Mark