Pages: [1]   Go Down

Author Topic: 8bit vs. 16bit ICC Profile  (Read 14956 times)

MBehrens

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 321
8bit vs. 16bit ICC Profile
« on: July 22, 2008, 10:43:08 pm »

I have a new Viewsonic VP2250wb LCD and I cannot find any claim to the bit level of its LUT. Unlike higher end monitors that boast a 12bit or higher LUT. Since this is a relatively low cost LCD, I'm guessing that it is 8 bit device. Is there any way to determine this?

I use a Spyder 2 colorimitor and ColorEyes Display Pro to calibrate. CEDP provides the capability of an 8 or 16 bit, or Matrix ICC profile. How important is it to match the bit level of the profile to the device? I've tried both 8 and 16 and there are subtle differences between the 2. Frankly I think the 8 bit is better, not as red. But maybe the 16 bit is more accurate despite my personal impression.

Thanks for any guidance.
Logged

MBehrens

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 321
8bit vs. 16bit ICC Profile
« Reply #1 on: July 26, 2008, 01:42:57 am »

Hmmm... Is this a stupid question?
Logged

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1948
    • zarzadzaniebarwa.pl
8bit vs. 16bit ICC Profile
« Reply #2 on: July 26, 2008, 06:35:58 am »

Quote
I have a new Viewsonic VP2250wb LCD and I cannot find any claim to the bit level of its LUT. Unlike higher end monitors that boast a 12bit or higher LUT. Since this is a relatively low cost LCD, I'm guessing that it is 8 bit device. Is there any way to determine this?

I use a Spyder 2 colorimitor and ColorEyes Display Pro to calibrate. CEDP provides the capability of an 8 or 16 bit, or Matrix ICC profile. How important is it to match the bit level of the profile to the device? I've tried both 8 and 16 and there are subtle differences between the 2. Frankly I think the 8 bit is better, not as red. But maybe the 16 bit is more accurate despite my personal impression.

Thanks for any guidance.
[a href=\"index.php?act=findpost&pid=210068\"][{POST_SNAPBACK}][/a]

Profile has nothing to do with monitors LUT. It doesn't matter if it maches LUT's bit level. Profile contains the colorimetric characterisation of the device, that's used by application Color Management Module.

Naturally, 16 bit table profile is more accurate than 8 bit profile, but - in case of monitor profiling - simple matrix profile is also recommended - it may be less acurate, but it provides better smoothness of gradations.
Logged
Marcin Kałuża | [URL=http://zarzadzaniebarwa

tho_mas

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1799
8bit vs. 16bit ICC Profile
« Reply #3 on: July 26, 2008, 07:13:35 am »

bit depth of the profile and the hardware are different things.
The displays LUT enables the display to adjust the RGB chanels with more or less steps in tonalvlaues. With an 8bit LUT the display can differentiate 256 (2^8=256) tonalvalues in each color chanel. With a 10bit LUT the display can differentiate 1024 (2^10=1024) steps to display the 256 tonalvalues of the RGB data.
The DVI connection cable is just able to send 8bit data to the display. So why is a LUT with higher bit depth in the display a good thing? If you adjust the display to a certain whitepoint (e.g. 5500K) you have to reduce one or two color chanels in the display. If the display has an 8bit LUT you reduce the tonalvalues of these color chanels to, for example, 200 tonal values. With a 10bit LUT you are able to reduce a color chanel up to 50% and still have 512 steps to differentiate the 256 tonalvalues of the RGB data.
So higher bit depth in the displays LUT is good to adjust a certain whitepoint (and "gamma" if adjustable in the hardware) without the loss of tonalvalues to display the 8 bit RGB data sent from the graphics card.

Profiles: there are matrix and LUT profiles. The LUT pofiles (and just here you can choose 8 or 16bit as the matrix profiles are always able to adress 16bit) are basically more accurate. They store much more color grid points in 3D tables (this is why the file size is much bigger than that of a matrix profile). But! They can produce some banding. Up to CS2 Photoshop is not able to manage LUT display profiles in a correct way. Since CS3 Photoshop is able to manage LUT profiles accurate.
I would recommend to choose matrix profiles for smoother transitions and to avoid banding. LUT profiles just with CS3 and in two cases: when the display is terribly linearised and this the only way to bring it close to a "good" grey axis. Or, the contrary, if the display is very good (and perfectly linearised through hardeware calibration) and is used in pre print production where highest color accuracy is needed. In any case I would set a LUT profile to 16bit so that Photoshop is able to convert to the display profile in the best way.
« Last Edit: July 26, 2008, 07:19:25 am by tho_mas »
Logged

MBehrens

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 321
8bit vs. 16bit ICC Profile
« Reply #4 on: July 27, 2008, 02:16:11 pm »

Quote
I would recommend to choose matrix profiles for smoother transitions and to avoid banding. LUT profiles just with CS3 and in two cases: when the display is terribly linearised and this the only way to bring it close to a "good" grey axis.
[a href=\"index.php?act=findpost&pid=210792\"][{POST_SNAPBACK}][/a]

I'm using Lightroom 1.4.1 (awaiting LR2) I expect that it can take advantage of LUT profiles, and this may explain a message I get in LR that it is "Rendering: Settings changed..." since changing displays to the one mentioned above (not sure if it does this with each profile update..)

I'll try the Matrix profile since both you and Czornyj suggest his for smooth gradations.

Thanks! This helps me a lot.
Logged
Pages: [1]   Go Up