Luminous Landscape Forum

Raw & Post Processing, Printing => Digital Image Processing => Topic started by: Serge Cashman on May 31, 2006, 12:35:51 am

Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on May 31, 2006, 12:35:51 am
I would like to adjust a white point of an LCD monitor (without 10 or more bit internal LUTs)  to a certain target

From what I understand monitor RGB buttons change internal monitor LUTs.

Videocard LUTs work the same as gamma adjustments by loading videocard LUTs during startup.

I understand there are TWO sets of 8 bit LUTs in this case, both of which can be adjusted to achieve the target white point. And both have the same problem of the 8 bit color adjustment.

Is this correct or way off?
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Stephen Best on May 31, 2006, 01:31:59 am
If the display only has a 8-bit LUT, leave both the white point and gamma at their native settings (assuming the monitor enables you to do so) and make the changes in the card ... or just profile it as is. My understanding is that most modern video cards have greater than 8-bit LUTs. The Radeon 9600 in my Mac is 10-bit.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: digitaldog on May 31, 2006, 09:15:30 am
Quote
If the display only has a 8-bit LUT, leave both the white point and gamma at their native settings (assuming the monitor enables you to do so) and make the changes in the card ... or just profile it as is. My understanding is that most modern video cards have greater than 8-bit LUTs. The Radeon 9600 in my Mac is 10-bit.
[{POST_SNAPBACK}][/a] (http://index.php?act=findpost&pid=66978\")

Problem is the OS and most applications can't use that extra data. Note the post to the ColorSync list by Bill of NEC:

Quote
While the Matrox board has 10 bit DACs (as do ATIs), this is of course only
useful for older analog monitors. Modern digital DVI monitors are not able
to take advantage of this.

Additionally, the data going into the 10 bit DACs is still 8 bit data from
the frame buffer but going through an 8 in x 10 bit out LUT. While having a
10 bit LUT on the video card is a great step forward, it is useless on
todays digital monitors since single-link DVI is an 8 bit bottleneck.

The ATI Avivo chipset mentioned that features 10 and 16 bit output is
currently only really useful for motion video, since the video stream data
can be processed (gamma, de-interlacing, color conversion, scaling etc.)
within the card at higher than 8 bit depth.

However until the core OSs and applications are updated to support frame
buffers of >8 bit depth, this is not of much use to all of us using
Photoshop etc.

FYI - There is a tech paper on the LCD2180WG-LED that explains some of the
issues with color bit depths as related to the increased gamut size of the
LED display, and why it is not quite such an issue as one individual has
suggested it to be. Certainly none of the color professionals worldwide who
have been involved with the display throughout it's development have raised
it as being a concern.

[a href=\"http://www.necdisplay.com/products/LCD2180WGLED_Techpaper.htm]http://www.necdisplay.com/products/LCD2180...D_Techpaper.htm[/url]

Will Hollingworth
Manager of OEM Product Design & Development Engineering
NEC Display Solutions of America, Inc.
http://www.necdisplay.com (http://www.necdisplay.com)
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Stephen Best on May 31, 2006, 09:48:45 am
Quote
Problem is the OS and most applications can't use that extra data.
[a href=\"index.php?act=findpost&pid=67000\"][{POST_SNAPBACK}][/a]

Thanks. I was going on how it used to work. The fact remains that there's no point in twiddling knobs (other than brightness) on displays with 8-bit DACs ... wouldn't you agree?
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: bjanes on May 31, 2006, 10:32:21 am
Quote
Thanks. I was going on how it used to work. The fact remains that there's no point in twiddling knobs (other than brightness) on displays with 8-bit DACs ... wouldn't you agree?
[a href=\"index.php?act=findpost&pid=67002\"][{POST_SNAPBACK}][/a]

If you read the NEC white paper, the 8 bit DVI bottleneck is not so great as it would first seem, since the 8 by 8 bit LUT on the host computer can be set to linear. Gamma correction and white balance can then be done in a 10 bit LUT in the display device.

It sounds like a very nice display, but at around US $6,000 it will be used only in high end applicatons.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: 61Dynamic on May 31, 2006, 01:47:07 pm
Quote
Thanks. I was going on how it used to work. The fact remains that there's no point in twiddling knobs (other than brightness) on displays with 8-bit DACs ... wouldn't you agree?
[a href=\"index.php?act=findpost&pid=67002\"][{POST_SNAPBACK}][/a]
Bingo. Don't screw with the video card, don't screw with the monitor OSD. The best calibrations come from only adjusting the brightness (the only analog adjustment available on an LCD).

If you need to change the color temp of the display simply set the calibration software to the desired temp and let it build the appropriate profile.

This will net you the best results. If your needs demand better results, then you should be buying displays that operate at the desired temp to begin with.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: 32BT on May 31, 2006, 04:04:04 pm
Quote
Problem is the OS and most applications can't use that extra data. Note the post to the ColorSync list by Bill of NEC:

While having a
10 bit LUT on the video card is a great step forward, it is useless on
todays digital monitors since single-link DVI is an 8 bit bottleneck.

Which is why they invented temporal dithering!

ATI Radeon X overview (http://www.ati.com/products/RadeonX1900/specs.html)

Given that the Mac supports 16bit video luts at the OS level, and the GMB match software seems to adjust the lut for some of its measurements, you may even be taking advantage of the extra info as it currently is...

Note that it is better to have a temporal dithered 6bit panel, than an unstable 8bit panel.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on May 31, 2006, 10:44:19 pm
Quote
If you need to change the color temp of the display simply set the calibration software to the desired temp and let it build the appropriate profile.
[a href=\"index.php?act=findpost&pid=67028\"][{POST_SNAPBACK}][/a]

...Achieving the desired temperature using WHAT exactly? That's kind of the main point of my question. I suppose it would be via the videocard's LUTs, but I don't know.

I do know that "the best" is not to mess with 8-bit LUTs, and I understand why it is so.

The question is what is involved in changing color temperature on 8-bit LCDs if one has to do it, and what is the best way about it.

The reasons for changing color temperature on an LCD could be adjusting it to your organization's standard target, matching another LCD's wp or using software that does not include a Native target option, to name a few.

So far my understanding is that both the monitor and the videocard LUTs can be used for this purpose.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: 61Dynamic on June 01, 2006, 12:02:23 am
Set the color temperature you want in the calibration software. The software will build a profile that will compensate for the monitors whitepoint and give you the desired WP. In other words it is taken care of in the ICC profile for the display.

This will degrade the image quality as well but not as badly as adjusting the video card LUTs or fiddling with the monitor controls.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 01, 2006, 12:37:56 am
If you work on a mac setting a certain white point target other than Native while "profiling" may seem like a natural thing to do. However "profiling" does not adjust monitor colors by itself. By assigning a  profile to a monitor on a Mac you automatically load corresponding videocard LUT corrections on startup (via the VCGT tag). Correct me if I'm wrong. PC users need a standalone LUT loader so it's more obvious to them...

I obviously don't refer to going into the videocard settings and pulling sliders over there, god forbid...
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 01, 2006, 12:44:42 am
Quote
...This will degrade the image quality as well but not as badly as adjusting the video card LUTs or fiddling with the monitor controls.
[a href=\"index.php?act=findpost&pid=67067\"][{POST_SNAPBACK}][/a]

And this is the kind of statement I'm looking for, only I would like to know more about the rationale for it.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: 32BT on June 01, 2006, 03:47:27 am
ICC profiles store RGB gamma values under the corresponding gamma curve tags. This can either be a single value indicating the gamma (1.8, 2.2, etc), or it can be a complete curve.

So this gives you effectively 2 places to store gamma correction. Both in these gamma curve tags, as well as in the video lut tags.

gamma curve tags are processed by the host application (photoshop) when preparing the image for display. It can do this in arbitrary precision and apply spatial dithering for possible bit limits. (In photoshop if you keep an image in 16bit, it actually displays differently than the same image converted to 8bit with dithering).

video luts are set in the video card (possibly involving a special utility to actually load the video lut data from the profile to the card). The video card can render with table precision and apply temporal dithering for intermediate values. (Video cards used to simply map 8bit to 8bit for video lut corrections).

Question now becomes: what is the optimal method to store the gamma information in a profile?

In general (native gamma method):
1. measure R, G, and B response,
2. calculate closest gamma values for R, G, and B,
3. store these gamma values in the normal gamma curve tags.
4. store deviations in the video lut.

this method ensures that the least amount of correction is applied in the video lut where you would lose the most amount of information because of 8to8bit mapping.
 
A different method (gamma 2.2):
1. store 2.2 in the curve tags,
2. adjust video lut to covert real response to gamma 2.2 equivalent.

This obviously can involve severe deviations in the video lut, but for untagged, direct display of RGB images that are supposed to be sRGB, you at least will see a normal contrast behavior, and if your display remotely conforms to sRGB, it should show relatively correct color. It also means that system elements are drawn properly on systems that do not apply CM continuously.

But what about white point?

Well, it is better to do it in the video lut for 2 reasons:
1. if it was in the gamma curve tag, then you would only see the correction if the system uses the display profile for ALL of its drawing.
2. for subtle, intermediate values you need some kind of dithering. Since the system can only apply spatial dithering (and only if it knows there is spatial room), while a video card can ALWAYS apply per pixel temporal dithering, the latter will produce, better and more consistent results.

Conclusion:
You can best use a method that stores a single gamma value in the curves tags, and have the video lut correct for the differences. Since video cards now support better than 8bit luts combined with temporal dithering, you can safely choose a gamma 2.2 as target and whatever whitepoint. Whether your software actually knows how to find a decent curve is another issue, but I would strongly suggest to look at some of the third party offerings such as ColorEyes.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: digitaldog on June 01, 2006, 08:32:08 am
If you look at the curves provided in say i1 Match after calibration, you'll see the adjustments made based on the target calibration values you asked for. In a prefect world, they would be totally straight (linear) but that's never going to happen. Anyway, in theory, you can play with different settings (Native/Native) and compare the curves to adjustments elsewhere in the process (LUTs).
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Tim Lookingbill on June 01, 2006, 06:40:31 pm
Just started reading through here. Interesting discussion. Hi, Andrew and Serge.

opgr,

Your last post indicates a very deep understanding of video systems I wasn't aware of. That mention of spatial and temporal dithering put me at another level of understanding. Never heard of it.

However, Googling those terms brought a truck load of hits on the subject. I guess this IS rocket science going by the NASA pdf that comes up. This page I found is interesting as well:

http://www.extremetech.com/print_article2/...a=141162,00.asp (http://www.extremetech.com/print_article2/0,1217,a=141162,00.asp)

At least I've come to the right place where this level of discussion exists.

Is there any visual diagrams that illustrate how video systems operate. I've been looking all over for pictures that show what, where and how on the subject of vLUTs, differences between color tables and matrix profiles, simple gamma curve and separate RGB adjusted curves for gamma correction, and how all of this affects the final visual.

Look forward to reading many more interesting discussions.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 01, 2006, 08:29:03 pm
Hi Tim. Nice to see you here.

Oscar, thanks a lot for your reply. From what I understand from your post one tag describes the actual output curves (for use by colormanaged applications), the other tag corrects the video output to match those described curves. I don't see where I get to control them though (aside from setting targets), I think it's done automatically...

My goal is  to correct the non-colormanaged (as well as colormanaged obviously) output to the target white point.  And I would like to know what's involved exactly.  Daniel sais this is better done  without touching the LCD RGB buttons for instance.

And I  still don't know  what LCD RGB buttons affect. (I also have difficulty understanding LCD DVI/VGA controls differences but I will probably post another topic on this).

I use Spyder2 Pro and it displays all curves - before, after, corrections, target - I find it a very helpful learning tool. I currently don't have a configuration that works with Coloreyes but I'll probably get a Display2 to better understand the subject.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: 32BT on June 04, 2006, 06:24:36 am
Quote
From what I understand from your post one tag describes the actual output curves (for use by colormanaged applications), the other tag corrects the video output to match those described curves. I don't see where I get to control them though (aside from setting targets), I think it's done automatically...

My goal is  to correct the non-colormanaged (as well as colormanaged obviously) output to the target white point. 

VGA is an analog signal and because of this, an LCD must convert to digital signal before it can display the information. Therefore, you can theoretically adjust the analog signal prior to sampling for the digital signal. If the RGB buttons actually allow you to adjust the pre-amps, then they work akin to setting the correct white-balance prior to shooting JPG.

A good indication that the RGB buttons actually adjust the analog signal is when the RGB buttons are not available in DVI mode.

DVI is a digital signal, and the LCD could theoretically dump it straight to the panel. But a TFT panel doesn't have a normal gamma response, so the LCD has an internal lut to compensate for the difference. That way you can provide it with a relatively normal RGB signal and it will respond predictably.

Obviously, this then results in two places where the video signal is corrected. First in the video card (loaded from the special profile tag) and second in the LCD itself to make it "behave" normally.

This also means you can "calibrate" the device behavior in two places, provided of course that you can load the correction luts into the relevant locations.

So, if you have an LCD with a 14bit internal lut, a DDC connection, and compatible software, you can calibrate the internal lut so it behaves perfectly, and then you can simply leave the video card lut to a straight curve and not bother with the additional video card profile tags etc...

Displays and software that actually allow you to do this, won't leave you in the dark about the RGB buttons.

If you have an LCD with a 10bit or 12bit internal lut, and you have the RGB controls available in DVI mode, you could certainly use it. GMB match allows you to adjust the controls with measurement feedback. However, it usually doesn't do well with anything other than "native" white, because there usually is such a violent difference between pure white and shades of gray on LCD panels. This is where software like ColorEyes shines.

If you have an LCD with an 8to8bit lut, then it becomes a complete mist what to do with the controls. It may be that the LCD is using temporal dithering internally (because it is driving a 6bit panel for example) and therefore it may actually do something useful with the RGB controls. That's a lot of "mays" though, and in this case it is best to select the option with the least amount of tempering of the internal lut. (probably R=G=B=100 in user mode).

Given your goal to also have uncalibrated RGB displayed with a decent WB, you would want to calibrate to a single gamma curve for all three primaries. So, even if you choose "native" gamma, it should still be one and the same gamma value for all three primaries. I don't know whether your software allows you to control these settings, but software is the only place where you could influence this.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: 61Dynamic on June 04, 2006, 11:19:22 am
Quote
VGA is an analog signal and because of this, an LCD must convert to digital signal before it can display the information. Therefore, you can theoretically adjust the analog signal prior to sampling for the digital signal. If the RGB buttons actually allow you to adjust the pre-amps, then they work akin to setting the correct white-balance prior to shooting JPG.
You really can't equate adjusting a analog signal to setting WB prior to shooting a jpeg. When shooting jpeg, you start off with quite allot of raw information before the jpeg is made. With the analog signal, you start off with a 24-bit jpeg, it gets converted to analog and then you end up with another 24-bit jpeg after it's converted back in the display.

With the analog signal, you actually have less information than a pure digital signal since there is always a loss during conversion. So even if the RGB controls did adjust the analog signal, it would be even less desirable to do so than adjusting a full digital signal in a DVI connection.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: 32BT on June 04, 2006, 12:22:51 pm
Quote
You really can't equate adjusting a analog signal to setting WB prior to shooting a jpeg.

The analogy was meant like this:

Does it help to adjust the RGB controls prior to calibration?

If you're using a VGA signal and the RGB controls adjust the analog signal, then it most certainly does help. It is then equivalent to setting the WB prior to shooting JPG as opposed to shooting at a fixed WB and adjusting it afterwards.

And I do agree with your point: DVI is to be preferred over any analogue connection. Be careful however about loss of information: the conversion may actually improve perceptual rendering of data. e.g.: the sampling of the relatively unstable analogue signal automatically dithers a pixel, produces noise over the entire image, and some misalignment, all of which are sometimes perceived as a more desirable rendition.

I would like to add: I am NOT an expert. These are merely my findings when looking for a replacement of my old Hitachi LCD about 2 years ago. I still haven't replaced it, since the changes back then were interesting enough to wait a little while longer.

I do believe it is a good time to make a purchase now. There are good offerings, and current developments are not particularly interesting for the digital darkroom, or won't materialize efficiently in the write-off time of a current purchase.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 05, 2006, 02:09:21 pm
That was very informative.

However I understand that on the subject of 8 bit LUT monitors even Oscar is unclear.

My understanding so far is even though there are 2 places to make adjustments there are no known benefits to using the monitor buttons as opposed to videocard LUTs for white point adjustment on 8 bit monitors. And there's a solid consensus that the best thing would be not to adjust white point at all (I knew that).

In practice it means that if software asks you if your (8 bit ) LCD monitor has RGB buttons it's best to tell that you don't, so it does everything through the videocard LUTs, both the white point and the gamma.

The information on higher bit LUTs was extremely helpful. From what I understood adjusting native white point on 10 or more bit internal LUTs monitor using buttons (in case you can't use DDC) does not lead to the loss of color values. So in this case you'd want to avoid using videocard LUTs for adjustments as much as possible.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: digitaldog on June 05, 2006, 02:39:07 pm
Just had to ping Dr Karl on all this. His response:
---

As to the issue in this post.

What is optimal front panel controls or video card LUT for adjusting  
white?

The simple answer is you can't really know. Manufacturers don't give  
you enough information in most cases.

The internal data paths of most LCDs are not published. Modern mid to  
high end LCDs do not have simple 1D LUTs. Most chip sets today have  
3D lookups which are altered by all of the front panel adjustments  
and a factory-set final grey balance 1D curve after that. Display  
features like NECs "colorcomp" are also using up data to make spacial  
corrections (this is never mentioned in the data sheets.)

Using an analog input signal never fixes anything! That input signal  
is not a smooth curve it is a stair-step signal created by the video  
card DAC. The only thing this will do is introduce noise and  
additional aliasing in the system as it is redigitised. Even if a  
front panel control did adjust the analog input side (i have never  
seen this) this would only create worse aliasing due to the  
misalignment of the stair-step signal and the ADC thresholds.

These things said, here is what you can do.

It is always preferable to keep the white point as close to native as  
possible. If you only have one display and native is 6800K use it  
don't correct it, your eyes will adapt. If you need to have two  
displays side by side then you need to correct white point.

If you are using a modern high-end LCD. Use the front-panel to adjust  
the white point, load a neutral LUT in VC. Look at a grayscale, is it  
neutral all the way to black? or did it look much better when set to  
native? This will give you some idea if your internal path has the 2  
stage 3D-1D system I described above. If it does use the front panel  
to correct the white, If not use the cal software and the LUT.

Do not adjust "contrast", "brightness" or "gamma" on the front panel.  
Do adjust "backlight", on some models backlight is incorrectly  
labeled "brightness."

When you calibrate do not change the TRC of the system ("use native  
gamma") changing the TRC will only cause more aliasing.

Do not use a Matrix/TRC based profile. Use a full lookup. This will  
provide the best soft proof. LCDs are not CRTs they do not use  
phosphors that mix in a linear fashion. There is a lot of crosstalk  
between channels especially in the darker tones. This can only be  
properly modeled with the full lookup ICC profile.

Hope this helps,

Karl Lang
----

Yup, it helped me.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 05, 2006, 03:38:59 pm
Thank you so much for taking your time asking Karl Lang. That was a very authoritative response.

Although I don't understand the 3D-1D part I can run the greyscale gradient test and see what the results are. What's the best way to look at the greyscale gradient? Colormanaged application set to monitor profile as a working space?

Unfortunately in my software (and most other solutions on the market) not using TRC (tone response curve) gamma correction  is impossible since Native Gamma target is not available. Do I understand this point correctly?
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: digitaldog on June 05, 2006, 06:10:39 pm
According to Karl, you'd make an R/G/B/C/M/Y and black gradient in say Photoshop. What you want to do is view this with the numbers going directly to the display to view smoothness as you alter the various on board display settings. However to do this, you need to either view them in a non ICC aware application (like most web browsers) OR you need to create a profile that is linear. On the Mac, you could do this using Apple's Calibrator WIHTOUT altering any settings on the display. Don't move any of the sliders in the software and set it all for native. In theory that would create a profile that would send the data to the screen without any adjustments then you could Assign it to the ramp you made while viewing this all in Photoshop. Oh, when you make the ramps, make sure all dither settings in Photoshop are OFF (color settings and gradient).

Seems a bit easier to view this in a non ICC aware application like a browser.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 05, 2006, 07:12:46 pm
Quote
On the Mac, you could do this using Apple's Calibrator WIHTOUT altering any settings on the display. Don't move any of the sliders in the software and set it all for native. ...

On a PC that could be done using Adobe Gamma I believe. A "no adjustments" profile... I'll look into that.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: 61Dynamic on June 08, 2006, 10:53:54 am
Quote
Just had to ping Dr Karl on all this. His response:
---

[...]

The simple answer is you can't really know. Manufacturers don't giveĀ 
you enough information in most cases.

[...][a href=\"index.php?act=findpost&pid=67460\"][{POST_SNAPBACK}][/a]
That's a big problem with displays these days. The LCD market is still very young and little info is given making it difficult to find decent displays for our use. And, as unnecessarily confusing as people let things get in calibrating LCDs today, just wait until the next technology SED (Surface-conduction Electron-emitter Display) spills into the monitor market in the next year or two. Then after that, OLED (Organic Light-Emitting Diode).

The future promises even better color reproduction and added confusion!
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 09, 2006, 08:26:41 pm
Well, I'm still trying to run the test correctly (without any LUT adjustments). I just got a Display2, hopefully it'll help.

Meanwhile I came across a very convincing post by Sergey Oboguev (at Pro Photo - I don't feel like paying to join) arguing against using RGB buttons adjustments on 8 bit LUT monitors. One of the points is that by adjusting in two places (buttons for WP, LUTs for gamma) you may drop more than one adjacsent RGB values...

http://www.prophotocommunity.com/ubbthread...ge/0#Post422964 (http://www.prophotocommunity.com/ubbthreads/showflat.php/Cat/0/Number/422964/an/0/page/0#Post422964)
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Stephen Best on June 09, 2006, 09:04:46 pm
Quote
Meanwhile I came across a very convincing post by Sergey Oboguev (at Pro Photo - I don't feel like paying to join) arguing against using RGB buttons adjustments on 8 bit LUT monitors. One of the points is that by adjusting in two places (buttons for WP, LUTs for gamma) you may drop more than one adjacsent RGB values...

http://www.prophotocommunity.com/ubbthread...ge/0#Post422964 (http://www.prophotocommunity.com/ubbthreads/showflat.php/Cat/0/Number/422964/an/0/page/0#Post422964)
[a href=\"index.php?act=findpost&pid=67814\"][{POST_SNAPBACK}][/a]

As I understand it, the average delta-E from the verification phase tells you how well the generated profile models the actual displayed values ... but nothing about gamut, accuracy, smoothness or otherwise. The eyeball test above I think would be more useful.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 09, 2006, 10:17:25 pm
Oh, that Delta E is about higher bit monitors... In the first couple of paragraphs he was talking about 8 bit ("dumb" in his words) monitors.

Since I don't know how his software works (I think it's Monaco Pro or whatever highend Xrite is called) I can't tell what color difference he's measuring over there. I think it measures a bunch of patches and gives a list of Delta Es for each one (compared to the target)...
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Stephen Best on June 09, 2006, 11:12:23 pm
Quote
Oh, that Delta E is about higher bit monitors... In the first couple of paragraphs he was talking about 8 bit ("dumb" in his words) monitors.
[a href=\"index.php?act=findpost&pid=67821\"][{POST_SNAPBACK}][/a]

I think the moral of all this is that if you're stuck with an 8-bit monitor, just use it as is. Most today seem to be modeled around 6500/2.2 which is a good match for general use, web browsers etc. I've got an Apple Cinema Display 20" with no knobs at all and it does the job ... either profiled natively or tuned to exactly 6500/2.2 in the card. But for much the same price these days you can buy a monitor with smarter electronics and (maybe) better gamut. I've got an NEC 2090 on order as a second monitor (replacing an old Sony CRT) and am keen to see how it compares. Having two smaller monitors means you can upgrade each independently as technology/price-performance improves. If your current monitor isn't meeting your requirements, you may want to consider a similar arrangement. You don't need high-end for palettes etc.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 09, 2006, 11:31:36 pm
I use dual monitors that have different Native white points. So I do need to adjust one of them. They are not highend monitors so I do see artifacts like banding and non-uniform color temperature on the greyscale ramp.

But the question is also important because most software solutions bundled with colorimeters guide you to go ahead and adjust LCD buttons if you have them.  From what I learned so far it only makes sense if you have a 10 or more bit internal LUTs monitor that for some reason does not work with your software via DDC.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Stephen Best on June 15, 2006, 01:53:50 am
Quote
I use dual monitors that have different Native white points. So I do need to adjust one of them. They are not highend monitors so I do see artifacts like banding and non-uniform color temperature on the greyscale ramp.

But the question is also important because most software solutions bundled with colorimeters guide you to go ahead and adjust LCD buttons if you have them.  From what I learned so far it only makes sense if you have a 10 or more bit internal LUTs monitor that for some reason does not work with your software via DDC.
[a href=\"index.php?act=findpost&pid=67827\"][{POST_SNAPBACK}][/a]

My NEC LCD2090UXi arrived today. I initially profiled it native (40% brightness) and didn't really see much difference to my Apple Cinema Display 20" (also native). I got pretty well identical results with Match 3.6 and basICColor display 4. Gray ramps for both displays were very clean and with no banding. The NEC had more contrast but if anything the gamut was slightly smaller than my older Apple. I was a bit disappointed at this stage and wondered whether I should have bought a cheaper Samsung etc.

Anyway, then I bumped the brightness to 65%, set temperature/gamma to 5000/2.4 and profiled it with display 4 in software LUT mode for D50/L*. The video LUT curves came out nearly flat with just a bump in the shadows. I went through image after image from a set of exhibition prints I did recently for a client and they all looked very close. The gray ramp is clean and with no banding. In fact I've only noticed a single case of banding (in a graduated sky area) where the working space gamma didn't match the display. I converted the image to Lab and it went away. I'm doing more of my work in Lab nowadays anyway. So this is where the extra bits in the monitor come in handy. Maybe if you're printing on resin-coated with lots of OBAs you can get away with 6500 ... but 5000 is a much better match to the rag paper I use.

I was still hopeful that the gamut for the new monitor would be slightly better. I haven't seen a lack on screen but compared it with the Apple using the ColorSync utility. I guess the story is if you want better gamut you'll have to pay for it. I was very impressed with the Eizo monitors I saw recently but the closest model to the 2090 would be the L997 at twice the price (at least here). Note that I haven't compared the 2090 side-by-side with the 2180/2190 (same panel) so maybe the larger monitor is better in this department. But again, more dollars. The 2090 will do me for a while.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Stephen Best on June 15, 2006, 05:20:10 am
More on the 2090 gamut.

[attachment=699:attachment]
[attachment=701:attachment]

Above are plots (as profiled) comparing it to sRGB (white mesh). It's better in cyan, but not quite as good as my Apple in the reds/yellows (but close to sRGB). Note that different settings for the display result in a different gamut.

All in all images look great on the 2090 and have real depth. It's a definite step up from my old Sony CRT.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 15, 2006, 01:44:59 pm
I'm not familiar with Basiccolor. What's software LUT mode? Do Match 3 and Basiccolor have DDC control over the monitor?

If "LUT mode" means adjusting videocard LUTs, are the resulting curves (which I assume are correction curves like in Match 3) close to 1 (run straight down the center)?
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Stephen Best on June 15, 2006, 07:11:51 pm
Quote
I'm not familiar with Basiccolor. What's software LUT mode? Do Match 3 and Basiccolor have DDC control over the monitor?

If "LUT mode" means adjusting videocard LUTs, are the resulting curves (which I assume are correction curves like in Match 3) close to 1 (run straight down the center)?
[{POST_SNAPBACK}][/a] (http://index.php?act=findpost&pid=68250\")

More on basICColor display 4 here:

[a href=\"http://www.basiccolor.de/english/Datenblaetter_E/display_E/display_E.htm]http://www.basiccolor.de/english/Datenblae...E/display_E.htm[/url]

Match 3.6 has closed the gap considerably but I think display 4 still has an edge.

In "Software LUT mode" it creates a LUT for the video card. You can of course just profile it as is (namely with a linear LUT).

The monitor supports hardware calibration but there's currently no software to support it ... AFAIK. An updated version of NEC's SpectraView is on the way. I don't think there's much (if anything) that hardware mode supports that you can't do with the buttons on the monitor, but you do miss out on closed-loop calibration. From what I can currently see, I don't think I'll bother with SpectraView.

Here's the video LUT generated for D50/L*:

[attachment=703:attachment]

The green and blue curves are similar. Most of this is because a gamma of 2.4 isn't exactly L*. If all I wanted was a gamma of 2.2 (or whatever) I'd just dial it (and the colour temperature) in on the monitor and profile it as is. The monitor supports a gamma from 0.5 to 4.0 (in 0.1 increments) and a colour temperature of 3000-9600K internally, though obviously you don't want to move too far away from the backlight temperature. This flexibility is what you're paying for.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 15, 2006, 08:48:10 pm
Stephen, thanks so much for such a detailed reply.

So, the curves do in fact display LUT adjustments of the vcgt tag.

Even if Spectraview does not work for your monitor it should come with some kind of utility allowing you to adjust RGB sliders and whatnot  from your computer. Also, theoretically even if you do adjust RGB buttons on that monitor the resulting quality should be better than LUT adjustments.

So, I suppose with the  L* gamma target and White point adjusted via either monitor software or monitor buttons (as opposed to calibration software adjusting LUTs) the curves are supposed to be almost perfectly linear. I suppose the ultimate goal is to achieve a target significantly different from Native without vc LUTs adjustments...

I don't say that to nitpick -  it appears to follow from what more than 8 bit LUT monitors were designed for.

I still don't know how to thoroughly test a display to check the negative results of either videocard or internal LUTs adjustments unfortunately - I'd really like to learn more about it.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Stephen Best on June 15, 2006, 09:14:54 pm
Quote
Even if Spectraview does not work for your monitor it should come with some kind of utility allowing you to adjust RGB sliders and whatnot  from your computer. Also, theoretically even if you do adjust RGB buttons on that monitor the resulting quality should be better than LUT adjustments.
[a href=\"index.php?act=findpost&pid=68277\"][{POST_SNAPBACK}][/a]

My 2090 came with a copy of NaviSet but this is for Windows (blehhh) only. There's nothing to stop you adjusting the colour temperature with the buttons at the same time as you're measuring what you're getting with a puck. There's also plenty of other monitor settings I haven't looked at yet. The aim should simply be to get close to what you want for whitepoint/gamma and not see any banding from the combination of video card and monitor LUT ... without agonizing over the last ounce of linearity. At some point you have to move on from the curve shapes, gamut plots etc. and just trust what your eyes are telling you.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 15, 2006, 09:23:47 pm
Sure. I'm just being geeky about it.
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: jani on June 18, 2006, 07:45:30 pm
Quote
That's a big problem with displays these days. The LCD market is still very young and little info is given making it difficult to find decent displays for our use. And, as unnecessarily confusing as people let things get in calibrating LCDs today, just wait until the next technology SED (Surface-conduction Electron-emitter Display) spills into the monitor market in the next year or two. Then after that, OLED (Organic Light-Emitting Diode).

The future promises even better color reproduction and added confusion!
Or perhaps we're simply looking in the wrong places?

A friend of mine who works with dentists pointed out that they were purchasing some nice 3 Mpx displays from Eizo, with 10-bit grayscale from a 13-bit palette. Precision is paramount in medical imaging, so I decided to check out Eizo's pages on medical imaging products, and lo and behold:

EIZO RadiForce R31 (http://radiforce.com/en/products/col-r31.html)
3 Mpx
10 bpc
Price: ouch

IIRC, the grayscale GS310 sells for around USD 3,000 plus taxes.  I suppose the colour version is a bit more expensive.

I'd presume that these displays would be useful for photographers, too, but I have no practical way of verifying that.  
Title: (8 bit internal LUTs) LCD whitepoint adjustment
Post by: Serge Cashman on June 18, 2006, 09:25:15 pm
<edit> Deleted by author.