Pages: 1 ... 6 7 [8] 9   Go Down

Author Topic: Monitor calibration sensor evaluations  (Read 145172 times)

shewhorn

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 537
    • http://
Re: Monitor calibration sensor evaluations
« Reply #140 on: October 26, 2011, 08:02:17 pm »

Joe - If I am correct in that all your sensors other than the i1Display Pro give readings in the vicinity of 110 cd/m2 while the i1D3 reads 97, then something appears amiss. That is a larger offset from reality then we measured. A few questions: (1) Do all the other instruments read near 110? (2)

Well... I don't think I can say yes to this. I was going off of my historical familiarity with each instrument but it's been a while since I've gotten some objective numbers. I've posted them below but here's the quick synopsis (obviously given the ccss file I used Argyll to get these numbers):

Eye One Pro: 113.24 cd/m^2
i1 Display Pro (with ccss calibration): 105 cd/m^2
i1 Display Pro (without ccss calibration): 104.62 cd/m^2
i1 Display 2 (NEC, no ccss): 114.18 cd/m^2
Spyder 3 (no ccss): 98.02 cd/m^2
DTP-94: 104.49 cd/m^2 (the particular monitor in use is a wide gamut display so the DTP-94 can't handle this particular model so I don't know how accurate that would be but... would it be safe to assume that for measuring white luminance, even if it's reporting things incorrectly in terms of color, it's still measuring the same amount of energy???).

[SPOCK] Fascinating [/SPOCK] So it looks like we have clusters... the i1 Display Pro and DTP-94 agree with one another, the Eye One Pro and i1 Display 2 (MDSVSENSOR II) agree with one another, and the Spyder 3 appears to be lost in space.

Quote
Did you choose the correct panel type for the i1Display Pro measurements? Depending upon which calibration is loaded, there can be differences in readings, although not to the extent you see.

When using i1 Profiler, yes. Wide Gamut CCFL. The GF is gonna poke me with a cattle prod if I don't go to the grocery store now so I'll double check the ∆ between the Eye One Pro and i1 Display Pro when I get back.

Quote
The units we i1D3 tested were from a variety of sources. Some arrived directly from X-Rite and did indeed have closely grouped serial numbers. Several units were ones we purchased, while the rest came courtesy of various vendors. In the cumulative probability plot linked here, there was no correlation between sensor accuracy and i1Display Pro serial number. In short, the pucks were consistent - certainly to within the 10 dE level you are seeing!

Excellent. Looks like you have a pretty good sampling then. Here's the complete data (note, this particular screen is out of calibration at the moment, I just measured the same white patch with each instrument to see where they all fell in terms of luminance relative to one another)

i1Display Pro with ccss file
Current calibration response:
Black level = 0.19 cd/m^2
White level = 105.00 cd/m^2
Aprox. gamma = 2.35
Contrast ratio = 539:1
White chromaticity coordinates 0.3140, 0.3320
White    Correlated Color Temperature = 6415K, DE 2K to locus =  5.7
White Correlated Daylight Temperature = 6414K, DE 2K to locus =  1.4
White        Visual Color Temperature = 6216K, DE 2K to locus =  5.5
White     Visual Daylight Temperature = 6370K, DE 2K to locus =  1.3

i1 Display Pro without ccss file:
Current calibration response:
Black level = 0.20 cd/m^2
White level = 104.62 cd/m^2
Aprox. gamma = 2.34
Contrast ratio = 535:1
White chromaticity coordinates 0.3081, 0.3352
White    Correlated Color Temperature = 6702K, DE 2K to locus = 10.6
White Correlated Daylight Temperature = 6696K, DE 2K to locus =  7.4
White        Visual Color Temperature = 6278K, DE 2K to locus = 10.2
White     Visual Daylight Temperature = 6422K, DE 2K to locus =  7.1

Eye One Pro (straight out of the case, onto the calibration plate, then on to the screen and immediately measured... usually I leave it on the screen for a little while to stabilize)
Current calibration response:
Black level = 0.22 cd/m^2
White level = 113.24 cd/m^2
Aprox. gamma = 2.34
Contrast ratio = 525:1
White chromaticity coordinates 0.3195, 0.3375
White    Correlated Color Temperature = 6105K, DE 2K to locus =  5.9
White Correlated Daylight Temperature = 6104K, DE 2K to locus =  1.5
White        Visual Color Temperature = 5925K, DE 2K to locus =  5.6
White     Visual Daylight Temperature = 6063K, DE 2K to locus =  1.4
The instrument can be removed from the screen.


i1 Display 2 (MDSVSENSOR II NEC)
Black level = 0.22 cd/m^2
White level = 114.18 cd/m^2
Aprox. gamma = 2.34
Contrast ratio = 515:1
White chromaticity coordinates 0.3261, 0.3504
White    Correlated Color Temperature = 5770K, DE 2K to locus =  9.6
White Correlated Daylight Temperature = 5770K, DE 2K to locus =  6.0
White        Visual Color Temperature = 5502K, DE 2K to locus =  9.3
White     Visual Daylight Temperature = 5617K, DE 2K to locus =  5.8

Spyder 3
Current calibration response:
Black level = 0.25 cd/m^2
White level = 98.02 cd/m^2
Aprox. gamma = 2.33
Contrast ratio = 392:1
White chromaticity coordinates 0.3091, 0.3354
White    Correlated Color Temperature = 6643K, DE 2K to locus = 10.2
White Correlated Daylight Temperature = 6638K, DE 2K to locus =  6.9
White        Visual Color Temperature = 6245K, DE 2K to locus =  9.9
White     Visual Daylight Temperature = 6388K, DE 2K to locus =  6.6

DTP-94
Current calibration response:
Black level = 0.20 cd/m^2
White level = 104.49 cd/m^2
Aprox. gamma = 2.34
Contrast ratio = 522:1
White chromaticity coordinates 0.3293, 0.3347
White    Correlated Color Temperature = 5642K, DE 2K to locus =  2.9
White Correlated Daylight Temperature = 5643K, DE 2K to locus =  7.1
White        Visual Color Temperature = 5719K, DE 2K to locus =  2.7
White     Visual Daylight Temperature = 5859K, DE 2K to locus =  6.8
The instrument can be removed from the screen.


Cheers, Joe
Logged

tony22

  • Full Member
  • ***
  • Offline Offline
  • Posts: 105
Re: Monitor calibration sensor evaluations
« Reply #141 on: October 26, 2011, 08:40:22 pm »

Ethan, my question may be out in left field but I'll ask it anyway. ;D How would you evaluate the i1Display Pro compared to the old Chroma5 colorimeter for calibration of a plasma HT display? The Chroma5 (once calibrated to NIST standards) is a particularly good match for that kind of work. Excellent performance at very low luminosities. If the new i1DP is better in that area and has great color accuracy it might be good for this purpose.
Logged

Ethan_Hansen

  • Full Member
  • ***
  • Offline Offline
  • Posts: 114
    • Dry Creek Photo
Re: Monitor calibration sensor evaluations
« Reply #142 on: October 26, 2011, 11:48:47 pm »

Joe: You are correct in that wide-gamut measurement errors in the DTP-94 are mainly in color (the x/y chromaticity coordinates Argyll reports) rather than luminance. The with/without ccss luminance values from the i1D3 are within measurement noise, particularly if the location on the screen shifted even slightly. I don't know what to make of the i1D2 and i1 Pro luminance values. We could not get any reply from NEC about the calibration process they use for their OEM-branded i1D2 units. Some vendors individually calibrated each puck, while others simply added generic correction matrix values. No clue about NEC. I am suspicious here about the cal. on the i1D2 and i1Pro. How long since your i1Pro went back to X-Rite for calibration? If you want to play the mailing game, send me your i1Pro and we'll measure it.

The chromaticity differences between the i1Pro and i1Display Pro + ccss are small: 1.7 dE-2K. You are right about the Spyder 3. It looks best suited to being a futuristic paperweight. Did you buy it before 2010? If so, those were the days of random Spyder performance.



Tony: I haven't a clue about the Chroma5 and plasma displays. The Chroma5 design was way ahead of its time in terms of filter set, thermal stabilization, and low-level resolution. I think the guy now at X-Rite responsible for much of the i1Display Pro design was running things at Sequel in Chroma5 days (Tom Lianza, who also moonlights as the chair of the ICC). The people at SpectraCal could answer your comparison questions. They now support the i1Display Pro for their CalMAN A/V calibration product, and have years of experience with the Chroma5.

shewhorn

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 537
    • http://
Re: Monitor calibration sensor evaluations
« Reply #143 on: October 27, 2011, 03:39:09 am »

Joe: You are correct in that wide-gamut measurement errors in the DTP-94 are mainly in color (the x/y chromaticity coordinates Argyll reports) rather than luminance. The with/without ccss luminance values from the i1D3 are within measurement noise, particularly if the location on the screen shifted even slightly. I don't know what to make of the i1D2 and i1 Pro luminance values. We could not get any reply from NEC about the calibration process they use for their OEM-branded i1D2 units. Some vendors individually calibrated each puck, while others simply added generic correction matrix values. No clue about NEC. I am suspicious here about the cal. on the i1D2 and i1Pro. How long since your i1Pro went back to X-Rite for calibration?

The i1Pro was due for it's annual checkup on 5/4/2011 so it's possible that it needs a tweak. The i1D2 was actually just recently replaced by NEC. The previous puck had started to produce magenta casts. I'm sure it's a refurb but in theory, it's been checked (of course that doesn't mean it couldn't have slipped through QA).

Quote
If you want to play the mailing game, send me your i1Pro and we'll measure it.

Let me know what you think at the end of this... the second pass of results only leaves room for a few conclusions...

1) The monitor I used for the tests in the previous post has some CCFLs that are beginning to go whackadoodle (although I think I would notice a +10 cd/m^2 fluctuation and after multiple sequential measurements I wouldn't have expected to see any kind of consistency but I did).
2) User Error - I thought I was being cautious with placement attempting to line up the sensor aperture with the middle of the patches that Argyll puts up. Perhaps I wasn't as precise as I thought I was being. This time around I created a perfectly centered target in Photoshop (with dimensions of 2560x1600 and 1920x1200) and made that my background. I used this target as a guide to align the sensor apertures (I also ran the tests on my NEC which is significantly more linear).
3) I had less than a glass of wine with dinner, but perhaps I had more than I thought I did?  ;D

2 would seem to be the most plausible explanation.

Quote
The chromaticity differences between the i1Pro and i1Display Pro + ccss are small: 1.7 dE-2K. You are right about the Spyder 3. It looks best suited to being a futuristic paperweight. Did you buy it before 2010? If so, those were the days of random Spyder performance.

The Spyder 3 was actually recently replaced as well. My original was generating some green casts in the shadows. Datacolor sent me a new one (no questions asked) in... maybe May? Clearly this copy is not very good at measuring shadows. The reported contrast ratios across all the other sensors are very consistent but the Spyder 3 indicates a significantly lower contrast ratio due to the higher black point that it's reporting (I ran it multiple times and the numbers coming back were consistent so it's not like the Eye One Pro that kicks back different numbers if you take a snapshot at any one single point in time).

So more results... Here's what i1 Profiler has to say:

1st pass on HP
Eye One Pro = 103 cd/m^2
i1 Display Pro = 102 cd/m^2

2nd pass a few hours later on NEC:
Eye One Pro = 99 cd/m^2
i1 Display Pro = 101 cd/m^2

Well that isn't what I saw before. Hmmm.... I must be going Looney Tunes. I swear there was a pretty big difference between the Eye One Pro and the i1 Display Pro and indeed there does appear to be in Argyll, at least with the previous tests. Let's try this again in Argyll. This time all the sensors had time to bake on screen for about 20 minutes and I used an NEC 2690 (previous tests were done on an HP LP3065... the NEC's luminance across the screen is much more linear so that along with aid of the target as the background should help mitigate placement inconsistencies a bit)


Eye One Pro:
Current calibration response:
Black level = 0.48 cd/m^2
White level = 106.24 cd/m^2
Aprox. gamma = 2.14
Contrast ratio = 220:1
White chromaticity coordinates 0.3165, 0.3324
White    Correlated Color Temperature = 6279K, DE 2K to locus =  4.4
White Correlated Daylight Temperature = 6280K, DE 2K to locus =  0.3
White        Visual Color Temperature = 6136K, DE 2K to locus =  4.2
White     Visual Daylight Temperature = 6288K, DE 2K to locus =  0.3


_______________________________

i1 Display Pro (without ccss):
Current calibration response:
Black level = 0.49 cd/m^2
White level = 100.17 cd/m^2
Aprox. gamma = 2.13
Contrast ratio = 203:1
White chromaticity coordinates 0.3038, 0.3310
White    Correlated Color Temperature = 6977K, DE 2K to locus = 10.6
White Correlated Daylight Temperature = 6970K, DE 2K to locus =  7.5
White        Visual Color Temperature = 6504K, DE 2K to locus = 10.3
White     Visual Daylight Temperature = 6660K, DE 2K to locus =  7.2

_______________________________

i1 Display Pro (with ccss):
Current calibration response:
Black level = 0.50 cd/m^2
White level = 100.58 cd/m^2
Aprox. gamma = 2.13
Contrast ratio = 203:1
White chromaticity coordinates 0.3104, 0.3274
White    Correlated Color Temperature = 6643K, DE 2K to locus =  5.0
White Correlated Daylight Temperature = 6642K, DE 2K to locus =  0.5
White        Visual Color Temperature = 6452K, DE 2K to locus =  4.8
White     Visual Daylight Temperature = 6623K, DE 2K to locus =  0.5

_______________________________

i1 Display 2
Current calibration response:
Black level = 0.57 cd/m^2
White level = 111.27 cd/m^2
Aprox. gamma = 2.13
Contrast ratio = 197:1
White chromaticity coordinates 0.3237, 0.3472
White    Correlated Color Temperature = 5883K, DE 2K to locus =  9.1
White Correlated Daylight Temperature = 5882K, DE 2K to locus =  5.5
White        Visual Color Temperature = 5618K, DE 2K to locus =  8.8
White     Visual Daylight Temperature = 5738K, DE 2K to locus =  5.3

_______________________________

Spyder 3
Current calibration response:
Black level = 0.62 cd/m^2
White level = 101.75 cd/m^2
Aprox. gamma = 2.13
Contrast ratio = 163:1
White chromaticity coordinates 0.3171, 0.3305
White    Correlated Color Temperature = 6262K, DE 2K to locus =  2.7
White Correlated Daylight Temperature = 6263K, DE 2K to locus =  2.2
White        Visual Color Temperature = 6176K, DE 2K to locus =  2.6
White     Visual Daylight Temperature = 6335K, DE 2K to locus =  2.1

_______________________________

DTP-94
Current calibration response:
Black level = 0.50 cd/m^2
White level = 103.03 cd/m^2
Aprox. gamma = 2.15
Contrast ratio = 206:1
White chromaticity coordinates 0.3340, 0.3388
White    Correlated Color Temperature = 5431K, DE 2K to locus =  2.8
White Correlated Daylight Temperature = 5430K, DE 2K to locus =  7.0
White        Visual Color Temperature = 5500K, DE 2K to locus =  2.7
White     Visual Daylight Temperature = 5625K, DE 2K to locus =  6.8


Alright... thought I was being pretty careful with placement before (this time I'm being even more careful)....

Okay... let's try something else...

Spectraview II (on the NEC 2690, probably should have used the 2690 before, although I'm using a target to align the sensors this time around, even if I'm off the difference will be less than 0.5 cd/m^2 in the measurement circle):
Eye One Pro - 104.4 cd/m^2 @ 6135ºK CIE Coordinates=0.3192,0.3339
i1 Display Pro - 101.3 cd/m^2 @ 6526ºK CIE Coordinates=0.3124,0.3287
i1 Display 2 (MDSVSENSOR II) - 111.9 cd/m^2 @ 5887ºK CIE Coordinates=0.3239,0.3417 (this right here is a surprise... usually the Eye One Pro and this sensor were always within 250ºK of one another... I went for a 2nd opinion on this one with BasICColor Display and it pretty much agreed, 110.88 cd/m^2 @ 5822ºK... a 3rd opinion from Argyll is in that ballpark too)
DTP-94 - 104.7 cd/m^2 @ 5373ºK CIE Coordinates=0.3353,0.3396
Spyder 3 - 102.4 cd/m^2 @ 6061ºK CIE Coordinates=0.3208,0.3326

What to make of it? I'm not sure. Maybe I need to do this a few more times being more methodical before I draw any conclusions. As for sending my Eye One Pro over your way, I'd be very interested in knowing how it does compared to everything else you've tested BUT... seeing as its past it's recommended recalibration appt. I'm not sure if it would be terribly useful for you in terms of adding another instrument to your results. That said, you're welcome to any of my sensors if you need to get more data points for your comparisons.

Cheers, Joe
« Last Edit: October 27, 2011, 12:53:28 pm by shewhorn »
Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: Monitor calibration sensor evaluations
« Reply #144 on: October 27, 2011, 06:51:38 am »

The 2711 was one of the displays we used for evaluations. The average i1D3 measurement error was 1.4 dE-2K on white and 2.6 dE-2K on black. Only the Discus performed better (0.7/1.9 dE-2K). I would, however, recommend getting the full version of the i1D3 rather than the CM Display, as this gives you i1Profiler as well as the ability to use Argyll. A couple of software suppliers are actively working on adding DDC control to Dell Ultrasharps, the 2711 in particular. If they are successful, the resulting profiles will likely surpass Argyll's in quality.
This is very interesting!

I had not even considered that it would be possible to control the display LUT from the computer as long as Dell never advertised that feature. Do you have any references to such discussions?

I am guessing that the "increased quality" over Argyll would be in _applying_ the correction in >8 bit precision display firmware instead of prior to an 8-bit DVI link, not so much in _estimating_ the display behaviour? Given that, it should (in principle) still be possible to estimate the display response using Argyll, and then somehow upload the correction using DDC and some other software?

Being able to automatically switch between a calibrated sRGB response for day-to-day use and a wide-gamut native response whenever a color-management aware application was running (e.g. Lightroom) would really mean a lot to me. Where do I donate the money? :-)

-h

Edit:
Found a link or two that seems relevant:
http://answerpot.com/showthread.php?712963-10-but+LUTs+and+the+Dell+U2711
http://forums.adobe.com/message/3991911
http://colorhacks.blogspot.com/2008/02/monitors-with-internal-luts.html
http://ddccontrol.sourceforge.net/
http://forums.entechtaiwan.com/index.php?topic=7463.0
http://en.community.dell.com/support-forums/peripherals/f/3529/t/19363520.aspx
« Last Edit: October 27, 2011, 07:18:31 am by hjulenissen »
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: Monitor calibration sensor evaluations
« Reply #145 on: October 27, 2011, 10:22:59 am »

I am guessing that the "increased quality" over Argyll would be in _applying_ the correction in >8 bit precision display firmware instead of prior to an 8-bit DVI link, not so much in _estimating_ the display behaviour? Given that, it should (in principle) still be possible to estimate the display response using Argyll, and then somehow upload the correction using DDC and some other software?

Having a conversation yesterday with display guru Karl Lang who produced the Pressview and Sony Artisan, he stated that a display LUT is always preferable and that software LUTs are typically not the way to go for a number of reasons. Hopefully we can convince Karl to come out and discuss this here in more detail. The bottom line he proposed is that units like NEC which provide a linear LUT in the graphic system then apply the corrections in the panel LUT is the way to handle these tasks (for example, he stated that its impossible to control contrast ratio and black at the same time using the graphic system LUT if I understood correctly).

We can discuss differences in dE between devices but if the process is creating corrections on the graphic system instead if in a high bit LUT in the panel, the differences are moot.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Ethan_Hansen

  • Full Member
  • ***
  • Offline Offline
  • Posts: 114
    • Dry Creek Photo
Re: Monitor calibration sensor evaluations
« Reply #146 on: October 28, 2011, 01:37:49 am »

A set of comments:

Joe: I'm not sure what is going on there. The approach we use to collect data is as follows:
  • Mark a location on the screen for the puck(s) to ensure consistent placement. I don't recommend breaking out the Sharpies; we place blue painter's tape on each edge of the bezel with horizontal and vertical alignment marks. A straightedge does the rest.
  • Hang all the pucks on the screen for 20 minutes or so to warm up.
  • Choose a sensor, place it in the marked location, and measure away on a white patch. We record 100 readings and average (hacked versions of Argyll and a tool provided by X-Rite).
  • Repeat the step above for all sensors.
  • Unplug each unit from the USB port, plug back in, and repeat the above two steps twice more.
  • Have a drink. You'll need it.
Your latest data make everything but the i1D2 look reasonable. Luminance readings are fairly close - as expected - and the color temperature variations are certainly within the sensor accuracy values we saw. I would try the NEC sensor with NEC's software. If I remember correctly, SVII applies a correction matrix known only to BasICColor and NEC.


This is very interesting!

I had not even considered that it would be possible to control the display LUT from the computer as long as Dell never advertised that feature. Do you have any references to such discussions?

I am guessing that the "increased quality" over Argyll would be in _applying_ the correction in >8 bit precision display firmware instead of prior to an 8-bit DVI link, not so much in _estimating_ the display behaviour? Given that, it should (in principle) still be possible to estimate the display response using Argyll, and then somehow upload the correction using DDC and some other software?
Dell makes offhand mention of DDC in the online documentation. From what I have been told, the problem with DDC and Dell stems from Dell's transfer of all firmware development to a new group when the transition from CRT to LCD occurred. Some LCD Dell monitors returned an EDID (the "Extended Display Identification Data" tag used to identify a particular panel) for a previous generation CRT. There are other quirks with Dell's DDC implementation, of which their own engineers may or may not be aware.

DDC offers several advantages. First, adjustments to the monitor usually use a finer scale than you get through the OSD. Even with monitors having only 8-bit internal LUTs, making some of the necessary corrections in the monitor LUT often produces fewer calibration artifacts. Finally, for high-bit monitors, adjustments that create visible banding when performed only through the video card are smooth as can be when made in the monitor.


Quote from: digitaldog
Having a conversation yesterday with display guru Karl Lang who produced the Pressview and Sony Artisan, he stated that a display LUT is always preferable and that software LUTs are typically not the way to go for a number of reasons. Hopefully we can convince Karl to come out and discuss this here in more detail. The bottom line he proposed is that units like NEC which provide a linear LUT in the graphic system then apply the corrections in the panel LUT is the way to handle these tasks (for example, he stated that its impossible to control contrast ratio and black at the same time using the graphic system LUT if I understood correctly).

I'll agree with Karl for the most part. The hybrid approach used by Frank Herbert and the ICS folks of splitting adjustments between both the video card and monitor gave smoother gradients and better shadow resolution on an Artisan than GMB/Son'y own code. You can control contrast ratio and black level through the video card alone, although you may not like the resulting posterization.

shewhorn

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 537
    • http://
Re: Monitor calibration sensor evaluations
« Reply #147 on: October 28, 2011, 03:06:17 am »

We record 100 readings and average (hacked versions of Argyll and a tool provided by X-Rite).

Haven't written a single line of code since September of 2002 when I left tech but... if it's in C or C++ I'm guessing that would be a fairly straightforward change to make. I might have to dust off gcc (errr... make that, install it first... I figured it would be part of the standard UNIX install on OS X).

Cheers, Joe
Logged

shewhorn

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 537
    • http://
Re: Monitor calibration sensor evaluations
« Reply #148 on: October 28, 2011, 03:33:27 am »

Dell makes offhand mention of DDC in the online documentation. From what I have been told, the problem with DDC and Dell stems from Dell's transfer of all firmware development to a new group when the transition from CRT to LCD occurred. Some LCD Dell monitors returned an EDID (the "Extended Display Identification Data" tag used to identify a particular panel) for a previous generation CRT. There are other quirks with Dell's DDC implementation, of which their own engineers may or may not be aware.

On a somewhat related but slightly tangential note, have you ever seen curious behaviors  in any of Dell's monitors with regards to how the monitor LUTs behave? I tested an Asus PA246Q. All things considered, for a sub $500 monitor it had some nice features... 12 bit LUT, 10 bit panel... it had two fatal flaws though:

1) It couldn't go lower than 135 cd/m^2
2) ANY change to the monitor LUTs resulted in an alarming increase in ∆E... as in average ∆E in the 10+ range.

I first noticed this when using CEDP to control it via DDC. The validation results could have been the lead character in a horror film. Multiple attempts produced identical results. DDC being what it is, I figured their implementation may not have been standard so I reset the monitor and figured I'd do it the old fashioned way, by adjusting the RGB gains and offsets by hand. This produced the same results. If you set the RGB levels to anything other than 0,0,0, the average ∆E would jump to 10+. Didn't matter if it was 1,0,0 or -20,+17,-4... any change would result in massive average ∆E.

I found this to be a rather curious behavior. A month later I tested a Dell U2410. SAME EXACT behavior. Similar specifications... 12 bit LUT, 10 bit panel (I believe they both used the same panel)... I kind of have a sneaking suspicion that Dell and Asus OEMed the guts and contracted the engineering from the same place. That's just too much of a coincidence.

Cheers, Joe
Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: Monitor calibration sensor evaluations
« Reply #149 on: October 28, 2011, 03:35:38 am »

I'll agree with Karl for the most part. The hybrid approach used by Frank Herbert and the ICS folks of splitting adjustments between both the video card and monitor gave smoother gradients and better shadow resolution on an Artisan than GMB/Son'y own code. You can control contrast ratio and black level through the video card alone, although you may not like the resulting posterization.
Doing stuff PC-side offers flexibility and adaptivity. Doing stuff display-side means that you (potentially) avoid requantizing to 8 bits.

Seems to me that doing 1D 8->10/12-bit gamma on the native primaries in the monitor LUT and perhaps large-scale channel gains (brightness, whitepoint) makes sense, while doing complex rotations is probably a job for your favourite CM-aware application (e.g. Photoshop) that have complete access to 1)characteristics of the camera used and 2)complete access to the profile of the subsequent image pipeline.

-h
Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: Monitor calibration sensor evaluations
« Reply #150 on: October 28, 2011, 03:38:29 am »

I first noticed this when using CEDP to control it via DDC. The validation results could have been the lead character in a horror film. Multiple attempts produced identical results. DDC being what it is, I figured their implementation may not have been standard so I reset the monitor and figured I'd do it the old fashioned way, by adjusting the RGB gains and offsets by hand. This produced the same results. If you set the RGB levels to anything other than 0,0,0, the average ∆E would jump to 10+. Didn't matter if it was 1,0,0 or -20,+17,-4... any change would result in massive average ∆E.
What is CEDP?

Could it be that all/most internal processing is applied as a modification of the internal LUT, and that the factory (unknown) LUT is very irregular to compensate for nasty panel behaviour?

-h
Logged

shewhorn

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 537
    • http://
Re: Monitor calibration sensor evaluations
« Reply #151 on: October 28, 2011, 04:12:22 am »

What is CEDP?

Color Eyes Display Pro

Quote
Could it be that all/most internal processing is applied as a modification of the internal LUT, and that the factory (unknown) LUT is very irregular to compensate for nasty panel behaviour?

I don't think so. The panel it uses is the same as the NEC PA241W, Eizo CG243W, Eizo CG245W and a few others so it's keeping some pretty good company. It doesn't mean it will perform the same as those monitors, I certainly don't expect it to have the same luminance uniformity as the Eizo and NECs and the quality of CCFL that they use could certainly have an impact as well but I don't think that's enough to cause the performance to be so off base that it requires drastic corrections like that. I'm speculating of course so I could be wrong but it would seem unlikely.

Just an additional note... unlike the Asus the Dell U2410 was able to get down to quite useable luminance levels and had enough headroom underneath it (wouldn't that be footroom?) so that you weren't running it at an extreme. If I remember correctly with the luminance all the way down the monitor was at 89 cd/m^2 (I'd have to double check on that, at any rate I remember being surprised as Dells have historically required a welder's mask).

Cheers, Joe
Logged

ThDo

  • Newbie
  • *
  • Offline Offline
  • Posts: 32
Re: Monitor calibration sensor evaluations
« Reply #152 on: October 30, 2011, 02:47:54 am »

@shewhorn

Here are mine Spectraview II values.

Spyder 3: CIE 0.306, 0.338 at 114.24 cd/m²
i1 Pro: CIE 0.313, 0.335 at 116.63 cd/m²
i1 Display Pro: CIE 0.313, 0.330 at 114.33 cd/m²

What did you do to see 4 decimal places?

Logged

shewhorn

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 537
    • http://
Re: Monitor calibration sensor evaluations
« Reply #153 on: October 31, 2011, 02:56:43 am »

What did you do to see 4 decimal places?

In target settings, click on edit, and then select custom for white point, then click edit. This brings up the measurement screen. This screen gives you 4 decimal places vs. the colorimeter winow which only gives you 3.

Ethan... Here's another go, all in Spectraview II this time (on the NEC2690).

Eye One Pro = 106.32 cd/m^2 @ 6145ºK
i1 Display Pro = 100.19 cd/m^2 @ 6427ºK
i1 Display 2 (MDSVSENSOR II) = 112.08 cd/m^2 @ 6050ºk
Spyder 3 = 101.69 cd/m^2 @ 6100ºK
DTP-94 = 102.41 cd/m^2 @ 5500ºK

For quick reference here's the results from a few days ago:

Eye One Pro - 104.4 cd/m^2 @ 6135ºK
i1 Display Pro - 101.3 cd/m^2 @ 6526ºK
i1 Display 2 (MDSVSENSOR II) - 111.9 cd/m^2 @ 5887ºK
Spyder 3 - 102.4 cd/m^2 @ 6061ºK
DTP-94 - 104.7 cd/m^2 @ 5373ºK

So for color temperature the i1 Display Pro is consistently reading a bit cooler than  the Eye One Pro, i1 Display 2 (MDSVSENSOR II), and Spyder 3 (which are all relatively close to one another, we'll consider the results of the DTP-94 invalid for color temp).
For Luminance this time around the i1 Display Pro, the Spyder 3, and the DTP-94 are relatively close, the Eye One Pro falls in the middle of the pack, and the i1 Display 2 (MDSVSENSOR II) is reading higher than everything else.

I used a slightly different methodology today... Not only were all the pucks plugged in and on the screen for quite a while like they were before, I also ran them in continuous measurement mode until the values stabilized. I figured being plugged in and on the screen for a long time would be enough to get them to operating temperature but the Eye One Pro's reported color temp changes quite a bit... it maybe takes about 5 minutes for it to stabilize and the reading can change by 150ºK. All of the pucks' color temp readings changed when taking continuous measurements with the exception of the Spyder 3 which pretty much didn't change. With the Eye One Pro, the i1 Display 2 and the DTP-94 the reported color temperature got cooler as the pucks heated up and the i1 Display Pro's reading got warmer (although the change wasn't as much as the others with the exception of the aforementioned Spyder 3). The luminance readings were all pretty consistent.

Cheers, Joe
Logged

Alan Goldhammer

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 4344
    • A Goldhammer Photography
Re: Monitor calibration sensor evaluations
« Reply #154 on: October 31, 2011, 08:04:23 am »

Joe, thanks for the work on this.  I've noticed that it can take a while for various instruments to stabilize as well.  Most of my calibration is done (NEC P221 with Spectraview) with the i1 Display 2 (MDSVSENSOR II) and I find that the first calibration run invariably gives very poor results with a high delta E.  2nd and third runs repeated right after are stable and readings seldom differ by more than 1%.  Before I sold my ColorMunki, I compared it with the i1 Display 2 and they were pretty much equivalent.  Since I'm satisfied with the current calibration approach, I have not experimented around with my i1 Pro but maybe will do so the next time it is due for a calibration.

Alan
Logged

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1948
    • zarzadzaniebarwa.pl
Re: Monitor calibration sensor evaluations
« Reply #155 on: October 31, 2011, 08:41:57 am »

I used a slightly different methodology today... Not only were all the pucks plugged in and on the screen for quite a while like they were before, I also ran them in continuous measurement mode until the values stabilized. I figured being plugged in and on the screen for a long time would be enough to get them to operating temperature but the Eye One Pro's reported color temp changes quite a bit... it maybe takes about 5 minutes for it to stabilize and the reading can change by 150ºK.

Did you recalibrate the i1Pro after it was stabilized?
Logged
Marcin Kałuża | [URL=http://zarzadzaniebarwa

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1948
    • zarzadzaniebarwa.pl
Re: Monitor calibration sensor evaluations
« Reply #156 on: October 31, 2011, 09:13:08 am »

I find that the first calibration run invariably gives very poor results with a high delta E.  2nd and third runs repeated right after are stable and readings seldom differ by more than 1%.

It's something different - CCFL backlight also needs some time for stabilisation, so it may drift a bit in a period of time between calibration and validation. That's why you get consistent results and lower deltas in next runs - you can also check "Extended luminance stabilization time" in SpectraView II Preferences>Calibration tag.
Logged
Marcin Kałuża | [URL=http://zarzadzaniebarwa

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: Monitor calibration sensor evaluations
« Reply #157 on: October 31, 2011, 10:35:32 am »

you can also check "Extended luminance stabilization time" in SpectraView II Preferences>Calibration tag.

Yup, I always have that on. It slows the process way down but who cares. As long as you don’t have a screen saver come on during this longer process, set it and forget it.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Alan Goldhammer

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 4344
    • A Goldhammer Photography
Re: Monitor calibration sensor evaluations
« Reply #158 on: October 31, 2011, 10:45:51 am »

It's something different - CCFL backlight also needs some time for stabilisation, so it may drift a bit in a period of time between calibration and validation. That's why you get consistent results and lower deltas in next runs - you can also check "Extended luminance stabilization time" in SpectraView II Preferences>Calibration tag.
Thanks for the tip; I just ticked the box and will try this next time I calibrate.  Andrew's point following is good about the screensaver (though I always turn mine off while calibrating).
Logged

shewhorn

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 537
    • http://
Re: Monitor calibration sensor evaluations
« Reply #159 on: October 31, 2011, 08:13:27 pm »

Did you recalibrate the i1Pro after it was stabilized?

As fast as I possibly could.... Closed the "colorimeter" window (which was taking continuous measurements), put the the Eye One Pro on the calibration tile and covered it (not using the tile... just using the lack of light when on the tile), quit the application and restarted it, did the cal, realigned the sensor, and started calibration. So yes! Even so I've noticed that having the Eye One Pro out of continuous measurement mode, even  for 30 seconds will cause the color temp to drift a bit. I knew it was sensitive to heat but I never knew just how sensitive it was!

Cheers, Joe
Logged
Pages: 1 ... 6 7 [8] 9   Go Up