Luminous Landscape Forum

Raw & Post Processing, Printing => Colour Management => Topic started by: Ethan_Hansen on April 29, 2011, 06:06:03 pm

Title: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on April 29, 2011, 06:06:03 pm
We have been working on a thorough evaluation (http://www.drycreekphoto.com/Learn/Calibration/MonitorCalibrationHardware.html) of monitor measurement hardware. We used a reference-grade spectroradiometer (Photo Research PR-730 (http://www.photoresearch.com/current/pr730.asp)) to compare sensor performance on a variety of monitors. All were IPS displays, and ranged from entry level sRGB gamut models (e.g. NEC 231wmi, Dell U2311H) with CCFL backlight, to Adobe RGB compatible wide-gamut models with either CCFL backlights (e.g. NEC PA-241W, Eizo CG243W) or RGB LED backlighting (HP LP2480zx). We measured multiple samples of each sensor to quantify inter-instrument agreement on a white background. Each monitor was then measured at a 6500K, 150 cd/m2 white level, middle gray, and the lowest useable black level as measured by ColorEyes Display (http://www.integrated-color.com/cedpro/coloreyesdisplay.html); roughly the level where RGB (0, 0, 0) is distinguishable from (1, 1, 1).

Colorimeters measured were the X-Rite DTP-94 and Eye-One Display 2 (http://www.amazon.com/gp/product/B000JLO31M/ref=as_li_ss_tl?ie=UTF8&tag=drycreekphoto-20&linkCode=as2&camp=217145&creative=399349&creativeASIN=B000JLO31M), and the Datacolor Spyder 3 Elite (http://www.amazon.com/gp/product/B00372561Q/ref=as_li_ss_tl?ie=UTF8&tag=drycreekphoto-20&linkCode=as2&camp=217145&creative=399349&creativeASIN=B00372561Q). These were compared to the X-Rite Eye-One Pro (http://www.amazon.com/gp/product/B001NJ0C96/ref=as_li_ss_tl?ie=UTF8&tag=drycreekphoto-20&linkCode=as2&camp=217145&creative=399349&creativeASIN=B001NJ0C96) spectrophotometer and, for the sake of experimentation, to the venerable Spectrolino. We measured a minimum of 10 samples for each of the above sensors. We had access to a single ColorMunki (http://www.amazon.com/gp/product/B00169N0BK/ref=as_li_ss_tl?ie=UTF8&tag=drycreekphoto-20&linkCode=as2&camp=217145&creative=399349&creativeASIN=B00169N0BK). While we could not compare unit-to-unit stability for the ColorMunki, the accuracy measurements were interesting.

For the full details, read the article (http://www.drycreekphoto.com/Learn/Calibration/MonitorCalibrationHardware.html) on our site.

Summarizing the contents we saw the DTP-94 and the spectrophotometers had the best inter-instrument agreement. Any given DTP-94 or i1Pro can be counted on to provide consistent performance, be it good or bad. The Spyder 3 showed more variability. We only compared units manufactured after mid-2009. Older Spyders were all over the place in performance. Trailing the pack was the Eye-One Display. There was over twice as much average unit-to-unit variability between Eye-One Displays as between the two most different DTP-94 sensors. The two most different i1 Displays (out of 16 total) differed by 14 dE-2000 in the color they reported as monitor white. That's a lot. If you have multiple monitors to calibrate and use an Eye-One Display, be sure to calibrate them all with the same sensor.
(http://www.drycreekphoto.com/images/calibration/SensorVariability.png)

We next measured accuracy as referenced to our PR-730 on each monitor at white, middle gray, and black. The results were nicely summarized by comparing accuracy on a standard gamut (96% sRGB, 66% Adobe RGB coverage) CCFL monitor and a wide-gamut (168% sRGB, 115% Adobe RGB) LED-backlit screen.
(http://www.drycreekphoto.com/images/calibration/SensorAccuracy.png)

Here we can see the strengths and weaknesses of each instrument. Spectrophotometers, with the exception of the ColorMunki, have no problems measuring lighter values on any display type. Problems occur when luminance values drop down into the mud. The standard gamut monitor had black point luminance of 0.19 cd/m2, the wide-gamut panel was a mere 0.11 cd/m2. Most of what the spectrophotometers recorded at these low levels was measurement noise. Monitor profiles made from these devices exhibited color tinges in what should have been neutral shadow areas.

We only had one ColorMunki unit at our disposal; its performance was comparatively poor. I can not say if the Munki we measured was the runt of the litter or if our results were indicative of less than stellar performance for the product line. Given that the ColorMunki is marketed as a entry-level device, the performance we saw may not be surprising. We saw consistently high error levels on all monitors with the Munki. If your livelihood depends on how accurately your monitors render images, the ColorMunki is not the best choice.

Among the colorimeters, the DTP-94 turned in consistently good performance on standard gamut displays and equally consistently poor performance on all wide-gamut models. The Spyder 3 turned in the best wide-gamut performance of any of the standard colorimeters, essentially equal to what we measured on sRGB gamut monitors. Inter-instrument variability is a problem with the Spyder 3, however. Although the average accuracy was good, few photographers have a dozen Spyders at hand to use in calibrating their screens. Overall, I would rate the Spyder 3 as being the best option we evaluated for profiling wide-gamut displays.

The Eye-One Display trails the pack. If we took the best performing 50% of the i1 Display units - ones clustered within +/- 2dE-2000 of each other - the story would be different. All the i1 units were relatively new, the oldest being manufactured in 2008, the newest this year. Our first thought was that we might be seeing the effects of the internal plastic filters aging. This is a problem with i1 Display - and any other colorimeter using plastic filters - another advantage to the DTP-94 and its glass filter set. We had a few early, pre-production i1 Displays courtesy of GretagMacbeth. The readings from these units were indeed very different from the rest. There was no correlation between i1 age and color response for the newer pucks. Based on our sample size, you have a 50/50 chance of getting a good i1 Display.

The last set of sensors we looked at were OEM-branded Eye-One Displays tuned to the characteristics of a particular monitor. We had one each for the HP LP2480zx and NEC PA-241W. Both of these i1's turned in excellent performance on their respective monitors. Using the tuned sensor on any other monitor, however, was invitation to disaster. The results were ugly. Again, with a sample size of one device, I cannot say whether these were particularly good examples of the i1 Display family or whether the OEM tuning tames the unit-to-unit variability seen with standard Eye-One Displays. Quato sells tuned versions of the DTP-94 with their (seriously) high-end Intelli Proof monitors. Having neither Quato monitor nor one of their sensors, I can not comment on their performance.

Update May 23, 2011: New data available for OEM calibrated Eye-One Display 2 sensors. The i1D2 can be individually calibrated using a reference spectroradiometer. Doing so reduces mean intra-unit variability by almost half (mean unit-to-unit variation of 7.0 dE-2000 for stock i1D2 vs. 3.7 dE for OEM calibrated).
Title: Re: Monitor calibration sensor evaluations
Post by: Iliah on April 29, 2011, 06:50:52 pm
Thank you, Ethan. What one is to expect with i1 Pro if black point is set not to lowest neutral, but to 0.5 cd/m2?
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on April 29, 2011, 07:01:33 pm
Thank you, Ethan. What one is to expect with i1 Pro if black point is set not to lowest neutral, but to 0.5 cd/m2?

Iliah - excellent point, and one i should have mentioned. We looked into this with a couple of monitors. Setting black to 0.5 - 0.8 decreased the mean measurement error from ~10dE to ~5. Big improvement if you are willing to live with lighter blacks levels. This is a valid option, particularly for anyone printing on fine art papers or other stock with relatively low print contrast.
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on April 29, 2011, 08:35:11 pm
Ethan, thank you very much for this informative analysis and the detailed report backing it up on your website. I have one question: do you think your results would have been influenced one way or another by the software used for the tests - specifically had you used BasicColor rather than ColorEyes Display, or had you used NEC's Spectraview II or Eizo's own, both of which latter only make matrix profiles.
Title: Re: Monitor calibration sensor evaluations
Post by: Iliah on April 29, 2011, 08:44:49 pm
Something I think may be worthwhile mentioning:

> BasicColor

BasICColor display should be on version 5 soon, with many changes and improvements compared to version 4. Their DISCUS colorimeter is a somewhat different beast compared to i1 Display 2 and DTP-94.
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on April 29, 2011, 08:49:46 pm
Mark,

The end goal of this project is a long-overdue revision to our monitor calibration tools ratings. It seemed a good idea to first determine what the capability of each sensor was, independent of the software used to drive it. Also, we have a really good spectroradiometer on long-term loan, so this was a reason to make use of it.

The only use of commercial software in our tests was in determining an appropriate minimum black point. All the monitors used were capable of darker black levels, but setting them there made for plugged up shadows. We started manually adjusting each monitors output, but ColorEyes gave essentially the same results in an automated manner. Other calibration software certainly can optimize black levels, but ColorEyes proved closest to laboriously hand-tuned black points. The differences were usually not significant to the end results. Moving the black luminance by 0.05 cd/m2 (the spread we saw between ColorEyes and other software's targets) did not make any real difference to the sensor error levels we saw.

The actual measurements were made without any color management being performed and using either OEM drivers and home-brew software or hacks into the Argyll code to control the output.
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on April 29, 2011, 08:52:58 pm
Something I think may be worthwhile mentioning:

> BasicColor

BasICColor display should be on version 5 soon, with many changes and improvements compared to version 4. Their DISCUS colorimeter is a somewhat different beast compared to i1 Display 2 and DTP-94.

What is a "somewhat different beast"? Does it bite, smile, growl? :-)
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on April 29, 2011, 08:54:37 pm
Mark,

The end goal of this project is a long-overdue revision to our monitor calibration tools ratings. It seemed a good idea to first determine what the capability of each sensor was, independent of the software used to drive it. Also, we have a really good spectroradiometer on long-term loan, so this was a reason to make use of it.

The only use of commercial software in our tests was in determining an appropriate minimum black point. All the monitors used were capable of darker black levels, but setting them there made for plugged up shadows. We started manually adjusting each monitors output, but ColorEyes gave essentially the same results in an automated manner. Other calibration software certainly can optimize black levels, but ColorEyes proved closest to laboriously hand-tuned black points. The differences were usually not significant to the end results. Moving the black luminance by 0.05 cd/m2 (the spread we saw between ColorEyes and other software's targets) did not make any real difference to the sensor error levels we saw.

The actual measurements were made without any color management being performed and using either OEM drivers and home-brew software or hacks into the Argyll code to control the output.

Understood. Thanks Ethan.
Title: Re: Monitor calibration sensor evaluations
Post by: Iliah on April 29, 2011, 08:58:52 pm
You can find basICColor DISCUS Reference Manual here:
http://www.basiccolor.de/assets/Manuals/Manual-DISCUS.pdf
Title: Re: Monitor calibration sensor evaluations
Post by: Alan Goldhammer on April 29, 2011, 09:38:39 pm
Ethan,

Thanks for the review.  If I am reading your post correctly (have not had a chance to read the full article); sourced i1 displays from monitor manufacturers (such as NEC) seem to have a better tolerance than not when used on that companies monitor.  I guess this is because of the specification that were supplied to X-Rite and the specific application.  I know the NEC sensors are designed for the wide gamut monitors and to be used with Spectraview (which I use).  What I don't know is what the variance is in tolerance between various OEM designed sensors.  Does NEC do any kind of testing prior to shipping or do they rely on X-Rite here.  I know you didn't test using Spectraview but I guess I find this all pretty worrying since these are marketed at "precision" instruments designed to give good color control.  I'm unsure whether that is the case.
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on April 29, 2011, 10:08:12 pm
Ethan,

Thanks for the review.  If I am reading your post correctly (have not had a chance to read the full article); sourced i1 displays from monitor manufacturers (such as NEC) seem to have a better tolerance than not when used on that companies monitor.  I guess this is because of the specification that were supplied to X-Rite and the specific application.  I know the NEC sensors are designed for the wide gamut monitors and to be used with Spectraview (which I use).  What I don't know is what the variance is in tolerance between various OEM designed sensors.  Does NEC do any kind of testing prior to shipping or do they rely on X-Rite here.  I know you didn't test using Spectraview but I guess I find this all pretty worrying since these are marketed at "precision" instruments designed to give good color control.  I'm unsure whether that is the case.

The answer is I don't know. The two OEM-adjusted sensors we have are a NEC sold with the Spectraview package and HP's DreamColor unit. Our assumption is that these two Eye-One flavors are adjusted differently. The NEC sensor performs well on the PA monitor, but is grossly inaccurate on either sRGB monitors, the HP (uses LED backlighting rather than CCFL), and even wide-gamut Eizo displays that are also CCFL lit. The HP puck has similar behavior: excellent performance on the monitor it is sold with, lousy on all others.

If these two OEM flavors of i1 are actually identical, that would imply a truly worrying lack of consistency. We asked both HP and NEC for details. None were forthcoming. Without a reasonable sampling of OEM pucks to characterize there is no way to be sure. X-Rite, to the best of my knowledge, has not published accuracy and resolution specs for the Eye-One colorimeters like they do for the i1Pro spectros.
Title: Re: Monitor calibration sensor evaluations
Post by: Scott Martin on April 29, 2011, 11:09:17 pm
Fun testing Ethan! I'm a testing geek too. I like these hardware tests relative to a reference device. I think it would be interesting to now compare the calibrated results from one device used with a bunch of different applications (DispCal/Argyll, Syper3Elite, i1Profiler, Color Eyes DisplayPro, etc). Which software produces the best results with that one device?

After that, it would be interesting to look at what combinations of software and hardware are yielding the best results. What combination of devices and software applications produces the best results across a wide variety of displays?

And then start comparing the calibration times (like 1+ hour for DispCal/Argyll High quality vs under 2 minutes for i1Profiler) and see which combination provides the best calibration for the time spent. Is it worth spending an hour to calirbate a display? Or will a 10 minute calibration provide 99% of the benefit?

As I see it, the question is: when displays are calibrated with a variety of devices and applications which *combination* consistently yields the best results within a reasonable timeframe? This is the question that I've been trying to answer. And I think I've got it.

Obviously the number of possible combinations (calibration devices, calibration apps, displays) is huge. There are some trends that can help us narrow things down. The EyeOneDisplay for example sucks - but we've know that for a long time right? DTP94 has been a long time top performer but should have limitations with wide gamut monitors. The Spyder3 is perhaps the most modern, versatile and affordable colorimeter, with broad support. EyeOnePros are good, commonly available, broadly supported, affordable spectros but we've seen issues in the past with the shadows. There are other options that are a bit more expensive that don't offer compelling reasons to consider them. IPS Displays can more or less be lumped into three categories: low cost ~sRGB CCFL, higher cost ~aRGB CCFL and various LED displays that tend to have larger gamuts. So it's not like we really need to test 100 displays.  Nonetheless people like myself that travel around the country calibrating displays and other devices will eventually get hands on with hundreds of displays with whatever calibration system they go with and look for trends along the way.

We can use QA tools to statistically measure the accuracy of a calibrated monitor. In addition we can visually look for smoothness in gradations, espicailly in the shadows. In addition to this, I particularly like to visually analyze how well a display profile performs with soft proofing! This last step is often the toughest challenge for a display profile and some clearly perform better than others.

I've been focusing my testing on The Spyder3Elite and EyeOnePro devices using ColorEyesDisplayPro, DispCal/Argyll, Syper3Elite, and i1Profiler. From the testing I've done thusfar, I have to say I'm pretty blown away and completely surprised with what I'm seeing from i1Profiler with an EyeOnePro. Not only does the process take less than 2 minutes (!!) but the shadows and smoothness are incredible, rivaling even DispCal's hour long calibration process. i1Profiler's QA analysis reports lower Average Delta E variations with i1Profiler's calibration than the others. I've been frustrated with shadow detail from the i1 Pro in the past but i1Profiler is doing a better job than I've seen with other solutions.

Ethan you wrote "No software wizardry will improve a spectrophotometer's accuracy at low light levels to be competitive with a colorimeter." But that's exactly what I'm seeing with i1Profiler (to my surprise as much as yours I'm sure!). Suddenly the shadows are all there statistically and visually and the performance with soft proofing is incredible.

So this is fun stuff. I'd love to hear your results should you attack this question above, regardless of methodology. I've got lots of tests to continue expanding upon and perhaps I will come to different conclusions later as my methodology continues to evolve. I hate to say it, but there's more stuff on the horizon that keeps all of this a moving target. But for now, I think there's lots of "under the hood" algorithmic mojo happening in i1Profiler that's worth getting excited about. Let me (us) know what you think!

Cheers and thanks again Ethan!
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on April 30, 2011, 02:03:19 am
Scott,

You understand our motivation perfectly. We are out to update our reviews of monitor profiling systems. Looking at hardware-agnostic packages (ColorEyes, BasICColor, CalPC, etc.) that support a huge range of sensors is daunting. Yes, it is pretty cool to be able to hook a Minolta CS-2000 up to BasICColor or our PR-730 to CalPC, but this doesn't tell you much about real-world performance. (If any of you photographers can financially justify hanging $20K+ of sensor off your monitor for calibration purposes, I'll sell my stake in Dry Creek and work as your gofer in a heartbeat).

What we wanted to establish was which sensors we should be looking at to be able to review systems in a reasonable length of time. Our methodology is straightforward. We use the unbiased eye of the spectroradiometer to check white and black point accuracy, grayscale neutrality and adherence to the desired gamma curve, and accuracy of the RGB primaries. The next judgement is purely subjective. Several of us flip through a series of synthetic and actual images to judge how well a profile performs. Lather, rinse, repeat on the next monitor.

I have been impressed with i1Profiler as well. X-Rite indeed managed to coax commendable performance out of a short measurement cycle. I assume they are using a method better than the simple averaging in Argyll to tease a signal from the measurement noise. (Serious late-night geek digression: The standard error of a measurement sample goes down with the square root of the sample size. In English, this means that taking 3 measurements cuts your random variability in half vs. 1 sample. To reduce it by half again, you need 10 measurements. Shrinking the error by half again requires 100 measurements, followed by 10000 measurements for another 50% reduction. You can see why the DispCal's high quality mode takes an hour to run).

Back to reality... Our tests of i1Profiler do indeed give more neutral and open shadows with an i1Pro than most other software. The calibration ain't perfect, however. We still visually see color crossovers where none exist with a profile made with, for example, ColorEyes and a DTP-94 (highlights and saturated colors are an entirely different matter).

Statistical measurements of a profile's performance with the same sensor and software used to create the profile are problematic. Others have argued that this is an exercise in an activity that probably should not be, shall we say, brought up in a PG-rated forum. My take is that there is useful information to be gleaned from these profile validations, but the results need to be viewed with caution. Yes, comparing dE values can aid in convincing people who insist on calibrating to 5000K that 6500K is more compatible with how modern monitors actually work. The same goes with setting a white luminance too low on any LCD other than an Eizo. Likewise, seeing the blue primary dE increasing steadily is a warning sign that the backlight is failing. Looking at the reported dE values for a grayscale can be misleading. We measured a black point dE-2000 of over 8 when the profiling software reported all values as being less than 1 for i1Pro-generated profiles.

We'll see where this all leads. On one hand, there is the hybrid approach used by Argyll and CalPC of creating a correction matrix with a spectrophotometer for the primaries and whit point which is then applied to the profile built from colorimeter measurements. On the other hand. measurement technology is improving. OEM-tuned sensors are available for specific monitor technologies, while improved sensors such as the Discus are coming out at (semi-)affordable price points.

One last comment. The high quality mode in Argyll/DispCal has a potential gotcha: temperature drift. The ii1Pro is fairly sensitive to temperature. Even under controlled ambient temperature there can be shifts from the i1 heating up as is rests on the screen. We saw this firsthand when trying ridiculously long integration times to see what the absolute noise floor was for an i1Pro. We started with a warmed-up monitor, and saw the i1 readings progressively change over the course of 20 minutes. Our reference PR-730 is both temperature-compensated and was being used in non-contact mode. After the first 20 minutes, the i1 readings did not shift for the 24 hour duration of our test. Moral of the story: preheat your i1Pro by hanging it on your screen while you surf the web for 20 minutes or so before you calibrate.
Title: Re: Monitor calibration sensor evaluations
Post by: jaapb on April 30, 2011, 02:47:37 am
Ethan, Scott,

Very interesting stuff guys.
I thought I knew it all about profiling monitors and papers but since delving into ArgyllCMS, though quite geeky, my level of understanding was raised considerably. When reading threads like these I now see there is even a whole new dimension to discover.
Thank you guys.

Jaap
Title: Re: Monitor calibration sensor evaluations
Post by: Lilien on April 30, 2011, 06:04:25 am
Ethan,

Thanks for the informative review, although as an owner of ColorMunki (with Argyll) I'm, needless to say, not very happy with your findings.

Have you taken into consideration, that the ColorMunki sensor is, according to X-Rite, not calibrated to the same (former GretagMacbeth) standard as e.g. the Eye-One Pro?
The Colormunki was their first instrument calibrated to the new XRGA standard, so it is not to be expected to get similar readings.

Have you tested if the hires (3.33 nm) and adaptive (integration time) modes within Argyll give any benefit with the ColorMunki?

I also found, that my Colormunki seems to be sensitive to temperature changes. Even a self-calibration brings it not back to the same reading after a substantial temperature change (several minutes on the screen).

Thanks again and regards,
 Juergen
Title: Re: Monitor calibration sensor evaluations
Post by: Ernst Dinkla on April 30, 2011, 06:54:06 am
Ethan,

Very nice testing. Gives me a good guide to improve my monitor calibrations/profiling with a mix of displays here.

The Spyder 3 Elite general quality on both monitor gamuts surprised me.

Another thing I was curious about for some time is whether that colorimeter capacity to measure better in low luminance values compared to spectrometers is seen in reflective measurements. I gather that the illumination in spectrometers is made high enough that reflected light still keeps it out of the dangerous zone. I recall of the HP Z model spectrometers that they measure the dark patches longer to get better readings, less noise and/or less hysterisis affecting them.  The distance between the media and lamp/sensor with that printer integrated spectrometer has to be considered, more light will be used but maybe that extra time for dark patches was needed as well.

met vriendelijke groeten, Ernst

New: Spectral plots of +250 inkjet papers:

http://www.pigment-print.com/spectralplots/spectrumviz_1.htm
Title: Re: Monitor calibration sensor evaluations
Post by: Mark Paulson on April 30, 2011, 09:09:43 am
Something I think may be worthwhile mentioning:

> BasicColor

BasICColor display should be on version 5 soon, with many changes and improvements compared to version 4. Their DISCUS colorimeter is a somewhat different beast compared to i1 Display 2 and DTP-94.

I would be happy to send my DISCUS to Ethan for testing.
Title: Re: Monitor calibration sensor evaluations
Post by: Scott Martin on April 30, 2011, 09:29:03 am
Our tests of i1Profiler do indeed give more neutral and open shadows with an i1Pro than most other software. The calibration ain't perfect, however. We still visually see color crossovers where none exist with a profile made with, for example, ColorEyes and a DTP-94 (highlights and saturated colors are an entirely different matter)

I'll suggest that we both continue testing this very comparison. CEDP with the DTP94 has been a rock solid component of my business for years so comparing just that to i1P/i1pro is a particularly important comparison for me. Unlike you, I'm seeing better results with i1Profiler, so I think you and I should both look further into this.

I agree that statistical measurements should ideally not be done within the same application that performed the calibration. While I'm impressed with i1P's QA with the colorchecker approach I'm open to other suggestions.
Title: Re: Monitor calibration sensor evaluations
Post by: Iliah on April 30, 2011, 10:50:18 am
Dear Ethan,

> it is pretty cool to be able to hook a Minolta CS-2000 up to BasICColor

Now it is Konica Minolta Display Color Analyzer CA-210. I'm getting reliable measurements with it down to 0.2 cd/m2 (they state 0.1 cd/m2, but I do not see it that way), which raises the same question of black point. Generally I stay at 0.5 cd/m2 (my target is proofs and press).
Title: Re: Monitor calibration sensor evaluations
Post by: keith_cooper on April 30, 2011, 04:58:05 pm
Interesting to see similar results to the more qualitative ones I got when looking at the Discus recently, where a Spyder 3 topped an i2 disp2, and i1 Pro, although all fell behind the Discus.

BTW Have to take my hat off to those with the wherewithal to test all these multiple devices -  my patience runs out far too quickly ;-)
Title: Re: Monitor calibration sensor evaluations
Post by: Iliah on April 30, 2011, 09:33:48 pm
Well, DISCUS is an easy choice :)
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on April 30, 2011, 09:39:14 pm
Is it really worth the price in terms of visible results versus several of the less expensive alternatives?
Title: Re: Monitor calibration sensor evaluations
Post by: Iliah on April 30, 2011, 09:54:20 pm
Is it really worth the price in terms of visible results versus several of the less expensive alternatives?

Well, it depends what one is after. 85% of what it does I can do with a combination of DTP-94 or i1 Display 2, i1Pro, and Argyll. But it takes about an hour to create a ccmx correction matrix for each type of a new display, and than about 15 minutes each time for calibration and profiling. With DISCUS I need only one device with me and 15 minutes plus results are better. The interesting part is that if the temperature in the room changes by 5F, or the monitor is not extra stable, or the ambient light is changing, or one is not wearing a black shirt all the effort and money spent on extra precision are more or less going to a bin.
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on April 30, 2011, 10:14:48 pm
Anyone spending the kind of money a DISCUS costs can just as well buy a few black shirts to accompany it, no? :-)
Title: Re: Monitor calibration sensor evaluations
Post by: Iliah on April 30, 2011, 10:22:18 pm
Anyone spending the kind of money a DISCUS costs can just as well buy a few black shirts to accompany it, no? :-)

Yes, if they know to. But more often than not they do not. Manuals are glossing those things over, too.
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on May 01, 2011, 04:30:52 am
With the DISCUS you can also calibrate the display from the distance, so it takes care of the not-black shirt flare then ;)
Title: Re: Monitor calibration sensor evaluations
Post by: Iliah on May 01, 2011, 09:49:12 am
> With the DISCUS you can also calibrate the display from the distance, so it takes care of the not-black shirt flare then

If the operator is not moving at all and the shirt is of uniform colour, yes it does.
Title: Re: Monitor calibration sensor evaluations
Post by: terrywyse on May 01, 2011, 12:53:25 pm
As I see it, the question is: when displays are calibrated with a variety of devices and applications which *combination* consistently yields the best results within a reasonable timeframe? This is the question that I've been trying to answer. And I think I've got it.

More to the point of what Scott mentioned......is it possible that different applications handle the devices differently...or compensate for some of their inherent disadvantages with regards to integration times?

Reason I ask....I've been using the ColorMunki lately with basICColor Display 4 and like it....but I noticed that the ColorMunki+basICColor takes on the order of about 10 min. to complete a profile relative to the 6 min. it takes with either my RevD i1Pro or my RevA i1Monitor spectro...so is basICColor Display using longer integration times with the ColorMunki and, as a result, achieving similar measurement quality to the i1Pro?

One major advantage I see to the ColorMunki is that it doesn't appear to suffer from the temperature swings that the i1Pro suffers when contacting the surface of the display.

Terry
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on May 01, 2011, 01:36:53 pm
......is it possible that different applications handle the devices differently...or compensate for some of their inherent disadvantages with regards to integration times?

My experience is absolutely! Been that way for years too.

The other question is, if the goal is a print to display match, is this an issue with the instruments or software products intermixed as long as one alters the target calibration appropriately. Frankly I could care less if an instrument I ask for D65 theoretically gets closer or father away from that aim point (which I doubt is fully possible anyway), but better, that whatever values I ask for produce a visual match.

Ask two products to hit D65 or any CCT K value in two different software products. What’s the likelihood they will be the same? Does it really matter anyway?

Now instrument variations per model is a big deal! So I’m not letting manufacturers off the hook for vastly different results from the same target requests in the same software. That’s not acceptable for those working in collaboration.
Title: Re: Monitor calibration sensor evaluations
Post by: Scott Martin on May 03, 2011, 09:46:57 am
Our tests of i1Profiler do indeed give more neutral and open shadows with an i1Pro than most other software. The calibration ain't perfect, however. We still visually see color crossovers where none exist with a profile made with, for example, ColorEyes and a DTP-94 (highlights and saturated colors are an entirely different matter).

Ethan, just to follow up. I'm continuing this testing and am finding that on some displays (like a Samsung 245T and Dell 24") I'm seeing dramatically improved results (visually and statistically) using an EyeOnePro device instead of a DTP94, even when the same software is used.

One of the interesting things that I see but suspect most people don't is how different two different types of monitors can look when calibrated the same way. Put a Cinema Display and a Samsung monitor on the same Mac and calibrate them both using the exact same settings and marvel at the disappointing differences you'll see. Better yet, take a client like Whole Foods World HQ where they've got 50+ designers and video professionals all in one area using a hodge podge of different brands of displays. I'm finding that if they are all calibrated using a colorimeter (Spyder3 or DTP94) when you stand back and look at all of them in one room it's kinda surprising how much inconsistency there is between them. Calibrate all of them with a spectro (using CEDP or DispCal) and they are visually perfectly consistent! Combine that with i1Profiler which does a better job at handling the shadows with a spectro than anything else and you've got a truly superior solution. Problem solved.

While lots of users may only have one or two displays these are the real world challenges (10+ different types of displays all side-by-side in one room) that my business faces every day. My hands-on testing is showing that spectros have advantages over even the best colorimeters in some situations and with i1Profiler I don't see any problems with the shadows like we've seen elsewhere. I'm not seeing any color crossovers or drawbacks with my particular pair of EyeOnes.

I'm going to stick to my guns here and suggest that, for now, i1Profiler with an i1Pro seems to be the answer to the question "when displays are calibrated with a variety of devices and applications which *combination* consistently yields the best results within a reasonable timeframe?"

Title: Re: Monitor calibration sensor evaluations
Post by: Pictus on May 03, 2011, 10:29:18 am
Ethan, just to follow up. I'm continuing this testing and am finding that on some displays (like a Samsung 245T and Dell 24") I'm seeing dramatically improved results (visually and statistically) using an EyeOnePro device instead of a DTP94, even when the same software is used.

With dispcalGUI + i1Pro create a correction matrix, then calibrate with DTP-94?  :)
Title: Re: Monitor calibration sensor evaluations
Post by: Scott Martin on May 03, 2011, 10:40:46 am
With dispcalGUI + i1Pro create a correction matrix, then calibrate with DTP-94?  :)
Yes, really good stuff there! It does add a layer of complexity though. If I need to go around and calibrate 50 different displays would I do it with DispCal's ~45 minutes process using a correction matrix or do I use i1P's less than two minute process where no correction file is needed? And how much is the display itself likely to change during that ~45 minute process?

Someone who has a colorimeter but no spectro might want to borrow one to make such a correction. But if one own a i1pro and i1Profiler  there's not a whole lot of point going back to a colorimeter.

Lots of angles to looking at this! I like to think of myself as an end users advocate. What the best solution for different types of end users? There's usually different solutions for different types of users.
Title: Re: Monitor calibration sensor evaluations
Post by: Alan Goldhammer on May 04, 2011, 11:12:54 am
In the FWIW department, I just went through the ArgyllCMS monitor calibration process using a ColorMunki (I must have one of the good batch) to create a profile using 500 color patches to create it.  I used the same standard settings that I use with SpectraView II and the X-Rite wide gamut puck that NEC distributes.  Visually there was no perceptual difference in the profiles but I since I don't have any software to evaluate them, I cannot provide a definitive answer about how similar they are.  It's clear that ArgyllCMS takes much longer than running SpectraView and it may give better results since it can be configured in a number of ways SpectraView cannot.  I have an i1 on order (backorder at X-Rite) and will repeat the process when it arrives since it should provide better readings than the ColorMunki. 
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on May 04, 2011, 11:24:22 am
In the FWIW department, I just went through the ArgyllCMS monitor calibration process using a ColorMunki (I must have one of the good batch) to create a profile using 500 color patches to create it.  I used the same standard settings that I use with SpectraView II and the X-Rite wide gamut puck that NEC distributes.  Visually there was no perceptual difference in the profiles but I since I don't have any software to evaluate them, I cannot provide a definitive answer about how similar they are.  It's clear that ArgyllCMS takes much longer than running SpectraView and it may give better results since it can be configured in a number of ways SpectraView cannot.  I have an i1 on order (backorder at X-Rite) and will repeat the process when it arrives since it should provide better readings than the ColorMunki. 

Spectraview creates a matrix profile (9 data points) while the process you are using most likely creates an LUT profile with several hundred data points, so perhaps that partly explains difference in processing time. If you want independent verification of profile quality, you can use PatchTool from Babelcolor (www.babelcolor.com). I used it to evaluate profiles for my NEC 271 and it revealed lower (i.e. better) dE readings for my BasicColor 4.22 LUT profile versus my Spectraview II matrix profile, both generated using the same colorimeter (the NEC customized i1 Display 2) and the same basic parameters for luminance, gamma and white point . 
Title: Re: Monitor calibration sensor evaluations
Post by: Alan Goldhammer on May 04, 2011, 11:38:56 am
Spectraview creates a matrix profile (9 data points) while the process you are using most likely creates an LUT profile with several hundred data points, so perhaps that partly explains difference in processing time. If you want independent verification of profile quality, you can use PatchTool from Babelcolor (www.babelcolor.com). I used it to evaluate profiles for my NEC 271 and it revealed lower (i.e. better) dE readings for my BasicColor 4.22 LUT profile versus my Spectraview II matrix profile, both generated using the same colorimeter (the NEC customized i1 Display 2) and the same basic parameters for luminance, gamma and white point . 
I didn't measure the sensor dwell time while it was reading the patches but it certainly was one second at a minimum and likely a little more so it takes about 10 minutes or so to measure those.  I'll look at PatchTool since I will need something as move down the Argyll path to validate what is going on.  I've also created a paper profile for Ilford Gold Fiber Silk using Argyll using the ColorMunki to do the readings.  It's a much better profile than the one done with the ColorMunki software, and I finally got a nice looking sky in the Atkinson arch shot.  It was pretty much right on and I probably could tweak it a little bit more. 

I presume from what you state that you are using BasicColor to calibrate your monitor, correct?
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on May 04, 2011, 12:39:21 pm
Yes, correct.
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on May 04, 2011, 12:41:13 pm
Spectraview creates a matrix profile (9 data points) while the process you are using most likely creates an LUT profile with several hundred data points, so perhaps that partly explains difference in processing time. If you want independent verification of profile quality, you can use PatchTool from Babelcolor (www.babelcolor.com). I used it to evaluate profiles for my NEC 271 and it revealed lower (i.e. better) dE readings for my BasicColor 4.22 LUT profile versus my Spectraview II matrix profile, both generated using the same colorimeter (the NEC customized i1 Display 2) and the same basic parameters for luminance, gamma and white point .  

There's no correction matrix for i1d2 WG in basICColor 4.2.4, so it basically takes different readings than Spectraview II (see Ethan's sensor evaluation results). basICColor 5 should support i1d2WG sensor (as this colorimeter is now available in basICColor's offer) - I bet that it will also give different results in PatchTool profile validation.
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on May 05, 2011, 02:52:08 am
Sorry for dropping off the face of the Earth for a few days - needed to do work that I was actually being paid for. There have been a number of questions about our results as well as many useful and thoughtful discussions in this thread. I'll do my best to address them. The questions are in three general categories: (1) Testing methodology, (2) Sensors (particularly the ColorMunki), and (3) everything else.

1: Testing Methodology (aka "do you guys know what the hell you are doing?!?")

Q: What software is used to drive the instruments and how does that factor into the results?
A: When possible, we used ArgyllCMS routines. These have the advantage of being completely customizable with the appropriate source code changes. Our intent was to determine the best possible results from each instrument. For spectrophotometers, this meant using very long integration times and heavy averaging when measuring darker values. The results we report, therefore, should be viewed as best-case for each sensor.

Q: How confident are you of the accuracy values reported in shadow levels?
A: That's an easy one. The spectroradiometer we used, a PR-730, has a luminance sensitivity of 0.0003 cd/m2. The minimum luminance we measured on a monitor was ~0.1cd/m2, or over 300x the resolution of the instrument. The PR-730 is accurate to +/-2% in luminance sensitivity at 0.009cd/m2, or to put it into other terms, we might be seeing 0.0998 or 0.1002cd/m2 rather than 0.1. Color accuracy is a similarly ridiculous +/-0.0015% at a luminance 10x lower than any monitor can reach. Our PR-730 was factory calibrated to NIST traceable standards within a few weeks of our evaluations.

2: Sensor questions

Q: What version of the ColorMunki did you test?
A: The Photo/Design version -- the spectrophotometer capable of emissive measurements for monitors and reflective measurements for prints. The ColorMunki Create is a repackaged Eye-One Display.

Q: What about XRGA? Are some sensors (e.g. ColorMunki) using this and would it make a difference?
A: Not to the best of our knowledge when controlled with Argyll's code. Based on X-Rite's XRGA whitepaper, measured color differences will be minimal for the Munki with or without XRGA.

Q: Are all ColorMunkis inaccurate or is it just the one you guys measured?
A: Only having characterized a single unit, we simply don't know. Until we can measure more samples, I will neither condemn nor exonerate the ColorMunki. Our results were disturbing, showing gross errors in black point measurements, but we might have tested a lemon unit. With the help of a third-party software supplier, we hope to get several more Munkis to evaluate. After verifying our first results showing high error levels in our ColorMunki sample, we emailed X-Rite to ask if they could send a few demo units our way. No response.

Q: For the Eye-One Pro, does using the raw 3nm measurement data help vs. using the default 10nm intervals reported by X-Rite's software?
[Explanation: The i1 Pro samples light at 3nm intervals. The data are noisy, and the default values reported are pre-smoothed to 10nm intervals. The output soectra of either a CCFL or LED backlight is spiky, with significant spikes being narrower than 10nm. The question comes down to whether the default smoothing obliterates useful data]
A: Again, Argyll comes to the rescue. It supports measuring at full resolution. The noise levels are indeed high, and feeding the raw values can create some pretty strange results. I geeked away on my latest plane trip, running some i1 readings through the FFT filtering routines we use in our profiling code. After suitable tweaks, I could get spectral curves closely approximating the 1nm sampling curves from our PR-730. I do not know if i1Profiler uses a similar technique, but I would not be surprised if it does. The dE measurements we reported used the default 10nm, smoothed data. Given that both the absolute magnitude of error on white with an i1Pro and the intra-unit variations were low, the 3nm sampling strategy is a refinement on an already good product. The problem area is in the shadows, where the measurements are noise-limited rather than influenced by spectral spikes.

Q: Any thoughts on the BasICColor DISCUS (http://www2.chromix.com/colorgear/shop/productdetail.cxsa?toolid=50140&pid=11713)?
A: Aside from the snazzy specs? With thanks to the good folks at CHROMiX, we hope to have one in-house for testing within a couple of weeks. Mark Paulson was kind enough to volunteer his DISCUS as well. Mark: If the offer still stands, I may take you up on it after we get a chance to run the fisrt sample through its paces.

Everything else

Q: Which is the more important metric, the absolute sensor accuracy or how much sensor-to-sensor variation is seen? From Andrew: "Now instrument variations per model is a big deal! So I’m not letting manufacturers off the hook for vastly different results from the same target requests in the same software. That’s not acceptable for those working in collaboration."
A: Of the two, I would focus on the inter-unit variability. Different manufacturers calibrate their sensors to slightly different references. Seeing a few dE difference in average readings between sensor types can be attributed to calibration settings. The large unit-to-unit differences we saw in, for example, the Eye-One Display point to a sensor that cannot be relied on for accurate readings. The largest deviation we saw on the i1D2 was 14 dE-2000. To visualize this is Photoshop, fill a square with a middle grey [160, 160, 160] (sRGB or Adobe RGB - doesn't matter). Fill an adjoining square with [213, 213, 213]. That is 14 Delta E-2000, and that is not subtle. The graphic below illustrates this.
(http://www.drycreekphoto.com/images/calibration/14_dE-2000.png)

Scott and Terry pose the question of which combination of software and hardware gives the best results. Determining this is the end goal of our exercise. As a first pass, we aimed to determine what the best-case capability for each instrument. We are then cherry-picking the best sensors to use in software comparisons; e.g trying to eliminate as many variables as possible,

Ethan, just to follow up. I'm continuing this testing and am finding that on some displays (like a Samsung 245T and Dell 24") I'm seeing dramatically improved results (visually and statistically) using an EyeOnePro device instead of a DTP94, even when the same software is used.
The Samsung 245T is a strange beast. It is a PVA panel with moderately wide gamut. The expanded gamut is a function of the backlight on the 245T, so I am not surprised that the i1Pro spectrophotometer gives better results than the DTP-94. This correlates with our measurements. The DTP-94 contains a filter set that is farther from the standard observer than those found in the Spyder 3. Hence, uncorrected readings get progressively less accurate as the backlight spectrum difference between a particular panel and a standard CCFL increases. In our tests, the DTP-94 consistently turned in  the least accurate white level readings on all wide gamut displays.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 15, 2011, 04:05:28 am
I'm going to stick to my guns here and suggest that, for now, i1Profiler with an i1Pro seems to be the answer to the question "when displays are calibrated with a variety of devices and applications which *combination* consistently yields the best results within a reasonable timeframe?"

Damn you Scott. Now I'm going to have to go back and re-evaluate i1Profiler for monitor profiling. :-) During the testing I'd decided that CEDP was still king.

FWIW, I've had a preference for my my Eye One Pro for calibrating and profiling my screens. I also have a Spyder 3, DTP 94, and an i1D2 and the profiles produced by the Eye One Pro seem to be far more neutral than my other pucks. Using Spectraview in conjunction with the Eye One Pro on the NEC 2690 I am able to discern differences between 000 and 111 so what more can you ask for (bah... I actually have an answer for that but I'm not permitting myself to say it because it will result in me pulling out the credit card)? In addition the profile is definitely more neutral than the profiles created with the NEC tweaked i1D2 that came with the Spectraview package. I did up the black level a smidge as I noticed some green casts in the shadows but bumping it up to around 0.3 or 0.35 cd/m^2 (can't remember specifically) seemed to take care of the problems I had (or at least... there's nothing I can perceive that bothers me at the moment).

Ethan - THANK YOU for doing all of this work. I'm kind of jealous as I love this type of white box (and um... occasionally black box) testing and would have loved to have had a chance to play with some of those toys.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Scott Martin on May 15, 2011, 01:20:16 pm
Damn you Scott. Now I'm going to have to go back and re-evaluate i1Profiler for monitor profiling. :-) During the testing I'd decided that CEDP was still king.

I love the iterative nature of CEDP's process. CEDP and i1P with and EyeOnePro is a combination I'm studying on a variety of displays right now. I'm finding that i1P has an edge on some displays while, surprisingly, CEDP can have an edge on others. So there's a need to analyze the broader landscape and make educated, forward thinking recommendations to clients.
Title: Re: Monitor calibration sensor evaluations
Post by: Pictus on May 15, 2011, 11:50:11 pm
I love the iterative nature of CEDP's process. CEDP and i1P with and EyeOnePro is a combination I'm studying on a variety of displays right now. I'm finding that i1P has an edge on some displays while, surprisingly, CEDP can have an edge on others. So there's a need to analyze the broader landscape and make educated, forward thinking recommendations to clients.

Probably with Eizo models as CEDP can access the monitor internal LUT.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 16, 2011, 12:54:01 am
I love the iterative nature of CEDP's process. CEDP and i1P with and EyeOnePro is a combination I'm studying on a variety of displays right now. I'm finding that i1P has an edge on some displays while, surprisingly, CEDP can have an edge on others. So there's a need to analyze the broader landscape and make educated, forward thinking recommendations to clients.

I've been revisiting things a bit. Today I've been playing with BasICColor and CEDP using the i1Pro. A few observations when paired with the i1P... BasICColor's calibrations run much faster where as CEDP tends to linger on each path a little longer. I'm not sure if that's a good thing or not, especially where shadow detail is concerned. Until tonight I've not noticed this. While it's well known that the i1P is noisier when measuring the shadows, I've never observed this to be a problem using Spectraview on my 2690 or CEDP on my other screens but when doing validations with BasICColor I've noticed DRAMATICALLY different results for the black point and as such, wildly varying evaluations for contrast ratio. On one pass I'll get 400:1 and on the next... 750:1.

Perhaps this behavior is also present in CEDP but I'm usually pretty observant and I think I'd catch variations of 0.18 cd/m^2 on 1 pass to 0.35 cd/m^2 measured on the next (well... at the very least I'd like to think that I wouldn't miss something like that :) ). That's just not something I've ever noticed with either the US version of Spectraview or CEDP (when using the i1P). When using the Spyder 3 with BasICColor the validations are consistent. I can only theorize as to why this might be but it would seem that BasICColor might not be taking as many measurements per patch as Spectraview (US version) and CEDP are thus resulting in erratic measurements in black point from validation to validation.

I'll have to pay attention to i1Profiler to see what it does as it obviously whips through the patches quite quickly.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on May 16, 2011, 02:55:31 am
While it's well known that the i1P is noisier when measuring the shadows, I've never observed this to be a problem using Spectraview on my 2690 or CEDP on my other screens but when doing validations with BasICColor I've noticed DRAMATICALLY different results for the black point and as such, wildly varying evaluations for contrast ratio. On one pass I'll get 400:1 and on the next... 750:1.

i1pro gives inconsistent results when measuring the shadows, so changing the profiler may give better or worse results, but so or so it's still a lottery. Just take a look a that video - I've measured my 3090WQXi bkpt with i1pro and basICColor DISCUS simultaneously:
http://dl.dropbox.com/u/19059944/i1pro-vs-dscs.MOV

(the DISCUS measurement is the Y coordinate on the left (it reports 0,275cd/m^2), the i1pro measurement is the Y coordinate on the right (it's changing its mind each time the averaged measurement is done: 0,19 - 0,27cd/m^2)
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 16, 2011, 04:06:39 am
i1pro gives inconsistent results when measuring the shadows, so changing the profiler may give better or worse results, but so or so it's still a lottery. Just take a look a that video - I've measured my 3090WQXi bkpt with i1pro and basICColor DISCUS simultaneously:
http://dl.dropbox.com/u/19059944/i1pro-vs-dscs.MOV

(the DISCUS measurement is the Y coordinate on the left (it reports 0,275cd/m^2), the i1pro measurement is the Y coordinate on the right (it's changing its mind each time the averaged measurement is done: 0,19 - 0,27cd/m^2)

I definitely don't dispute that it's noisy (and I'm seeing the exact same behavior with luminance jumping all over the place with the i1P when measuring the blackest blacks) but I think when considering practical real world use that I've found (and I believe Scott may be of the same opinion) that in many cases the benefits of using the Eye One Pro outweigh its noisy performance with dark patches. I have to double check but I believe I also set the black point target on my NEC to 0.3 or 0.35 cd/m^2 so it's possible that that might be high enough that I'm not seeing as much flakiness resulting from the i1P's noise floor. With the testing I'm doing right now I have everything set to minimum black levels.

I'd love to have that Discus you have there... if you'd like to send it to me  ;D

Just a side note... while blacker blacks (I feel a Spinal Tap quote coming) are always desirable, when it comes to profiling my monitors I place more importance on being able to see each patch discretely. Of course increasing your black level above the absolute minimum that your monitor can achieve can definitely lob off some of that resolution my personal experience has been that regardless of whether or not I'm using a spectro, or a colorimeter, there always seems to be a bit of funk in the darkest shadows (funk in this case is a color cast) on monitors with really low black levels that is often mitigated by bumping up the black levels a smidge so in real world use the noise present in spectrophotometers hasn't been a show stopper. As mentioned before I actually prefer my Eye One Pro over my other pucks (most of the time but not all of the time) as on the whole it definitely renders profiles that are more neutral. I reserve the right to be fickle and change my mind of course.  ;D

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on May 16, 2011, 04:34:59 am
Generally, I had also preferred my i1pro over the other pucks. But knowing its limitations, I simply find it inconsistent when measuring the shadows, and don't belive any specific profiling software can really help it. Even Argyll CMS with its own i1pro driver (that have larger integration times) doesn't cure the issue, as can be seen in Ethan's evaluation results.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 16, 2011, 05:22:10 am
Generally, I had also preferred my i1pro over the other pucks. But knowing its limitations, I simply find it inconsistent when measuring the shadows, and don't belive any specific profiling software can really help it. Even Argyll CMS with its own i1pro driver (that have larger integration times) doesn't cure the issue, as can be seen in Ethan's evaluation results.

No doubt... weakest link. etc... created a correction matrix for the Spyder 3. Waiting for Argyll to finish up. Lots of hurry up and wait! :) For this much waiting.... it had better be good!  ;D

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Mark Paulson on May 16, 2011, 10:30:46 am
Quote
Mark: If the offer still stands, I may take you up on it after we get a chance to run the fisrt sample through its paces.

Just let me know when you want it. Scott is here in San Antonio and he can also have a test with it if desired. I don't have time as I have another full time job.
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on May 16, 2011, 01:11:44 pm
Joe: What you are seeing with the i1Pro is indeed the limitations of its measurement capabilities. From our results, the i1Pro has a real-world noise floor of between 0.3 - 0.4 cd/m2. At lower luminance values, i1Pro numbers are more random than real. When you raise the blackest black from 0.11 (http://www.imdb.com/title/tt0088258/quotes?qt0261726) to 0.35, you are getting more signal, less noise, and the green casts to the shadows go away. (In all likelihood, the true monitor black point is 0.05 - 0.1 cd/m2 higher than what the i1Pro is reporting).

CEDP increases measurement integration time in darker tones above what other (non-Argyll) programs do. The price you pay for BasICColor Display's speed is inaccurate shadow readings. Measure the same monitor with a DTP-94, and BasICColor should give more consistent black levels. The white point will be all over the place, however.

I spent more time with i1Profiler over the weekend. When choosing "native" contrast ratio, it gives a higher black point than does CEDP's relative black, no matter whether an i1Pro or i1D2 is used. At the black level CEDP chooses, a good monitor (e.g. Eizo CG243W) is capable of differentiating RGB (1, 1, 1) from (0, 0, 0). By setting a higher level, i1Profiler both gets around the limitations of the i1Pro and allows faster measurement times. Whether this works for you depends on how black you like it.

The absence or presence of color casts at the lowest luminance levels appears to be a function of the sensor being used. One calibration suite (CalPC (http://www.spectracal.com/toppage.aspx?ID=20)) we are evaluating supported our PR-730 spectroradiometer. With a monitor that it could talk to via DDC, it produced dead-neutral shadows down to 0.15 cd/m2. BasICColor Display (http://www2.chromix.com/colorgear/shop/productdetail.cxsa?toolid=50141&pid=11713) also supports fancy equipment, just in case anyone has a spare KM CS-2000 floating around. Once we get a Discus in-house, we will check BasICColor's performance down in the mud.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 16, 2011, 03:17:31 pm
Joe: What you are seeing with the i1Pro is indeed the limitations of its measurement capabilities. From our results, the i1Pro has a real-world noise floor of between 0.3 - 0.4 cd/m2. At lower luminance values, i1Pro numbers are more random than real. When you raise the blackest black from 0.11 (http://www.imdb.com/title/tt0088258/quotes?qt0261726) to 0.35, you are getting more signal, less noise, and the green casts to the shadows go away. (In all likelihood, the true monitor black point is 0.05 - 0.1 cd/m2 higher than what the i1Pro is reporting).

Well it's good to get some validation that the numbers I've come up with based on observation are grounded with some objective measurements. I think I did my testing with the Eye One Pro about a year ago.

Quote
CEDP increases measurement integration time in darker tones above what other (non-Argyll) programs do. The price you pay for BasICColor Display's speed is inaccurate shadow readings.

It was pretty darn noticeable with some whackadoodle results on the shadow side of things. As far as the Eye One Pro is concerned, purely from an observation standpoint it seems like CEDP (or i1Profiler) is a better match for the Eye One Pro.

Quote
Measure the same monitor with a DTP-94, and BasICColor should give more consistent black levels. The white point will be all over the place, however.

Sigh, wide gamut monitors offer something I need but I cried a little when I had to put the DTP-94 away. Although I've spent a fair amount of time evaluating Argyll in the past for print profiling, I hadn't really dug into its display profiling capabilities. I'm curious to know if creating a correction matrix for the DTP-94 when used in conjunction with a screen that has white LED backlights will help any? I suspect not as my guess is that the problems the DTP-94 suffers with white LED backlights has to do with the spectrum of the light produced. I do still have one screen though (with a white LED backlight... and it's a REALLY sucky screen ([COUGH]MacBook Pro[/COUGH])) that can use all the help it can get. It's actually profiled quite well right now with CEDP but if I didn't try to do better I'd have to turn in my nerd card...

Quote
At the black level CEDP chooses, a good monitor (e.g. Eizo CG243W) is capable of differentiating RGB (1, 1, 1) from (0, 0, 0). By setting a higher level, i1Profiler both gets around the limitations of the i1Pro and allows faster measurement times. Whether this works for you depends on how black you like it.

I know all the reviewers rave about a nice black point but to be honest, to me it's all relative. What's more important to me is seeing as many discrete steps as I possibly can. If I can do that with a lower black point, great but if not, I'm not all that bothered. Of course increasing the BP to improve BP performance means that you have to sacrifice resolution somewhere else... whether or not that will manifest visibly of course depends upon the design of the monitor (resolution of the panel, resolution of the monitor LUT and... ahem... in the case of Asus whether or not someone who knew what they were doing was responsible for writing the firmware  ::) , with the NEC PA series and the Eizo CG series that's often trivial as it has the resolution to do that).

Quote
The absence or presence of color casts at the lowest luminance levels appears to be a function of the sensor being used. One calibration suite (CalPC (http://www.spectracal.com/toppage.aspx?ID=20)) we are evaluating supported our PR-730 spectroradiometer. With a monitor that it could talk to via DDC, it produced dead-neutral shadows down to 0.15 cd/m2. BasICColor Display (http://www2.chromix.com/colorgear/shop/productdetail.cxsa?toolid=50141&pid=11713) also supports fancy equipment, just in case anyone has a spare KM CS-2000 floating around. Once we get a Discus in-house, we will check BasICColor's performance down in the mud.

How I wish :). I'll be interested to see what you find once you get your hands on a few Discuses... (Disci?). I suspect they probably offer the greatest bang for the buck in terms of monitor calibration. What's the next step up from there? Sencore OTC1000/X-Rite Hubble?

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Scott Martin on May 16, 2011, 05:28:56 pm
Just a side note... while blacker blacks (I feel a Spinal Tap quote coming) are always desirable....

That's fantastic - LOL!!
Title: Re: Monitor calibration sensor evaluations
Post by: gromit on May 16, 2011, 06:11:49 pm
At the black level CEDP chooses, a good monitor (e.g. Eizo CG243W) is capable of differentiating RGB (1, 1, 1) from (0, 0, 0).

It's debatable how useful this is. With a default gamma of 2.2, this represents an L* value of 0.0045 ... namely something that will print indistinguishable from black.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 16, 2011, 06:55:56 pm
It's debatable how useful this is. With a default gamma of 2.2, this represents an L* value of 0.0045 ... namely something that will print indistinguishable from black.

In which case it will be reflected accurately on screen when you apply the soft proofing profile and you won't have to wonder whether or not what you're seeing is accurate. The whole point of color management is quality control and having the ability to manufacture a product to a specific set of tolerances consistently and reliably over time. To that end, it is quite useful in my opinion. When something goes wrong having solid quality control procedures in place reduces the amount of time it takes to identify the problem which saves time and money. :-)

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: gromit on May 16, 2011, 07:02:17 pm
In which case it will be reflected accurately on screen when you apply the soft proofing profile ...

If you're seeing differences this small, the representation of shadows on your monitor is bogus ... whether soft-proofing is invoked or not.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 16, 2011, 07:33:44 pm
If you're seeing differences this small, the representation of shadows on your monitor is bogus ... whether soft-proofing is invoked or not.

I have some papers I've profiled that will show differences down to 8,8,8 and when soft proofing is enabled, this is accurately reflected in the soft proof. If things are set up properly, yes it will work and it will work quite well.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: gromit on May 16, 2011, 07:43:50 pm
I have some papers I've profiled that will show differences down to 8,8,8 and when soft proofing is enabled, this is accurately reflected in the soft proof. If things are set up properly, yes it will work and it will work quite well.

You're missing the point, this has nothing to do with soft-proofing. Soft-proofing goes through the same mapping to the display. If the shadow values are represented inaccurately, they'll be inaccurate whether soft-proofing is invoked or not.
Title: Re: Monitor calibration sensor evaluations
Post by: eronald on May 16, 2011, 08:04:50 pm
So far the best profile I got for my MacBook Pro 2011 17" is ColorMunki.

I don't do print or contract print so I don't need super accurate, though.
I do need something pleasant to look at.

Edmund
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 16, 2011, 09:40:53 pm
You're missing the point, this has nothing to do with soft-proofing. Soft-proofing goes through the same mapping to the display. If the shadow values are represented inaccurately, they'll be inaccurate whether soft-proofing is invoked or not.

Just because a print on a particular paper made with a particular printer can or cannot represent detail in a certain area is not a reason to have the monitor mimick that medium exactly. What if the output medium is another monitor and I want to represent as much shadow detail as possible? If you do what you propose (not being able to distinguish between the difference between any two given steps) then you're undermining the usefulness of your tools as a reference medium. We profile our screens to a specific known standard. Your post is the first time I've ever seen anyone suggest that maintaining those standards are not useful.

Ehtan said At the black level CEDP chooses, a good monitor (e.g. Eizo CG243W) is capable of differentiating RGB (1, 1, 1) from (0, 0, 0). and you questioned whether that was useful. You then said "If you're seeing differences this small, the representation of shadows on your monitor is bogus ... whether soft-proofing is invoked or not." If what you're saying is true, then Eizo, NEC, Integrated Color, BasICColor, and X-Rite are all doing their jobs wrong.

An analogy... I have a piano. It has 88 notes and every note is tuned to a specific frequency so the piano is calibrated. When I play, not every song I play hits the keys in the bottom octave. The fact that that octave is there though, tuned, and accurate and capable of reproducing those notes doesn't mean that the rest of what I play on the piano is inaccurate if the music doesn't call for those specific notes.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: gromit on May 16, 2011, 10:19:14 pm
Just because a print on a particular paper made with a particular printer can or cannot represent detail in a certain area is not a reason to have the monitor mimick that medium exactly.

You're conflating two different things. I suggest you re-read what I said.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 16, 2011, 11:20:18 pm
You're conflating two different things. I suggest you re-read what I said.

It appears that what you're saying is, if you can see a difference between 0,0,0 and 1,1,1, then your monitor is not capable of accurately representing a print, and that being able to see a difference between 0,0,0 and 1,1,1 is not useful. Is that not what you are saying? If not I may be misunderstanding what you are trying to say.
Title: Re: Monitor calibration sensor evaluations
Post by: gromit on May 16, 2011, 11:39:48 pm
It appears that what you're saying is, if you can see a difference between 0,0,0 and 1,1,1, then your monitor is not capable of accurately representing a print, and that being able to see a difference between 0,0,0 and 1,1,1 is not useful. Is that not what you are saying?

For a gamma of 2.2 (monitor calibration and working space) ... yes. Not only will the representation of shadows in the soft proof be inaccurate but also that of detail in the image file itself. For L* it's a different story.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 17, 2011, 02:07:15 am
For a gamma of 2.2 (monitor calibration and working space)

The working space doesn't come into play here, at least when Ethan says he can see the difference between 0 and 1 I'm under the assumption that he's talking about sending 0,0,0 and 1,1,1 straight to a profiled monitor.
Title: Re: Monitor calibration sensor evaluations
Post by: eronald on May 17, 2011, 07:59:22 am
What you do to look at the profiled device  is switch into the monitor working space, and then you send the numbers - thus they are profiled device numbers.
For whatever that's worth. The experiment shows just about the best software will be able to do with the profiled device.
I don't know if there is a single reliable experiment you can do to get closer to the hardware.

I think.

Edmund

The working space doesn't come into play here, at least when Ethan says he can see the difference between 0 and 1 I'm under the assumption that he's talking about sending 0,0,0 and 1,1,1 straight to a profiled monitor.
Title: Re: Monitor calibration sensor evaluations
Post by: gromit on May 17, 2011, 06:49:22 pm
The working space doesn't come into play here, at least when Ethan says he can see the difference between 0 and 1 I'm under the assumption that he's talking about sending 0,0,0 and 1,1,1 straight to a profiled monitor.

A common way of doing so is to set the monitor and working space gamma to the same value ... which is why I mention it. Really you need to read what's said and think about it more rather than just jumping on posts. I still don't understand how this got into a discussion about soft-proofing.
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on May 17, 2011, 10:26:50 pm
The working space doesn't come into play here, at least when Ethan says he can see the difference between 0 and 1 I'm under the assumption that he's talking about sending 0,0,0 and 1,1,1 straight to a profiled monitor.

Well that’s how the test should be done. You’d build this document and assign the display profile to it, then use the curves test on a rectangle within the middle of the document in full screen mode.

I agree, its useful to be able to have the native calibration of the display show you a visible, albeit subtle difference between 0/0/0 and 1/1/1/.
Title: Re: Monitor calibration sensor evaluations
Post by: gromit on May 18, 2011, 12:45:10 am
I agree, its useful to be able to have the native calibration of the display show you a visible, albeit subtle difference between 0/0/0 and 1/1/1/.

As I said elsewhere, most people's monitor calibration gives them an overly optimistic view of shadow detail. It may be handy to see the detail, but it's not accurate and won't be reflected in the print (for a gamma of 2.2).
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on May 18, 2011, 09:09:24 am
As I said elsewhere, most people's monitor calibration gives them an overly optimistic view of shadow detail. It may be handy to see the detail, but it's not accurate and won't be reflected in the print (for a gamma of 2.2).

But the test isn’t considering prints at all (at this point), only how well the display delivers steps throughout the tonal range. Also not discussed, the test expects the user to view the neutrality (visually of course) of each step in the ramp from black to white as one moves the up arrow on the keyboard using Curves. This test is far from the end all of such tests, but for what it was designed to illustrate, its quite useful.
Title: Re: Monitor calibration sensor evaluations
Post by: gromit on May 18, 2011, 01:41:37 pm
But the test isn’t considering prints at all (at this point), only how well the display delivers steps throughout the tonal range. Also not discussed, the test expects the user to view the neutrality (visually of course) of each step in the ramp from black to white as one moves the up arrow on the keyboard using Curves. This test is far from the end all of such tests, but for what it was designed to illustrate, its quite useful.

I understand how the test works and what it shows. The issue is that most don't go the extra step and ask themselves whether, for the gamma chosen, they should see these differences. For a gamma of 2.2, a value of (4,4,4) still only corresponds to an L* of 0.1 which, for all practical purposes, is black.
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on May 18, 2011, 01:43:48 pm
The test can help decided upon a gamma setting (or we can just select Native and move on).
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on May 19, 2011, 05:06:42 am
I understand how the test works and what it shows. The issue is that most don't go the extra step and ask themselves whether, for the gamma chosen, they should see these differences. For a gamma of 2.2, a value of (4,4,4) still only corresponds to an L* of 0.1 which, for all practical purposes, is black.

While basically you're right, L* is not the absolute determinant of our perception, as we see lightness in a very relative way. So while it's impossible to see the above mentioned differences in 2° CIE Standard Observer L* viewing conditions, it's not necessarily that hard when we change the surround luminance, use simultaneous contrast effect, and make stimuli larger.

http://www.cis.rit.edu/fairchild/PDFs/AppearanceLec.pdf
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 19, 2011, 11:18:38 am
While basically you're right, L* is not the absolute determinant of our perception, as we see lightness in a very relative way. So while it's impossible to see the above mentioned differences in 2° CIE Standard Observer L* viewing conditions, it's not necessarily that hard when we change the surround luminance, use simultaneous contrast effect, and make stimuli larger.

http://www.cis.rit.edu/fairchild/PDFs/AppearanceLec.pdf

That's a very good point to make. If the surround is white (or if the ambient light is too high) I can't tell the difference between 0,0,0 and 1,1,1.

Title: Re: Monitor calibration sensor evaluations
Post by: Alan Goldhammer on May 19, 2011, 12:16:06 pm
FWIW, the NEC branded i1 display puck (designed for the NEC wide gamut monitors) works perfectly fine with ArgyllCMS when profiling my NEC P221 monitor.  I was able to use my existing SpectraView built profile as the starting point to create the 500 patch color target for Argyll.  After running the Argyll suite of programs my resulting profile turned out fine with a delta E lower than the ones I usually obtain with SpectraView.  However, as Ethan noted it takes a considerable amount of time to build an Argyll profile and the command line interface is not for everyone.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 19, 2011, 12:57:01 pm
FWIW, the NEC branded i1 display puck (designed for the NEC wide gamut monitors) works perfectly fine with ArgyllCMS when profiling my NEC P221 monitor. 

I've always wondered if the correction matrix in the puck is available to applications other than Spectraview (in which case it would just work like a regular i1D2) or if it doesn't matter (the correction matrix isn't something that the software accesses and applies, but rather, the matrix is incorporated into the measurements the puck is spitting out to the applications)?

Quote
I was able to use my existing SpectraView built profile as the starting point to create the 500 patch color target for Argyll.  After running the Argyll suite of programs my resulting profile turned out fine with a delta E lower than the ones I usually obtain with SpectraView.

Gotta be a little bit careful about comparing ∆E values reported between packages. Just having a different patch set on which to base your measurements off of can yield dramatically different results, even within the same software package.

Quote
  However, as Ethan noted it takes a considerable amount of time to build an Argyll profile and the command line interface is not for everyone.

Holy crap does it ever!!!  :D  I've been revisiting Argyll over the past 3 days. I'd evaluated it last year for printer profiling but never really explored its monitor profiling capabilities. There are some really neat features in there but WOW... it certainly does take a long time! I really like the ability to make a correction matrix.
Title: Re: Monitor calibration sensor evaluations
Post by: Alan Goldhammer on May 19, 2011, 01:28:13 pm
Gotta be a little bit careful about comparing ∆E values reported between packages. Just having a different patch set on which to base your measurements off of can yield dramatically different results, even within the same software package.
Yes, I understand but also need to note that SpectraView only measures the three primaries where Argyll, even in the basic form looks at more patches.  The key thing was that the initial SpectraView white point, black point, and contrast settings were preserved in Argyll which meant I didn't have to fool around with the monitor's hardware adjustment panel.

Quote
Holy crap does it ever!!!  :D  I've been revisiting Argyll over the past 3 days. I'd evaluated it last year for printer profiling but never really explored its monitor profiling capabilities. There are some really neat features in there but WOW... it certainly does take a long time! I really like the ability to make a correction matrix.
Yes, you do have to invest time and effort but given the price of the software it's worth it! :D 
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 19, 2011, 05:51:03 pm
Yes, you do have to invest time and effort but given the price of the software it's worth it! :D 

It certainly is. Having been a software engineer in a former life I'm absolutely floored by the work he's put into this project. A codebase like that is such a HUGE undertaking. I'm really surprised that nobody has put it into a really slick GUI. There's DispCal but... it's pretty rough around the edges. I almost want to dust off OOP books and maybe learn some Objective C and whip together something for the Mac but... time time time. How the heck do you find the time for a regular job AND to code outside of that? Gramme (is that the right spelling) must be super-human.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Alan Goldhammer on May 19, 2011, 06:36:34 pm
It certainly is. Having been a software engineer in a former life I'm absolutely floored by the work he's put into this project. A codebase like that is such a HUGE undertaking. I'm really surprised that nobody has put it into a really slick GUI. There's DispCal but... it's pretty rough around the edges. I almost want to dust off OOP books and maybe learn some Objective C and whip together something for the Mac but... time time time. How the heck do you find the time for a regular job AND to code outside of that? Gramme (is that the right spelling) must be super-human.

Cheers, Joe
I looked at the DispCal GUI but it was more complicated than using the command line and was missing a some features.  I did some Visual C++ programming for fun in the past and writing the GUI is the easiest part of all.  I think the issue with Argyll is that it is constantly evolving (version 1.3.3 was released within the past week or so with more features so anyone supporting a GUI would have to make changes as well.  I'm surprised as you are by the one man show effort that he puts into this and it makes most commercial efforts in color management pale in comparison.  I'm waiting on an i1 unit to arrive and I'm going to do some paper profiling using Argyll.
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on May 23, 2011, 12:57:22 pm
We now have a better understanding of how OEM-specific Eye-One Display 2 sensors work. Each i1D2 can be calibrated using a reference spectroradiometer. This both allows tuning the readings to be accurate for a particular display panel's back light characteristics and, perhaps more importantly, significantly reduces the unit-to-unit variability. Our initial sampling of 17 i1D2 sensors showed a mean variation between units when measuring a standard gamut panel set to a 6500K white point at 150 cd/m2 of 7.0 dE-2000. Testing 11 individually calibrated pucks on the same display type as they were calibrated to showed only 3.7 dE average disagreement between sensors. Not spectacular, but a 50% reduction in variability between sensors is not trivial. This put the OEM-tuned i1D2 as less erratic than the Spyder 3, but still more variable than either the i1Pro or DTP-94.
(http://www.drycreekphoto.com/images/calibration/SensorVariability.png)

Accuracy of the bespoke i1D2 sensors is still limited to the monitor they are tuned for. It is possible, however, to use a correction matrix for these pucks to improve results on other panel types. This leads one to the conclusion that X-Rite has a marketing opportunity to sell individually calibrated Eye-One sensors tuned to different generic back light types. Targeting standard and wide gamut RGB LED and wide gamut CCFL would be a good start. Also, suppliers of monitor profiling software would be well advised to include correction matrices for different panel back light types.

Thanks to SpectraCAL for pointing me in the right direction and to HP for the generous loan of a box o' pucks.
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on May 23, 2011, 03:18:39 pm
Interesting results Ethan. To me it adds up to negative quality progress from XRite between DTP-94 and the various flavours of i1 Display 2. Their first business opportunity should be to make instruments that behave more consistently. And of course if the consistency were also geared to accuracy it would be yet more helpful.
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on May 23, 2011, 04:40:09 pm
Very interesting - as a logical conclusion it seems that every NEC OEM sensor is individually calibrated...

I'd really like to have some kind of software, that could internally tune an i1d2 to a specific display using some spectroradiometer - I suppose even i1pro or ColorMunki could make things better, and maybe it also could fix the problem with i1d2's tendency to degrade over time.

Speaking of stability - any information regarding basICColor DISCUS testing?
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 23, 2011, 04:56:33 pm
I'd really like to have some kind of software, that could internally tune an i1d2 to a specific display using some spectroradiometer - I suppose even i1pro or ColorMunki could make things better

You can do with in Argyll with ccmxmake.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on May 23, 2011, 06:31:40 pm
You can do with in Argyll with ccmxmake.

Argyll won't modify colorimeters LUT. At least not yet...
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on May 23, 2011, 06:52:56 pm
Argyll won't modify colorimeters LUT. At least not yet...

No, definitely not so it's of no benefit if you want to use another application but if you're staying within Argyll then you can take advantage of the correction matrix. Unfortunately even with the DispCal GUI it's not exactly what I'd call an easy to use app for the masses (actually I prefer the command line over DispCal GUI myself as I find it easier to use but... a well done GUI would certainly be a welcome addition).

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on May 25, 2011, 02:06:08 am
Very interesting - as a logical conclusion it seems that every NEC OEM sensor is individually calibrated...

I'd really like to have some kind of software, that could internally tune an i1d2 to a specific display using some spectroradiometer - I suppose even i1pro or ColorMunki could make things better, and maybe it also could fix the problem with i1d2's tendency to degrade over time.

Speaking of stability - any information regarding basICColor DISCUS testing?

I have not heard anything from X-Rite directly about how one can calibrate an Eye-One Display. The OEM vendors we spoke to did not know if this was covered under NDA, but the suspicion is that it was. It certainly would be revealing to try tuning an i1D2 to a particular screen and seeing how it could perform.

On to the BasICColor DISCUS (http://www2.chromix.com/colorgear/shop/productdetail.cxsa?toolid=50140&pid=11713). Thanks to BasICColor and CHROMiX, we have a unit in-house. The performance was impressive. For full details, read the updated article (http://www.drycreekphoto.com/Learn/Calibration/MonitorCalibrationHardware.html) on our site or skip ahead to the conclusions (http://www.drycreekphoto.com/Learn/Calibration/MonitorCalibrationHardware.html#BestSensor).

(http://www.drycreekphoto.com/images/calibration/SensorAccuracy.png)

The white measurement error on the wide-gamut display showed the DISCUS at its worst. This panel used RGB LED backlighting. On other wide-gamut monitors having CCFL backlights (e.g. Eizo, NEC PA-series. LaCie) the DISCUS measured in at between 0.75 - 1.4 dE-2000 from where our spectroradiometer claimed the monitor really was. Those values are simply impressive. The DISCUS' white point error is less than the across-screen uniformity of even most high-end monitors.

All our measurements used the DISCUS release of BasICColor Display (v4.2.4) to drive the instrument. There are many rough edges in this release - video LUTs not flushing consistently, DDC calibrations ending up at D50 no matter what white point was specified, missing i1D2 support, and, of course, the dispute with ICS making L* calibration unavailable unless you state you do not reside in the US. Even so, it did allow accurate measurements of monitor black and white points.

The DISCUS - at least the unit we tested - did show one notable flaw. The puck includes an ambient light measurement diffuser. Although measurements of monitors were very accurate, ambient measurements were not. We consistently obtained readings with too low a color temperature, the equivalent of at least 6 dE-2000 low. Also, the DISCUS has thermal compensation capability. For monitor measurements, this worked like a charm -- only minimal shifts with a 10C change in temperature. Ambient light measurements did change, with the DISCUS reporting the color temperature of a constant light source rising by over 300K as the puck heated up.

I do not know if this indicates a problem in our measurement methodology (light source too bright?), a QA flaw in the puck we measured, contamination or damage to the ambient light diffuser, a systematic problem such as thermochromic diffuser material, or something else entirely. Too soon to say, but if any of you DISCUS owners also have an i1 Beamer or even an i1D2 with the ambient head, try measuring a viewing booth or other constant, 5000 - 6500K light source with the DISCUS and the second instrument. See if there is a consistent offset, particularly if the DISCUS always reads low.
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on May 25, 2011, 02:49:19 am
Ethan - problem with measuring chromatic coordinates of ambient light is that in basICColor display 4.2.4 there's only a correction matrix for JUST Normlicht CCFL spectra available. So it musn't be reliable with other light sources, I suppose.

(http://members.chello.pl/m.kaluza/dscs3.jpg)

As for temperature sensor - I've noticed, that it's not correcting measurements internally, but the correction is applied in the profiling software. It works while profiling, but not while you're measuring the x,y coordinates of the wtpt (in ambient or contact mode), the temperature change can cause a drift, so recalibration of the instrument is needed to counteract this behavior.

Anyway, basICColor display 4.2.4 is the first release that supports the DISCUS, so there may be some minor early age issues. And there's basICColor display 5 coming soon...

Thanks for sharing the test results!
Title: Re: Monitor calibration sensor evaluations
Post by: tony22 on May 28, 2011, 08:35:42 am
I've been focusing my testing on The Spyder3Elite and EyeOnePro devices using ColorEyesDisplayPro, DispCal/Argyll, Syper3Elite, and i1Profiler. From the testing I've done thusfar, I have to say I'm pretty blown away and completely surprised with what I'm seeing from i1Profiler with an EyeOnePro. Not only does the process take less than 2 minutes (!!) but the shadows and smoothness are incredible, rivaling even DispCal's hour long calibration process. i1Profiler's QA analysis reports lower Average Delta E variations with i1Profiler's calibration than the others. I've been frustrated with shadow detail from the i1 Pro in the past but i1Profiler is doing a better job than I've seen with other solutions.

Ethan you wrote "No software wizardry will improve a spectrophotometer's accuracy at low light levels to be competitive with a colorimeter." But that's exactly what I'm seeing with i1Profiler (to my surprise as much as yours I'm sure!). Suddenly the shadows are all there statistically and visually and the performance with soft proofing is incredible.


I have been impressed with i1Profiler as well. X-Rite indeed managed to coax commendable performance out of a short measurement cycle. I assume they are using a method better than the simple averaging in Argyll to tease a signal from the measurement noise. (Serious late-night geek digression: The standard error of a measurement sample goes down with the square root of the sample size. In English, this means that taking 3 measurements cuts your random variability in half vs. 1 sample. To reduce it by half again, you need 10 measurements. Shrinking the error by half again requires 100 measurements, followed by 10000 measurements for another 50% reduction. You can see why the DispCal's high quality mode takes an hour to run).

Back to reality... Our tests of i1Profiler do indeed give more neutral and open shadows with an i1Pro than most other software. The calibration ain't perfect, however.

Ethan and Scott, I have found this discussion fascinating. Given that I'm still using i1Match, however, it leads to some questions. I've read some snippets suggesting that i1Profiler's monitor calibration software is not quite as flexible as i1Match's (setting user selected white and black point, for example). If this is true, ultimately how does the monitor calibration capability compare between the two? With all the good things said above, is there still a price to pay with less adjustability?
Title: Re: Monitor calibration sensor evaluations
Post by: ronker on May 31, 2011, 07:07:30 am
Quote
Both of these i1's turned in excellent performance on their respective monitors. Using the tuned sensor on any other monitor, however, was invitation to disaster. The results were ugly.

I have a NEC PA301W with a NEC OEM EOD2 (MDSV2) sensor. I've thought about use the sensor on other monitor especially with my NEC 2180UX. Do you think there will be also bad results on a 2180UX? I've read on the NEC homepage that the MDSV2 ist backward compatible to the old sensors.
Title: Re: Monitor calibration sensor evaluations
Post by: mjdl on June 02, 2011, 12:13:34 am
[...] Unfortunately even with the DispCal GUI it's not exactly what I'd call an easy to use app for the masses (actually I prefer the command line over DispCal GUI myself as I find it easier to use but... a well done GUI would certainly be a welcome addition). [...]

I'm just a newbie user of ArgyllCMS, dispcalGUI, and a DTP-94, but I really have to disagree that the dispcalGUI application is difficult to use: just set the desired display calibration parameters in the upper part of the application, select the kind of display profile you want, choose one of the provided colour target sets in the lower part of the application (and customize it if necessary, to produce a new target set), and that's pretty much it--make sure you deselect "Interactive display adjustment" (assuming you've already adjusted the display controls), so that you can go off and do something else for the hour or more it takes for a high quality calibrate+profile process to finish--it will finish unattended.

Of course I've had to do a fair amount of reading to understand the general principles of operation and to ensure the options I've selected in dispcalGUI make sense, but the latest version of the application (0.7.x.x) does have a detailed, long page of instructions.

Granted, my ThinkPad matte display is nothing compared to the kinds of LCDs mentioned already in this thread (it only has 2/3rds the colour volume of sRGB), but the verification results of one profile, attached, seem O.K. to me, and the actual visual performance a vast improvement over the uncalibrated state, i.e. now there is a neutral grayscale, good separation in shadows and highlights on all kinds of photos, etc.

What is really difficult, however, is going beyond the basic Argyll calibration parameters: e.g., I need to figure out how to correct the very slight over-warming of dark greys in the range RGB(10,10,10) to RGB(30,30,30) and of highlights RGB(235,235,235)+ compared to the middle grey values.
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on June 06, 2011, 06:36:27 am
there's only a correction matrix for JUST Normlicht CCFL spectra available. So it musn't be reliable with other light sources, I suppose.

I guess I was right:
http://www.basiccolor.de/dry-creek-photo-testet-discus/
Quote
Einziger Kritikpunkt ist die Lichtmessung, hier wurde im Test ein nicht näher spezifiziertes Umgebungslicht gemessen, der basICColor DISCUS ist jedoch auf Normlicht kalibriert und liefert deshalb nur hierfür perfekte Messwerte
Title: Re: Monitor calibration sensor evaluations
Post by: stefohl on June 10, 2011, 10:38:33 am
Comparing Basiccolor Discus with KonicaMinolta Display Color Analyzer CA-210

Today I did a test on the Discus by comparing it to a CA-210. I've done several such tests in the past with EyeOne Pro, Spyder III and EyeOne Display. The DeltaE was below 1 when we compared the measurements from the Discus and the CA-210. When I tested the other instruments the Delta E on a calibrator that worked OK was between 3-5.

I've tested more than 15 EyeOne Displays now, and I estimate that more than 30% of them showed results that were unacceptable, with an average DeltaE over 10, compared to the measurements we got from our reference calibrator. I haven't tested any that was tuned to a specific monitor, so I don't know how good the results can be in that case. But I don't recommend any of the photographers that we are working with buying a generic EyeOne Display.

The results from the Discus was very impressive. I'm happy to see that we now have a calibrator that can do a very satisfactory calibration even on a wide gamut monitor.

Best/
Stefan
Title: Re: Monitor calibration sensor evaluations
Post by: trinityss on June 27, 2011, 05:04:29 pm
The only device missing in this test is the new xrite display pro :-)
I would now like to see a comparison with the discuss!

Kr,
Title: Re: Monitor calibration sensor evaluations
Post by: pcunite on August 17, 2011, 10:22:07 am
The only device missing in this test is the new xrite display pro :-)
I would now like to see a comparison with the discuss!

Indeed, we know that the plastic i1 Display 2 was junk, but what about the i1 Display 3? It is getting good reviews here:
http://www.curtpalme.com/ChromaPure_EyeOneDisplay3.shtm
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on August 17, 2011, 10:32:10 am
Indeed, we know that the plastic i1 Display 2 was junk, but what about the i1 Display 3? It is getting good reviews here:
http://www.curtpalme.com/ChromaPure_EyeOneDisplay3.shtm


I’d be worried by any review that can’t correctly name the product. There is no such thing as an EyeOne Display 3, at least as far as X-rite is concerned. No, its not officially called the i1 Display Pro III.
Title: Re: Monitor calibration sensor evaluations
Post by: WombatHorror on August 18, 2011, 02:51:54 am
Iliah - excellent point, and one i should have mentioned. We looked into this with a couple of monitors. Setting black to 0.5 - 0.8 decreased the mean measurement error from ~10dE to ~5. Big improvement if you are willing to live with lighter blacks levels. This is a valid option, particularly for anyone printing on fine art papers or other stock with relatively low print contrast.

AFAIK SV II used with NEC PA monitors writes but a simple matrix profile and it doesn't even use the probe to set the gray-scale, it simple tunes the color temperature using a brightish white patch and then the 3D 13bit LUT and color engine do all the rest. It does use the probes to show you afterwards how it measured the gray-scale but since it just writes a matrix profile I don't believe it actually write that to the profile. So the poor dark readings of the i1Pro should have no effect when using SV II and NEC PA series.

Of course it is also true that an i1Pro costs like $800+ and the NEC i1D2 like $200 and when I compared the values that each measured for color temp of a bright patch and for the primary locations most values were 0.000-0.001 apart, the same, might have a couple .002s I forget, at most a single .003 but I think .002 was the largest difference I got. (xy values of xyY) so if all you do is use the NEC and are not calibrated other things as well, then their rebadged and specially calibrated probe is heck of a lot less expensive way to get the same results.
Title: Re: Monitor calibration sensor evaluations
Post by: WombatHorror on August 18, 2011, 02:55:45 am
The answer is I don't know. The two OEM-adjusted sensors we have are a NEC sold with the Spectraview package and HP's DreamColor unit. Our assumption is that these two Eye-One flavors are adjusted differently. The NEC sensor performs well on the PA monitor, but is grossly inaccurate on either sRGB monitors, the HP (uses LED backlighting rather than CCFL), and even wide-gamut Eizo displays that are also CCFL lit. The HP puck has similar behavior: excellent performance on the monitor it is sold with, lousy on all others.

If these two OEM flavors of i1 are actually identical, that would imply a truly worrying lack of consistency. We asked both HP and NEC for details. None were forthcoming. Without a reasonable sampling of OEM pucks to characterize there is no way to be sure. X-Rite, to the best of my knowledge, has not published accuracy and resolution specs for the Eye-One colorimeters like they do for the i1Pro spectros.

As I stated in another post I got basically the exact same results on a NEC PA using either an i1pro or the NEC i1D2 for bright gray color temp measurements and primary location measurements. When I used the NEC i1D2 on my HDTV the results seemed pretty far off from my DTP94 though. Only 1 sample of each here but these seem to match exactly to what you found.
Title: Re: Monitor calibration sensor evaluations
Post by: WombatHorror on August 18, 2011, 03:00:39 am


I've been focusing my testing on The Spyder3Elite and EyeOnePro devices using ColorEyesDisplayPro, DispCal/Argyll, Syper3Elite, and i1Profiler. From the testing I've done thusfar, I have to say I'm pretty blown away and completely surprised with what I'm seeing from i1Profiler with an EyeOnePro. Not only does the process take less than 2 minutes (!!) but the shadows and smoothness are incredible, rivaling even DispCal's hour long calibration process. i1Profiler's QA analysis reports lower Average Delta E variations with i1Profiler's calibration than the others. I've been frustrated with shadow detail from the i1 Pro in the past but i1Profiler is doing a better job than I've seen with other solutions.

Ethan you wrote "No software wizardry will improve a spectrophotometer's accuracy at low light levels to be competitive with a colorimeter." But that's exactly what I'm seeing with i1Profiler (to my surprise as much as yours I'm sure!). Suddenly the shadows are all there statistically and visually and the performance with soft proofing is incredible.



Interesting. I didn't try the new i1profiler yet, the original x-rite profiler software that came with my i1pro was awful IMO, way worse than ColorEyes (one unfortunate thing about ColorEyes is that you can't chose an sRGB tone response curve).

One thing you can do as far as poor i1pro shadows with some programs is use the i1pro on bright colors to train say a DTP94b. I know Calman for HDTVs allows that and some other programs do as well, mostly more HDTV oriented ones I think. I think the free open-source monitor one does too though now actually.
Title: Re: Monitor calibration sensor evaluations
Post by: WombatHorror on August 18, 2011, 03:09:29 am


One last comment. The high quality mode in Argyll/DispCal has a potential gotcha: temperature drift. The ii1Pro is fairly sensitive to temperature. Even under controlled ambient temperature there can be shifts from the i1 heating up as is rests on the screen. We saw this firsthand when trying ridiculously long integration times to see what the absolute noise floor was for an i1Pro. We started with a warmed-up monitor, and saw the i1 readings progressively change over the course of 20 minutes. Our reference PR-730 is both temperature-compensated and was being used in non-contact mode. After the first 20 minutes, the i1 readings did not shift for the 24 hour duration of our test. Moral of the story: preheat your i1Pro by hanging it on your screen while you surf the web for 20 minutes or so before you calibrate.

yeah temperature drift with many of the instruments has scared me a bit about the profilers that take a long time
hopefully i'm wrong but it seemed that my i1pro drift again and again over time not just for the first 20 min and then stable
Title: Re: Monitor calibration sensor evaluations
Post by: WombatHorror on August 18, 2011, 03:18:07 am
Ethan, just to follow up. I'm continuing this testing and am finding that on some displays (like a Samsung 245T and Dell 24") I'm seeing dramatically improved results (visually and statistically) using an EyeOnePro device instead of a DTP94, even when the same software is used.

One of the interesting things that I see but suspect most people don't is how different two different types of monitors can look when calibrated the same way. Put a Cinema Display and a Samsung monitor on the same Mac and calibrate them both using the exact same settings and marvel at the disappointing differences you'll see. Better yet, take a client like Whole Foods World HQ where they've got 50+ designers and video professionals all in one area using a hodge podge of different brands of displays. I'm finding that if they are all calibrated using a colorimeter (Spyder3 or DTP94) when you stand back and look at all of them in one room it's kinda surprising how much inconsistency there is between them. Calibrate all of them with a spectro (using CEDP or DispCal) and they are visually perfectly consistent! Combine that with i1Profiler which does a better job at handling the shadows with a spectro than anything else and you've got a truly superior solution. Problem solved.

While lots of users may only have one or two displays these are the real world challenges (10+ different types of displays all side-by-side in one room) that my business faces every day. My hands-on testing is showing that spectros have advantages over even the best colorimeters in some situations and with i1Profiler I don't see any problems with the shadows like we've seen elsewhere. I'm not seeing any color crossovers or drawbacks with my particular pair of EyeOnes.

I'm going to stick to my guns here and suggest that, for now, i1Profiler with an i1Pro seems to be the answer to the question "when displays are calibrated with a variety of devices and applications which *combination* consistently yields the best results within a reasonable timeframe?"






there is also the problem of metamerism where the different spikes in the primaries of different monitors, particularly wide gamut vs. standard gamut may lead to measured results being identical but visual results differing. My Samsung 244T and Samsung C650 looked more similar than my NEC PA 241PA due to metamerism. (comparing to real world color checker chart under D65 this looked different yet again but it seemed closer to how my eye sees things to the NEC PA wide gamut actually) and more complicated is that the degree and nature of the metamerism seen in these cases is thought to vary somewhat person to person
Title: Re: Monitor calibration sensor evaluations
Post by: Ernst Dinkla on August 18, 2011, 03:55:45 am

One thing you can do as far as poor i1pro shadows with some programs is use the i1pro on bright colors to train say a DTP94b. I know Calman for HDTVs allows that and some other programs do as well, mostly more HDTV oriented ones I think. I think the free open-source monitor one does too though now actually.


With a spectrometer ArgyllCMS can create a correction matrix for a colorimeter, both used on the same monitor. In profile creation that correction matrix will adapt and correct the colorimeter to the specific monitor used and keep the colorimeter darker color measurements quality. It would be nice to see the colorimeter test widened up with test that have that correction included. Creating the correction matrix is a fast and easy process.

met vriendelijke groeten, Ernst


Dinkla Gallery Canvas Wrap Actions for Photoshop

http://www.pigment-print.com/dinklacanvaswraps/index.html
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on August 18, 2011, 01:06:32 pm
AFAIK SV II used with NEC PA monitors writes but a simple matrix profile and it doesn't even use the probe to set the gray-scale, it simple tunes the color temperature using a brightish white patch and then the 3D 13bit LUT and color engine do all the rest. It does use the probes to show you afterwards how it measured the gray-scale but since it just writes a matrix profile I don't believe it actually write that to the profile. So the poor dark readings of the i1Pro should have no effect when using SV II and NEC PA series.

SVII measures the native TRC of the display in 8, 16, 32, or 52 steps, and puts correction curves in high-bit LUT of the display. So poor dark readings of the i1Pro have obvious and noticable effect.
Title: Re: Monitor calibration sensor evaluations
Post by: RichWagner on August 19, 2011, 12:03:45 am
Quote
You understand our motivation perfectly. We are out to update our reviews of monitor profiling systems. Looking at hardware-agnostic packages (ColorEyes, BasICColor, CalPC, etc.) that support a huge range of sensors is daunting. Yes, it is pretty cool to be able to hook a Minolta CS-2000 up to BasICColor or our PR-730 to CalPC, but this doesn't tell you much about real-world performance.

Unfortunately, the number of devices that these excellent software packages support will likely decrease in the near future, thanks to new policies from X-Rite.

A few weeks ago X-Rite announced the availability of two new colorimeters: the i1 Display Pro and the ColorMunki Display.

Here is the current status on third-party software support for these X-Rite products, as noted by a third-party software developer:

ColorMunki Display

As in the past for previous Munki instruments, there is no third party Software Development Kit (SDK) available. An SDK is essentially the drivers and software libraries that enable third party developers to communicate with the instrument. This does not mean that the SDK does not exist, it only means it is not available; in other words, this is a business decision! Thus, until such an SDK is available, third party developers cannot provide support for the Munki series of instruments.

i1 Display Pro

This model replaces the i1 Display2. As with the i1 Display2, there are three Categories of the i1 Display Pro available:

Category (a)- The instruments sold by X-Rite under the X-Rite name (also called the retail version). In the past, for the older i1 Display2, the SDK could be used for instruments in this category; this is NOT possible with the SDK of the i1 Display Pro.

Category (b)- The instruments sold by a third party (called Original Equipment Manufacturer, or OEM). The SDK could, and can still be used for instruments in this category. However, in the past, such instruments had the same communication interface specifications as the X-Rite retail model; this is not the case anymore.

Category (c)- The instruments sold by an OEM, which have communication interface specifications DIFFERENT from the Category (b) model. The SDK cannot be used to communicate with those instruments.

For example, the custom i1 Display2 recommended by NEC for many wide-gamut NEC monitors (called the MDSVSENSOR2) is a Category (b) model while the HP-branded i1 Display2 (HP DreamColor Advanced Profiling Solution, KZ300AA) sold by HP for use with the HP DreamColor monitors is a Category (c) model.

(Note that while the NEC MDSVSENSOR2 has the same communication interface as the standard i1 Display2, it has a different set of sensor filters which is better adapted to the wide gamut primaries of many NEC monitors.)

With the older i1 Display2, the X-Rite SDK could be used with the Category (a) and (b) instruments; this explains, for instance, why third party software can be used with the X-Rite i1 Display2 and the NEC i1 MDSVSENSOR2, but not with the HP DreamColor i1 Display2.

With the new X-Rite policies, the i1 Display Pro sold by X-Rite can ONLY be used with i1 Profiler; it CANNOT be used with the i1 Display Pro SDK. This means that third party developers cannot support this instrument. Third party developers can only support the Category (b) and (c) instruments (Category (c) instruments will also require a separate agreement between the third-party developer and the licensed OEM). In addition, Category (b) and (c) instruments CANNOT be used with the i1 Profiler software (or other X-Rite software); if you want to use i1 Profiler, you need to buy a retail i1 Display Pro from X-Rite.

These new policies severely limit customer benefits since the new colorimeters can only be used with the bundled software they are sold with (from either the OEM or from X-Rite). This really stinks.

These changes do not affect the i1 Pro (a spectrophotometer that provides spectral data) as it is not currently being replaced by a new model, and is still supported by the older third party X-Rite SDK.

For those concerned about the implications of these new X-Rite policies - in particular, if you think that you could benefit from using third party software with an X-Rite branded ColorMunki Display or i1 Display Pro -  I suggest you communicate with your X-Rite representative or file a complaint via the general support page:
http://www.xrite.com/contact_us.aspx?reasonid=3

Personally, I will not buy another X-Rite product until these policies change.

--Rich Wagner
Title: Re: Monitor calibration sensor evaluations
Post by: RichWagner on August 19, 2011, 02:51:11 am
With a spectrometer ArgyllCMS can create a correction matrix for a colorimeter, both used on the same monitor. In profile creation that correction matrix will adapt and correct the colorimeter to the specific monitor used and keep the colorimeter darker color measurements quality. It would be nice to see the colorimeter test widened up with test that have that correction included. Creating the correction matrix is a fast and easy process.


The procedure to derive a color correction matrix for colorimeters for specific displays using a reference instrument like an eye-one pro is described here:

http://www.astm.org/Standards/E1455.htm

I believe the original research was described in:

Four-Color Matrix Method for Correction of Tristimulus Colorimeters
Yoshihiro Ohno and Jonathan E. Hardis
Proc., IS&T Fifth Color Imaging Conference, 301-305 (1997)

and

Four-Color Matrix Method for Correction of Tristimulus Colorimeters – Part 2
Yoshi Ohno and Steven W. Brown
IS&T Sixth Color Imaging Conference, (1998)

PatchTool can also generate and use a correction matrix. Danny Pascale has written a nice note on the process:

http://babelcolor.com/download/AN-9%20How%20to%20derive%20and%20use%20a%20Color%20Correction%20Matrix.pdf

Rich Wagner

Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on August 19, 2011, 09:03:17 am
Unfortunately, the number of devices that these excellent software packages support will likely decrease in the near future, thanks to new policies from X-Rite.
.....................

Personally, I will not buy another X-Rite product until these policies change.

--Rich Wagner

Of course the policies are not that "new" - the die has been cast ever since the merger with GMB, gobbling-up of the market, killing of perfectly satisfactory, usable products and software, and the huge licensing debacle they got themselves into with i1Profiler. One can only hope that technically and economically superior competition will emerge to put this seriously mismanaged corporation in its place - and until then it would indeed be very nice, if feasible, to avoid buying ANYTHING from X-Rite until the level of greedy monopolistic behaviour declines and the levels of concern with product quality and customer interests increases.
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on August 19, 2011, 10:28:20 am
Since Richard decided to post this to the ColorSync list, and Tom Lianza of X-rite replied, I figured it might be a useful data point to the debate to share Tom’s reply. FWIW, Tom is the Chair of the ICC and Director of Advanced Development R&D at X-rite

Quote
To all,

It's hard to respond to so many erroneous opinions at one time, but I will
make some general comments.

1. OEM products.  When an OEM purchases a product from us, it is their
product.  There may be manufacturing differences, there may be firmware
differences.  Most important, X-rite cannot make a change to the product
without specific agreement with the OEM.  We can, and often do, make changes
in the retail line of products.  OEMs make specific volume commitments and
often invest a significant amount of R&D and supply chain cost to implement
the product in their product line.  They also service the product.  Some
display vendors put the calibration software in the display.  They have
every right to insist on absolute customization. When I designed the Sony
Artisan hardware I had to use a completely different strategy for
suppression of static because their standards exceeded the FCC and European
standards of the time.  We also had to re-engineer the cable so that the
insulation could be physically consumed.  This required a change in the
production techniques of the cable and very specific testing for impurities.
OEM's invest heavily in their products and they need to protect that
investment.  They are not screwing the consumer.  Comments like that
indicate a total ignorance of the position that OEM's are in.

2. The small developer- Independent developers are an important part of our
business.  The question is: who supports the hardware product? A developer
like Graeme Gill "cracks" our products at "arm's length".  In the US, this
is completely legal and has been defended in court many times (the DOS BIOS
is a good example of how this worked).  He applies his technology to a
product that was legally purchased through retail channels.  What would be
illegal would be to make copies of our software and sell it using the
cracked technology or to ship our drivers, unlicensed, to the field. In the
life a given hardware product, independent software developers will make
many upgrades and not all of them will be free.  We make money on hardware
once, and that piece of hardware is an "enabler" for many other companies
over the course of its lifetime.

I believe that there  will be a mechanism for independent developers to use
the latest technology colorimeter, purchased through retail channels, but I
need to confirm that.  Understand that the latest technology products are
significantly different than the earlier products and do require a certain
amount of training to implement properly. It's not in anyone's best interest
to open the technology to everyone who thinks they are a developer until we
understand the support and training costs.  Not everyone is a Karl or
Graeme.  Is it in the consumers best interests to have poor software
implementations of new hardware technology?  Should we make exceptions for
certain developers who may have the knowledge and not allow other developers
into the fray?  How do we make sure that an iSV product doesn't kill a
retail product by overlaying different dll's or packages?   How do we cover
release of information that is currently under application for patent?  How
do we inform ISV's of changes that are warranted by changes in operating
systems.  How do we justify the costs of system support to the ISV community
given that we make money only on the sale of the hardware item? There is
nothing nefarious here, it just takes time and resources which are not
available at the moment.

Third hand comments about internal corporate policy (which, by the way is
probably covered by NDA)  such as those from Mr Wagner, should not generate
the frenzy of nonsense that we have seen on this list.  Snarky and
uninformed comments like those of Mr. Segal add nothing to a solution to the
issues of ISV support. I hope that you all have a better understanding of
the situation and we can end this mindless thread started with third hand
information taken out of the context of reality.....
Tom Lianza

Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on August 19, 2011, 10:38:10 am
Yes I saw that and responded to it:

"Reality" is perceived by different people in different ways depending on, amongst other things, at which end of the spectrum they happen to sit, what their interests are, and how these matters affect them personally. While some peoples' reality is other peoples' snark, we'll see over time which perspective prevails. And if there is really critical misinformation out here, maybe the company needs to do more to correct that. Over the years of corporate history, the demise of bigger fish than XRite started when they just couldn't truly and seriously see beyond themselves and their self-perceived immediate interests.
 
Mark

In the final analysis, my perceptions don't matter. What matters is the judgment of the market in which I am only one in a cast of many thousands. But that cast and other developers around the world will decide the future of colour management alternatives. Time and the market will tell how XRite fares. Of course it's a complex reality - they've made a spectrum of stuff from lower quality to very high quality, and we all know that every product embodies compromises. However, the issues at stake here go beyond that. I'll leave it at that.
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on August 19, 2011, 10:45:16 am
Just after returning to my email Inbox, I received the following from the Colorsync List:

"Mailing list removal confirmation notice for mailing list
Colorsync-users

We have received a request for the removal of your email address,
"............" from the colorsync-users@lists.apple.com mailing
list.  To confirm that you want to be removed from this mailing list,
simply reply to this message, keeping the Subject: header intact.  Or
visit this web page........"

I have notified the List administrator that this is a malicious action, I did not request to be removed from the Colorsync List and the perpetrator should be investigated. Nuff said.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on August 19, 2011, 11:16:14 am
Rich, and Andrew,

Thanks for that additional information. It would appear to me that not all the facts are out there yet. While the information that Rich posted confirms what Jack Bingham at Integrated Color (Color Eyes Display Pro) had said about 3rd party support for a retail version of the i1 Display Pro, my email to BasICColor, and the announcement that Spectraview would support the i1 Display Pro (although I'm not sure if it was specified whether or not it would only support the model that they are going to sell, or if it would support retail versions as well) would seem to be in disagreement with the information Rich posted and suggest that X-Rite is indeed making exceptions. I'm not sure.  I think we're going to have to wait and see to find out how this pans out.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on August 19, 2011, 11:28:35 am
... and the announcement that Spectraview would support the i1 Display Pro (although I'm not sure if it was specified whether or not it would only support the model that they are going to sell, or if it would support retail versions as well) would seem to be in disagreement with the information Rich posted and suggest that X-Rite is indeed making exceptions.

I don’t know either but I’m excited that there will be this new instrument available for use in SpectraView. I’m not really that concerned if it will only operate with that software package, especially if the hardware is better mated to the panels.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on August 19, 2011, 11:42:23 am
I don’t know either but I’m excited that there will be this new instrument available for use in SpectraView. I’m not really that concerned if it will only operate with that software package, especially if the hardware is better mated to the panels.

Agreed. I will get one as soon as Spectraview II or Spectraview Profiler releases a version that supports it.
Title: Re: Monitor calibration sensor evaluations
Post by: Alan Goldhammer on August 19, 2011, 11:46:43 am
Andrew, thanks for posting the response from Tom Lianza.  He makes a good point that there is nothing to prevent any third party developer from writing their own driver to make new instrumentation available.  As he noted, Graeme Gill has been doing this for some time (as a one man shop!) to provide hardware support for ArgyllCMS.  He already supports a bunch spectros and as users provide him with the funds to buy new instruments, he will continue this.  I know that I'm a bit of a contrarian and don't believe that there is any conspiracy by X-Rite here but believe that there are opportunities for developers to come out with both new hardware and software solutions.  I won't be surprised at all if NEC source some of the newer hardware and mate it to Spectraview.
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on August 19, 2011, 11:54:47 am
Alan, just to be clear, I think its useful to draw a distinction between "conspiracy" and "questionable policies". All of my concerns and those I've seen expressed by quite a few others, whatever their factual basis, are in the latter area.
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on August 19, 2011, 12:38:39 pm
Andrew, thanks for posting the response from Tom Lianza

There’s more just in:

Quote
As the product manager for our Display Solutions at X-Rite, let me try to
clarify a few things regarding OEM and 3rd party developer support for our
new i1D3 based colorimeters.  There seems to be lot’s of confusion here.
First and foremost, X-Rite absolutely does allow OEMs and 3rd party
developers access to our i1Display Pro retail instruments.  Karl Koch
already mentioned it here on the ColorSync List that BasICColor will support
our retail product and I now have permission from EIZO to announce that a
soon to be released version of ColorNavigator software will also support our
retail i1Display Pro device.  In addition, while I cannot mention their
names, I can tell you that there are more OEMs & developers currently
working on integrating our channel device into their software solutions.
So, for any other OEMs or third party developers that wish to also support
our retail i1Display Pro device, please contact devsupport@xrite.com and ask
for our “i1Display Pro License Request Form”.  Second, regarding support of
OEM versions of i1D3, this is not an X-Rite issue at all.  Any software
developer who wishes to support an OEM version of the i1D3 device must
contact and obtain this ability from that OEM.  I hope this clarifies things
and I must say it’s nice to see such the demand for our new products.

Best regards,
Steve.
Title: Re: Monitor calibration sensor evaluations
Post by: WombatHorror on August 19, 2011, 11:30:17 pm
SVII measures the native TRC of the display in 8, 16, 32, or 52 steps, and puts correction curves in high-bit LUT of the display. So poor dark readings of the i1Pro have obvious and noticable effect.

Are you sure about that, that it is actually calibrating at each measured step on the PA series? Because when I use say some random program with some other monitor then I may see weird tints in dark colors but I never see any weird tints in the grayscale with the SV II whether I use NEC i1D2 or i1Pro with it.

My feeling is that  with the PA series it just measures it and reports along the way in those steps but it doesn't use them to send anything to the LUT and that it just feeds the system the adjustments to set color temp at a fairly bright white and that is it.
Title: Re: Monitor calibration sensor evaluations
Post by: WombatHorror on August 19, 2011, 11:37:19 pm
Rich, and Andrew,

Thanks for that additional information. It would appear to me that not all the facts are out there yet. While the information that Rich posted confirms what Jack Bingham at Integrated Color (Color Eyes Display Pro) had said about 3rd party support for a retail version of the i1 Display Pro, my email to BasICColor, and the announcement that Spectraview would support the i1 Display Pro (although I'm not sure if it was specified whether or not it would only support the model that they are going to sell, or if it would support retail versions as well) would seem to be in disagreement with the information Rich posted and suggest that X-Rite is indeed making exceptions. I'm not sure.  I think we're going to have to wait and see to find out how this pans out.

Cheers, Joe

And what about Argyll or Colorimetre HCFR or the independent coder? Locking things down like this, especially items of this nature usually ends up biting the company in the end (although Apple has managed OK I guess, but then again Apple is Apple).
Title: Re: Monitor calibration sensor evaluations
Post by: RichWagner on August 20, 2011, 12:10:25 am
Andrew, thanks for posting the response from Tom Lianza.  He makes a good point that there is nothing to prevent any third party developer from writing their own driver to make new instrumentation available.  As he noted, Graeme Gill has been doing this for some time (as a one man shop!) to provide hardware support for ArgyllCMS.  He already supports a bunch spectros and as users provide him with the funds to buy new instruments, he will continue this.

There is a huge difference in software development in using a SDK specific to a given device firmware vs. having the electrical engineering know-how to reverse-engineer the firmware.  Doing the latter requires sophisticated and expensive electronics benchwork and know-how. There are few in this field that have that expertise.  Graeme's background in electrical engineering gives him an uncommon advantage. Even so, it would require significant labor, and I'm sure Graeme has more interesting things to do that are of greater interest to the Argyll community.

Many software developers depend on the EyeOne SDK.  Their lack of access to a significant subset of EyeOne devices becomes particularly frustrating, as their expertise is in color science and software development, not reverse-engineering firmware.  The HP-XRite DreamColor colorimeter is a great example. I certainly never expected that the HP-XRite DreamColor colorimeter would only function with the HP-XRite software when I bought it, OR that the HP-XRite DreamColor software would be so poor and feature-deprived.  This expensive colorimeter is not supported by ANY other software - even XRite's Match, ProfileMaker 5, Profiler, etc.  Software support is non-existent.  HP tech support is clueless, and X-Rite says it's HP's problem - even though the software is downloaded from X-Rite's web site. Even searching for the software on X-Rite's web site is futile.  So who wins?  HP... in the short term.  Certainly not the DreamColor monitor owner.

--Rich

Title: Re: Monitor calibration sensor evaluations
Post by: bossanova808 on August 20, 2011, 01:13:20 am
Eizo have announced support for the i1Display Pro in CN6.1, NEC have said they pan to support it in SV2 although not quite officially yet, I think, BasicColor have announced support and X-Rite have clearly and publicly, several times now, stated that they have anSDK availably for the X-Rite i1Display Pro to be accessed by other apps if that app is willing to license it.

Re-badged ones can not be used in other software (so you buy an NEC badged one and you won't be able to use it in i1Profiler, for example).

This would appear to be exactly the same scenario as with the i1Display 2 (and other calibrators for that matter too).

So seriously - what's the issue here?  It's pretty clear and it's just as it was...
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on August 20, 2011, 10:58:48 am
I certainly never expected that the HP-XRite DreamColor colorimeter would only function with the HP-XRite software when I bought it, OR that the HP-XRite DreamColor software would be so poor and feature-deprived.  This expensive colorimeter is not supported by ANY other software - even XRite's Match, ProfileMaker 5, Profiler, etc.  Software support is non-existent.  HP tech support is clueless, and X-Rite says it's HP's problem - even though the software is downloaded from X-Rite's web site. Even searching for the software on X-Rite's web site is futile.  So who wins?  HP... in the short term.  Certainly not the DreamColor monitor owner.

Clearly X-rite is at fault here and HP as well as the buyer is immune.

Yet another reason I was glad I listened to DreamColor early adopters who knew about this technology (people like Lang and Murphy) and didn’t bite.
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on August 20, 2011, 11:02:38 am
So seriously - what's the issue here? 

Good question!
Title: Re: Monitor calibration sensor evaluations
Post by: dchew on August 21, 2011, 05:04:04 pm
I plan to purchase an NEC PA241W-BK.  I currently have only a 17” mbpro, along with the x-rite i1pro spectrophotometer.  As discussed ad nauseum, my recent upgrade to Lion renders iMatch useless.  So I wonder if my plan should be:
-Get the NEC with the SVII for monitor profiling.
-Use some other software along with the i1pro for printer-paper profiles.

Some questions:
1.   What software should I choose for running printer-paper profiles?

2.   I figure I will try the SVII on my laptop screen also.  Given the above reported tests, the SVII may not work to well for that.  If not, l will just go back to the legacy profile I created with i1Match just prior to the Lion upgrade.  Does that sound reasonable assuming the laptop will no longer be the critical monitor?

3.   Does my plan make sense or is there a better path forward?

Dave
Title: Re: Monitor calibration sensor evaluations
Post by: Alan Goldhammer on August 21, 2011, 06:44:08 pm
I plan to purchase an NEC PA241W-BK.  I currently have only a 17” mbpro, along with the x-rite i1pro spectrophotometer.  As discussed ad nauseum, my recent upgrade to Lion renders iMatch useless.  So I wonder if my plan should be:
-Get the NEC with the SVII for monitor profiling.
-Use some other software along with the i1pro for printer-paper profiles.

Some questions:
1.   What software should I choose for running printer-paper profiles?

2.   I figure I will try the SVII on my laptop screen also.  Given the above reported tests, the SVII may not work to well for that.  If not, l will just go back to the legacy profile I created with i1Match just prior to the Lion upgrade.  Does that sound reasonable assuming the laptop will no longer be the critical monitor?

3.   Does my plan make sense or is there a better path forward?

Dave
I suspect that SVII won't run on your laptop screen since it is designed for stand alone NEC monitors that can be addressed by the software.  Since you are getting a NEC monitor don't even bother trying to do anything with your laptop screen as it's probably a waste of time.  Depending on how much time you want to invest ArgyllCMS (http://www.argyllcms.com/index.html) can be used to profile both papers and your screen (and a lot of other things).  A number of us who regularly post here use this and it's free so the price is right.  I've profiled all my papers using an i1 Pro and ArgyllCMS and am currently running my NEC monitor off an Argyll generated profile (I also have SVII).
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on September 03, 2011, 03:50:58 pm
ArgyllCMS 1.3.4 is out there with i1display pro/ColorMunki display support. A couple of very interesting and highly informative insights from Graeme Gill:
http://www.freelists.org/post/argyllcms/Argyll-V134-released

Quote from: Ernst Dinkla
> Could you comment on their quality compared to the old i1 Display II and
> the Spyder III all dall used with ArgyllCMS?
Quote from: Graeme Gill
I haven't done any real comparisons with a spectrometer, but my impression is
that (at least on non-refreshed displays such as LCD's) that they are pretty
good.
The filters seem very close to the standard observer in shape, and the optical
arrangement captures lots of light, giving good resolution readings, and
(in the case of the i1 Display Pro), very fast readings if the patch is
not too dark. Their precision at low light levels is excellent, although they
slow down a bit.
The optics ensures a narrow acceptance angle, and therefore they seem good
at distant measurements, such as measuring a display including flair,
or measuring projectors.

For refreshed displays such as CRT's, then I think the DTP92 and DTP94
are still superior, since they synchronize properly to the refresh rate.

To get some idea about how close the filters are to the standard observer,
I did the following: For a set of display sample XYZ values of the primaries
+ white, for each pair of calibration matrices for the different display
types, I computed the CIEDE2000 between the values predicted by the two
matrices.
Over the 147 combinations, there was an average error of 0.798 DE, and a
maximum of 4.72. I think this hints that the instruments will perform very
well across different types of displays.

Quote from: Roger Breton
> To get some idea about how close the filters are to the standard observer, I
> thought one had to measure the whole 380 to 730 spectrum, using a 1nm (or
> not too larger) spectrograph, to break the spectrum into "lines", so that
> one can get the response of the colorimeter at that particular wavelength
Quote from: Graeme Gill
This information is stored in the instrument.

Quote from: Gerhard Fuernkranz
> just wondering, do these gadgets now have the individually measured
> sensitivity curves stored in their eeprom?
Quote from: Graeme Gill
Indeed they do.

Title: Re: Monitor calibration sensor evaluations
Post by: RichWagner on September 03, 2011, 04:46:22 pm
Graeme also wrote:
 
Quote
Support for the X-Rite i1 Display Pro and ColorMunki Display colorimeters (i1d3).
 While these instruments seem to be quite good without specific display calibration,
 they can benefit from it, so a new type of Argyll colorimeter calibration file has
 been created (CCSS) to support the type of calibration these instruments use.
 ccmxmake has been renamed to ccxxmake and support added for creating CCCS calibration
 files using a spectrometer instrument. A new tool called i1d3ccss allows the calibration
 files that come with the instruments to be translated into CCSS files and installed.
 A CRT calibration is provided, something that is missing from the manufacturers set.
 Non-1931 2 degree observer support has been added for these instruments, taking advantage
 of the type of calibration they use.

///

Marco wrote:
I have not understood, it is possible load the correction directly into the instrument so that
it can be used with every software?

Which instrument do you mean ?

Most colorimeters use a 3x3 calibration matrix which is stored in the
instrument EEPRom.

The i1d3 uses a different approach, and instead uses spectral measurements
from display to optimize it's accuracy for that type of display. Typically
these measurements are stored in a file (.edr files for X-Rite, .ccss for
ArgyllCMS).


and

Quote
Florian Höch wrote:
Graeme, regarding the slower measurement speed of the ColorMunki
Display, do you think this a hardware difference or maybe a difference
in firmware? I'm just curious. I only know from one unofficial source
which previously stated that they should be about the same (in speed)
when used with the same software (ie. when using the ColorMunki with i1
Profiler), but it seems that was just a rumor?

My guess is that the firmware adds about a 1 second delay to each
reading (or maybe imposes a minimum of 1 second per reading). The
instruments otherwise seem to operate identically, including the
integration time set, which is a lot less than 1 second.

So in terms of differentiating the two products, X-Rite have been
smart in making it something that a different driver can't overcome.
Title: Re: Monitor calibration sensor evaluations
Post by: Damir on September 05, 2011, 05:07:18 pm
OT

Ethan is there any chance that you update color gamut spaces for digital cameras on your site - it is very old, some scanners would be nice to see too.
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on September 06, 2011, 01:58:56 pm
Graeme also wrote(...)

and also:
Quote from: Graeme Gill
Possibly the main reason someone might prefer the i1d3 over the ColorMunki
Photo is the better temperature stabilisation, which could be an advantage
in measuring large sets of test values.
http://www.freelists.org/post/argyllcms/Argyll-V134-released,17

X-Rite should seriously consider making a generous donation to ArgyllCMS. I didn't even suspect that these new sensors are so freaking cool before reading Graeme's comments on the list.

Title: Re: Monitor calibration sensor evaluations
Post by: trinityss on September 07, 2011, 02:34:59 am
I was always wondering if there was any kind of temperature compensation.
I asked X-rite support and never received an answer  :(.

May I assume that the compensation is inside the device or is it done by Argyll CMS (but then the device must report the temperature...). I assume it's in the device?


Thx!
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on September 07, 2011, 10:50:14 am
and also:http://www.freelists.org/post/argyllcms/Argyll-V134-released,17

X-Rite should seriously consider making a generous donation to ArgyllCMS. I didn't even suspect that these new sensors are so freaking cool before reading Graeme's comments on the list.

Indeed they should, it was the reason why I finally placed an order for an i1Display Pro. I'm not impressed with X-Rites i1Profiler based monitor profiling software. It has much potential but they still need to fix a few things. Argyll though is quite the piece of software for profiling my non-NEC monitors, especially when paired with the Eye One Pro to generate a correction matrix. I don't think X-Rite would have anything to lose from it.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Paz on October 05, 2011, 11:38:53 pm
Ethan,

Thanks to you and Dry Creek Photo for taking the time to run these tests and publish your results.

In your article, you state:

Quote
For wide-gamut displays, the best colorimeter option is the Datacolor Spyder 3. It is reasonably accurate, but unit-to-unit performance is not as consistent as could be desired.

How would I be able to tell if a particular Spyder3 was a good one or not?   All I've heard is that 'recent' ones are all good.  If that is so, how would one be able to tell the manufacture date?

thanks,

Paz
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on October 06, 2011, 04:18:07 am
Do yourself a favor and get i1Display Pro instead.
Title: Re: Monitor calibration sensor evaluations
Post by: Paz on October 10, 2011, 01:15:18 pm
Thanks for the advice, but Ethan's research indicates the i1Display Pro would not be the best measurement device for my very bright, wide gamut, backlit RGB LED monitor.

Also, I sent an email to x-Rite a couple of weeks ago inquiring about purchasing their products and they have not replied.  It begs the question of what happens after one has actually bought their products... not to mention that I'm not thrilled that my Gretag Macbeth puck has fallen victim to planned obsolescence.
Title: Re: Monitor calibration sensor evaluations
Post by: Mark D Segal on October 10, 2011, 04:48:51 pm

Also, I sent an email to x-Rite a couple of weeks ago inquiring about purchasing their products and they have not replied.  It begs the question of what happens after one has actually bought their products....

Forewarned is fore-armed.
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on October 10, 2011, 05:31:06 pm
Thanks for the advice, but Ethan's research indicates the i1Display Pro would not be the best measurement device for my very bright, wide gamut, backlit RGB LED monitor.
Actually, Ethan didn't tested i1Display Pro (yet). But I'm pretty sure a RGB LED backlit display won't be a problem for this sensor.
Title: Re: Monitor calibration sensor evaluations
Post by: Paz on October 10, 2011, 11:31:13 pm
Thank you, Czornyj .  I double checked.  You're right.

I paid more attention to what did work with my type of monitor than those that did not.

Paz
Title: Re: Monitor calibration sensor evaluations
Post by: kf_tam on October 23, 2011, 01:37:21 am
Ethan's test has been updated to include i1Display Pro, and more samples of Discus and Colormunki Photo.
Thank Ethan for the great work  :D.
Title: Re: Monitor calibration sensor evaluations
Post by: Tim Lookingbill on October 23, 2011, 11:46:16 am
Went to Ethan's DryCreekPhoto site and couldn't find any reviews i1Display Pro.

Does Ethan have another site?
Title: Re: Monitor calibration sensor evaluations
Post by: alain on October 23, 2011, 12:56:19 pm
Went to Ethan's DryCreekPhoto site and couldn't find any reviews i1Display Pro.

Does Ethan have another site?

On the first post there a link inside the sentence : "For the full details, read the article on our site."
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on October 26, 2011, 01:29:38 am
We finished testing the BasICColor Discus (http://www2.chromix.com/colorgear/shop/productdetail.cxsa?toolid=50140&pid=11713), X-Rite i1Display Pro (http://www.amazon.com/gp/product/B0055MBQOW/ref=as_li_tf_tl?ie=UTF8&tag=drycreekphoto-20&linkCode=as2&camp=217145&creative=399373&creativeASIN=B0055MBQOW), and the ColorMunki Display (http://www.amazon.com/gp/product/B0055MBQOM/ref=as_li_tf_tl?ie=UTF8&tag=drycreekphoto-20&linkCode=as2&camp=217145&creative=399373&creativeASIN=B0055MBQOM) a couple of weeks ago. I held off on posting the reviews publically (although from emails and comments I see that the URLs did go out) to give the manufacturers time to review and comment on our findings. Reviews of monitor calibration software are in progress; we are waiting for both vendor comments and the i1Display Pro and Discus to be supported by more software packages.

Our full article (http://www.drycreekphoto.com/Learn/Calibration/MonitorCalibrationHardware.html) goes into details on both the new pucks and comparisons to older ones. We have a direct comparison (http://www.drycreekphoto.com/Learn/Calibration/Details/Discus_v_i1D3.html) of the Discus to the i1Display Pro.

In summary, the new pucks are game changers. Starting with how consistent the sensors are, both the i1D3 and Discus trounce all older contenders. As you can seen in the table below, the average unit-to-unit variation is, from a visual perspective, invisible and the worst-case units are far superior to all other products. If you purchase either sensor, we see no evidence that you will get a lemon.
(http://www.drycreekphoto.com/images/calibration/SensorVariability.png)

As for absolute accuracy, both the i1Display Pro and Discus surpass even individually calibrated i1D2 models. Colorimeter or spectrophotometer - the new pucks leave 'em in the dust. The advantage held for all LCD technologies: standard or wide-gamut CCFL, white or RGB LED. If you are in the market for a monitor calibrator, any other option under $25K will be second best.
(http://www.drycreekphoto.com/images/calibration/SensorAccuracy.png)

All of which leads to the question of which sensor is best. In all our tests - intra-unit variability, accuracy on each panel type, and thermal stability - the Discus came out ahead of the i1Display Pro. The margin was, usually, at or below the limits of visibility. From a strict statistical perspective (http://www.drycreekphoto.com/Learn/Calibration/Details/Discus_v_i1D3.html#Math), the Discus was not significantly more accurate than the i1Display Pro. This statement must be taken in context: We only had five Discus samples available, so the error bars on any confidence estimates are large.

There are two areas where the Discus and i1Display Pro differ greatly. The first is cost: You can buy five i1D3 pucks and have enough left over for a decent dinner out for the price of a single Discus.

The other difference is in handling. The Discus is an imposing presence. If portability is a consideration, BasICColor's beast is not for you. Sheer size and weight do have their advantages. Once you place the Discus on the screen, that's that. Ambient light is effectively sealed out and accurate measurements are assured. The high, narrow profile of the i1Display Pro works against it in this regard. The screen, puck, and counterweight need to be adjusted so the puck sits absolutely flush on the monitor surface. Even so, we found getting the most accurate results from the i1D3 required a darkened room.

Both pucks offer excellent thermal stability (http://www.drycreekphoto.com/Learn/Calibration/Details/Discus_v_i1D3.html#Thermal). This matters, because a typical CCFL-backlight monitor runs ~15F over ambient. Measurements from many older instruments will drift during the profiling session if the puck is not placed on the screen to warm up for 15 minutes first. Eliminating such productivity parasites is always fine by me.

Until we have software in hand that drives both instruments well, the most important comparison between the i1Display Pro and Discus will remain incomplete: namely how well, from a subjective, visual perspective do the calibrations and profiles they generate perform? Although our measurements only showed the Discus having a slight lead over the i1Display Pro in measurement accuracy, we concentrated on performance at white, black, and a handful of intermediate points. I can make handwaving arguments about why the visual difference between the pucks should be, if not invisible, damned close to it or the countervailing argument that small, inconsistent errors can introduce visible artifacts in real-world image applications.

For now, the only commercially-released software we have to drive the Discus is BasICColor Display 4.2, and the i1Display Pro uses i1Profiler 1.1. X-Rite's software comes up short against BC Display on DDC-capable monitors. This appears to be because i1Profiler does not utilize the monitor LUTs for grayscale and gamma adjustments, relying instead only on the video card LUTs. Most better monitors have high-bit LUTs, while video card adjustments are performed in 8-bit mode. As with image editing in Photoshop, curve adjustments to 8-bit images can create artifacts that do not appear when editing in high-bit mode. Monitor profiling packages such as ColorEyes Display and BasICColor that intelligently balance monitor and video card LUT adjustments hold the upper hand in raw calibration performance over i1Profiler. That said, the actual profiling side of i1Profiler looks good; the underlying calibration is not up BCD and CED at their best.

We should be able to make a more direct comparison shortly. BasICColor is due to release a version of Display that talks to the i1Display Pro as is ColorEyes. I'll update when they do.

One other puck did stand out: The X-Rite ColorMunki Display (http://www.amazon.com/gp/product/B0055MBQOM/ref=as_li_tf_tl?ie=UTF8&tag=drycreekphoto-20&linkCode=as2&camp=217145&creative=399373&creativeASIN=B0055MBQOM). It shares the same basic hardware as the i1Display Pro but costs a third less. The software set lacks the more advanced calibration setpoints, validation functionality, and the ability to trend performance over time. The puck itself measures ambient lighting luminance but not color temperature as does the i1D3. Finally measurements poke along at one fifth the speed of the i1Display Pro. This is not as bad as it sounds at first glance - the CM Display speed is the same as that of the older i1 Display 2.

X-Rite told us they do not plan on unlocking the CM Display for third-party software. This makes business sense, as the capability comparison to the i1D3 could make the cost differential unattractive. Nevertheless, even with the limited software set, the ColorMunki Display is the clear choice for hobbyists or others for whom cost is a prime consideration.
Title: Re: Monitor calibration sensor evaluations
Post by: 32BT on October 26, 2011, 06:01:29 am
Did you possibly also do some (subjective) test comparing two different monitors next to each other ?

Such as:
1. Given a utility monitor and an image editing monitor
2. Calibrate the editing monitor to its native white point
3. Calibrate the utility monitor to the image-editing-monitor white point

And what about calibrating the image-editing-monitor to a viewing environment whitepoint?

(Note that this could be done independent from the software. In the past I have tested monitor white point measurements in both white and gray, and while the pucks would report a virtual exact match, the visual color-appearance wasn't even remotely close.)
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on October 26, 2011, 10:16:26 am
Oscar,

We approached this in two ways. First, we evaluated each sensor on a range of screens, manually adjusting each panel to as close to a 6500K white point and 150 cd/m2 luminance level as possible. These included both CCFL and LED backlight laptops, three standard gamut CCFL, four wide-gamut CCFL, one RGB LED, and two white LED displays. We used our spectroradiometer to guide the adjustments. Depending on whether DDC was available, the screens hit the setpoint to varying degrees of accuracy.

The other approach is part of our profiling software evaluations. Here we test both the ability to hit a specified white level and match to another screen.

On top-quality, highly uniform displays if the measured difference between two screens was under 1 dE-2000, the visual match was near-perfect. On lesser displays there often are color and luminance shifts across the screen. These variations can easily exceed 5 dE-2000 on cheapo (~$300, 24") IPS panels and go even higher on on TN panels. There is no way to get a visual match between such a monitor and a reference, simply because only a small portion of the screen accurately hits the calibration target. Most high end displays have maximum variation of at least 2 dE-2000 edge to edge. You can see that.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 26, 2011, 01:44:35 pm
We approached this in two ways. First, we evaluated each sensor on a range of screens, manually adjusting each panel to as close to a 6500K white point and 150 cd/m2 luminance level as possible.

Ethan,

I'm curious to know, relative to the spectroradiometer you're using, where did the various pucks fall in terms of hitting a specific luminance? I've noticed with my own i1 Display Pro that it reports a significantly lower luminance level than the Spyder 3, DTP-94, NEC i1D2, and Eye One Pro. Relative to the Eye One Pro if I measure the luminance at 110 cd/m^2 (in both Argyll and i1 Profiler), the i1 Display Pro will report back somewhere in the range of 96 to 97 cd/m^2, significantly lower than where the other sensors fall. I don't think that's a big deal in the grand scheme of things but it did catch my attention.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: hjulenissen on October 26, 2011, 02:46:42 pm
Thanks for your efforts.

From your findings so far:
Does it seem that purchasing the new i1d3 to use with ArgyllCMS is a good choice given that:
-One allready has a Spyder 3 express that never gave credible readings, and that comparisions with another Spyder 3 was all over the place
-No spectrophotometer is readily available for calibration
-One has a wide-gamut, medium-cost IPS display (Dell u2711s)

best regards
h
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 26, 2011, 04:18:56 pm
One more question... How close were the serial numbers on the units you got to test? I'm curious to know how consistent they are across different production runs.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on October 26, 2011, 05:33:25 pm
Quote from: shewhorn
I'm curious to know, relative to the spectroradiometer you're using, where did the various pucks fall in terms of hitting a specific luminance? ...the i1 Display Pro will report back somewhere in the range of 96 to 97 cd/m^2, significantly lower than where the other sensors fall.

One more question... How close were the serial numbers on the units you got to test? I'm curious to know how consistent they are across different production runs.
Joe - If I am correct in that all your sensors other than the i1Display Pro give readings in the vicinity of 110 cd/m2 while the i1D3 reads 97, then something appears amiss. That is a larger offset from reality then we measured. A few questions: (1) Do all the other instruments read near 110? (2) Did you choose the correct panel type for the i1Display Pro measurements? Depending upon which calibration is loaded, there can be differences in readings, although not to the extent you see.

The units we i1D3 tested were from a variety of sources. Some arrived directly from X-Rite and did indeed have closely grouped serial numbers. Several units were ones we purchased, while the rest came courtesy of various vendors. In the cumulative probability plot linked here (http://www.drycreekphoto.com/Learn/Calibration/Details/Discus_v_i1D3.html#Math), there was no correlation between sensor accuracy and i1Display Pro serial number. In short, the pucks were consistent - certainly to within the 10 dE level you are seeing!

Quote from: hjulenissen
Does it seem that purchasing the new i1d3 to use with ArgyllCMS is a good choice given that...
The 2711 was one of the displays we used for evaluations. The average i1D3 measurement error was 1.4 dE-2K on white and 2.6 dE-2K on black. Only the Discus performed better (0.7/1.9 dE-2K). I would, however, recommend getting the full version (http://www.amazon.com/gp/product/B0055MBQOW/ref=as_li_tf_tl?ie=UTF8&tag=drycreekphoto-20&linkCode=as2&camp=217145&creative=399373&creativeASIN=B0055MBQOW) of the i1D3 rather than the CM Display, as this gives you i1Profiler as well as the ability to use Argyll. A couple of software suppliers are actively working on adding DDC control to Dell Ultrasharps, the 2711 in particular. If they are successful, the resulting profiles will likely surpass Argyll's in quality.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 26, 2011, 08:02:17 pm
Joe - If I am correct in that all your sensors other than the i1Display Pro give readings in the vicinity of 110 cd/m2 while the i1D3 reads 97, then something appears amiss. That is a larger offset from reality then we measured. A few questions: (1) Do all the other instruments read near 110? (2)

Well... I don't think I can say yes to this. I was going off of my historical familiarity with each instrument but it's been a while since I've gotten some objective numbers. I've posted them below but here's the quick synopsis (obviously given the ccss file I used Argyll to get these numbers):

Eye One Pro: 113.24 cd/m^2
i1 Display Pro (with ccss calibration): 105 cd/m^2
i1 Display Pro (without ccss calibration): 104.62 cd/m^2
i1 Display 2 (NEC, no ccss): 114.18 cd/m^2
Spyder 3 (no ccss): 98.02 cd/m^2
DTP-94: 104.49 cd/m^2 (the particular monitor in use is a wide gamut display so the DTP-94 can't handle this particular model so I don't know how accurate that would be but... would it be safe to assume that for measuring white luminance, even if it's reporting things incorrectly in terms of color, it's still measuring the same amount of energy???).

[SPOCK] Fascinating [/SPOCK] So it looks like we have clusters... the i1 Display Pro and DTP-94 agree with one another, the Eye One Pro and i1 Display 2 (MDSVSENSOR II) agree with one another, and the Spyder 3 appears to be lost in space.

Quote
Did you choose the correct panel type for the i1Display Pro measurements? Depending upon which calibration is loaded, there can be differences in readings, although not to the extent you see.

When using i1 Profiler, yes. Wide Gamut CCFL. The GF is gonna poke me with a cattle prod if I don't go to the grocery store now so I'll double check the ∆ between the Eye One Pro and i1 Display Pro when I get back.

Quote
The units we i1D3 tested were from a variety of sources. Some arrived directly from X-Rite and did indeed have closely grouped serial numbers. Several units were ones we purchased, while the rest came courtesy of various vendors. In the cumulative probability plot linked here (http://www.drycreekphoto.com/Learn/Calibration/Details/Discus_v_i1D3.html#Math), there was no correlation between sensor accuracy and i1Display Pro serial number. In short, the pucks were consistent - certainly to within the 10 dE level you are seeing!

Excellent. Looks like you have a pretty good sampling then. Here's the complete data (note, this particular screen is out of calibration at the moment, I just measured the same white patch with each instrument to see where they all fell in terms of luminance relative to one another)

i1Display Pro with ccss file
Current calibration response:
Black level = 0.19 cd/m^2
White level = 105.00 cd/m^2
Aprox. gamma = 2.35
Contrast ratio = 539:1
White chromaticity coordinates 0.3140, 0.3320
White    Correlated Color Temperature = 6415K, DE 2K to locus =  5.7
White Correlated Daylight Temperature = 6414K, DE 2K to locus =  1.4
White        Visual Color Temperature = 6216K, DE 2K to locus =  5.5
White     Visual Daylight Temperature = 6370K, DE 2K to locus =  1.3

i1 Display Pro without ccss file:
Current calibration response:
Black level = 0.20 cd/m^2
White level = 104.62 cd/m^2
Aprox. gamma = 2.34
Contrast ratio = 535:1
White chromaticity coordinates 0.3081, 0.3352
White    Correlated Color Temperature = 6702K, DE 2K to locus = 10.6
White Correlated Daylight Temperature = 6696K, DE 2K to locus =  7.4
White        Visual Color Temperature = 6278K, DE 2K to locus = 10.2
White     Visual Daylight Temperature = 6422K, DE 2K to locus =  7.1

Eye One Pro (straight out of the case, onto the calibration plate, then on to the screen and immediately measured... usually I leave it on the screen for a little while to stabilize)
Current calibration response:
Black level = 0.22 cd/m^2
White level = 113.24 cd/m^2
Aprox. gamma = 2.34
Contrast ratio = 525:1
White chromaticity coordinates 0.3195, 0.3375
White    Correlated Color Temperature = 6105K, DE 2K to locus =  5.9
White Correlated Daylight Temperature = 6104K, DE 2K to locus =  1.5
White        Visual Color Temperature = 5925K, DE 2K to locus =  5.6
White     Visual Daylight Temperature = 6063K, DE 2K to locus =  1.4
The instrument can be removed from the screen.


i1 Display 2 (MDSVSENSOR II NEC)
Black level = 0.22 cd/m^2
White level = 114.18 cd/m^2
Aprox. gamma = 2.34
Contrast ratio = 515:1
White chromaticity coordinates 0.3261, 0.3504
White    Correlated Color Temperature = 5770K, DE 2K to locus =  9.6
White Correlated Daylight Temperature = 5770K, DE 2K to locus =  6.0
White        Visual Color Temperature = 5502K, DE 2K to locus =  9.3
White     Visual Daylight Temperature = 5617K, DE 2K to locus =  5.8

Spyder 3
Current calibration response:
Black level = 0.25 cd/m^2
White level = 98.02 cd/m^2
Aprox. gamma = 2.33
Contrast ratio = 392:1
White chromaticity coordinates 0.3091, 0.3354
White    Correlated Color Temperature = 6643K, DE 2K to locus = 10.2
White Correlated Daylight Temperature = 6638K, DE 2K to locus =  6.9
White        Visual Color Temperature = 6245K, DE 2K to locus =  9.9
White     Visual Daylight Temperature = 6388K, DE 2K to locus =  6.6

DTP-94
Current calibration response:
Black level = 0.20 cd/m^2
White level = 104.49 cd/m^2
Aprox. gamma = 2.34
Contrast ratio = 522:1
White chromaticity coordinates 0.3293, 0.3347
White    Correlated Color Temperature = 5642K, DE 2K to locus =  2.9
White Correlated Daylight Temperature = 5643K, DE 2K to locus =  7.1
White        Visual Color Temperature = 5719K, DE 2K to locus =  2.7
White     Visual Daylight Temperature = 5859K, DE 2K to locus =  6.8
The instrument can be removed from the screen.


Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: tony22 on October 26, 2011, 08:40:22 pm
Ethan, my question may be out in left field but I'll ask it anyway. ;D How would you evaluate the i1Display Pro compared to the old Chroma5 colorimeter for calibration of a plasma HT display? The Chroma5 (once calibrated to NIST standards) is a particularly good match for that kind of work. Excellent performance at very low luminosities. If the new i1DP is better in that area and has great color accuracy it might be good for this purpose.
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on October 26, 2011, 11:48:47 pm
Joe: You are correct in that wide-gamut measurement errors in the DTP-94 are mainly in color (the x/y chromaticity coordinates Argyll reports) rather than luminance. The with/without ccss luminance values from the i1D3 are within measurement noise, particularly if the location on the screen shifted even slightly. I don't know what to make of the i1D2 and i1 Pro luminance values. We could not get any reply from NEC about the calibration process they use for their OEM-branded i1D2 units. Some vendors individually calibrated each puck, while others simply added generic correction matrix values. No clue about NEC. I am suspicious here about the cal. on the i1D2 and i1Pro. How long since your i1Pro went back to X-Rite for calibration? If you want to play the mailing game, send me your i1Pro and we'll measure it.

The chromaticity differences between the i1Pro and i1Display Pro + ccss are small: 1.7 dE-2K. You are right about the Spyder 3. It looks best suited to being a futuristic paperweight. Did you buy it before 2010? If so, those were the days of random Spyder performance.



Tony: I haven't a clue about the Chroma5 and plasma displays. The Chroma5 design was way ahead of its time in terms of filter set, thermal stabilization, and low-level resolution. I think the guy now at X-Rite responsible for much of the i1Display Pro design was running things at Sequel in Chroma5 days (Tom Lianza, who also moonlights as the chair of the ICC). The people at SpectraCal (http://store.spectracal.com/) could answer your comparison questions. They now support the i1Display Pro for their CalMAN A/V calibration product, and have years of experience with the Chroma5.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 27, 2011, 03:39:09 am
Joe: You are correct in that wide-gamut measurement errors in the DTP-94 are mainly in color (the x/y chromaticity coordinates Argyll reports) rather than luminance. The with/without ccss luminance values from the i1D3 are within measurement noise, particularly if the location on the screen shifted even slightly. I don't know what to make of the i1D2 and i1 Pro luminance values. We could not get any reply from NEC about the calibration process they use for their OEM-branded i1D2 units. Some vendors individually calibrated each puck, while others simply added generic correction matrix values. No clue about NEC. I am suspicious here about the cal. on the i1D2 and i1Pro. How long since your i1Pro went back to X-Rite for calibration?

The i1Pro was due for it's annual checkup on 5/4/2011 so it's possible that it needs a tweak. The i1D2 was actually just recently replaced by NEC. The previous puck had started to produce magenta casts. I'm sure it's a refurb but in theory, it's been checked (of course that doesn't mean it couldn't have slipped through QA).

Quote
If you want to play the mailing game, send me your i1Pro and we'll measure it.

Let me know what you think at the end of this... the second pass of results only leaves room for a few conclusions...

1) The monitor I used for the tests in the previous post has some CCFLs that are beginning to go whackadoodle (although I think I would notice a +10 cd/m^2 fluctuation and after multiple sequential measurements I wouldn't have expected to see any kind of consistency but I did).
2) User Error - I thought I was being cautious with placement attempting to line up the sensor aperture with the middle of the patches that Argyll puts up. Perhaps I wasn't as precise as I thought I was being. This time around I created a perfectly centered target in Photoshop (with dimensions of 2560x1600 and 1920x1200) and made that my background. I used this target as a guide to align the sensor apertures (I also ran the tests on my NEC which is significantly more linear).
3) I had less than a glass of wine with dinner, but perhaps I had more than I thought I did?  ;D

2 would seem to be the most plausible explanation.

Quote
The chromaticity differences between the i1Pro and i1Display Pro + ccss are small: 1.7 dE-2K. You are right about the Spyder 3. It looks best suited to being a futuristic paperweight. Did you buy it before 2010? If so, those were the days of random Spyder performance.

The Spyder 3 was actually recently replaced as well. My original was generating some green casts in the shadows. Datacolor sent me a new one (no questions asked) in... maybe May? Clearly this copy is not very good at measuring shadows. The reported contrast ratios across all the other sensors are very consistent but the Spyder 3 indicates a significantly lower contrast ratio due to the higher black point that it's reporting (I ran it multiple times and the numbers coming back were consistent so it's not like the Eye One Pro that kicks back different numbers if you take a snapshot at any one single point in time).

So more results... Here's what i1 Profiler has to say:

1st pass on HP
Eye One Pro = 103 cd/m^2
i1 Display Pro = 102 cd/m^2

2nd pass a few hours later on NEC:
Eye One Pro = 99 cd/m^2
i1 Display Pro = 101 cd/m^2

Well that isn't what I saw before. Hmmm.... I must be going Looney Tunes. I swear there was a pretty big difference between the Eye One Pro and the i1 Display Pro and indeed there does appear to be in Argyll, at least with the previous tests. Let's try this again in Argyll. This time all the sensors had time to bake on screen for about 20 minutes and I used an NEC 2690 (previous tests were done on an HP LP3065... the NEC's luminance across the screen is much more linear so that along with aid of the target as the background should help mitigate placement inconsistencies a bit)


Eye One Pro:
Current calibration response:
Black level = 0.48 cd/m^2
White level = 106.24 cd/m^2
Aprox. gamma = 2.14
Contrast ratio = 220:1
White chromaticity coordinates 0.3165, 0.3324
White    Correlated Color Temperature = 6279K, DE 2K to locus =  4.4
White Correlated Daylight Temperature = 6280K, DE 2K to locus =  0.3
White        Visual Color Temperature = 6136K, DE 2K to locus =  4.2
White     Visual Daylight Temperature = 6288K, DE 2K to locus =  0.3


_______________________________

i1 Display Pro (without ccss):
Current calibration response:
Black level = 0.49 cd/m^2
White level = 100.17 cd/m^2
Aprox. gamma = 2.13
Contrast ratio = 203:1
White chromaticity coordinates 0.3038, 0.3310
White    Correlated Color Temperature = 6977K, DE 2K to locus = 10.6
White Correlated Daylight Temperature = 6970K, DE 2K to locus =  7.5
White        Visual Color Temperature = 6504K, DE 2K to locus = 10.3
White     Visual Daylight Temperature = 6660K, DE 2K to locus =  7.2

_______________________________

i1 Display Pro (with ccss):
Current calibration response:
Black level = 0.50 cd/m^2
White level = 100.58 cd/m^2
Aprox. gamma = 2.13
Contrast ratio = 203:1
White chromaticity coordinates 0.3104, 0.3274
White    Correlated Color Temperature = 6643K, DE 2K to locus =  5.0
White Correlated Daylight Temperature = 6642K, DE 2K to locus =  0.5
White        Visual Color Temperature = 6452K, DE 2K to locus =  4.8
White     Visual Daylight Temperature = 6623K, DE 2K to locus =  0.5

_______________________________

i1 Display 2
Current calibration response:
Black level = 0.57 cd/m^2
White level = 111.27 cd/m^2
Aprox. gamma = 2.13
Contrast ratio = 197:1
White chromaticity coordinates 0.3237, 0.3472
White    Correlated Color Temperature = 5883K, DE 2K to locus =  9.1
White Correlated Daylight Temperature = 5882K, DE 2K to locus =  5.5
White        Visual Color Temperature = 5618K, DE 2K to locus =  8.8
White     Visual Daylight Temperature = 5738K, DE 2K to locus =  5.3

_______________________________

Spyder 3
Current calibration response:
Black level = 0.62 cd/m^2
White level = 101.75 cd/m^2
Aprox. gamma = 2.13
Contrast ratio = 163:1
White chromaticity coordinates 0.3171, 0.3305
White    Correlated Color Temperature = 6262K, DE 2K to locus =  2.7
White Correlated Daylight Temperature = 6263K, DE 2K to locus =  2.2
White        Visual Color Temperature = 6176K, DE 2K to locus =  2.6
White     Visual Daylight Temperature = 6335K, DE 2K to locus =  2.1

_______________________________

DTP-94
Current calibration response:
Black level = 0.50 cd/m^2
White level = 103.03 cd/m^2
Aprox. gamma = 2.15
Contrast ratio = 206:1
White chromaticity coordinates 0.3340, 0.3388
White    Correlated Color Temperature = 5431K, DE 2K to locus =  2.8
White Correlated Daylight Temperature = 5430K, DE 2K to locus =  7.0
White        Visual Color Temperature = 5500K, DE 2K to locus =  2.7
White     Visual Daylight Temperature = 5625K, DE 2K to locus =  6.8


Alright... thought I was being pretty careful with placement before (this time I'm being even more careful)....

Okay... let's try something else...

Spectraview II (on the NEC 2690, probably should have used the 2690 before, although I'm using a target to align the sensors this time around, even if I'm off the difference will be less than 0.5 cd/m^2 in the measurement circle):
Eye One Pro - 104.4 cd/m^2 @ 6135ºK CIE Coordinates=0.3192,0.3339
i1 Display Pro - 101.3 cd/m^2 @ 6526ºK CIE Coordinates=0.3124,0.3287
i1 Display 2 (MDSVSENSOR II) - 111.9 cd/m^2 @ 5887ºK CIE Coordinates=0.3239,0.3417 (this right here is a surprise... usually the Eye One Pro and this sensor were always within 250ºK of one another... I went for a 2nd opinion on this one with BasICColor Display and it pretty much agreed, 110.88 cd/m^2 @ 5822ºK... a 3rd opinion from Argyll is in that ballpark too)
DTP-94 - 104.7 cd/m^2 @ 5373ºK CIE Coordinates=0.3353,0.3396
Spyder 3 - 102.4 cd/m^2 @ 6061ºK CIE Coordinates=0.3208,0.3326

What to make of it? I'm not sure. Maybe I need to do this a few more times being more methodical before I draw any conclusions. As for sending my Eye One Pro over your way, I'd be very interested in knowing how it does compared to everything else you've tested BUT... seeing as its past it's recommended recalibration appt. I'm not sure if it would be terribly useful for you in terms of adding another instrument to your results. That said, you're welcome to any of my sensors if you need to get more data points for your comparisons.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: hjulenissen on October 27, 2011, 06:51:38 am
The 2711 was one of the displays we used for evaluations. The average i1D3 measurement error was 1.4 dE-2K on white and 2.6 dE-2K on black. Only the Discus performed better (0.7/1.9 dE-2K). I would, however, recommend getting the full version (http://www.amazon.com/gp/product/B0055MBQOW/ref=as_li_tf_tl?ie=UTF8&tag=drycreekphoto-20&linkCode=as2&camp=217145&creative=399373&creativeASIN=B0055MBQOW) of the i1D3 rather than the CM Display, as this gives you i1Profiler as well as the ability to use Argyll. A couple of software suppliers are actively working on adding DDC control to Dell Ultrasharps, the 2711 in particular. If they are successful, the resulting profiles will likely surpass Argyll's in quality.
This is very interesting!

I had not even considered that it would be possible to control the display LUT from the computer as long as Dell never advertised that feature. Do you have any references to such discussions?

I am guessing that the "increased quality" over Argyll would be in _applying_ the correction in >8 bit precision display firmware instead of prior to an 8-bit DVI link, not so much in _estimating_ the display behaviour? Given that, it should (in principle) still be possible to estimate the display response using Argyll, and then somehow upload the correction using DDC and some other software?

Being able to automatically switch between a calibrated sRGB response for day-to-day use and a wide-gamut native response whenever a color-management aware application was running (e.g. Lightroom) would really mean a lot to me. Where do I donate the money? :-)

-h

Edit:
Found a link or two that seems relevant:
http://answerpot.com/showthread.php?712963-10-but+LUTs+and+the+Dell+U2711
http://forums.adobe.com/message/3991911
http://colorhacks.blogspot.com/2008/02/monitors-with-internal-luts.html
http://ddccontrol.sourceforge.net/
http://forums.entechtaiwan.com/index.php?topic=7463.0
http://en.community.dell.com/support-forums/peripherals/f/3529/t/19363520.aspx
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on October 27, 2011, 10:22:59 am
I am guessing that the "increased quality" over Argyll would be in _applying_ the correction in >8 bit precision display firmware instead of prior to an 8-bit DVI link, not so much in _estimating_ the display behaviour? Given that, it should (in principle) still be possible to estimate the display response using Argyll, and then somehow upload the correction using DDC and some other software?

Having a conversation yesterday with display guru Karl Lang who produced the Pressview and Sony Artisan, he stated that a display LUT is always preferable and that software LUTs are typically not the way to go for a number of reasons. Hopefully we can convince Karl to come out and discuss this here in more detail. The bottom line he proposed is that units like NEC which provide a linear LUT in the graphic system then apply the corrections in the panel LUT is the way to handle these tasks (for example, he stated that its impossible to control contrast ratio and black at the same time using the graphic system LUT if I understood correctly).

We can discuss differences in dE between devices but if the process is creating corrections on the graphic system instead if in a high bit LUT in the panel, the differences are moot.
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on October 28, 2011, 01:37:49 am
A set of comments:

Joe: I'm not sure what is going on there. The approach we use to collect data is as follows:
Your latest data make everything but the i1D2 look reasonable. Luminance readings are fairly close - as expected - and the color temperature variations are certainly within the sensor accuracy values we saw. I would try the NEC sensor with NEC's software. If I remember correctly, SVII applies a correction matrix known only to BasICColor and NEC.


This is very interesting!

I had not even considered that it would be possible to control the display LUT from the computer as long as Dell never advertised that feature. Do you have any references to such discussions?

I am guessing that the "increased quality" over Argyll would be in _applying_ the correction in >8 bit precision display firmware instead of prior to an 8-bit DVI link, not so much in _estimating_ the display behaviour? Given that, it should (in principle) still be possible to estimate the display response using Argyll, and then somehow upload the correction using DDC and some other software?
Dell makes offhand mention of DDC in the online documentation (http://support.dell.com/support/edocs/MONITORS/U2711b/en/ug/about.htm). From what I have been told, the problem with DDC and Dell stems from Dell's transfer of all firmware development to a new group when the transition from CRT to LCD occurred. Some LCD Dell monitors returned an EDID (the "Extended Display Identification Data" tag used to identify a particular panel) for a previous generation CRT. There are other quirks with Dell's DDC implementation, of which their own engineers may or may not be aware.

DDC offers several advantages. First, adjustments to the monitor usually use a finer scale than you get through the OSD. Even with monitors having only 8-bit internal LUTs, making some of the necessary corrections in the monitor LUT often produces fewer calibration artifacts. Finally, for high-bit monitors, adjustments that create visible banding when performed only through the video card are smooth as can be when made in the monitor.


Quote from: digitaldog
Having a conversation yesterday with display guru Karl Lang who produced the Pressview and Sony Artisan, he stated that a display LUT is always preferable and that software LUTs are typically not the way to go for a number of reasons. Hopefully we can convince Karl to come out and discuss this here in more detail. The bottom line he proposed is that units like NEC which provide a linear LUT in the graphic system then apply the corrections in the panel LUT is the way to handle these tasks (for example, he stated that its impossible to control contrast ratio and black at the same time using the graphic system LUT if I understood correctly).

I'll agree with Karl for the most part. The hybrid approach used by Frank Herbert and the ICS folks of splitting adjustments between both the video card and monitor gave smoother gradients and better shadow resolution on an Artisan than GMB/Son'y own code. You can control contrast ratio and black level through the video card alone, although you may not like the resulting posterization.
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 28, 2011, 03:06:17 am
We record 100 readings and average (hacked versions of Argyll and a tool provided by X-Rite).

Haven't written a single line of code since September of 2002 when I left tech but... if it's in C or C++ I'm guessing that would be a fairly straightforward change to make. I might have to dust off gcc (errr... make that, install it first... I figured it would be part of the standard UNIX install on OS X).

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 28, 2011, 03:33:27 am
Dell makes offhand mention of DDC in the online documentation (http://support.dell.com/support/edocs/MONITORS/U2711b/en/ug/about.htm). From what I have been told, the problem with DDC and Dell stems from Dell's transfer of all firmware development to a new group when the transition from CRT to LCD occurred. Some LCD Dell monitors returned an EDID (the "Extended Display Identification Data" tag used to identify a particular panel) for a previous generation CRT. There are other quirks with Dell's DDC implementation, of which their own engineers may or may not be aware.

On a somewhat related but slightly tangential note, have you ever seen curious behaviors  in any of Dell's monitors with regards to how the monitor LUTs behave? I tested an Asus PA246Q. All things considered, for a sub $500 monitor it had some nice features... 12 bit LUT, 10 bit panel... it had two fatal flaws though:

1) It couldn't go lower than 135 cd/m^2
2) ANY change to the monitor LUTs resulted in an alarming increase in ∆E... as in average ∆E in the 10+ range.

I first noticed this when using CEDP to control it via DDC. The validation results could have been the lead character in a horror film. Multiple attempts produced identical results. DDC being what it is, I figured their implementation may not have been standard so I reset the monitor and figured I'd do it the old fashioned way, by adjusting the RGB gains and offsets by hand. This produced the same results. If you set the RGB levels to anything other than 0,0,0, the average ∆E would jump to 10+. Didn't matter if it was 1,0,0 or -20,+17,-4... any change would result in massive average ∆E.

I found this to be a rather curious behavior. A month later I tested a Dell U2410. SAME EXACT behavior. Similar specifications... 12 bit LUT, 10 bit panel (I believe they both used the same panel)... I kind of have a sneaking suspicion that Dell and Asus OEMed the guts and contracted the engineering from the same place. That's just too much of a coincidence.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: hjulenissen on October 28, 2011, 03:35:38 am
I'll agree with Karl for the most part. The hybrid approach used by Frank Herbert and the ICS folks of splitting adjustments between both the video card and monitor gave smoother gradients and better shadow resolution on an Artisan than GMB/Son'y own code. You can control contrast ratio and black level through the video card alone, although you may not like the resulting posterization.
Doing stuff PC-side offers flexibility and adaptivity. Doing stuff display-side means that you (potentially) avoid requantizing to 8 bits.

Seems to me that doing 1D 8->10/12-bit gamma on the native primaries in the monitor LUT and perhaps large-scale channel gains (brightness, whitepoint) makes sense, while doing complex rotations is probably a job for your favourite CM-aware application (e.g. Photoshop) that have complete access to 1)characteristics of the camera used and 2)complete access to the profile of the subsequent image pipeline.

-h
Title: Re: Monitor calibration sensor evaluations
Post by: hjulenissen on October 28, 2011, 03:38:29 am
I first noticed this when using CEDP to control it via DDC. The validation results could have been the lead character in a horror film. Multiple attempts produced identical results. DDC being what it is, I figured their implementation may not have been standard so I reset the monitor and figured I'd do it the old fashioned way, by adjusting the RGB gains and offsets by hand. This produced the same results. If you set the RGB levels to anything other than 0,0,0, the average ∆E would jump to 10+. Didn't matter if it was 1,0,0 or -20,+17,-4... any change would result in massive average ∆E.
What is CEDP?

Could it be that all/most internal processing is applied as a modification of the internal LUT, and that the factory (unknown) LUT is very irregular to compensate for nasty panel behaviour?

-h
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 28, 2011, 04:12:22 am
What is CEDP?

Color Eyes Display Pro

Quote
Could it be that all/most internal processing is applied as a modification of the internal LUT, and that the factory (unknown) LUT is very irregular to compensate for nasty panel behaviour?

I don't think so. The panel it uses is the same as the NEC PA241W, Eizo CG243W, Eizo CG245W and a few others so it's keeping some pretty good company. It doesn't mean it will perform the same as those monitors, I certainly don't expect it to have the same luminance uniformity as the Eizo and NECs and the quality of CCFL that they use could certainly have an impact as well but I don't think that's enough to cause the performance to be so off base that it requires drastic corrections like that. I'm speculating of course so I could be wrong but it would seem unlikely.

Just an additional note... unlike the Asus the Dell U2410 was able to get down to quite useable luminance levels and had enough headroom underneath it (wouldn't that be footroom?) so that you weren't running it at an extreme. If I remember correctly with the luminance all the way down the monitor was at 89 cd/m^2 (I'd have to double check on that, at any rate I remember being surprised as Dells have historically required a welder's mask).

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: ThDo on October 30, 2011, 02:47:54 am
@shewhorn

Here are mine Spectraview II values.

Spyder 3: CIE 0.306, 0.338 at 114.24 cd/m²
i1 Pro: CIE 0.313, 0.335 at 116.63 cd/m²
i1 Display Pro: CIE 0.313, 0.330 at 114.33 cd/m²

What did you do to see 4 decimal places?

Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 31, 2011, 02:56:43 am
What did you do to see 4 decimal places?

In target settings, click on edit, and then select custom for white point, then click edit. This brings up the measurement screen. This screen gives you 4 decimal places vs. the colorimeter winow which only gives you 3.

Ethan... Here's another go, all in Spectraview II this time (on the NEC2690).

Eye One Pro = 106.32 cd/m^2 @ 6145ºK
i1 Display Pro = 100.19 cd/m^2 @ 6427ºK
i1 Display 2 (MDSVSENSOR II) = 112.08 cd/m^2 @ 6050ºk
Spyder 3 = 101.69 cd/m^2 @ 6100ºK
DTP-94 = 102.41 cd/m^2 @ 5500ºK

For quick reference here's the results from a few days ago:

Eye One Pro - 104.4 cd/m^2 @ 6135ºK
i1 Display Pro - 101.3 cd/m^2 @ 6526ºK
i1 Display 2 (MDSVSENSOR II) - 111.9 cd/m^2 @ 5887ºK
Spyder 3 - 102.4 cd/m^2 @ 6061ºK
DTP-94 - 104.7 cd/m^2 @ 5373ºK

So for color temperature the i1 Display Pro is consistently reading a bit cooler than  the Eye One Pro, i1 Display 2 (MDSVSENSOR II), and Spyder 3 (which are all relatively close to one another, we'll consider the results of the DTP-94 invalid for color temp).
For Luminance this time around the i1 Display Pro, the Spyder 3, and the DTP-94 are relatively close, the Eye One Pro falls in the middle of the pack, and the i1 Display 2 (MDSVSENSOR II) is reading higher than everything else.

I used a slightly different methodology today... Not only were all the pucks plugged in and on the screen for quite a while like they were before, I also ran them in continuous measurement mode until the values stabilized. I figured being plugged in and on the screen for a long time would be enough to get them to operating temperature but the Eye One Pro's reported color temp changes quite a bit... it maybe takes about 5 minutes for it to stabilize and the reading can change by 150ºK. All of the pucks' color temp readings changed when taking continuous measurements with the exception of the Spyder 3 which pretty much didn't change. With the Eye One Pro, the i1 Display 2 and the DTP-94 the reported color temperature got cooler as the pucks heated up and the i1 Display Pro's reading got warmer (although the change wasn't as much as the others with the exception of the aforementioned Spyder 3). The luminance readings were all pretty consistent.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: Alan Goldhammer on October 31, 2011, 08:04:23 am
Joe, thanks for the work on this.  I've noticed that it can take a while for various instruments to stabilize as well.  Most of my calibration is done (NEC P221 with Spectraview) with the i1 Display 2 (MDSVSENSOR II) and I find that the first calibration run invariably gives very poor results with a high delta E.  2nd and third runs repeated right after are stable and readings seldom differ by more than 1%.  Before I sold my ColorMunki, I compared it with the i1 Display 2 and they were pretty much equivalent.  Since I'm satisfied with the current calibration approach, I have not experimented around with my i1 Pro but maybe will do so the next time it is due for a calibration.

Alan
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on October 31, 2011, 08:41:57 am
I used a slightly different methodology today... Not only were all the pucks plugged in and on the screen for quite a while like they were before, I also ran them in continuous measurement mode until the values stabilized. I figured being plugged in and on the screen for a long time would be enough to get them to operating temperature but the Eye One Pro's reported color temp changes quite a bit... it maybe takes about 5 minutes for it to stabilize and the reading can change by 150ºK.

Did you recalibrate the i1Pro after it was stabilized?
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on October 31, 2011, 09:13:08 am
I find that the first calibration run invariably gives very poor results with a high delta E.  2nd and third runs repeated right after are stable and readings seldom differ by more than 1%.

It's something different - CCFL backlight also needs some time for stabilisation, so it may drift a bit in a period of time between calibration and validation. That's why you get consistent results and lower deltas in next runs - you can also check "Extended luminance stabilization time" in SpectraView II Preferences>Calibration tag.
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on October 31, 2011, 10:35:32 am
you can also check "Extended luminance stabilization time" in SpectraView II Preferences>Calibration tag.

Yup, I always have that on. It slows the process way down but who cares. As long as you don’t have a screen saver come on during this longer process, set it and forget it.
Title: Re: Monitor calibration sensor evaluations
Post by: Alan Goldhammer on October 31, 2011, 10:45:51 am
It's something different - CCFL backlight also needs some time for stabilisation, so it may drift a bit in a period of time between calibration and validation. That's why you get consistent results and lower deltas in next runs - you can also check "Extended luminance stabilization time" in SpectraView II Preferences>Calibration tag.
Thanks for the tip; I just ticked the box and will try this next time I calibrate.  Andrew's point following is good about the screensaver (though I always turn mine off while calibrating).
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 31, 2011, 08:13:27 pm
Did you recalibrate the i1Pro after it was stabilized?

As fast as I possibly could.... Closed the "colorimeter" window (which was taking continuous measurements), put the the Eye One Pro on the calibration tile and covered it (not using the tile... just using the lack of light when on the tile), quit the application and restarted it, did the cal, realigned the sensor, and started calibration. So yes! Even so I've noticed that having the Eye One Pro out of continuous measurement mode, even  for 30 seconds will cause the color temp to drift a bit. I knew it was sensitive to heat but I never knew just how sensitive it was!

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 31, 2011, 08:15:46 pm
It's something different - CCFL backlight also needs some time for stabilisation, so it may drift a bit in a period of time between calibration and validation. That's why you get consistent results and lower deltas in next runs - you can also check "Extended luminance stabilization time" in SpectraView II Preferences>Calibration tag.

I've noticed most larger CCFL lit monitors can take around 45 minutes to stabilize. For this reason I never bother calibrating and profiling in the morning. When I need to cal, it's the very last thing I do at the end of the day before I turn the machine off, that way I never have to guess as to whether or not the monitor is warmed up.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: shewhorn on October 31, 2011, 08:18:46 pm
Joe, thanks for the work on this. 

My pleasure although... Ethan deserves all the thanks here. When you consider all the multiple pucks and multiple monitors, I'm guessing the number of hours he's put into this is probably around the 160 hour mark if not more. This information is of tremendous value.

Cheers, Joe
Title: Re: Monitor calibration sensor evaluations
Post by: ThDo on November 01, 2011, 02:52:21 am
Any clues why you get such a huge difference between i1 Pro and i1 Display Pro?

I have tried your approach and get the following values.

i1 Display Pro
6509K at 107.94 cd/m²

i1 Pro
6482K at 109,32 cd/m²

Spyder 3
6976K at 108,54 cd/m²

Title: Re: Monitor calibration sensor evaluations
Post by: Ernst Dinkla on November 01, 2011, 04:28:16 am
Ethan deserves all the thanks here. This information is of tremendous value.

Cheers, Joe

Yes, the testing and this thread is the best information available for anyone that needs color correct monitors. Ethan has done a brilliant job.


met vriendelijke groeten, Ernst

330+ paper white spectral plots including the Canon US catalog:

http://www.pigment-print.com/spectralplots/spectrumviz_1.htm
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on November 01, 2011, 09:52:36 am
Thanks to all for the kind words. Much appreciated.

Joe: I'll look into your latest data later -  typing on my phone while waiting for a flight isn't conducive to number crunching.  Taking a SWAG at it, the numbers look reasonable for luminance. Color temp I am not so sure about. The variation you see between i1Pro and i1D3 are certainly within the range we have seen. The NEC puck just plain looks bad. The luminance value should be more in line with the rest. Shoot me a PM with your snail-mail and I will send you an i1D3 w measured as being in the middle of the pack.

Other quick comments: The huge offset of the i1d2 from the other instruments is not surprising. See the chart in the first post of this lengthy thread. The unit-to-unit variability we measured on the i1d2 is large. As far as I can ascertain, NEC does not calibrate each i1d2 individually.

An i1 Pro is one of the more temperature sensitive instruments. Letting it stabilize at monitor temp and then calibrating is essential for accuracy.
Title: Re: Monitor calibration sensor evaluations
Post by: fetish on November 11, 2011, 03:00:40 pm
This thread is one of the most informative and wonderful thread I've read. Enjoyed it immensely.


Dell makes offhand mention of DDC in the online documentation (http://support.dell.com/support/edocs/MONITORS/U2711b/en/ug/about.htm). From what I have been told, the problem with DDC and Dell stems from Dell's transfer of all firmware development to a new group when the transition from CRT to LCD occurred. Some LCD Dell monitors returned an EDID (the "Extended Display Identification Data" tag used to identify a particular panel) for a previous generation CRT. There are other quirks with Dell's DDC implementation, of which their own engineers may or may not be aware.

Slightly off the topic but yes it seems that quite a few Dell ultrasharp monitors have DDC built in them but they posses nowhere near the implementation of NECs and Eizos.
To be honest it's a horrible hit and miss (more miss than hits) affair everytime I tried to take advantage of DDC using CEDP that I dismissed using it entirely.
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on November 11, 2011, 10:56:46 pm
Slightly off the topic but yes it seems that quite a few Dell ultrasharp monitors have DDC built in them but they posses nowhere near the implementation of NECs and Eizos.
To be honest it's a horrible hit and miss (more miss than hits) affair everytime I tried to take advantage of DDC using CEDP that I dismissed using it entirely.

The last I heard, the guys at Integrated Color (makers of CEDP) gave up on talking to Dell. At least two other monitor profiling software vendors I am aware of are working actively on cracking DDC on Dell UltraSharp displays. As I mentioned above, there is serious confusion within Dell regarding DDC.

On Windows, CEDP uses a DDC driver from Portrait Displays. It is outdated and tied all too much to individual graphics cards. The most recent video cards specifically supported were introduced in 2006 (NVIDIA 8800). No clue whether an update to the DDC code will be forthcoming. We can hope.
Title: Re: Monitor calibration sensor evaluations
Post by: Tim Lookingbill on November 13, 2011, 12:31:13 pm
Ethan,

I'ld be interested in your thoughts on this thread I started...

http://www.luminous-landscape.com/forum/index.php?topic=59388.msg479174#msg479174

Do you think my buying the i1Display Pro would fix these issues I've been having and deliver better Delta E numbers?
Title: Thanks! (Re: Monitor calibration sensor evaluations)
Post by: ComputerDork on November 28, 2011, 08:51:23 pm
I don't have anything to add at this point, but my eternal thanks for both performing such thorough evaluations and then making them publicly available. This is incredibly useful! I'm now kind of wishing that I had bought a ColorMunki first (for printer profiling and reasonably OK monitor profiling) then an i1 Display Pro when I bought my NEC PA241 (instead of the OEM MDSVSESOR2).
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on January 03, 2012, 04:48:51 am
There's a new Spyder4 colorimeter out there.

Ethan, when could we count to hear something about new device from you?
Title: Re: Monitor calibration sensor evaluations
Post by: mac_paolo on January 03, 2012, 06:19:38 am
There's a new Spyder4 colorimeter out there.
Any link?  ???
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on January 03, 2012, 07:32:09 am
http://shop.colourconfidence.com/section.php/10684/1/new-spyder4-range
Title: Re: Monitor calibration sensor evaluations
Post by: uwitberg on January 06, 2012, 06:30:07 am
There's a new Spyder4 colorimeter out there.

Datacolor have updated their website by now:
http://spyder.datacolor.com/product-mc.php (http://spyder.datacolor.com/product-mc.php)
Title: Re: Monitor calibration sensor evaluations
Post by: mac_paolo on January 08, 2012, 04:17:35 am
Datacolor have updated their website by now:
http://spyder.datacolor.com/product-mc.php (http://spyder.datacolor.com/product-mc.php)
Oh boy... I bought the i1Display Pro 3 days ago. I costed 50€ than advised price for Spyder4 Elite.
Should I return the former one to get the latter and keep the difference?  ;)
Title: Re: Monitor calibration sensor evaluations
Post by: digitaldog on January 08, 2012, 12:13:06 pm
The i1D Pro is an excellent piece of hardware. Unless you have data to prove the other product is as good or better, I’d stick with what you have.
Title: Re: Monitor calibration sensor evaluations
Post by: Czornyj on January 08, 2012, 12:15:12 pm
Oh boy... I bought the i1Display Pro 3 days ago. I costed 50€ than advised price for Spyder4 Elite.
Should I return the former one to get the latter and keep the difference?  ;)

I bet i1Display Pro is still better than new Spyder4, it's a very smart designed, well made sensor with advanced profiling software.
Title: Re: Monitor calibration sensor evaluations
Post by: WombatHorror on January 09, 2012, 03:36:50 pm
On a somewhat related but slightly tangential note, have you ever seen curious behaviors  in any of Dell's monitors with regards to how the monitor LUTs behave? I tested an Asus PA246Q. All things considered, for a sub $500 monitor it had some nice features... 12 bit LUT, 10 bit panel... it had two fatal flaws though:

1) It couldn't go lower than 135 cd/m^2
2) ANY change to the monitor LUTs resulted in an alarming increase in ∆E... as in average ∆E in the 10+ range.

I first noticed this when using CEDP to control it via DDC. The validation results could have been the lead character in a horror film. Multiple attempts produced identical results. DDC being what it is, I figured their implementation may not have been standard so I reset the monitor and figured I'd do it the old fashioned way, by adjusting the RGB gains and offsets by hand. This produced the same results. If you set the RGB levels to anything other than 0,0,0, the average ∆E would jump to 10+. Didn't matter if it was 1,0,0 or -20,+17,-4... any change would result in massive average ∆E.

I found this to be a rather curious behavior. A month later I tested a Dell U2410. SAME EXACT behavior. Similar specifications... 12 bit LUT, 10 bit panel (I believe they both used the same panel)... I kind of have a sneaking suspicion that Dell and Asus OEMed the guts and contracted the engineering from the same place. That's just too much of a coincidence.

Cheers, Joe

I believe one monitor review site said that the color engine in the U2410 is broken and it produces twisted messed up results if you use the custom mode and it's better to leave it in presets and adjust externally.
It would be odd if ASUS used the same driving electronics/firmware with same bug though.
Apparently the Dell U2710 has no such bug.

Title: Re: Monitor calibration sensor evaluations
Post by: WombatHorror on January 09, 2012, 03:52:06 pm
@shewhorn

Here are mine Spectraview II values.

Spyder 3: CIE 0.306, 0.338 at 114.24 cd/m²
i1 Pro: CIE 0.313, 0.335 at 116.63 cd/m²
i1 Display Pro: CIE 0.313, 0.330 at 114.33 cd/m²

What did you do to see 4 decimal places?




I generally get extremely close readings from my custom NEC i1D2 and my i1Pro when using SV II to measure my NE PA241W. Over a year they don't agree quite as much at the start (when most values would be exactly the same to .000 or .001 off, I think only one or two had been .002 off) but even now it's pretty close.

I did find that I had to let the NEC i1D2 warm up for a long time, 15 minutes was NOT enough for it to closely match my i1pro. I let it warm for 45 minutes and then, with the monitor in native gamut mode/photo editing, measured:

i1pro:
WP .313,.330 6503K
R .679, .309
G .214, .690
B .152, .056

and the custom NEC i1D2 (and I believe they DO custom calibrate each copy, they have implied that, and I'd have to be awfully luck to have such close matching I think, and the price for them would be a remarkable rip-off otherwise and the difference between stock i1D2 and their i1D2 is the same as many place charge for a custom calibration):
WP .315, .330 6376K
R .683, .306
G .214, .691
B .152, .057

and comparing sRGB emulation mode:
i1pro:
.639, .330
.301, .599
.152 , .061
.312, .329
whle i1d2 said:
.642, .328
.303, .600
.152,.061
.315,.331

so the differences have certainly grown over the past year, when both were new, as I said, many values were .000 and the rest .001 but for a couple at .002 while nwo there are a number of .003 differences and one .004, that said there are still many .000 or .001 so the match is not too far off

In fact, it's actually only how each reads red where any difference at all lies and that is where the difference increased a little bit over teh year. For how each reads the blue and green signal, it's essentially exactly the same for both and still so a year a later.

I didn't compare luminance or gamma tracking. I didn't compare black point since the i1pro is sketchy there and tends to return a different luminance, often quite different, on every reading from SV II since it doesn't average them much.

(side note: I think when comparing probes and seeing how they do on wide gamuts, in particular, the chromaticities need to be measured and not jsut luminance and white point! For some probes it is the primaries where the measurements go really crazy on wide gamut too and not just with the WP.)

Title: Re: Monitor calibration sensor evaluations
Post by: WombatHorror on January 09, 2012, 04:00:34 pm
Oscar,

We approached this in two ways. First, we evaluated each sensor on a range of screens, manually adjusting each panel to as close to a 6500K white point and 150 cd/m2 luminance level as possible. These included both CCFL and LED backlight laptops, three standard gamut CCFL, four wide-gamut CCFL, one RGB LED, and two white LED displays. We used our spectroradiometer to guide the adjustments. Depending on whether DDC was available, the screens hit the setpoint to varying degrees of accuracy.

The other approach is part of our profiling software evaluations. Here we test both the ability to hit a specified white level and match to another screen.

On top-quality, highly uniform displays if the measured difference between two screens was under 1 dE-2000, the visual match was near-perfect. On lesser displays there often are color and luminance shifts across the screen. These variations can easily exceed 5 dE-2000 on cheapo (~$300, 24") IPS panels and go even higher on on TN panels. There is no way to get a visual match between such a monitor and a reference, simply because only a small portion of the screen accurately hits the calibration target. Most high end displays have maximum variation of at least 2 dE-2000 edge to edge. You can see that.

Am I correct that you don't test them for measuring the primaries though? Only the luminance and white point?
FOr instance I saw some guys using a spyder3 and with the new software getting a semi-decent WP reading from it on a wide gamut and they thought all was good but when I had them plot the gamut the green primary was plotted way out of line.

But much thanks for all the measurements taken though, very useful stuff!!!!
Title: Re: Monitor calibration sensor evaluations
Post by: Ethan_Hansen on January 13, 2012, 05:02:57 pm
We measured white (6500K, 150 cd/m2), lowest usable black (roughly the level at which [0, 0, 0] could be differentiated from [1, 1, 1]), middle gray, and the three primaries. For simplicity we only reported the results at black and white. Middle gray showed nothing of interest, and while the primary color measurements could diagnose which color a particular sensor had difficulty with (and, yes, the Spyder 3 was at its worst with green), the white point value encompassed all the primaries. After all, white is made with all three primaries turned on full.