We have been working on a
thorough evaluation of monitor measurement hardware. We used a reference-grade spectroradiometer (Photo Research
PR-730) to compare sensor performance on a variety of monitors. All were IPS displays, and ranged from entry level sRGB gamut models (e.g. NEC 231wmi, Dell U2311H) with CCFL backlight, to Adobe RGB compatible wide-gamut models with either CCFL backlights (e.g. NEC PA-241W, Eizo CG243W) or RGB LED backlighting (HP LP2480zx). We measured multiple samples of each sensor to quantify inter-instrument agreement on a white background. Each monitor was then measured at a 6500K, 150 cd/m2 white level, middle gray, and the lowest useable black level as measured by
ColorEyes Display; roughly the level where RGB (0, 0, 0) is distinguishable from (1, 1, 1).
Colorimeters measured were the X-Rite DTP-94 and
Eye-One Display 2, and the
Datacolor Spyder 3 Elite. These were compared to the
X-Rite Eye-One Pro spectrophotometer and, for the sake of experimentation, to the venerable Spectrolino. We measured a minimum of 10 samples for each of the above sensors. We had access to a single
ColorMunki. While we could not compare unit-to-unit stability for the ColorMunki, the accuracy measurements were interesting.
For the full details,
read the article on our site.
Summarizing the contents we saw the DTP-94 and the spectrophotometers had the best inter-instrument agreement. Any given DTP-94 or i1Pro can be counted on to provide consistent performance, be it good or bad. The Spyder 3 showed more variability. We only compared units manufactured after mid-2009. Older Spyders were all over the place in performance. Trailing the pack was the Eye-One Display. There was over twice as much
average unit-to-unit variability between Eye-One Displays as between the two most different DTP-94 sensors. The two most different i1 Displays (out of 16 total) differed by 14 dE-2000 in the color they reported as monitor white. That's a lot. If you have multiple monitors to calibrate and use an Eye-One Display, be sure to calibrate them all with the same sensor.
We next measured accuracy as referenced to our PR-730 on each monitor at white, middle gray, and black. The results were nicely summarized by comparing accuracy on a standard gamut (96% sRGB, 66% Adobe RGB coverage) CCFL monitor and a wide-gamut (168% sRGB, 115% Adobe RGB) LED-backlit screen.
Here we can see the strengths and weaknesses of each instrument. Spectrophotometers, with the exception of the ColorMunki, have no problems measuring lighter values on any display type. Problems occur when luminance values drop down into the mud. The standard gamut monitor had black point luminance of 0.19 cd/m2, the wide-gamut panel was a mere 0.11 cd/m2. Most of what the spectrophotometers recorded at these low levels was measurement noise. Monitor profiles made from these devices exhibited color tinges in what should have been neutral shadow areas.
We only had one ColorMunki unit at our disposal; its performance was comparatively poor. I can not say if the Munki we measured was the runt of the litter or if our results were indicative of less than stellar performance for the product line. Given that the ColorMunki is marketed as a entry-level device, the performance we saw may not be surprising. We saw consistently high error levels on all monitors with the Munki. If your livelihood depends on how accurately your monitors render images, the ColorMunki is not the best choice.
Among the colorimeters, the DTP-94 turned in consistently good performance on standard gamut displays and equally consistently poor performance on all wide-gamut models. The Spyder 3 turned in the best wide-gamut performance of any of the standard colorimeters, essentially equal to what we measured on sRGB gamut monitors. Inter-instrument variability is a problem with the Spyder 3, however. Although the average accuracy was good, few photographers have a dozen Spyders at hand to use in calibrating their screens. Overall, I would rate the Spyder 3 as being the best option we evaluated for profiling wide-gamut displays.
The Eye-One Display trails the pack. If we took the best performing 50% of the i1 Display units - ones clustered within +/- 2dE-2000 of each other - the story would be different. All the i1 units were relatively new, the oldest being manufactured in 2008, the newest this year. Our first thought was that we might be seeing the effects of the internal plastic filters aging. This is a problem with i1 Display - and any other colorimeter using plastic filters - another advantage to the DTP-94 and its glass filter set. We had a few early, pre-production i1 Displays courtesy of GretagMacbeth. The readings from these units were indeed very different from the rest. There was no correlation between i1 age and color response for the newer pucks. Based on our sample size, you have a 50/50 chance of getting a good i1 Display.
The last set of sensors we looked at were OEM-branded Eye-One Displays tuned to the characteristics of a particular monitor. We had one each for the HP LP2480zx and NEC PA-241W. Both of these i1's turned in excellent performance on their respective monitors. Using the tuned sensor on any other monitor, however, was invitation to disaster. The results were ugly. Again, with a sample size of one device, I cannot say whether these were particularly good examples of the i1 Display family or whether the OEM tuning tames the unit-to-unit variability seen with standard Eye-One Displays. Quato sells tuned versions of the DTP-94 with their (seriously) high-end Intelli Proof monitors. Having neither Quato monitor nor one of their sensors, I can not comment on their performance.
Update May 23, 2011: New data available for OEM calibrated Eye-One Display 2 sensors. The i1D2 can be individually calibrated using a reference spectroradiometer. Doing so reduces mean intra-unit variability by almost half (mean unit-to-unit variation of 7.0 dE-2000 for stock i1D2 vs. 3.7 dE for OEM calibrated).