We finished testing the
BasICColor Discus,
X-Rite i1Display Pro, and the
ColorMunki Display a couple of weeks ago. I held off on posting the reviews publically (although from emails and comments I see that the URLs did go out) to give the manufacturers time to review and comment on our findings. Reviews of monitor calibration software are in progress; we are waiting for both vendor comments and the i1Display Pro and Discus to be supported by more software packages.
Our
full article goes into details on both the new pucks and comparisons to older ones. We have a
direct comparison of the Discus to the i1Display Pro.
In summary, the new pucks are game changers. Starting with how consistent the sensors are, both the i1D3 and Discus trounce all older contenders. As you can seen in the table below, the average unit-to-unit variation is, from a visual perspective, invisible and the worst-case units are far superior to all other products. If you purchase either sensor, we see no evidence that you will get a lemon.
As for absolute accuracy, both the i1Display Pro and Discus surpass even individually calibrated i1D2 models. Colorimeter or spectrophotometer - the new pucks leave 'em in the dust. The advantage held for all LCD technologies: standard or wide-gamut CCFL, white or RGB LED. If you are in the market for a monitor calibrator, any other option under $25K will be second best.
All of which leads to the question of which sensor is best. In all our tests - intra-unit variability, accuracy on each panel type, and thermal stability - the Discus came out ahead of the i1Display Pro. The margin was, usually, at or below the limits of visibility. From a strict
statistical perspective, the Discus was not significantly more accurate than the i1Display Pro. This statement must be taken in context: We only had five Discus samples available, so the error bars on any confidence estimates are large.
There are two areas where the Discus and i1Display Pro differ greatly. The first is cost: You can buy five i1D3 pucks and have enough left over for a decent dinner out for the price of a single Discus.
The other difference is in handling. The Discus is an imposing presence. If portability is a consideration, BasICColor's beast is not for you. Sheer size and weight do have their advantages. Once you place the Discus on the screen, that's that. Ambient light is effectively sealed out and accurate measurements are assured. The high, narrow profile of the i1Display Pro works against it in this regard. The screen, puck, and counterweight need to be adjusted so the puck sits absolutely flush on the monitor surface. Even so, we found getting the most accurate results from the i1D3 required a darkened room.
Both pucks offer excellent
thermal stability. This matters, because a typical CCFL-backlight monitor runs ~15F over ambient. Measurements from many older instruments will drift during the profiling session if the puck is not placed on the screen to warm up for 15 minutes first. Eliminating such productivity parasites is always fine by me.
Until we have software in hand that drives both instruments well, the most important comparison between the i1Display Pro and Discus will remain incomplete: namely how well, from a subjective, visual perspective do the calibrations and profiles they generate perform? Although our measurements only showed the Discus having a slight lead over the i1Display Pro in measurement accuracy, we concentrated on performance at white, black, and a handful of intermediate points. I can make handwaving arguments about why the visual difference between the pucks should be, if not invisible, damned close to it or the countervailing argument that small, inconsistent errors can introduce visible artifacts in real-world image applications.
For now, the only commercially-released software we have to drive the Discus is BasICColor Display 4.2, and the i1Display Pro uses i1Profiler 1.1. X-Rite's software comes up short against BC Display on DDC-capable monitors. This appears to be because i1Profiler does not utilize the monitor LUTs for grayscale and gamma adjustments, relying instead only on the video card LUTs. Most better monitors have high-bit LUTs, while video card adjustments are performed in 8-bit mode. As with image editing in Photoshop, curve adjustments to 8-bit images can create artifacts that do not appear when editing in high-bit mode. Monitor profiling packages such as ColorEyes Display and BasICColor that intelligently balance monitor and video card LUT adjustments hold the upper hand in raw calibration performance over i1Profiler. That said, the actual profiling side of i1Profiler looks good; the underlying calibration is not up BCD and CED at their best.
We should be able to make a more direct comparison shortly. BasICColor is due to release a version of Display that talks to the i1Display Pro as is ColorEyes. I'll update when they do.
One other puck did stand out: The X-Rite
ColorMunki Display. It shares the same basic hardware as the i1Display Pro but costs a third less. The software set lacks the more advanced calibration setpoints, validation functionality, and the ability to trend performance over time. The puck itself measures ambient lighting luminance but not color temperature as does the i1D3. Finally measurements poke along at one fifth the speed of the i1Display Pro. This is not as bad as it sounds at first glance - the CM Display speed is the same as that of the older i1 Display 2.
X-Rite told us they do not plan on unlocking the CM Display for third-party software. This makes business sense, as the capability comparison to the i1D3 could make the cost differential unattractive. Nevertheless, even with the limited software set, the ColorMunki Display is the clear choice for hobbyists or others for whom cost is a prime consideration.