Pages: 1 ... 5 6 [7] 8 9   Go Down

Author Topic: Monitor calibration sensor evaluations  (Read 141765 times)

Damir

  • Full Member
  • ***
  • Offline Offline
  • Posts: 232
Re: Monitor calibration sensor evaluations
« Reply #120 on: September 05, 2011, 05:07:18 pm »

OT

Ethan is there any chance that you update color gamut spaces for digital cameras on your site - it is very old, some scanners would be nice to see too.
Logged

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1914
    • zarzadzaniebarwa.pl
Re: Monitor calibration sensor evaluations
« Reply #121 on: September 06, 2011, 01:58:56 pm »

Graeme also wrote(...)

and also:
Quote from: Graeme Gill
Possibly the main reason someone might prefer the i1d3 over the ColorMunki
Photo is the better temperature stabilisation, which could be an advantage
in measuring large sets of test values.
http://www.freelists.org/post/argyllcms/Argyll-V134-released,17

X-Rite should seriously consider making a generous donation to ArgyllCMS. I didn't even suspect that these new sensors are so freaking cool before reading Graeme's comments on the list.

« Last Edit: September 06, 2011, 02:00:48 pm by Czornyj »
Logged

trinityss

  • Newbie
  • *
  • Offline Offline
  • Posts: 36
Re: Monitor calibration sensor evaluations
« Reply #122 on: September 07, 2011, 02:34:59 am »

I was always wondering if there was any kind of temperature compensation.
I asked X-rite support and never received an answer  :(.

May I assume that the compensation is inside the device or is it done by Argyll CMS (but then the device must report the temperature...). I assume it's in the device?


Thx!
Logged

shewhorn

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 537
    • http://
Re: Monitor calibration sensor evaluations
« Reply #123 on: September 07, 2011, 10:50:14 am »

and also:http://www.freelists.org/post/argyllcms/Argyll-V134-released,17

X-Rite should seriously consider making a generous donation to ArgyllCMS. I didn't even suspect that these new sensors are so freaking cool before reading Graeme's comments on the list.

Indeed they should, it was the reason why I finally placed an order for an i1Display Pro. I'm not impressed with X-Rites i1Profiler based monitor profiling software. It has much potential but they still need to fix a few things. Argyll though is quite the piece of software for profiling my non-NEC monitors, especially when paired with the Eye One Pro to generate a correction matrix. I don't think X-Rite would have anything to lose from it.

Cheers, Joe
Logged

Paz

  • Newbie
  • *
  • Offline Offline
  • Posts: 27
Re: Monitor calibration sensor evaluations
« Reply #124 on: October 05, 2011, 11:38:53 pm »

Ethan,

Thanks to you and Dry Creek Photo for taking the time to run these tests and publish your results.

In your article, you state:

Quote
For wide-gamut displays, the best colorimeter option is the Datacolor Spyder 3. It is reasonably accurate, but unit-to-unit performance is not as consistent as could be desired.

How would I be able to tell if a particular Spyder3 was a good one or not?   All I've heard is that 'recent' ones are all good.  If that is so, how would one be able to tell the manufacture date?

thanks,

Paz
Logged

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1914
    • zarzadzaniebarwa.pl
Re: Monitor calibration sensor evaluations
« Reply #125 on: October 06, 2011, 04:18:07 am »

Do yourself a favor and get i1Display Pro instead.

Paz

  • Newbie
  • *
  • Offline Offline
  • Posts: 27
Re: Monitor calibration sensor evaluations
« Reply #126 on: October 10, 2011, 01:15:18 pm »

Thanks for the advice, but Ethan's research indicates the i1Display Pro would not be the best measurement device for my very bright, wide gamut, backlit RGB LED monitor.

Also, I sent an email to x-Rite a couple of weeks ago inquiring about purchasing their products and they have not replied.  It begs the question of what happens after one has actually bought their products... not to mention that I'm not thrilled that my Gretag Macbeth puck has fallen victim to planned obsolescence.
Logged

Mark D Segal

  • Contributor
  • Sr. Member
  • *
  • Offline Offline
  • Posts: 12512
    • http://www.markdsegal.com
Re: Monitor calibration sensor evaluations
« Reply #127 on: October 10, 2011, 04:48:51 pm »


Also, I sent an email to x-Rite a couple of weeks ago inquiring about purchasing their products and they have not replied.  It begs the question of what happens after one has actually bought their products....

Forewarned is fore-armed.
Logged
Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....."

Czornyj

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1914
    • zarzadzaniebarwa.pl
Re: Monitor calibration sensor evaluations
« Reply #128 on: October 10, 2011, 05:31:06 pm »

Thanks for the advice, but Ethan's research indicates the i1Display Pro would not be the best measurement device for my very bright, wide gamut, backlit RGB LED monitor.
Actually, Ethan didn't tested i1Display Pro (yet). But I'm pretty sure a RGB LED backlit display won't be a problem for this sensor.

Paz

  • Newbie
  • *
  • Offline Offline
  • Posts: 27
Re: Monitor calibration sensor evaluations
« Reply #129 on: October 10, 2011, 11:31:13 pm »

Thank you, Czornyj .  I double checked.  You're right.

I paid more attention to what did work with my type of monitor than those that did not.

Paz
Logged

kf_tam

  • Newbie
  • *
  • Offline Offline
  • Posts: 1
Re: Monitor calibration sensor evaluations
« Reply #130 on: October 23, 2011, 01:37:21 am »

Ethan's test has been updated to include i1Display Pro, and more samples of Discus and Colormunki Photo.
Thank Ethan for the great work  :D.
Logged

Tim Lookingbill

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2436
Re: Monitor calibration sensor evaluations
« Reply #131 on: October 23, 2011, 11:46:16 am »

Went to Ethan's DryCreekPhoto site and couldn't find any reviews i1Display Pro.

Does Ethan have another site?
Logged

alain

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 455
Re: Monitor calibration sensor evaluations
« Reply #132 on: October 23, 2011, 12:56:19 pm »

Went to Ethan's DryCreekPhoto site and couldn't find any reviews i1Display Pro.

Does Ethan have another site?

On the first post there a link inside the sentence : "For the full details, read the article on our site."
Logged

Ethan_Hansen

  • Full Member
  • ***
  • Offline Offline
  • Posts: 114
    • Dry Creek Photo
Re: Monitor calibration sensor evaluations
« Reply #133 on: October 26, 2011, 01:29:38 am »

We finished testing the BasICColor Discus, X-Rite i1Display Pro, and the ColorMunki Display a couple of weeks ago. I held off on posting the reviews publically (although from emails and comments I see that the URLs did go out) to give the manufacturers time to review and comment on our findings. Reviews of monitor calibration software are in progress; we are waiting for both vendor comments and the i1Display Pro and Discus to be supported by more software packages.

Our full article goes into details on both the new pucks and comparisons to older ones. We have a direct comparison of the Discus to the i1Display Pro.

In summary, the new pucks are game changers. Starting with how consistent the sensors are, both the i1D3 and Discus trounce all older contenders. As you can seen in the table below, the average unit-to-unit variation is, from a visual perspective, invisible and the worst-case units are far superior to all other products. If you purchase either sensor, we see no evidence that you will get a lemon.

As for absolute accuracy, both the i1Display Pro and Discus surpass even individually calibrated i1D2 models. Colorimeter or spectrophotometer - the new pucks leave 'em in the dust. The advantage held for all LCD technologies: standard or wide-gamut CCFL, white or RGB LED. If you are in the market for a monitor calibrator, any other option under $25K will be second best.

All of which leads to the question of which sensor is best. In all our tests - intra-unit variability, accuracy on each panel type, and thermal stability - the Discus came out ahead of the i1Display Pro. The margin was, usually, at or below the limits of visibility. From a strict statistical perspective, the Discus was not significantly more accurate than the i1Display Pro. This statement must be taken in context: We only had five Discus samples available, so the error bars on any confidence estimates are large.

There are two areas where the Discus and i1Display Pro differ greatly. The first is cost: You can buy five i1D3 pucks and have enough left over for a decent dinner out for the price of a single Discus.

The other difference is in handling. The Discus is an imposing presence. If portability is a consideration, BasICColor's beast is not for you. Sheer size and weight do have their advantages. Once you place the Discus on the screen, that's that. Ambient light is effectively sealed out and accurate measurements are assured. The high, narrow profile of the i1Display Pro works against it in this regard. The screen, puck, and counterweight need to be adjusted so the puck sits absolutely flush on the monitor surface. Even so, we found getting the most accurate results from the i1D3 required a darkened room.

Both pucks offer excellent thermal stability. This matters, because a typical CCFL-backlight monitor runs ~15F over ambient. Measurements from many older instruments will drift during the profiling session if the puck is not placed on the screen to warm up for 15 minutes first. Eliminating such productivity parasites is always fine by me.

Until we have software in hand that drives both instruments well, the most important comparison between the i1Display Pro and Discus will remain incomplete: namely how well, from a subjective, visual perspective do the calibrations and profiles they generate perform? Although our measurements only showed the Discus having a slight lead over the i1Display Pro in measurement accuracy, we concentrated on performance at white, black, and a handful of intermediate points. I can make handwaving arguments about why the visual difference between the pucks should be, if not invisible, damned close to it or the countervailing argument that small, inconsistent errors can introduce visible artifacts in real-world image applications.

For now, the only commercially-released software we have to drive the Discus is BasICColor Display 4.2, and the i1Display Pro uses i1Profiler 1.1. X-Rite's software comes up short against BC Display on DDC-capable monitors. This appears to be because i1Profiler does not utilize the monitor LUTs for grayscale and gamma adjustments, relying instead only on the video card LUTs. Most better monitors have high-bit LUTs, while video card adjustments are performed in 8-bit mode. As with image editing in Photoshop, curve adjustments to 8-bit images can create artifacts that do not appear when editing in high-bit mode. Monitor profiling packages such as ColorEyes Display and BasICColor that intelligently balance monitor and video card LUT adjustments hold the upper hand in raw calibration performance over i1Profiler. That said, the actual profiling side of i1Profiler looks good; the underlying calibration is not up BCD and CED at their best.

We should be able to make a more direct comparison shortly. BasICColor is due to release a version of Display that talks to the i1Display Pro as is ColorEyes. I'll update when they do.

One other puck did stand out: The X-Rite ColorMunki Display. It shares the same basic hardware as the i1Display Pro but costs a third less. The software set lacks the more advanced calibration setpoints, validation functionality, and the ability to trend performance over time. The puck itself measures ambient lighting luminance but not color temperature as does the i1D3. Finally measurements poke along at one fifth the speed of the i1Display Pro. This is not as bad as it sounds at first glance - the CM Display speed is the same as that of the older i1 Display 2.

X-Rite told us they do not plan on unlocking the CM Display for third-party software. This makes business sense, as the capability comparison to the i1D3 could make the cost differential unattractive. Nevertheless, even with the limited software set, the ColorMunki Display is the clear choice for hobbyists or others for whom cost is a prime consideration.

32BT

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3095
    • Pictures
Re: Monitor calibration sensor evaluations
« Reply #134 on: October 26, 2011, 06:01:29 am »

Did you possibly also do some (subjective) test comparing two different monitors next to each other ?

Such as:
1. Given a utility monitor and an image editing monitor
2. Calibrate the editing monitor to its native white point
3. Calibrate the utility monitor to the image-editing-monitor white point

And what about calibrating the image-editing-monitor to a viewing environment whitepoint?

(Note that this could be done independent from the software. In the past I have tested monitor white point measurements in both white and gray, and while the pucks would report a virtual exact match, the visual color-appearance wasn't even remotely close.)
Logged
Regards,
~ O ~
If you can stomach it: pictures

Ethan_Hansen

  • Full Member
  • ***
  • Offline Offline
  • Posts: 114
    • Dry Creek Photo
Re: Monitor calibration sensor evaluations
« Reply #135 on: October 26, 2011, 10:16:26 am »

Oscar,

We approached this in two ways. First, we evaluated each sensor on a range of screens, manually adjusting each panel to as close to a 6500K white point and 150 cd/m2 luminance level as possible. These included both CCFL and LED backlight laptops, three standard gamut CCFL, four wide-gamut CCFL, one RGB LED, and two white LED displays. We used our spectroradiometer to guide the adjustments. Depending on whether DDC was available, the screens hit the setpoint to varying degrees of accuracy.

The other approach is part of our profiling software evaluations. Here we test both the ability to hit a specified white level and match to another screen.

On top-quality, highly uniform displays if the measured difference between two screens was under 1 dE-2000, the visual match was near-perfect. On lesser displays there often are color and luminance shifts across the screen. These variations can easily exceed 5 dE-2000 on cheapo (~$300, 24") IPS panels and go even higher on on TN panels. There is no way to get a visual match between such a monitor and a reference, simply because only a small portion of the screen accurately hits the calibration target. Most high end displays have maximum variation of at least 2 dE-2000 edge to edge. You can see that.

shewhorn

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 537
    • http://
Re: Monitor calibration sensor evaluations
« Reply #136 on: October 26, 2011, 01:44:35 pm »

We approached this in two ways. First, we evaluated each sensor on a range of screens, manually adjusting each panel to as close to a 6500K white point and 150 cd/m2 luminance level as possible.

Ethan,

I'm curious to know, relative to the spectroradiometer you're using, where did the various pucks fall in terms of hitting a specific luminance? I've noticed with my own i1 Display Pro that it reports a significantly lower luminance level than the Spyder 3, DTP-94, NEC i1D2, and Eye One Pro. Relative to the Eye One Pro if I measure the luminance at 110 cd/m^2 (in both Argyll and i1 Profiler), the i1 Display Pro will report back somewhere in the range of 96 to 97 cd/m^2, significantly lower than where the other sensors fall. I don't think that's a big deal in the grand scheme of things but it did catch my attention.

Cheers, Joe
Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: Monitor calibration sensor evaluations
« Reply #137 on: October 26, 2011, 02:46:42 pm »

Thanks for your efforts.

From your findings so far:
Does it seem that purchasing the new i1d3 to use with ArgyllCMS is a good choice given that:
-One allready has a Spyder 3 express that never gave credible readings, and that comparisions with another Spyder 3 was all over the place
-No spectrophotometer is readily available for calibration
-One has a wide-gamut, medium-cost IPS display (Dell u2711s)

best regards
h
« Last Edit: October 26, 2011, 04:45:11 pm by hjulenissen »
Logged

shewhorn

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 537
    • http://
Re: Monitor calibration sensor evaluations
« Reply #138 on: October 26, 2011, 04:18:56 pm »

One more question... How close were the serial numbers on the units you got to test? I'm curious to know how consistent they are across different production runs.

Cheers, Joe
Logged

Ethan_Hansen

  • Full Member
  • ***
  • Offline Offline
  • Posts: 114
    • Dry Creek Photo
Re: Monitor calibration sensor evaluations
« Reply #139 on: October 26, 2011, 05:33:25 pm »

Quote from: shewhorn
I'm curious to know, relative to the spectroradiometer you're using, where did the various pucks fall in terms of hitting a specific luminance? ...the i1 Display Pro will report back somewhere in the range of 96 to 97 cd/m^2, significantly lower than where the other sensors fall.

One more question... How close were the serial numbers on the units you got to test? I'm curious to know how consistent they are across different production runs.
Joe - If I am correct in that all your sensors other than the i1Display Pro give readings in the vicinity of 110 cd/m2 while the i1D3 reads 97, then something appears amiss. That is a larger offset from reality then we measured. A few questions: (1) Do all the other instruments read near 110? (2) Did you choose the correct panel type for the i1Display Pro measurements? Depending upon which calibration is loaded, there can be differences in readings, although not to the extent you see.

The units we i1D3 tested were from a variety of sources. Some arrived directly from X-Rite and did indeed have closely grouped serial numbers. Several units were ones we purchased, while the rest came courtesy of various vendors. In the cumulative probability plot linked here, there was no correlation between sensor accuracy and i1Display Pro serial number. In short, the pucks were consistent - certainly to within the 10 dE level you are seeing!

Quote from: hjulenissen
Does it seem that purchasing the new i1d3 to use with ArgyllCMS is a good choice given that...
The 2711 was one of the displays we used for evaluations. The average i1D3 measurement error was 1.4 dE-2K on white and 2.6 dE-2K on black. Only the Discus performed better (0.7/1.9 dE-2K). I would, however, recommend getting the full version of the i1D3 rather than the CM Display, as this gives you i1Profiler as well as the ability to use Argyll. A couple of software suppliers are actively working on adding DDC control to Dell Ultrasharps, the 2711 in particular. If they are successful, the resulting profiles will likely surpass Argyll's in quality.
Pages: 1 ... 5 6 [7] 8 9   Go Up