Luminous Landscape Forum

Raw & Post Processing, Printing => Colour Management => Topic started by: stefohl on October 04, 2010, 04:38:34 pm

Title: Test of monitor calibrators
Post by: stefohl on October 04, 2010, 04:38:34 pm
I've seen so many examples of bad monitor calibrators lately, so I decided to do a test. I have a friend working at Eizo/Sweden, so I can test my calibrators against their Minolta CA-210. As the Basiccolor Display supports this calibrator, we could calibrate a monitor and then verify the profile with this software. We then verified the monitor with my EyeOne Pro. The results were about the same. We then invited 15 photographers to come to us with their calibrators and the results were not impressive. Out of these 15 calibrators 5 gave really bad results. The worst showed an average Delta E of 30 and a max of 50. The others shoved results more like average Delta E of 15 and a max of 25. Out of the rest, three gave an average around 10 and a max of 15. So, out of 15 calibrators, 7 shoved good results, 8 gave results ranging from not so good to really bad.

The EyeOne Pros that shoved up all shoved good results. Maybe not so surprising, considering that the profile that we validated was created with an EyeOne Pro. Out of the 8 that didn't pass the test, we had four EyeOne Displays, three Spyder III:s and one Monaco Optix. Not a very scientific test, but rather distressing.
Title: Re: Test of monitor calibrators
Post by: Mark D Segal on October 04, 2010, 05:12:00 pm
You know, there are a slew of considerations at play in all of this. Firstly, there is variance between instruments. Secondly, instruments can degrade over time. Thirdly, they should all be at the same temperature when making comparative readings, because temperature affects their performance. Fourthly, some older colorimeters may not be well-adapted to evaluating the latest generation of wide-gamut displays. Fifth, is the parameter setting done using DDC or OSD controls (can impact consistency of the test between instruments)? Sixth, turning to the software, while the internal validation of an application such as BasicColor, or ColorEyes Display is fine as far as it goes, it does not go far enough. One wants an independent application generating a completely alternative set of values to be displayed, measured and the dE calculated in order to validate "from the outside" whether the internal validation of the calibration packages is reliable. This can be done with Babelcolor's PatchTool application, which I highly recommend, having used it intensively when I had display management issues, now resolved. It's a very useful analytic application. Of course, I should add that profile quality itself is obviously a key factor in the outcomes. Generally one prefers LUT over matrix profiles for achieving more accurate results, unless the display has very linear performance.
Title: Re: Test of monitor calibrators
Post by: stefohl on October 04, 2010, 05:34:03 pm
Of course there is a variance between instruments and they do degrade over time. That is what we wanted to test, can we trust the calibrators that we are using? That is also why we didn't test new instruments, but invited users to test their calibrators. Some of these were quite new, others were 3-4 years old.

And we didn't test which instrument that gave the best profile, we just compared the validation results from one calibrator that we knew gave a good results with the validation results that we got from other instruments. Of course we could have used a tool like PatchTool or U-Dact, but I don't think that had changed anything. We would still probably have found that out of 15 instruments, 8 weren't good.

Title: Re: Test of monitor calibrators
Post by: Mark D Segal on October 04, 2010, 05:35:34 pm
From the sound of it, probably more than eight aren't really trustworthy. It's an interesting exercise.
Title: Re: Test of monitor calibrators
Post by: probep on October 05, 2010, 06:22:13 am
I've seen so many examples of bad monitor calibrators lately, so I decided to do a test. I have a friend working at Eizo/Sweden, so I can test my calibrators against their Minolta CA-210. As the Basiccolor Display supports this calibrator, we could calibrate a monitor and then verify the profile with this software. We then verified the monitor with my EyeOne Pro. The results were about the same. We then invited 15 photographers to come to us with their calibrators and the results were not impressive. Out of these 15 calibrators 5 gave really bad results. The worst showed an average Delta E of 30 and a max of 50. The others shoved results more like average Delta E of 15 and a max of 25. Out of the rest, three gave an average around 10 and a max of 15. So, out of 15 calibrators, 7 shoved good results, 8 gave results ranging from not so good to really bad.

The EyeOne Pros that shoved up all shoved good results. Maybe not so surprising, considering that the profile that we validated was created with an EyeOne Pro. Out of the 8 that didn't pass the test, we had four EyeOne Displays, three Spyder III:s and one Monaco Optix. Not a very scientific test, but rather distressing.
Very interesting. But there are some questions.
What display(s) did you calibrate? Wide gamut? or sRGB CCFL-backlit display?
Could you name 7 good sensors?
Title: Re: Test of monitor calibrators
Post by: stefohl on October 05, 2010, 07:52:15 am
We used Eizo CG 221, which is a wide gamut monitor with CCFL and Eizo CG 211, which doesn't cover the whole Adobe RGB gamut. But it made no difference which monitor we used, the results were about the same.

Title: Re: Test of monitor calibrators
Post by: stefohl on October 05, 2010, 08:07:12 am

Could you name 7 good sensors?


No, and that is perhaps the worst problem. The only calibrator where all samples passed the test was EyeOne Pro. We had 4 of those during the test. We didn't have a ColorMunki, but will test one during the week.
Title: Re: Test of monitor calibrators
Post by: probep on October 05, 2010, 10:20:18 am
No, and that is perhaps the worst problem. The only calibrator where all samples passed the test was EyeOne Pro. We had 4 of those during the test. We didn't have a ColorMunki, but will test one during the week.
Thank you. the ColorMunki accuracy is highly interesting subject.
Title: Re: Test of monitor calibrators
Post by: Czornyj on October 05, 2010, 03:24:02 pm
Thank you. the ColorMunki accuracy is highly interesting subject.
To my eye it seems to be as good as my i1pro, I've been playing with few CM units and all of them were giving results similar to my spectro.
Title: Re: Test of monitor calibrators
Post by: stefohl on October 06, 2010, 06:10:45 pm
Thank you. the ColorMunki accuracy is highly interesting subject.

Today I tested two ColorMunkis and I'm happy to report that both of them were good, with an average Delta E < 2 and a max of < 4 when I tested them on a Eizo CG221, a monitor with a gamut about the same as Adobe RGB. It seems like the spectros are showing good results, both EyeOne Pros and ColorMunkis. The colorimeters are much more unstable.
Title: Re: Test of monitor calibrators
Post by: neil snape on October 07, 2010, 03:32:46 am
I find the CM very reliable and consistent.

I am currently testing some new software, and the results are marred on the HP 2480zx.

I have the X-Rite HP APS for this monitor and it just plain doesn't work.

Somehow I suspect that this monitor doesn't play well with the video card. Sometimes the samples are sent and it flashes between samples. Same for power on, wake up etc.

So your test above is a fine example of how the variables can play.

It doesn't attest to any qualification though as the above noted variable tells me that there will continue to be bad monitor calibrations for reasons of drivers, and hardware that don't work .

I wish I had Eizo, they still are striving for a standard. They also allow multiple devices with their software, a very good thing.
Title: Re: Test of monitor calibrators
Post by: Baxter on October 08, 2010, 05:27:18 pm
I've been using an Eye One Display since March 2004 and recently wondering what sort of improvement I'd see by replacing with a more modern device. I've an Eizo SX2761W, a 3.5 yr old Macbook Pro monitors to calibrate. Primary images come from a P45+

Printer-wise I use an Epson 7800 and  Mitsubishi 9550 Dye-sub for which I have had custom profiles made. Making printer profiles for paper isn't really something I need to do.

Two Q's
1 Will a current device produce noticeably better results?
2. If so, which ones should be on my shortlist please?

Many thanks

Bax
Title: Re: Test of monitor calibrators
Post by: stefohl on October 09, 2010, 03:31:21 am

Two Q's
1 Will a current device produce noticeably better results?

Of the 8 or 9 EyeOne Displays that I have tested, there were problems with 50% with average Delta E over 10. This isn't to say that 50% of all EyeOne Displays are wrong, as many of them were chosen just because that they had problems. I've seen older EyeOne Displays that gave good results and new one that didn't. It seems like that they are sensitive to both humidity and heat.

Quote
2. If so, which ones should be on my shortlist please?

The spectros that we've tested have shown better results than the colorimeters. Of the colorimeters it is the DTP-94 that has shown the best results. A colorimeter that I haven't tested is the new Basiccolor Discus, but it looks very promising.

Title: Re: Test of monitor calibrators
Post by: Lednam on October 09, 2010, 12:20:15 pm
This is interesting! The question that rings in my head right now is if I can make some ocular tests to confirm that my calibration is somewhat right ... ? Is there any good evaluation images out there ...? Thanks in advance!
Title: Re: Test of monitor calibrators
Post by: WombatHorror on October 28, 2010, 12:22:05 am
Today I tested two ColorMunkis and I'm happy to report that both of them were good, with an average Delta E < 2 and a max of < 4 when I tested them on a Eizo CG221, a monitor with a gamut about the same as Adobe RGB. It seems like the spectros are showing good results, both EyeOne Pros and ColorMunkis. The colorimeters are much more unstable.

None of the colorimeters you used works on wide gamut displays (at least not without varying degrees of custom compensation). You need to test them on a standard gamut display.

There is a link floating around where someone tested a whole slew of probes. The i1Pro was fairly decent (aside from shadows) on all monitors types. The DTP94b was fairly decent on CCFL sRGB screens. The early Spyder3 and off the shelf i1D2 were brutal on CCFL sRGB and forget it on other types. The later SPyder3 were mediocre.

(The special NEC version i1D2 are factory calibrated they are not the same as off teh shelf i1D2).

Title: Re: Test of monitor calibrators
Post by: Steve Weldon on October 28, 2010, 05:26:12 am
I just want to be sure.. when measuring Delta-E.. what software are you using and how are you measuring Delta E?

I don't understand this measurement as well as I'd like.. I've always understood it to be the difference between target and actual colors.  I usually look at my SVII Information Display Window and take the number directly off of the window in the White Point area where it's listed.  Target, calibrated, and then Delta E.

I ask because with the NEC Colorimeter my numbers range from .03 for sRGB Emulation to .65 at the other end and using the widest gamut such as for Photo Editing.  It seems the further away from sRGB I get, the more the number rises.  With the i1d2 colorimeter I get very close to the same, in no case above .8..

Am I reading this wrong?  Your numbers of 30-50 have got me asking..
Title: Re: Test of monitor calibrators
Post by: Czornyj on October 28, 2010, 12:10:22 pm
There's a difference between SpectraviewII and basICColor Display validation - SVII only validates the greyscale, while basICColor also validates a bunch of colors:
(http://www.colourmanagement.net/IMG/Display4-Validation.jpg)
Title: Re: Test of monitor calibrators
Post by: digitaldog on October 28, 2010, 12:27:50 pm
There are a couple things that would make the testing more solid (scientific). Did you find out the rev’s of each unit? They do differ in how they measure data, IOW, a Rev A device and a Rev D device may not correlate. For example, we found that a Rev E iSis produced quite different data than all previous iSis units because its been updated. Did you know that different companies come up with the final data differently? It seems shocking at first but this is one reason why we recently saw X-Rite introduce a new metric (XRGA standard). So while it was useful to use a better reference grade unit (the Minolta), there could be some differences here just due to the way two companies report the data. Having a higher end X-Rite reference instrument would make the test results a bit more accurate.
Quote
There's a difference between SpectraviewII and basICColor Display validation - SVII only validates the greyscale, while basICColor also validates a bunch of colors:
Can you specify the colors or are they a fixed set? Being able to specify the colors is useful. Many “validation” processes we see are fixed and often, the companies select colors that are easy to produce low deltas.

Lastly, while we all hope our instruments produce “accurate” and more importantly, consistent data, in terms of a display, the ultimate goal, one that’s very difficult to measure in terms of a success is how well the prints and display visually match.
Title: Re: Test of monitor calibrators
Post by: stefohl on October 28, 2010, 12:30:04 pm
None of the colorimeters you used works on wide gamut displays (at least not without varying degrees of custom compensation). You need to test them on a standard gamut display.

There is a link floating around where someone tested a whole slew of probes. The i1Pro was fairly decent (aside from shadows) on all monitors types. The DTP94b was fairly decent on CCFL sRGB screens. The early Spyder3 and off the shelf i1D2 were brutal on CCFL sRGB and forget it on other types. The later SPyder3 were mediocre.

We have done the test on both the Eizo CG211 and the CG 221. 221 has a gamut about the same as Adobe RGB, 211 has a smaller gamut. We have now tested more than 20 calibrators, new and old, both EyeOne Displays, Spyder III, ColorMunki, DTP94 and EyeOne Pro. On an average about 30 - 40% of the Displays and Spyders failed and they failed on both the small gamut and the wide gamut monitors. Would be interesting to read the link you are referring to.

Stefan
Title: Re: Test of monitor calibrators
Post by: stefohl on October 28, 2010, 12:34:53 pm
I just want to be sure.. when measuring Delta-E.. what software are you using and how are you measuring Delta E?


We used the ColorNavigators simple validation and compared the values that we got from our reference calibrator with the values that we got from the test example. We also used Basiccolors software and did the same thing because it supports the Minolta CA 210 that we used as our reference.
Title: Re: Test of monitor calibrators
Post by: stefohl on October 28, 2010, 12:44:13 pm
Did you know that different companies come up with the final data differently? It seems shocking at first but this is one reason why we recently saw X-Rite introduce a new metric (XRGA standard). So while it was useful to use a better reference grade unit (the Minolta), there could be some differences here just due to the way two companies report the data. Having a higher end X-Rite reference instrument would make the test results a bit more accurate. Can you specify the colors or are they a fixed set? Being able to specify the colors is useful. Many “validation” processes we see are fixed and often, the companies select colors that are easy to produce low deltas.  
I'm quite aware of the difficulties do this with a perfect result. I tried to find a way to spot the calibrators that gave a decent result and those who failed badly, that is that they showed an average of more than Delta E 5 or a maximum of more than Delta E 10. We have tested both EyeOne Pros rev A, B and D and they all passed. And when we compared the monitors when we used a profile created by those calibrators that failed and compared them with the profiles that were created by the those that passed, there were no problems to decide which monitor was profiled with the bad calibrators.
Title: Re: Test of monitor calibrators
Post by: Czornyj on October 28, 2010, 12:45:57 pm
Can you specify the colors or are they a fixed set? Being able to specify the colors is useful. Many “validation” processes we see are fixed and often, the companies select colors that are easy to produce low deltas.

Lastly, while we all hope our instruments produce “accurate” and more importantly, consistent data, in terms of a display, the ultimate goal, one that’s very difficult to measure in terms of a success is how well the prints and display visually match.

It's a fixed set. My understanding of such validation process is that it's only to verify whether the calibration process didn't go wrong for some trivial reason (like screen saver etc.).

Would be interesting to read the link you are referring to.
http://lists.apple.com/archives/colorsync-users/2009/Nov/msg00175.html
Title: Re: Test of monitor calibrators
Post by: stefohl on October 28, 2010, 03:11:15 pm
It's a fixed set. My understanding of such validation process is that it's only to verify whether the calibration process didn't go wrong for some trivial reason (like screen saver etc.).
http://lists.apple.com/archives/colorsync-users/2009/Nov/msg00175.html

I normally use the ColorNavigators Fogra 39 validation, as this gives me an idea about the quality not only of the calibration but also on the of the monitor's gamut. But here I used the simple validation because I just wanted a comparison of the measurements of the reference calibrator and the calibrator that is tested. And I think that the link that you provided pretty much shoved the same results that I have found.
Title: Re: Test of monitor calibrators
Post by: Steve Weldon on October 28, 2010, 11:18:57 pm
Thanks.. looking at the chart you posted you're also well under 1 for most measurements.  Curious what colorimeter you're using and if you think there's any way to infer the measurements I'm getting from SVII against basICColor?  It just seems to be a huge difference between what I'm getting and the measurements the OP is getting.

Would the SVII colorimeter being filtered to work with the NEC's affect it's Delta E?  It generally measures less than my 1id2, but we're still talking a measurement under 1.. so I suppose not but thought I'd ask.

Is any of this software free or perhaps it includes a 30 day trial?  For fun I'd like to see what my devices get using the same testing software.. though really I don't see what all the hoopla is about if a person is getting consistent measurements and the screen is matching output.. but in the area of color it seems I always have more to learn.