Pages: [1] 2 3 ... 9   Go Down

Author Topic: Monitor calibration sensor evaluations  (Read 145156 times)

Ethan_Hansen

  • Full Member
  • ***
  • Offline Offline
  • Posts: 114
    • Dry Creek Photo
Monitor calibration sensor evaluations
« on: April 29, 2011, 06:06:03 pm »

We have been working on a thorough evaluation of monitor measurement hardware. We used a reference-grade spectroradiometer (Photo Research PR-730) to compare sensor performance on a variety of monitors. All were IPS displays, and ranged from entry level sRGB gamut models (e.g. NEC 231wmi, Dell U2311H) with CCFL backlight, to Adobe RGB compatible wide-gamut models with either CCFL backlights (e.g. NEC PA-241W, Eizo CG243W) or RGB LED backlighting (HP LP2480zx). We measured multiple samples of each sensor to quantify inter-instrument agreement on a white background. Each monitor was then measured at a 6500K, 150 cd/m2 white level, middle gray, and the lowest useable black level as measured by ColorEyes Display; roughly the level where RGB (0, 0, 0) is distinguishable from (1, 1, 1).

Colorimeters measured were the X-Rite DTP-94 and Eye-One Display 2, and the Datacolor Spyder 3 Elite. These were compared to the X-Rite Eye-One Pro spectrophotometer and, for the sake of experimentation, to the venerable Spectrolino. We measured a minimum of 10 samples for each of the above sensors. We had access to a single ColorMunki. While we could not compare unit-to-unit stability for the ColorMunki, the accuracy measurements were interesting.

For the full details, read the article on our site.

Summarizing the contents we saw the DTP-94 and the spectrophotometers had the best inter-instrument agreement. Any given DTP-94 or i1Pro can be counted on to provide consistent performance, be it good or bad. The Spyder 3 showed more variability. We only compared units manufactured after mid-2009. Older Spyders were all over the place in performance. Trailing the pack was the Eye-One Display. There was over twice as much average unit-to-unit variability between Eye-One Displays as between the two most different DTP-94 sensors. The two most different i1 Displays (out of 16 total) differed by 14 dE-2000 in the color they reported as monitor white. That's a lot. If you have multiple monitors to calibrate and use an Eye-One Display, be sure to calibrate them all with the same sensor.

We next measured accuracy as referenced to our PR-730 on each monitor at white, middle gray, and black. The results were nicely summarized by comparing accuracy on a standard gamut (96% sRGB, 66% Adobe RGB coverage) CCFL monitor and a wide-gamut (168% sRGB, 115% Adobe RGB) LED-backlit screen.

Here we can see the strengths and weaknesses of each instrument. Spectrophotometers, with the exception of the ColorMunki, have no problems measuring lighter values on any display type. Problems occur when luminance values drop down into the mud. The standard gamut monitor had black point luminance of 0.19 cd/m2, the wide-gamut panel was a mere 0.11 cd/m2. Most of what the spectrophotometers recorded at these low levels was measurement noise. Monitor profiles made from these devices exhibited color tinges in what should have been neutral shadow areas.

We only had one ColorMunki unit at our disposal; its performance was comparatively poor. I can not say if the Munki we measured was the runt of the litter or if our results were indicative of less than stellar performance for the product line. Given that the ColorMunki is marketed as a entry-level device, the performance we saw may not be surprising. We saw consistently high error levels on all monitors with the Munki. If your livelihood depends on how accurately your monitors render images, the ColorMunki is not the best choice.

Among the colorimeters, the DTP-94 turned in consistently good performance on standard gamut displays and equally consistently poor performance on all wide-gamut models. The Spyder 3 turned in the best wide-gamut performance of any of the standard colorimeters, essentially equal to what we measured on sRGB gamut monitors. Inter-instrument variability is a problem with the Spyder 3, however. Although the average accuracy was good, few photographers have a dozen Spyders at hand to use in calibrating their screens. Overall, I would rate the Spyder 3 as being the best option we evaluated for profiling wide-gamut displays.

The Eye-One Display trails the pack. If we took the best performing 50% of the i1 Display units - ones clustered within +/- 2dE-2000 of each other - the story would be different. All the i1 units were relatively new, the oldest being manufactured in 2008, the newest this year. Our first thought was that we might be seeing the effects of the internal plastic filters aging. This is a problem with i1 Display - and any other colorimeter using plastic filters - another advantage to the DTP-94 and its glass filter set. We had a few early, pre-production i1 Displays courtesy of GretagMacbeth. The readings from these units were indeed very different from the rest. There was no correlation between i1 age and color response for the newer pucks. Based on our sample size, you have a 50/50 chance of getting a good i1 Display.

The last set of sensors we looked at were OEM-branded Eye-One Displays tuned to the characteristics of a particular monitor. We had one each for the HP LP2480zx and NEC PA-241W. Both of these i1's turned in excellent performance on their respective monitors. Using the tuned sensor on any other monitor, however, was invitation to disaster. The results were ugly. Again, with a sample size of one device, I cannot say whether these were particularly good examples of the i1 Display family or whether the OEM tuning tames the unit-to-unit variability seen with standard Eye-One Displays. Quato sells tuned versions of the DTP-94 with their (seriously) high-end Intelli Proof monitors. Having neither Quato monitor nor one of their sensors, I can not comment on their performance.

Update May 23, 2011: New data available for OEM calibrated Eye-One Display 2 sensors. The i1D2 can be individually calibrated using a reference spectroradiometer. Doing so reduces mean intra-unit variability by almost half (mean unit-to-unit variation of 7.0 dE-2000 for stock i1D2 vs. 3.7 dE for OEM calibrated).
« Last Edit: May 23, 2011, 12:37:26 pm by Ethan_Hansen »
Logged

Iliah

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 770
Re: Monitor calibration sensor evaluations
« Reply #1 on: April 29, 2011, 06:50:52 pm »

Thank you, Ethan. What one is to expect with i1 Pro if black point is set not to lowest neutral, but to 0.5 cd/m2?
Logged

Ethan_Hansen

  • Full Member
  • ***
  • Offline Offline
  • Posts: 114
    • Dry Creek Photo
Re: Monitor calibration sensor evaluations
« Reply #2 on: April 29, 2011, 07:01:33 pm »

Thank you, Ethan. What one is to expect with i1 Pro if black point is set not to lowest neutral, but to 0.5 cd/m2?

Iliah - excellent point, and one i should have mentioned. We looked into this with a couple of monitors. Setting black to 0.5 - 0.8 decreased the mean measurement error from ~10dE to ~5. Big improvement if you are willing to live with lighter blacks levels. This is a valid option, particularly for anyone printing on fine art papers or other stock with relatively low print contrast.

Mark D Segal

  • Contributor
  • Sr. Member
  • *
  • Offline Offline
  • Posts: 12512
    • http://www.markdsegal.com
Re: Monitor calibration sensor evaluations
« Reply #3 on: April 29, 2011, 08:35:11 pm »

Ethan, thank you very much for this informative analysis and the detailed report backing it up on your website. I have one question: do you think your results would have been influenced one way or another by the software used for the tests - specifically had you used BasicColor rather than ColorEyes Display, or had you used NEC's Spectraview II or Eizo's own, both of which latter only make matrix profiles.
Logged
Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....."

Iliah

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 770
Re: Monitor calibration sensor evaluations
« Reply #4 on: April 29, 2011, 08:44:49 pm »

Something I think may be worthwhile mentioning:

> BasicColor

BasICColor display should be on version 5 soon, with many changes and improvements compared to version 4. Their DISCUS colorimeter is a somewhat different beast compared to i1 Display 2 and DTP-94.
Logged

Ethan_Hansen

  • Full Member
  • ***
  • Offline Offline
  • Posts: 114
    • Dry Creek Photo
Re: Monitor calibration sensor evaluations
« Reply #5 on: April 29, 2011, 08:49:46 pm »

Mark,

The end goal of this project is a long-overdue revision to our monitor calibration tools ratings. It seemed a good idea to first determine what the capability of each sensor was, independent of the software used to drive it. Also, we have a really good spectroradiometer on long-term loan, so this was a reason to make use of it.

The only use of commercial software in our tests was in determining an appropriate minimum black point. All the monitors used were capable of darker black levels, but setting them there made for plugged up shadows. We started manually adjusting each monitors output, but ColorEyes gave essentially the same results in an automated manner. Other calibration software certainly can optimize black levels, but ColorEyes proved closest to laboriously hand-tuned black points. The differences were usually not significant to the end results. Moving the black luminance by 0.05 cd/m2 (the spread we saw between ColorEyes and other software's targets) did not make any real difference to the sensor error levels we saw.

The actual measurements were made without any color management being performed and using either OEM drivers and home-brew software or hacks into the Argyll code to control the output.

Mark D Segal

  • Contributor
  • Sr. Member
  • *
  • Offline Offline
  • Posts: 12512
    • http://www.markdsegal.com
Re: Monitor calibration sensor evaluations
« Reply #6 on: April 29, 2011, 08:52:58 pm »

Something I think may be worthwhile mentioning:

> BasicColor

BasICColor display should be on version 5 soon, with many changes and improvements compared to version 4. Their DISCUS colorimeter is a somewhat different beast compared to i1 Display 2 and DTP-94.

What is a "somewhat different beast"? Does it bite, smile, growl? :-)
Logged
Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....."

Mark D Segal

  • Contributor
  • Sr. Member
  • *
  • Offline Offline
  • Posts: 12512
    • http://www.markdsegal.com
Re: Monitor calibration sensor evaluations
« Reply #7 on: April 29, 2011, 08:54:37 pm »

Mark,

The end goal of this project is a long-overdue revision to our monitor calibration tools ratings. It seemed a good idea to first determine what the capability of each sensor was, independent of the software used to drive it. Also, we have a really good spectroradiometer on long-term loan, so this was a reason to make use of it.

The only use of commercial software in our tests was in determining an appropriate minimum black point. All the monitors used were capable of darker black levels, but setting them there made for plugged up shadows. We started manually adjusting each monitors output, but ColorEyes gave essentially the same results in an automated manner. Other calibration software certainly can optimize black levels, but ColorEyes proved closest to laboriously hand-tuned black points. The differences were usually not significant to the end results. Moving the black luminance by 0.05 cd/m2 (the spread we saw between ColorEyes and other software's targets) did not make any real difference to the sensor error levels we saw.

The actual measurements were made without any color management being performed and using either OEM drivers and home-brew software or hacks into the Argyll code to control the output.

Understood. Thanks Ethan.
Logged
Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....."

Iliah

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 770
Re: Monitor calibration sensor evaluations
« Reply #8 on: April 29, 2011, 08:58:52 pm »

You can find basICColor DISCUS Reference Manual here:
http://www.basiccolor.de/assets/Manuals/Manual-DISCUS.pdf
Logged

Alan Goldhammer

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 4344
    • A Goldhammer Photography
Re: Monitor calibration sensor evaluations
« Reply #9 on: April 29, 2011, 09:38:39 pm »

Ethan,

Thanks for the review.  If I am reading your post correctly (have not had a chance to read the full article); sourced i1 displays from monitor manufacturers (such as NEC) seem to have a better tolerance than not when used on that companies monitor.  I guess this is because of the specification that were supplied to X-Rite and the specific application.  I know the NEC sensors are designed for the wide gamut monitors and to be used with Spectraview (which I use).  What I don't know is what the variance is in tolerance between various OEM designed sensors.  Does NEC do any kind of testing prior to shipping or do they rely on X-Rite here.  I know you didn't test using Spectraview but I guess I find this all pretty worrying since these are marketed at "precision" instruments designed to give good color control.  I'm unsure whether that is the case.
Logged

Ethan_Hansen

  • Full Member
  • ***
  • Offline Offline
  • Posts: 114
    • Dry Creek Photo
Re: Monitor calibration sensor evaluations
« Reply #10 on: April 29, 2011, 10:08:12 pm »

Ethan,

Thanks for the review.  If I am reading your post correctly (have not had a chance to read the full article); sourced i1 displays from monitor manufacturers (such as NEC) seem to have a better tolerance than not when used on that companies monitor.  I guess this is because of the specification that were supplied to X-Rite and the specific application.  I know the NEC sensors are designed for the wide gamut monitors and to be used with Spectraview (which I use).  What I don't know is what the variance is in tolerance between various OEM designed sensors.  Does NEC do any kind of testing prior to shipping or do they rely on X-Rite here.  I know you didn't test using Spectraview but I guess I find this all pretty worrying since these are marketed at "precision" instruments designed to give good color control.  I'm unsure whether that is the case.

The answer is I don't know. The two OEM-adjusted sensors we have are a NEC sold with the Spectraview package and HP's DreamColor unit. Our assumption is that these two Eye-One flavors are adjusted differently. The NEC sensor performs well on the PA monitor, but is grossly inaccurate on either sRGB monitors, the HP (uses LED backlighting rather than CCFL), and even wide-gamut Eizo displays that are also CCFL lit. The HP puck has similar behavior: excellent performance on the monitor it is sold with, lousy on all others.

If these two OEM flavors of i1 are actually identical, that would imply a truly worrying lack of consistency. We asked both HP and NEC for details. None were forthcoming. Without a reasonable sampling of OEM pucks to characterize there is no way to be sure. X-Rite, to the best of my knowledge, has not published accuracy and resolution specs for the Eye-One colorimeters like they do for the i1Pro spectros.

Scott Martin

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1315
    • Onsight
Re: Monitor calibration sensor evaluations
« Reply #11 on: April 29, 2011, 11:09:17 pm »

Fun testing Ethan! I'm a testing geek too. I like these hardware tests relative to a reference device. I think it would be interesting to now compare the calibrated results from one device used with a bunch of different applications (DispCal/Argyll, Syper3Elite, i1Profiler, Color Eyes DisplayPro, etc). Which software produces the best results with that one device?

After that, it would be interesting to look at what combinations of software and hardware are yielding the best results. What combination of devices and software applications produces the best results across a wide variety of displays?

And then start comparing the calibration times (like 1+ hour for DispCal/Argyll High quality vs under 2 minutes for i1Profiler) and see which combination provides the best calibration for the time spent. Is it worth spending an hour to calirbate a display? Or will a 10 minute calibration provide 99% of the benefit?

As I see it, the question is: when displays are calibrated with a variety of devices and applications which *combination* consistently yields the best results within a reasonable timeframe? This is the question that I've been trying to answer. And I think I've got it.

Obviously the number of possible combinations (calibration devices, calibration apps, displays) is huge. There are some trends that can help us narrow things down. The EyeOneDisplay for example sucks - but we've know that for a long time right? DTP94 has been a long time top performer but should have limitations with wide gamut monitors. The Spyder3 is perhaps the most modern, versatile and affordable colorimeter, with broad support. EyeOnePros are good, commonly available, broadly supported, affordable spectros but we've seen issues in the past with the shadows. There are other options that are a bit more expensive that don't offer compelling reasons to consider them. IPS Displays can more or less be lumped into three categories: low cost ~sRGB CCFL, higher cost ~aRGB CCFL and various LED displays that tend to have larger gamuts. So it's not like we really need to test 100 displays.  Nonetheless people like myself that travel around the country calibrating displays and other devices will eventually get hands on with hundreds of displays with whatever calibration system they go with and look for trends along the way.

We can use QA tools to statistically measure the accuracy of a calibrated monitor. In addition we can visually look for smoothness in gradations, espicailly in the shadows. In addition to this, I particularly like to visually analyze how well a display profile performs with soft proofing! This last step is often the toughest challenge for a display profile and some clearly perform better than others.

I've been focusing my testing on The Spyder3Elite and EyeOnePro devices using ColorEyesDisplayPro, DispCal/Argyll, Syper3Elite, and i1Profiler. From the testing I've done thusfar, I have to say I'm pretty blown away and completely surprised with what I'm seeing from i1Profiler with an EyeOnePro. Not only does the process take less than 2 minutes (!!) but the shadows and smoothness are incredible, rivaling even DispCal's hour long calibration process. i1Profiler's QA analysis reports lower Average Delta E variations with i1Profiler's calibration than the others. I've been frustrated with shadow detail from the i1 Pro in the past but i1Profiler is doing a better job than I've seen with other solutions.

Ethan you wrote "No software wizardry will improve a spectrophotometer's accuracy at low light levels to be competitive with a colorimeter." But that's exactly what I'm seeing with i1Profiler (to my surprise as much as yours I'm sure!). Suddenly the shadows are all there statistically and visually and the performance with soft proofing is incredible.

So this is fun stuff. I'd love to hear your results should you attack this question above, regardless of methodology. I've got lots of tests to continue expanding upon and perhaps I will come to different conclusions later as my methodology continues to evolve. I hate to say it, but there's more stuff on the horizon that keeps all of this a moving target. But for now, I think there's lots of "under the hood" algorithmic mojo happening in i1Profiler that's worth getting excited about. Let me (us) know what you think!

Cheers and thanks again Ethan!
Logged
Scott Martin
www.on-sight.com

Ethan_Hansen

  • Full Member
  • ***
  • Offline Offline
  • Posts: 114
    • Dry Creek Photo
Re: Monitor calibration sensor evaluations
« Reply #12 on: April 30, 2011, 02:03:19 am »

Scott,

You understand our motivation perfectly. We are out to update our reviews of monitor profiling systems. Looking at hardware-agnostic packages (ColorEyes, BasICColor, CalPC, etc.) that support a huge range of sensors is daunting. Yes, it is pretty cool to be able to hook a Minolta CS-2000 up to BasICColor or our PR-730 to CalPC, but this doesn't tell you much about real-world performance. (If any of you photographers can financially justify hanging $20K+ of sensor off your monitor for calibration purposes, I'll sell my stake in Dry Creek and work as your gofer in a heartbeat).

What we wanted to establish was which sensors we should be looking at to be able to review systems in a reasonable length of time. Our methodology is straightforward. We use the unbiased eye of the spectroradiometer to check white and black point accuracy, grayscale neutrality and adherence to the desired gamma curve, and accuracy of the RGB primaries. The next judgement is purely subjective. Several of us flip through a series of synthetic and actual images to judge how well a profile performs. Lather, rinse, repeat on the next monitor.

I have been impressed with i1Profiler as well. X-Rite indeed managed to coax commendable performance out of a short measurement cycle. I assume they are using a method better than the simple averaging in Argyll to tease a signal from the measurement noise. (Serious late-night geek digression: The standard error of a measurement sample goes down with the square root of the sample size. In English, this means that taking 3 measurements cuts your random variability in half vs. 1 sample. To reduce it by half again, you need 10 measurements. Shrinking the error by half again requires 100 measurements, followed by 10000 measurements for another 50% reduction. You can see why the DispCal's high quality mode takes an hour to run).

Back to reality... Our tests of i1Profiler do indeed give more neutral and open shadows with an i1Pro than most other software. The calibration ain't perfect, however. We still visually see color crossovers where none exist with a profile made with, for example, ColorEyes and a DTP-94 (highlights and saturated colors are an entirely different matter).

Statistical measurements of a profile's performance with the same sensor and software used to create the profile are problematic. Others have argued that this is an exercise in an activity that probably should not be, shall we say, brought up in a PG-rated forum. My take is that there is useful information to be gleaned from these profile validations, but the results need to be viewed with caution. Yes, comparing dE values can aid in convincing people who insist on calibrating to 5000K that 6500K is more compatible with how modern monitors actually work. The same goes with setting a white luminance too low on any LCD other than an Eizo. Likewise, seeing the blue primary dE increasing steadily is a warning sign that the backlight is failing. Looking at the reported dE values for a grayscale can be misleading. We measured a black point dE-2000 of over 8 when the profiling software reported all values as being less than 1 for i1Pro-generated profiles.

We'll see where this all leads. On one hand, there is the hybrid approach used by Argyll and CalPC of creating a correction matrix with a spectrophotometer for the primaries and whit point which is then applied to the profile built from colorimeter measurements. On the other hand. measurement technology is improving. OEM-tuned sensors are available for specific monitor technologies, while improved sensors such as the Discus are coming out at (semi-)affordable price points.

One last comment. The high quality mode in Argyll/DispCal has a potential gotcha: temperature drift. The ii1Pro is fairly sensitive to temperature. Even under controlled ambient temperature there can be shifts from the i1 heating up as is rests on the screen. We saw this firsthand when trying ridiculously long integration times to see what the absolute noise floor was for an i1Pro. We started with a warmed-up monitor, and saw the i1 readings progressively change over the course of 20 minutes. Our reference PR-730 is both temperature-compensated and was being used in non-contact mode. After the first 20 minutes, the i1 readings did not shift for the 24 hour duration of our test. Moral of the story: preheat your i1Pro by hanging it on your screen while you surf the web for 20 minutes or so before you calibrate.

jaapb

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 96
Re: Monitor calibration sensor evaluations
« Reply #13 on: April 30, 2011, 02:47:37 am »

Ethan, Scott,

Very interesting stuff guys.
I thought I knew it all about profiling monitors and papers but since delving into ArgyllCMS, though quite geeky, my level of understanding was raised considerably. When reading threads like these I now see there is even a whole new dimension to discover.
Thank you guys.

Jaap
Logged

Lilien

  • Newbie
  • *
  • Offline Offline
  • Posts: 3
Re: Monitor calibration sensor evaluations
« Reply #14 on: April 30, 2011, 06:04:25 am »

Ethan,

Thanks for the informative review, although as an owner of ColorMunki (with Argyll) I'm, needless to say, not very happy with your findings.

Have you taken into consideration, that the ColorMunki sensor is, according to X-Rite, not calibrated to the same (former GretagMacbeth) standard as e.g. the Eye-One Pro?
The Colormunki was their first instrument calibrated to the new XRGA standard, so it is not to be expected to get similar readings.

Have you tested if the hires (3.33 nm) and adaptive (integration time) modes within Argyll give any benefit with the ColorMunki?

I also found, that my Colormunki seems to be sensitive to temperature changes. Even a self-calibration brings it not back to the same reading after a substantial temperature change (several minutes on the screen).

Thanks again and regards,
 Juergen
« Last Edit: April 30, 2011, 07:32:52 am by Lilien »
Logged

Ernst Dinkla

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 4005
Re: Monitor calibration sensor evaluations
« Reply #15 on: April 30, 2011, 06:54:06 am »

Ethan,

Very nice testing. Gives me a good guide to improve my monitor calibrations/profiling with a mix of displays here.

The Spyder 3 Elite general quality on both monitor gamuts surprised me.

Another thing I was curious about for some time is whether that colorimeter capacity to measure better in low luminance values compared to spectrometers is seen in reflective measurements. I gather that the illumination in spectrometers is made high enough that reflected light still keeps it out of the dangerous zone. I recall of the HP Z model spectrometers that they measure the dark patches longer to get better readings, less noise and/or less hysterisis affecting them.  The distance between the media and lamp/sensor with that printer integrated spectrometer has to be considered, more light will be used but maybe that extra time for dark patches was needed as well.

met vriendelijke groeten, Ernst

New: Spectral plots of +250 inkjet papers:

http://www.pigment-print.com/spectralplots/spectrumviz_1.htm
Logged

Mark Paulson

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 89
Re: Monitor calibration sensor evaluations
« Reply #16 on: April 30, 2011, 09:09:43 am »

Something I think may be worthwhile mentioning:

> BasicColor

BasICColor display should be on version 5 soon, with many changes and improvements compared to version 4. Their DISCUS colorimeter is a somewhat different beast compared to i1 Display 2 and DTP-94.

I would be happy to send my DISCUS to Ethan for testing.
Logged

Scott Martin

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1315
    • Onsight
Re: Monitor calibration sensor evaluations
« Reply #17 on: April 30, 2011, 09:29:03 am »

Our tests of i1Profiler do indeed give more neutral and open shadows with an i1Pro than most other software. The calibration ain't perfect, however. We still visually see color crossovers where none exist with a profile made with, for example, ColorEyes and a DTP-94 (highlights and saturated colors are an entirely different matter)

I'll suggest that we both continue testing this very comparison. CEDP with the DTP94 has been a rock solid component of my business for years so comparing just that to i1P/i1pro is a particularly important comparison for me. Unlike you, I'm seeing better results with i1Profiler, so I think you and I should both look further into this.

I agree that statistical measurements should ideally not be done within the same application that performed the calibration. While I'm impressed with i1P's QA with the colorchecker approach I'm open to other suggestions.
Logged
Scott Martin
www.on-sight.com

Iliah

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 770
Re: Monitor calibration sensor evaluations
« Reply #18 on: April 30, 2011, 10:50:18 am »

Dear Ethan,

> it is pretty cool to be able to hook a Minolta CS-2000 up to BasICColor

Now it is Konica Minolta Display Color Analyzer CA-210. I'm getting reliable measurements with it down to 0.2 cd/m2 (they state 0.1 cd/m2, but I do not see it that way), which raises the same question of black point. Generally I stay at 0.5 cd/m2 (my target is proofs and press).
Logged

keith_cooper

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 473
    • Northlight Images
Re: Monitor calibration sensor evaluations
« Reply #19 on: April 30, 2011, 04:58:05 pm »

Interesting to see similar results to the more qualitative ones I got when looking at the Discus recently, where a Spyder 3 topped an i2 disp2, and i1 Pro, although all fell behind the Discus.

BTW Have to take my hat off to those with the wherewithal to test all these multiple devices -  my patience runs out far too quickly ;-)
Logged
bye for now -- Keith
[url=http://www.nor
Pages: [1] 2 3 ... 9   Go Up