I'll answer as many queries as I can:
Pherold: I'm in the UK, so I am using Spectraview Profiler, (rebadged basiccolor), not Spectraview II, which I do not have access to.
Mark DS: I initially tried using DVI and the problem was the same, I switched to DisplayPort because the cable I had was longer and more convenient.
It's the shipped NEC cable, with the Monoprice converter. The computer is a Mac Pro 2008 (3,1) with a Radeon 4870. And yeah, I was disappointed about the 10bit thing too
Czornyj: Ordinarily I would agree with that, but it's a bit of a grey area to me with this particular monitor with the LUTs in the monitor.
If it was consistent, I would understand, but it isn't. Sometimes the colour looks right (in ANY app), sometimes it does not. Basically, things look OK in Lightroom or Photoshop, not so in Firefox, and yes I have enabled the colour management and set it to assume sRGB for untagged images in the browser options. Having been accustomed to using a profiled screen for many years, you know when something isn't right, it's not merely a question my perceiving the extra capability of the newer screen. Admittedly not a graphics app, but the most immediately obvious candidate is iTunes, the coverflow artwork is very badly oversaturated, but then the other day it was perfect again, just like on my old monitor, without my having changed anything.
I kind of had an expectation that because this high end monitor has the LUTs
in the monitor, the colour would look right with ANY app because the conversion is taking place on the basic video output, as opposed to in the OS/app. Maybe that expectation was wrong.
What is a grey area to me is whether this "hardware" calibration requires a software profile in order to work, and which option I should choose from the software and what the
real differences are between them. With the Spectraview Profiler software, the calibration/profiling options are :
- Hardware
- Hardware and Software
- Software
- No Calibration (profile only)
I've tried both option one and two, with no difference. Although I haven't tried it (yet) I have no doubt that if I were to profile this monitor with my old methods (i1Match, equivalent to option 3 I guess) it would be fine, but I balk at the idea of having spent all this money on a high end monitor with in-built hardware 14-bit 3D LUTs only to not use them...
I'm sure it's something I'm doing or not doing, or an incompatibility somewhere, I don't think the hardware is faulty, I just can't seem to get a handle on where the problem is.
Thanks for you replies.