We have put a lot of thought and research into this idea. The lab grade equipment used during the factory characterization is almost certainly more accurate than consumer level devices are at measuring the RGB primaries. While we could certainly modify the software to adjust the gamut to what your sensor thinks is the correct RGB primaries, chances are the factory calibration is more accurate, even after the monitor has been in use for thousands of hours.
Thanks.
I was wondering a bit about that.
sRGB gamut:
One curious thing is that here and the readouts from an i1 Pro and a NEC i1D2 from sRGB mode:
and comparing sRGB emulation mode which I custom tuned according to what my i1 Pro said:
i1pro:
.639, .330
.301, .599
.152 , .061
whle i1d2 said:
.642, .328
.303, .600
.152,.061
for blue, both probes read back the exact same results
for green there is a .002 diff for x and for y they basically give exactly the same results, just .001 off
now for red things are a bit more in disagreement at .003 and .002
but it seems surprising they would both agree almost completely for the G and B primaries if the probes were so poor at reading the primaries (although it could just be chance, of course)
to get these values I had to switch from factory programming of .640,.400 .300,.600 .150,.060 to:
.638,.333
.291,.602
.150,.057
So the i1 pro and i1d2 read the blue the same but differ by .003 in y compard to the factory value.
For green they read nearly the same .002,.001 delta compared to each other, but delta to factory values of .010,.003 and .012,.002
And for red they differ by .003,.002 and compared to factory by .001,.003 and .004,.005.
So the i1 pro reads a bit closer to the factory settings than the NEC i1D2, the difference only being for red really. It reads closer a bit closer to the i1D2 than to the factory settings overall, mostly due to the x-coord of green (differing .002 with respect to each other but by .010 and .012 to the factory setting), so does that imply that even tuned consumer probes simply tend to mess up reading the x-coord of green primaries badly and in the same exact way or since they both miss it in almost the exact same way that the set does drift over time more than thought or that my set had a mess up when they measured the x-coord of green in the factory?
I guess it's hard to say with so few data points. I will measure it with an i1 Display Pro later this week and see what it says. Not that it will answer the question, but it would be interesting.
Native Gamut:
Another odd thing I noticed is that when in native gamut mode, while the factory settings and i1 pro and NEC i1D2 were at least in some ballpark for most values, for some reason they measured the x of G wayyy differently and, again, in the exact same way. Is it likely that both would read so far off in the exact same way?
i1pro:
R .679, .309
G .214, .690
B .152, .056
NEC i1D2:
R .683, .306
G .214, .691
B .152, .057
so we have delta .004,.003 for R and .000,.001 for G and .000,.001 for B so they are in remarkable agreement with each other, even for the native gamut coordinates, for G and B, if starting to differ somewhat on their assessment of R.
the factory settings give:
R .678,.312
G .200,.694
B .152,.054
The deltas to the i1pro and i1d2: .001,.003 and .005,.006 for R; .014,.004 and .014,.003 for G; .000,.002 and .000,.003 for B.
So once again the i1pro reads closer to the factory settings and it agrees reasonably closely for B and R but once again the x coord of G is different, way different this time but both probes are in exact agreement as to the degree of difference. (When the sets and probes were newer I though the i1pro and i1d2 read a little more closely on avg so perhaps my i1d2 is starting to drift a touch.) Here the i1pro agree more with the factory than the i1d2 when it comes to R but more with the i1D2 for B and much moreso for G.
So does this mean the factory setting for the x-coord of G got mis-measured on my set at the factory?
That the set does drift over time, noticeably for at least some components of some primaries?
That the consumer probes are really poor at reading the x-coordinate of the green primaries (and in this case, if they all have a bias for over-reading the x component of G then why do the manufacturers not apply a negative scalar to how they read x of green [well I suppose the probe would need lots of logic and some guessing as to what it was measuring then otherwise it would then toss off all of the other x measurements, they could add driver code to be informed when they were trying to read a green primary perhaps or maybe automatically scale more and more the close the measured location go out towards the greens?])? I wonder if I can find my old data from when i first got the set. I wonder if the factory readings for x of G were closer then or if it was the same story.
I wonder if the closer agreement between the probes than either to the factory settings means that, the i1 pro at least if maybe few others, does read the locations, other than perhaps the x-coord of G, better than what the factory settings are, at least after much usage??
Anyway I obvious have very few data points to go on and I'm doing a lot of perhaps rather wild, if at least a trace educated, speculating here.
Have you noticed any tendency for probes to mess up the x coord of Green primary readings? Any drift in the coordinates as the monitor ages?
(I actually would expect the native primary locations, at the least, to remain rather constant over time, but I don't really know much about the backlights and how their spectrum changes over time or whether the colored filters over the subpixels in sets change much over time.)
Once again thanks and sorry for all the babbling here.