Yes, quite a number do, including SpectraView and at least the last number of generations of X-Rite products.
Oops you are correct.
Which is why we have the Native Gamma option. Why alter the behavior as you propose to sRGB when you can leave it alone and just characterize it with the profile.
Why not leave it at native gamma? Maybe because it's nice to have the 95% of programs that are not color aware look nice? It is sure nicer using the monitor to watch TV/movies/play games/edit in premiere pro/watch videos/for the times you use IE instead of Firefox/desktop/3D programs/games/etc. So why shouldn't someone calibrate to sRGB TRC for some of that stuff and to gamma 2.2 for the rest? Why leave it to just a few color aware programs and have to trust the CMM in all of them?
And in this case here, he has a wide gamut monitor and most likely the only reason he is even in the sRGB emulation mode is to deal with non-managed software. For the most part, when you are using an sRGB emulation mode on a wide gamut monitor it is because you are dealing with non-color aware programs, otherwise you probably are going to want to have it in native gamut mode.
Of course it will.
Hmm yeah I guess maybe it can also be stored as a table form too which would work.
I will have to look into it more for native gamut photo editing modes. I'm not sure it makes any difference on my PA though since I don't see any banding due to the high bit internal LUT anyway. Perhaps it might improve contrast ratio or something.
Yes, as I said, you can see the difference in all kinds of calibration targets. So what? Calibration should be set to provide a visual match to a print in most cases (that is what most users desire).
The gamma isn’t going to play a role here in ICC aware applications.
Yes, but what about for all the non-ICC aware stuff?
(and in the OP's case, if he is in the sRGB emulation mode, he very well may be in that mode specifically because he plans to use non-ICC aware programs)
For years, prior to OS X, Mac users calibrated to a Gamma 1.8 because outside ICC aware applications, the OS made that silly assumption. But inside ICC aware applications, it made no difference. Smart users calibrated their displays to 2.2 because that if far closer to the native gamma of the display and, there was no visual difference in ICC aware applications plus closer to native meant less banding (the reason Native has been an option for years in better software products). IF you cared about how non ICC aware app’s previewed their inaccurate color (cause they are not ICC aware), you used 1.8 otherwise everything appeared a tad dark (yes, there was a visual difference, so what?). But IN ICC aware applications, 1.8 or 2.2 produced the same previews, the former however introducing more banding in many display systems.
Again, what about when someone flips on a tv show or a movie or to a game or does some browsing and looking at other people's images and happens to be using one of the many non-fully aware browsers?
And I don't think it is a so what if IE shows things with the wrong TRC. You can start telling people to lift their shadows a little and end up giving bogus advice.
As for banding and such it depends whether it's an internal high bit LUT monitor like the OP has or someone relying on an 8bit graphics card LUT. And if you are dealing with non-iCC aware it's all moot since you need it to be calibrated and not merely profiled.
You are preaching to the choir! If you carefully re-read what I wrote, you’ll see I’ve gone out of my way to separate language discussing a simple gamma curve with a TRC curve when discussing the two.
That's what I thought but then you said something about how I insisted things need to be set to the 2.2 sRGB gamma instead so....
I question in this post, if you’d even really want it (or can actually really hit it). You seem to feel otherwise for reasons I as yet don’t understand.
I thought you were the one who was insisting it had to be hit since, if not them why even bother with what the TRC is, etc. etc., maybe we misunderstood each other.
And you can hit it with some monitors. I have it in sRGB emulation mode at sRGB TRC and 85cd/m^2 right at this moment and it is just about near min black level, the contrast ratio is a trace, TRACE lower than at 120 cd/m^2 but so what? the black level is much deeper and my eyes are not getting burned out and the calibration actually tested out to be a bit more uniform than the one made at 120 cd/m^2 so, no, it has not gone all whacked out. I'm in a dim room right now, why do I want a high black level, faded looking image and my eyes burned out?
First off, seeing a before and after difference tells us nothing about the quality of the ‘after’ calibration. Whether it meets sRGB specifications or matches a print. I don’t know why you keep going back to seeing a difference in two settings. That is totally to be expected. Other than there are differences, the results thus far say nothing further.
I keep going on about how you can see the difference between an sRGB image displayed with sRGB TRC vs. gamma 2.2 TRC because you said the difference was nothing and like dancing on the head of a pin and that he shouldn't even bother setting his sRGB emulation mode to sRGB TRC for general work. I was just saying that, to me at least, it is a lot more noticeable than that. Now all of a sudden you seem to say that the difference is noticeable.
But to propose that using that setting in some why makes your LCD display produce sRGB specified is not something I’m buying.
I'm saying that if you are switching from photo editing mode to sRGB emulation mode to do say perhaps some browsing on the web using IE that setting it to sRGB TRC does get you proper sRGB image display while setting it to gamma 2.2 makes it a little bit off and that with many, if not all images, the difference definitely can be seen easily enough even if it is not radically extreme or anything.