Since a rationale for "Native" targets is to minimize or completely eliminate the use of videocard LUTs it's hard to understand why you would complain. Obviously if your monitor's native state is unuseable then you can't use Native targets...
Could you explain in more detail how you evaluate the profiles?
[a href=\"index.php?act=findpost&pid=71816\"][{POST_SNAPBACK}][/a]
Minimize? Yes. Eliminate? Maybe in a perfect world, but then you wouldn't need calibration in the first place, no?
Just to recap:
TRC = Tonal Response Curve
display profiles contain a gamma curve entry that software can use to prepare data for proper display. This gamma curve can either be a single point designating a pure gamma value, or it can be multiple points designating a curve. Precision can be 16bit.
VCGT = Video Card Gamma Type
in addition, profiles can contain a video card lut entry. While it can describe a single gamma value, this entry usually describes an actual LUT. Precision can be 16bit. (As stored in the profile. The video card itself may or may not honor this precision).
There are a couple of possible scenarios for proper display preparation. The 2 relevant cases are:
1. Set a single gamma value in the TRC, and set the video card lut such that the separate channels actually behave like that pure gamma value.
2. Set a specific channel behavior in the TRC, and leave the video card lut linear.
For GMB, native everything apparently means scenario 2, except that GMB seems to add just a single gamma value in the TRC. Device obviously doesn't respond like a pure gamma curve, thus bad calibration results, hence ill advice.
A variation on 1:
3. Find the closest gamma value for each individual channel, set this in the TRC, then adjust the video card lut accordingly.
This is what I would call "Native Gamma", and minimizes video lut corrections.
A variation on 2:
4. Completely describe the device behavior in 3D luts, leave the video card lut linear.
You would think GMB does this in native everything and lut based profiles, but it doesn't. For some reason, the B2A tables (the relevant tables for the job at hand) are completely linear, and the matrix entry is (ab)used for normal matrix based conversions. The gamma curves are pure gamma curves again, except they are now stored in the 3D lut curve entry. And storing gamma curves in the 3D lut curve can actually harm precision. In addition, the tables are XYZ based, which is totally ridiculous for the task at hand since the video pipeline behaves more or less perceptual. A Lab table is more appropriate, and trying to use linear gamma data in profile tables to correct for possible perceptual pipeline deviations is simply IMPOSSIBLE!
(This is similar to the linear RAW data ETTR discussions we all have every now and then, except the linear RAW data is now described in 5 bit precision. Imagine that...)
So, unless I'm doing something seriously wrong, selecting lut-based profiles in GMB is simply misleading, as far as I can tell. Thoroughly not recommended...