But calibration is not really about accuracy i think. It is about matching something - points in a standard colour space, prints on a printer, etc.. After all, we deal mostly in a reflective real world and not a transmissive world. So a screen is very different from what we see around us.
Isn't it mostly about getting the screen to match the print device output ?
Please do not think typing-wise without a clear understanding of and better yet some experience with the entity in question. It is dangerous in disseminating incorrect information.
Monitor calibration is just an adjusting a black to white axis (grey one) to the given targets:
1. White point - defines the color and brightness of the white (255 in every channel r/g/b/ of the 8bit range)
2. Consistency of that white point color from white to black
3. The gamma curve - the uniformity of the brightness increment from black to white, subjectively judged by the place of the middle grey along the grey axis. In more consumer terms, contrast curve.
4. In some cases, the black point, both in terms of hue and brightness. In relation to white point brightness, it defines contrast.
You can and should adjust the white point and gamma curve by means of the OSD menu controls as close to the target values as possible before calibrating the display with a spectrophotometer\colorimeter. The calibration by using these devices will "equalize" each channel as to be in accordance to the target white point and gamma curve. Thus a big shift from actual white point and gamma curve from the target ones will need a bigger correction coefficient and lead to a deficiency of bit resolution resulting in a non-linear behavior (read: bad gradients) of any non-grey (read: hue having) color of the display, that will lead to color inaccuracy.