Update on L* versus 2.2 calibration:
I just purchased the Datacolor Spyder3 StudioSR kit on generous discount at the Photo Plus East show in NY. As a current owner of ColorEyes Display Pro running a Monaco Optix XR colorimeter (aka Xrite Dtp94), the newly purchased Spyder3 Elite colorimeter (also supported by ColorEyes Pro) gave me a chance to revisit the L* versus gamma 2.2 discussion using the same software but with different sensor. Also, it gave me a chance to try a different calibration software package on my system.
Previously in this thread, I reported better calibration using the L* calibration on my system compared to G2.2 (Mkbkpro running Apple Cinema display, ie., the previous fluorescent ACD not the latest LED version). I suggested that, all theory aside, real world hardware/software compatibility may dictate what is best. So, now that I have a Spyder3 3 Elite instrument, I used ColorEyes on the same hardware setup, with aimpoint D50 as before, and revisited the G2.2 versus L* aimpoint results. Surprise... L* didn't work as well with the Spyder3 unit. G2.2 was the best overall calibration as validated by the MonitorChecker target I provided URL access to earlier in this thread.This result indicates that the optimum calibration settings are both hardware and software dependent notwithstanding all the theoretical constructs (and even includes one's choice of calibration device).
Next, I installed the Datacolor Spyder3 Pro software which in advance mode also supports both G2.2 and L* calibration. This is a more affordable package than ColorEyes. Result?. It was unable to achieve an excellent calibration of my ACD to either L* D50 or G2.2/D50, but resetting to G2.2/native whitepoint calibration did produce an excellent result, albeit not at the whitepoint I desired. The more expensive ColorEyes Pro software could calibrate my ACD to either D50 or native whitepoint with excellent results using the Monoco XR colorimeter, but produced lesser quality results with the Spyder3 unit and L* calibration whereas G2.2 was excellent with the Spyder 3 colorimeter on my system.
My conclusion: That at least for 8bit video technology, the best calibration state is a delicate interaction between Display, videocard, calibration software, calibration sensor, and desired gamma/whitepoint. This optimum state must be determined empirically using a good independent test target. Relying on the monitor calibration software to "validate" itself doesn't get there. Excellent validation results were returned in all cases by both software packages whereas the real impact on my MonitorChecker target indicated that very real differences existed. The "best" calibration therefore requires some experimentation. Not all calibration aimpoints produce equally calibrated accuracy.