However, doing the same test on Canon 40D's RAW files, there was no loss of IQ using 12-bit. That camera is noisy enough to make 14 bits unnecesary.
I suspect that would be true. I tried similar tests with my D700 whilst my D7k was in for repair, and was very surprised to find virtually no image quality difference in the 11th stop, between 12 bit and 14 bit, except for a slight additional graininess in the 12 bit image, almost like the difference between an ISO 200 film and an ISO 400 film, but not as great.
I admit I was expecting to see a greater difference.
However, the D7k is a different kettle of fish. Not only is that slightly greater graininess apparent in the 12 bit shots, in the 11th and 12th stops also, but there seems to be an unavoidable clipping of the blacks in 12 bit mode. Such clipping is not apparent in the 14 bit shots with the D7k.
It goes without saying that I've applied no sharpening nor noise reduction in the following ACR conversions. Tone curve is linear, blacks zero and contrast either 0 or -50, depending on the image.