Mark and Andrew,
I think you may have to explain this in more detail. It's not making much sense to me.
You seem to be implying that a monitor with a high contrast ratio will automatically bestow a high contrast upon the image being displayed, as though the contrast of the image is wholly dependent upon the CR spec of the monitor.
As I understand, this is only partly the case, but I'm always open to a persuasive argument. If an image has an inherently low contrast and a low dynamic range, it will appear as such on the monitor irrespective of whether the monitor has a high or a low contrast ratio, provided the monitor has a sufficiently high CR to accommodate the dynamic range of the image
However, the opposite is not the case. If a monitor has too low a contrast ratio, after adjusting brightness to the recommended level for calibration, say 100 nits, then blacks will likely not be black and images will tend to appear washed out.
If two monitors have the same maximum brightness level, the one with the higher contrast ratio would be the one preferred, all else being equal. It's better to have a contrast ratio which is unnecessarily high than one which is not high enough, just as it's better to have a camera with a high DR capability even though for some, or even most applications, that high DR might not be needed.
It is understood that all images have to be processed before printing in order to fit the gamut and the contrast within the limits of the print. Both camera and monitor generally have a much higher contrast and DR capability than ink and paper.
The inherent weakness of the LCD has always been the presence of a backlight which makes it difficult to achieve a good black. For this reason only the best and most expensive LCDs could match the qualities of a moderately priced CRT in which individual phosphors are able to be swithched off completely to render a truer black.
The difference between a monitor with a high CR and one with a low CR, but both having equal maximum brightness, is the ability to separate subtle shades of near black. If that capability of the monitor with the higher CR is of no practical use because of the ambient lighting conditions of your working environment, then no harm done. The issue is, does your monitor lend itself to accurate calibration?
Can either of you give me an example of a monitor which cannot be accurately calibrated because its real and actual contrast ratio is too high? It's understood that there's often a lot of hyperbole going on with CR figures for sales purposes and that one should not always believe such inflated figures.