I'd like to focus on just this point, for the sake of clarity. Specifically, what might be meant by "properly".
A colour profile/ICC aware application on a wide gamut profiled display will indeed display your image properly. By which I mean the greens, reds etc will display as accurately as the camera that captured them ie close to real life.
What it can't do is display colours that were never captured in the first place. Such as a shade of green or purple that sRGB cannot represent, because it is outside of sRGB gamut.
If that last sentence seems mysterious, please google "color gamut".
Note that Windows itself is not colour aware, only individual applications such as Photoshop, Lightroom etc.
cheers, martin
I perfectly understand everything you say, Martin. I have read dozens of articles totalling several hundred pages, and have a solid scientific/engineering background as well, so by now would claim to at least know what a colour space and gamut is. However, let's just say I know enough to know that this is a highly complex subject, so I am treading carefully. I also never ask a question unless I have done at least some basic research, and have a fair idea of the answer in advance.
Now let's go back to my original question, and my particular situation and needs.
The majority of the general public shoot in sRGB JPEG. For the last 10 years I have also shot in sRGB JPEG, though I do care about image quality, and take the trouble to own a reasonable quality camera, and use it properly. For the purpose of this thread, I am deliberately wearing the hat of all who shoot in sRGB JPEG, which is in fact the overall majority.
My old screen has died, so I need to buy a new one, and on the face of it, it would be nice to buy one of the new extended gamut monitors, such as the Dell U2410.
You are exactly right to question what I mean by displaying my sRGB JPEGs "properly", and in retrospect I can see how my use of the term was confusing.
The key issue here for me, is whether I will be able to exploit the benefit of an extended gamut monitor when viewing sRGB JPEGs, either those that I have taken in the past, or in the future.
Of course, if I set the monitor to emulate sRGB color space, then everything will be "OK" in the sense that the restricted gamut of colours in my files will be displayed correctly, exactly as they were on my old monitor, but in that case I am completely wasting my time and money buying the extended gamut monitor.
If the extended gamut monitor is not set to sRGB, and the software is totally dumb (like any part of Windows) then the software assumes (correctly) that my image file is sRGB, but does not know that it will be sending that image data to an extended gamut monitor. Presumably then, the dumb software happily sends the image data to the monitor, identically to how it would for any other monitor. The result will be that my restricted gamut image data is mapped to the full gamut of the monitor. To Joe public the result may even look impressive, but the vivid and saturated colours being displayed will bear little resemblance to the colours in the original scene that was photographed. For anyone that cares a fig about colour fidelity, using the extended gamut of the display in this way would be a truly awful thing to do, almost a crime ....
If the software is smart, and knows that the image file is sRGB, and also knows that the monitor has an extended gamut (and knows it is not set to emulate sRGB), then the situation is more interesting. However you look at it though, colours in the original scene that were out of the sRGB gamut, cannot be displayed correctly on the monitor, because the sRGB file simply does not contain the information about these out-of-gamut colours in the first place.
A colour profile/ICC aware application on a wide gamut profiled display will indeed display your image properly. By which I mean the greens, reds etc will display as accurately as the camera that captured them ie close to real life.
Therefore this statement is not, strictly speaking, true. The camera can and does capture a gamut exceeding even the best display, but the captured gamut is then compressed or clipped into the sRGB colour space when producing the sRGB JPEG with the result that the wide gamut of the display is wasted. The fault here is not with the camera, the JPEG compression, the display or the editing/viewing software. As far as I can see, the fault lies in the absurdly restrictive sRGB colour space. In a previous post I asked why Microsoft/HP elected to standardize on such an obviously restrictive colour space in the first place, and apparently no one knows. I don't know either, but I curse that they did.
Every way you look at it then, the extended gamut of the monitor cannot be usefully exploited when displaying an sRGB JPEG, and the problem has nothing to do with how smart or colour aware the software/viewer is. I would like to be wrong on this point, but unfortunately everything I have said appears to be correct.
Given that the overall majority of people shoot in sRGB JPEG, and will never be interested in going RAW, this is surely a very significant problem. As things stand, the overall majority of people have no incentive to buy an extended gamut monitor, because they will not be able to exploit the extended gamut. FWIW, the majority who shoot sRGB JPEG won't be able to use the full gamut of the better inkjet printers either, for the same reasons. What a pathetic situation. I reiterate my previous conviction that the industry is in a total mess, which is hardly anything new if you look back through the evolution of Microsoft windows, for example.
If anything I have said here is factually wrong, then please, please, tell me.
Colin