Pages: [1] 2 3   Go Down

Author Topic: Choosing a monitor  (Read 11765 times)

col

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 52
Choosing a monitor
« on: March 08, 2010, 03:22:31 am »

I am in the process of upgrading my home PC, and choosing a new monitor.

A 24" monitor seems like a good size, and I don't mind spending a bit more for a better quality consumer-grade monitor.

I shoot in JPEG, so am neither a fanatic nor professional, but nonetheless appreciate good image quality, as viewed on my monitor.

I almost never print my images, preferring to view then on my monitor.

The very cheapest monitors don't appeal because of the poor viewing angle, so I am leaning towards one of the better consumer models such as the Dell 2408WFP or U2410, which are both quite affordable.

These are both "extended gamut" monitors, which at first seemed like an advantage, with the ability to display a wider gamut, or range of colours. Granted, the colour accuracy may be terrible (by professional standards) straight out of the box, but a low-cost colorimeter can fix that up, to a standard that is very good by consumer standards.

I would like some informed advice as to whether my conclusions below are correct.

On further reading, it appears that the extended gamut is a complete waste of time for me (and the majority of consumers), because my JPEGS are encoded in the standard sRGB colour space. The much wider range of colours which my camera (Canon G10) is capable of recording is irreversibly thrown away when the image is stored as a JPEG. If I set the monitor to display an extended gamut, the result will presumably be that the restricted sRGB gamut of my JPEGs is mapped to the extended gamut, producing a horribly unreal, though doubtless vivid, image. If I set the monitor to emulate the sRGB colour space, then there is no benefit in buying a monitor with extended gamut.

Did I get that right? If I got that right, then I'm extremely pissed off with Microsoft and HP for developing the short-sighted and low quality sRGB standard, which has the effect of preventing the average consumer from exploiting the low cost extended gamut monitors that are now becoming available.

Would I also be right in saying that consumer grade inkjet printers have now developed to the stage where their gamut also often exceeds the sRGB gamut? If this is true, then we have the same problem, where the average photographic consumer who prefers to work with JPEGs cannot exploit the full colour gamut of their printer.  

Would I also be right in saying that this mess has come about through industry (largely Microsoft) short-sightedness, and almost complete disregard for consumer image quality? Would not most of this mess be avoided if Microsoft and everyone else adopted a universal and significantly wider colour space,  encompassing extended gamut screens and printers? Sure, this may require another couple of data bits, and our JPEGs would be slightly bigger, but negligibly so I would suggest, considering advances in hardware. We have apparently accepted an order of magnitude increase in megapixels and image file sizes over the last few years, and RAW files are MUCH bigger, so I don't buy the argument that a universal, larger colour space is impractical or undesirable because a few more data bits would be necessary.        

OK people, please tear me apart and tell me I am completely wrong, because right now I'm feeling annoyed in my belief that average photographers such as myself, who prefer to work with JPEGs, cannot exploit the benefits of today's extended gamut monitors.
Logged

walter.sk

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1433
Choosing a monitor
« Reply #1 on: March 08, 2010, 10:00:05 am »

Quote from: col
I shoot in JPEG, so am neither a fanatic nor professional, but nonetheless appreciate good image quality, as viewed on my monitor.  

OK people, please tear me apart and tell me I am completely wrong, because right now I'm feeling annoyed in my belief that average photographers such as myself, who prefer to work with JPEGs, cannot exploit the benefits of today's extended gamut monitors.
You don't make clear what you do when you "work with jpegs."  Do you mean that you import them from your camera and then view them on your monitor?  Or do you mean that you optimize the color, tonality, contrast and other parameters of the images?

Relegating those who use RAW to "fanatics or professionals"  suggests either disdain for "average" photographers who shoot RAW, or a gross misunderstanding of what shooting RAW is or isn't.

I suspect that your work with jpegs includes adjusting exposure, lightening up shadows to bring out detail, bringing some more color to washed out skies, adjusting overall contrast, straightening tilted horizons, cropping images and possibly toning down noise and sharpening the images.

It also sounds as if you don't know that you can shoot RAW and do no more than the above adjustments in the RAW processor if you so choose.

When you can get a terabyte of storage for $100, the size of a Tiff file, especially when reduced in pixel dimensions, should not make a big difference in a decision to shoot RAW or Jpeg, but the ability to recover highlight detail and have a richer color and tone pallet should.

I think that having used a wide gamut monitor to work on my images I would not go back to an sRGB monitor.  I believe that if you only use your images to view on a monitor, you would still enjoy them more in a larger color space and on a monitor that could do them justice.

Many photographers who shoot jpeg and know that there is a difference, somehow have phobias about shooting RAW, or remember when RAW conversion included making choices about the curves and profiles involved in conversion.  Once seeing how the ease and amount of work in processing a RAW image can be the same as adjusting a jpeg yet give better results, many photographers become RAW enthusiasts.
Logged

Paul Sumi

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1217
Choosing a monitor
« Reply #2 on: March 08, 2010, 10:38:28 am »

I use a wide gamut monitor with my computer system.  I *do* shoot RAW and I do make prints at home.  I would never go back to a sRGB screen for my main display.

That said, one of the disadvantages of using a wide gamut monitor is that it renders colors properly only in color managed applications. This issue comes up the most often with web browsers since not all are color managed.

One other thing - are you calibrating your current monitor? You will get the most out of either sRGB or wide gamut monitors with hardware/software calibration.

Paul
« Last Edit: March 08, 2010, 11:01:09 am by Paul Sumi »
Logged

LucDelorme

  • Newbie
  • *
  • Offline Offline
  • Posts: 13
Choosing a monitor
« Reply #3 on: March 08, 2010, 11:06:36 am »

Does your G10 have the option of saving JPEGs in AdobeRGB.  Some cameras have that option.  If so the extended gamut may be of value to you, and you will be able to view your pictures in a colour gamut that is closer to that which your camera can capture.  As mentioned previously, applications that are not colour-managed will not show colours accurately however.  This will be a problem mostly in Internet Exploere, office applications and games.  The colours aren't horribly off, but the difference is visible.  This is a bit less of a problm with a Mac, since Mac applications are typically more colour-aware.

On the standard gamut front, take a look at the NEC EA231WMi.  It's a nice, affordable IPS monitor. Colour accuracy is good once calibrated.  Some have reported issues with panel uniformity, but the Dell units also seems to have the problem, as most consumer-level IPS panels do.

Cheers,

Luc
Logged

Paul Sumi

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1217
Choosing a monitor
« Reply #4 on: March 08, 2010, 11:11:19 am »

Quote from: LucDelorme
Does your G10 have the option of saving JPEGs in AdobeRGB.  Some cameras have that option.

I don't believe that the G10 has an Adobe RGB option for JPGs.  I'll have to check my G10 owner's manual to be sure.

Quote from: LucDelorme
The colours aren't horribly off, but the difference is visible.

True - this is seen most with flesh tones which are definitely more saturated.

Paul
« Last Edit: March 08, 2010, 11:14:37 am by Paul Sumi »
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20646
  • Andrew Rodney
    • http://www.digitaldog.net/
Choosing a monitor
« Reply #5 on: March 08, 2010, 03:33:33 pm »

Quote from: col
On further reading, it appears that the extended gamut is a complete waste of time for me (and the majority of consumers), because my JPEGS are encoded in the standard sRGB colour space. The much wider range of colours which my camera (Canon G10) is capable of recording is irreversibly thrown away when the image is stored as a JPEG. If I set the monitor to display an extended gamut, the result will presumably be that the restricted sRGB gamut of my JPEGs is mapped to the extended gamut, producing a horribly unreal, though doubtless vivid, image.


You are correct on all points expect that bit about the sRGB on a wide gamut display looking unreal. They will preview fine in an ICC aware application. Not so much outside them. If you are sure you’ll stick with sRGB JEPGs, there’s little reason to move to an extended gamut display.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

col

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 52
Choosing a monitor
« Reply #6 on: March 08, 2010, 08:44:23 pm »

Quote from: walter.sk
You don't make clear what you do when you "work with jpegs."  Do you mean that you import them from your camera and then view them on your monitor?  Or do you mean that you optimize the color, tonality, contrast and other parameters of the images?

Relegating those who use RAW to "fanatics or professionals"  suggests either disdain for "average" photographers who shoot RAW, or a gross misunderstanding of what shooting RAW is or isn't.

I suspect that your work with jpegs includes adjusting exposure, lightening up shadows to bring out detail, bringing some more color to washed out skies, adjusting overall contrast, straightening tilted horizons, cropping images and possibly toning down noise and sharpening the images.

It also sounds as if you don't know that you can shoot RAW and do no more than the above adjustments in the RAW processor if you so choose.

When you can get a terabyte of storage for $100, the size of a Tiff file, especially when reduced in pixel dimensions, should not make a big difference in a decision to shoot RAW or Jpeg, but the ability to recover highlight detail and have a richer color and tone pallet should.

I think that having used a wide gamut monitor to work on my images I would not go back to an sRGB monitor.  I believe that if you only use your images to view on a monitor, you would still enjoy them more in a larger color space and on a monitor that could do them justice.

Many photographers who shoot jpeg and know that there is a difference, somehow have phobias about shooting RAW, or remember when RAW conversion included making choices about the curves and profiles involved in conversion.  Once seeing how the ease and amount of work in processing a RAW image can be the same as adjusting a jpeg yet give better results, many photographers become RAW enthusiasts.

Firstly thanks to everyone for your information and advice.

I'll answer the questions above, because I probably did not make my needs myself clear, and I also think it important to realize that the majority of consumers who own and use digital camera have no interest in generating or working with RAW files. I realize that people on this site are enthusiasts or professionals, but I am deliberately looking at the bigger picture, where the majority of the general public do not and never will want to shag around with RAW.

Quote
You don't make clear what you do when you "work with jpegs."  Do you mean that you import them from your camera and then view them on your monitor?  Or do you mean that you optimize the color, tonality, contrast and other parameters of the images?

I always shoot in JPEG, and archive my pictures in JPEG. In 95% of cases I don't alter the out-of-camera picture in any way. In the 5% of cases that I do, the change will be minor such as rotation, cropping or resizing to send on the net. It is very rare indeed that I either feel the need, or could be bothered, with changing colour, contrast etc., and the limited and technically inferior methods for making such changes with a JPEG are just fine for me, given how infrequently I do it. Of course, I always keep the original JPEG because any changes made irreversibly degrade the quality. Arguably I am barely an enthusiast, but being a scientist/engineer type, I do care about the image quality more than your average happy-family-snaps photographer.

Sure, I know about RAW and the advantages it offers the enthusiast but, as explained, I would prefer not to be shagged with it. What I was NOT aware of until a few days ago was that standard JPEGs throw away the wide colur gamut captured by the camera, which arguably mattered little in the past, because in the past no one except an enthusiast or professional would own (or could afford) an extended gamut monitor. I was also not aware until recently that at least some of the gamut potentially available in the better consumer inkjet printers is not usable when printing sRGB JPEGs with their restricted gamut.  


Quote
I think that having used a wide gamut monitor to work on my images I would not go back to an sRGB monitor.  I believe that if you only use your images to view on a monitor, you would still enjoy them more in a larger color space and on a monitor that could do them justice.
Unfortunately, now that such monitors are cheaply available, I'm fairly sure that I also would prefer to view my images in a larger colour space that could do them justice. Even more unfortunately, it is impossible for me to do so for the thousands of JPEG images that I have taken over the last 10 years since digital cameras have been available - thank you to Microsoft and HP for your crap sRGB standard.    

My harsh criticisms are not directed at JPEGs as such. I think most people know that JPEG compression produces some artifacts and loss of detail, but it's actually pretty darned good, and in any event you have the camera option of selecting the JPEG quality (compression factor), and thus choosing your own compromise between file size and image quality.

I realize that enthusiasts and professionals can get the results they require by shooting RAW, using colour profiles and ICC aware applications. However, what the general public (and me) surely would want is to be able to happily take JPEG photos just as they do now, and then be able to display them optimally (means with the full gamut that the display is capable of) on any monitor, with any software, or print them optimally (means with the full gamut that the printer is capable of) on any printer. Is that asking too much? You bet it is!!! The industry is (IMHO) in a complete mess in this matter, and perhaps some debate about how things could be improved might be useful. I suspect that in practice Microsoft holds most of the cards here, dominating operating systems, browsers, image viewers and so on, and little is likely to happen without will and cooperation from them. Nonetheless, it might be interesting to discuss how, at least in principle, things could be done better than they are now.

 


Logged

col

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 52
Choosing a monitor
« Reply #7 on: March 08, 2010, 10:58:05 pm »

Quote from: digitaldog
You are correct on all points expect that bit about the sRGB on a wide gamut display looking unreal. They will preview fine in an ICC aware application. Not so much outside them. If you are sure you’ll stick with sRGB JEPGs, there’s little reason to move to an extended gamut display.


Sadly you confirm my suspicion that a wide gamut monitor is a complete waste of time for me, and the majority of the general public, if I only shoot sRGB JPEGs.

Agreed that sRGB JPEGs will preview without looking "unreal" in an ICC aware application. At best, this means is that they will preview no better than with a standard monitor - you can't get back gamut that was never stored in the JPEG in the first place. At worst, as most "consumer" image viewers (including any part of Windows) are not ICC aware, then the sRGB JPEG will be displayed incorrectly with gross colour distortion.

I'm really pissed off that the industry is in such a mess. I believe that Firefox are doing their best to improve matters by being aware of and correctly displaying Adobe RGB files. That is a good start, and hopefully might convince Microsoft and others that the present unsatisfactory situation needs to be addressed.

The irony is, that I believe the issue of "full gamut colour" is a marketing dream, if the industry could all pull in the same direction. Of course Joe Public would want, no need, "Full Gamut Colour", if they knew they were presently missing out on it, just as eveyone wants and needs "Full HD". Sooner or later the market will wake up to this, but the practical problem is that a universal extended gamut standard needs to be defined, needs to be backward compatible, and needs to be universally adopted by the industry in the way that sRGB JPEGS are at present. It's a beautiful dream, but I know of no theoretical reason why it can't happen.  

I have little idea what I am talking about, so regard my suggestions as a stimulus for discussion. One key question is how much bigger would JPEG files need to be be to store existing wider colour space standards such as Adobe, Adobe Wide Gamut, or PhotoPro, compared to sRGB, for the same spacing between adjacent colours.? I'm sure there are experts here that could answer that question, but given that each extra bit doubles the number of colour values that can be stored, I would have thought that the files would not be much bigger at all. In any event, storage is so cheap these days that a modest increase in file sizes would not be an issue anyway.

Why was the sRGB space defined so damned tightly in the first place anyway, without any "extra space" whatsoever to accomodate likely gamut expansion of monitors and printers. Was it just because the standard was conceived in days long passed, when every wasted byte mattered, and everything possible was done to reduce file size? Again, there must be experts that can answer that question.

   


Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20646
  • Andrew Rodney
    • http://www.digitaldog.net/
Choosing a monitor
« Reply #8 on: March 09, 2010, 08:59:32 am »

Quote from: col
I'm really pissed off that the industry is in such a mess. I believe that Firefox are doing their best to improve matters by being aware of and correctly displaying Adobe RGB files. That is a good start, and hopefully might convince Microsoft and others that the present unsatisfactory situation needs to be addressed.
sRGB needs to go the way of the dodo bird. It was based on display behaviors of the early 1990’s and on CRT displays that have gone as well. The issue isn’t the displays, its the brain dead app’s that don’t treat the data correctly.
Quote
The irony is, that I believe the issue of "full gamut colour" is a marketing dream, if the industry could all pull in the same direction. Of course Joe Public would want, no need, "Full Gamut Colour", if they knew they were presently missing out on it, just as eveyone wants and needs ...
Quote
I have little idea what I am talking about, so regard my suggestions as a stimulus for discussion.
They can certainly use and would prefer wider gamut technology if the browsers and OS (Windows) had some intelligence and didn’t think everything were sRGB. You are throwing the baby out with the bath water with these kinds of beliefs about wide gamut displays.
Quote
One key question is how much bigger would JPEG files need to be be to store existing wider colour space standards such as Adobe, Adobe Wide Gamut, or PhotoPro, compared to sRGB
No bigger. An 8-bit per color document is the same size, the encoding color space doesn’t change that.
Quote
Why was the sRGB space defined so damned tightly in the first place anyway, without any "extra space" whatsoever to accomodate likely gamut expansion of monitors and printers. Was it just because the standard was conceived in days long passed, when every wasted byte mattered, and everything possible was done to reduce file size? Again, there must be experts that can answer that question.
Because it was designed nearly 20 years ago.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

col

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 52
Choosing a monitor
« Reply #9 on: March 10, 2010, 01:13:44 am »

Hi all,

I don't understand some of Andrew's anwers so, as I am here to learn, will challenge his answers in the hope that he (or anyone else) can explain where I am wrong.

Quote
sRGB needs to go the way of the dodo bird.
On that we both totally agree.


Quote
The issue isn’t the displays ....
From my reading, sRGB was usually just fine for CRT and "standard" LCD displays. However, as I see it, there is a very real issue indeed with the newer "extended gamut" displays, in that sRGB image files simply can't properly exploit the extended gamut of these displays. As the majority of the public shoot in sRGB JPEG, this means that the majority of people who own cameras can't exploit the benefits of these improved displays. That sounds like a very significant issue to me.  


Quote
... its the brain dead app’s that don’t treat the data correctly.
Most applications are "brain dead", yes, and that is another problem. However, it doesn't matter how aware or smart the application is, it still can't properly display an sRGB file on an extended gamut monitor, can it?


Quote
They can certainly use and would prefer wider gamut technology if the browsers and OS (Windows) had some intelligence and didn’t think everything were sRGB. You are throwing the baby out with the bath water with these kinds of beliefs about wide gamut displays.
As far as I can see, it doesn't matter how aware or smart the application is, it still can't properly display an sRGB file on an extended gamut monitor. If I am wrong on this point then I'll be extremely happy, and go out and buy an extended gamut monitor tomorrow, so I can view my existing 10 years worth of sRGB JPEGS in glorious extended gamut. We need to resolve this point.


Quote
I wrote:
One key question is how much bigger would JPEG files need to be be to store existing wider colour space standards such as Adobe, Adobe Wide Gamut, or PhotoPro, compared to sRGB, for the same spacing between adjacent colours.?
Quote
No bigger. An 8-bit per color document is the same size, the encoding color space doesn’t change that.
I think you did not read my question properly. One of the results of using a wider colour space with the same bit depth is that the spacing between colours is increased, leading to a risk of posterization. Reading the original question again carefully, can anyone answer?


Quote
I wrote:
Why was the sRGB space defined so damned tightly in the first place anyway, without any "extra space" whatsoever to accomodate likely gamut expansion of monitors and printers. Was it just because the standard was conceived in days long passed, when every wasted byte mattered, and everything possible was done to reduce file size?
Quote
Because it was designed nearly 20 years ago.
I realize that, but it in no way answers my question. Maybe my suggested answer is partly correct, though I suspect there is more to it.

Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20646
  • Andrew Rodney
    • http://www.digitaldog.net/
Choosing a monitor
« Reply #10 on: March 10, 2010, 09:01:18 am »

Quote
Most applications are "brain dead", yes, and that is another problem. However, it doesn't matter how aware or smart the application is, it still can't properly display an sRGB file on an extended gamut monitor, can it?
Which is why nearly all such wide gamut displays provide an sRGB emulation for all those stupid applications.
Quote
As far as I can see, it doesn't matter how aware or smart the application is, it still can't properly display an sRGB file on an extended gamut monitor.
Actually that is possible.
Quote
I think you did not read my question properly. One of the results of using a wider colour space with the same bit depth is that the spacing between colours is increased, leading to a risk of posterization. Reading the original question again carefully, can anyone answer?
You should always be working in high bit. Lots of users don’t, even in ProPhoto (which according to Kodak who developed it and Bruce Fraser who did early testing, worked fine with 8-bit data). But you should be working in high bit no matter the working space. And the working space has nothing to do with the display color space or gamut.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

martinreed22

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 50
Choosing a monitor
« Reply #11 on: March 10, 2010, 09:48:27 am »

Quote from: col
As far as I can see, it doesn't matter how aware or smart the application is, it still can't properly display an sRGB file on an extended gamut monitor. If I am wrong on this point then I'll be extremely happy, and go out and buy an extended gamut monitor tomorrow, so I can view my existing 10 years worth of sRGB JPEGS in glorious extended gamut. We need to resolve this point.
I'd like to focus on just this point, for the sake of clarity. Specifically, what might be meant by "properly".

A colour profile/ICC aware application on a wide gamut profiled display will indeed display your image properly. By which I mean the greens, reds etc will display as accurately as the camera that captured them ie close to real life.

What it can't do is display colours that were never captured in the first place. Such as a shade of green or purple that sRGB cannot represent, because it is outside of sRGB gamut.

If that last sentence seems mysterious, please google "color gamut".

Note that Windows itself is not colour aware, only individual applications such as Photoshop, Lightroom etc.

cheers, martin
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20646
  • Andrew Rodney
    • http://www.digitaldog.net/
Choosing a monitor
« Reply #12 on: March 10, 2010, 09:50:19 am »

Quote from: martinreed22
I'd like to focus on just this point, for the sake of clarity. Specifically, what might be meant by "properly".
A colour profile/ICC aware application on a wide gamut profiled display will indeed display your image properly. By which I mean the greens, reds etc will display as accurately as the camera that captured them ie close to real life.
What it can't do is display colours that were never captured in the first place. Such as a shade of green or purple that sRGB cannot represent, because it is outside of sRGB gamut.
If that last sentence seems mysterious, please google "color gamut".
Note that Windows itself is not colour aware, only individual applications such as Photoshop, Lightroom etc.

Well said!
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Mike Bailey

  • Newbie
  • *
  • Offline Offline
  • Posts: 24
    • http://bluerockphotography.com
Choosing a monitor
« Reply #13 on: March 10, 2010, 10:24:36 am »

Beware of the Dell 2408WFP.  I have one I bought last August and debated returning it, but ended up living with the shortcomings it has.  The worst is that it has very uneven brightness across the screen.  The left side for 2 or 3 inches in much brighter than the center.  The right side shows similar problems, but not quite as badly.  This is a very frequent problem among many people who have bought that monitor.  Some think it is resolved by different firmware fixes.  Some know better than that.  If you go to the forums on Dell's web site and search on this, you'll find enough to probably convince you.

http://en.community.dell.com/forums/p/1916...1.aspx#19400841

Just a fair warning.  Best to take that one off your list.  I use it now as a backup monitor and not on my main work machine.  There are other problems, perhaps related to operating system and graphics cards, but it often will go into standby (black screen) mode soon after booting up and/or randomly after being on some time.

Mike

______________
http://BlueRockPhotography.com
Logged
Mike Bailey
[url=http://bluerockp

col

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 52
Choosing a monitor
« Reply #14 on: March 10, 2010, 05:18:12 pm »

Quote from: Mike Bailey
Beware of the Dell 2408WFP.  I have one I bought last August and debated returning it, but ended up living with the shortcomings it has.  The worst is that it has very uneven brightness across the screen.  The left side for 2 or 3 inches in much brighter than the center.  The right side shows similar problems, but not quite as badly.  This is a very frequent problem among many people who have bought that monitor.  Some think it is resolved by different firmware fixes.  Some know better than that.  If you go to the forums on Dell's web site and search on this, you'll find enough to probably convince you.

http://en.community.dell.com/forums/p/1916...1.aspx#19400841

Just a fair warning.  Best to take that one off your list.  I use it now as a backup monitor and not on my main work machine.  There are other problems, perhaps related to operating system and graphics cards, but it often will go into standby (black screen) mode soon after booting up and/or randomly after being on some time.

Mike_____________
http://BlueRockPhotography.com



Thanks for the heads up on that, because that montor was top of my list, influenced by a recent massive price drop from AUD$800 to $550. I believe the price drop is not because the model is no good as such, but to make way for the newer model U2410, priced at AUD$800.

This monitor was reviewed and accurately measured here http://www.tftcentral.co.uk/reviews/dell_2408wfp.htm and panel uniformity was found to be good, even better than average.

The problem with reviews such as this, and with the experience of any individual, is that the product quality varies considerably from unit to unit, to the point that I sometimes wonder whether such reviews based on a single sample really tell us anything useful at all. You really need to measure maybe four samples, taken evenly throughout the most recent 6 months of production, but I have drifted onto another topic ....

You are clearly aware of the dangers of drawing conclusions from a single sample, but as quite a number of users have experienced similar problems, I concur that this monitor is best avoided.

I now lean towards current model Dell U2410 - comments anyone? The tftcentral review found that their particular sample was poorly calibrated out-of-the-box, but an excellent performer once calibrated.

Again, thanks.

Logged

col

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 52
Choosing a monitor
« Reply #15 on: March 10, 2010, 07:10:07 pm »

Quote from: martinreed22
I'd like to focus on just this point, for the sake of clarity. Specifically, what might be meant by "properly".

A colour profile/ICC aware application on a wide gamut profiled display will indeed display your image properly. By which I mean the greens, reds etc will display as accurately as the camera that captured them ie close to real life.

What it can't do is display colours that were never captured in the first place. Such as a shade of green or purple that sRGB cannot represent, because it is outside of sRGB gamut.

If that last sentence seems mysterious, please google "color gamut".

Note that Windows itself is not colour aware, only individual applications such as Photoshop, Lightroom etc.

cheers, martin

I perfectly understand everything you say, Martin. I have read dozens of articles totalling several hundred pages, and have a solid scientific/engineering background as well, so by now would claim to at least know what a colour space and gamut is. However, let's just say I know enough to know that this is a highly complex subject, so I am treading carefully. I also never ask a question unless I have done at least some basic research, and have a fair idea of the answer in advance.

Now let's go back to my original question, and my particular situation and needs.

The majority of the general public shoot in sRGB JPEG. For the last 10 years I have also shot in sRGB JPEG, though I do care about image quality, and take the trouble to own a reasonable quality camera, and use it properly. For the purpose of this thread, I am deliberately wearing the hat of all who shoot in sRGB JPEG, which is in fact the overall majority.

My old screen has died, so I need to buy a new one, and on the face of it, it would be nice to buy one of the new extended gamut monitors, such as the Dell U2410.

You are exactly right to question what I mean by displaying my sRGB JPEGs "properly", and in retrospect I can see how my use of the term was confusing.

The key issue here for me, is whether I will be able to exploit the benefit of an extended gamut monitor when viewing sRGB JPEGs, either those that I have taken in the past, or in the future.

Of course, if I set the monitor to emulate sRGB color space, then everything will be "OK" in the sense that the restricted gamut of colours in my files will be displayed correctly, exactly as they were on my old monitor, but in that case I am completely wasting my time and money buying the extended gamut monitor.

If the extended gamut monitor is not set to sRGB, and the software is totally dumb (like any part of Windows) then the software assumes (correctly) that my image file is sRGB, but does not know that it will be sending that image data to an extended gamut monitor. Presumably then, the dumb software happily sends the image data to the monitor, identically to how it would for any other monitor. The result will be that my restricted gamut image data is mapped to the full gamut of the monitor. To Joe public the result may even look impressive, but the vivid and saturated colours being displayed will bear little resemblance to the colours in the original scene that was photographed. For anyone that cares a fig about colour fidelity, using the extended gamut of the display in this way would be a truly awful thing to do, almost a crime ....

If the software is smart, and knows that the image file is sRGB, and also knows that the monitor has an extended gamut (and knows it is not set to emulate sRGB), then the situation is more interesting. However you look at it though, colours in the original scene that were out of the sRGB gamut, cannot be displayed correctly on the monitor, because the sRGB file simply does not contain the information about these out-of-gamut colours in the first place.

Quote
A colour profile/ICC aware application on a wide gamut profiled display will indeed display your image properly. By which I mean the greens, reds etc will display as accurately as the camera that captured them ie close to real life.
Therefore this statement is not, strictly speaking, true. The camera can and does capture a gamut exceeding even the best display, but the captured gamut is then compressed or clipped into the sRGB colour space when producing the sRGB JPEG with the result that the wide gamut of the display is wasted. The fault here is not with the camera, the JPEG compression, the display or the editing/viewing software. As far as I can see, the fault lies in the absurdly restrictive sRGB colour space. In a previous post I asked why Microsoft/HP elected to standardize on such an obviously restrictive colour space in the first place, and apparently no one knows. I don't know either, but I curse that they did.    

Every way you look at it then, the extended gamut of the monitor cannot be usefully exploited when  displaying an sRGB JPEG, and the problem has nothing to do with how smart or colour aware the software/viewer is. I would like to be wrong on this point, but unfortunately everything I have said appears to be correct.

Given that the overall majority of people shoot in sRGB JPEG, and will never be interested in going RAW, this is surely a very significant problem. As things stand, the overall majority of people have no incentive to buy an extended gamut monitor, because they will not be able to exploit the extended gamut. FWIW, the majority who shoot sRGB JPEG won't be able to use the full gamut of the better inkjet printers either, for the same reasons. What a pathetic situation. I reiterate my previous conviction that the industry is in a total mess, which is hardly anything new if you look back through the evolution of Microsoft windows, for example.

If anything I have said here is factually wrong, then please, please, tell me.

Colin
« Last Edit: March 10, 2010, 08:59:47 pm by col »
Logged

col

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 52
Choosing a monitor
« Reply #16 on: March 11, 2010, 01:05:59 am »

I wrote:
Quote
One key question is how much bigger would JPEG files need to be be to store existing wider colour space standards such as AdobeRGB, AdobeWideRGB, or PhotoProRGB, compared to sRGB, for the same spacing between adjacent colours?  ...  given that each extra bit doubles the number of colour values that can be stored, I would have thought that the files would not be much bigger at all.
Quote from: digitaldog
You should always be working in high bit. Lots of users don’t, even in ProPhoto (which according to Kodak who developed it and Bruce Fraser who did early testing, worked fine with 8-bit data). But you should be working in high bit no matter the working space. And the working space has nothing to do with the display color space or gamut.

Agreed that all your "working" , editing and generally manipulating the image, using your favourite sftware such as Photoshop, Lightroom etc,  should be done in 16 bits, crazy not to.

But still the original question has not been answered. The reason I asked the question is because there seems to be good agreement that sRGB is crap and "should go the way of the dodo". OK, so in an ideal world we would presumably prefer that EVERYONE standardized on a wider colour space, such as AdobeRGB, or maybe even PhotoPro, but this has possible implications for the size of our image files, because a greater bit depth may be required to avoid posterization issues due to the gradations between colours becoming too coarse. Keep in mind that this is an exercise of the mind, concerned with what we might ideally and theoretically LIKE to have, rather than what we actually have. I'm a purist who enjoys discussing how things should be done, while most sensible people concentrate more on what is actually available to them, even though it may be crap.  

To answer the question requires looking at the relative sizes of the commonly available colour spaces, which Wikipedia gives as below :-

sRGB covers 35% of the visible colours specidied by CIE
AdobeRGB covers 50%
WideAdobeRGB covers 76%
PhotoPro covers 90%

The numbers are interesting in their own right, with even the well regarded AdobeRGB colour space capable of representing only 50% of the colours that humans are capable of seeing, and the ubiquitous sRGB representing a mere 35%. Fortunately, human vision is very forgiving.

Getting back to the original question, if the presently adopted 8-bits per channel (for the image file) is regarded as satisfactory for sRGB, then we see that AdobeRGB colour space is only 40% bigger (50/35), so 8 bits is probably just fine for AdobeRGB as well, and conceivably could be used for WideAdobe and Photpro as well, though I personally reckon that would be pushing things a bit. One extra bit (9-bits) would easily handle WideAdobeRGB, and 2 extra bits (10-bit, a minor increase of 25%) would be the most ever required, giving finer colour gradations in PhotoPro than we presently have with 8-bit sRGB.

My original suspicion appears to be confirmed. Required image file size is essentially a non-issue concerning the choice of a wider colour space. Which asks two obvious questions, in my mind.
 
(1) Why did Microsoft/HP decide to make the sRGB colour space so damn narrow and restrictive? (There must be an answer, to be sure ...)  

(2) In principle, would we all (meaning the entire industry) be much better off to universally adopt a wider colour space to replace sRGB?
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20646
  • Andrew Rodney
    • http://www.digitaldog.net/
Choosing a monitor
« Reply #17 on: March 11, 2010, 08:59:20 am »

Quote from: col
(1) Why did Microsoft/HP decide to make the sRGB colour space so damn narrow and restrictive? (There must be an answer, to be sure ...)  

(2) In principle, would we all (meaning the entire industry) be much better off to universally adopt a wider colour space to replace sRGB?

1. Because its based on a theoretical emissive display and at the time, the majority of such devices where CRT displays with a certain phosphor set etc. That’s why I said, 20 years ago, it made sense.
2. I don’t know how anyone defines “the industry” or how any group can define a universally accepted color space (we can’t even agree on a universal raw file format which is far more important IMHO).

What will likely happen is more and more displays will have a extended gamut, the price will come down and sRGB will as I hope, become extinct. That’s exactly what we see happening with CRT displays, many of which had the potential to produce sRGB.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Rocco Penny

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 483
Choosing a monitor
« Reply #18 on: March 11, 2010, 09:51:57 am »

Ultimately for me,
a beginner and uneducated,

I want to enjoy looking at images and art.
There is a huge difference between my old laptop screen @ 17" and 1 million colors
and my entry level IPS @ 24" to make the change more than worth it.
I wish I could've skipped the 8 bit screen, but $
So yeah it's pretty darn neat to see a jpeg as good as it gets.
You'll probably get other stuff later
500 isn't too much for me but 2000 is
« Last Edit: March 11, 2010, 09:52:52 am by Rocco Penny »
Logged

joofa

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 544
Choosing a monitor
« Reply #19 on: March 11, 2010, 03:00:36 pm »

Quote from: digitaldog
I don’t know how anyone defines “the industry” or how any group can define a universally accepted color space (we can’t even agree on a universal raw file format which is far more important IMHO).

Progress is happening in related industries such as Digital Cinema, where efforts are underway to standardize a color space, and DCI initiatives recommends the XYZ space. Parameteric RGB was considered but the idea dropped considering what will happen if the metadata regarding primaries specifications was lost.

Logged
Joofa
http://www.djjoofa.com
Download Photoshop and After Effects plugins
Pages: [1] 2 3   Go Up