So I have ordered a Dell U2410. It is the one that claims 96% of the Adobe RGB color gamut. It also claims to come calibrated from the factory. I am trying to figure out what that means, exactly. I haven't seen much difference in the calibration of my current smaller Dell LCD. I understand that LCD calibration is less critical because they have less color variability than CRTs had.
So, should I just use the standard Adobe RGB color profile and trust that the display is calibrated from the factory. (I have an Eye One Display 2.) I have never gotten through the calibration of my current Dell LCD without the calibration software telling me I am way out of spec, but nothing has been able to bring it within spec. I think I have been working with the wrong gamma settings in the software, but it really hasn't been critical to me and I haven't had the patience to work it through. I always use paper profiles, some of them custom.
Any advice, suggestions?
Mark, I am using a wide gamut Samsung XL24 LED illuminated LCD. And, FWIW, it is a real joy to be able to really calibrate this monitor. I use Eye-One Pro but I think the software is pretty much the same with the Eye-One 2 monitor calibrator. The gamut of this monitor is 120% of the NTSC standard. Believe me if you can use two monitors and view the same image on both, you will see a dramatic difference.
I would never accept a "canned" monitor calibration. Regardless of the illumination type, it will change over time. if you are a serious printer, you have to caibrate.
Don
If it's not critical to you and you don't have the patience to do some reading and learning, my advice would be "don't bother".
Hi MarkDS,
it is hard not to take that response as a bit condescending. Try not to assume everyone who questions the value of calibration to a gnats @ss is inexperienced and lacking in knowledge.
I am interested in precision, and I have done a significant amount with calibration of my old CRT, but with my current LCD, I haven't seen the benefit I saw with CRT. I have been wondering whether with better factory calibration of hardware if the monitor calibration puck will be going the way of the light meter as a quaint but mostly obsolete relic of a bygone era? (I know the obsolescence of light meters is a controversial issue, so lets not get into that one here.)
thanks,
So I have ordered a Dell U2410. It is the one that claims 96% of the Adobe RGB color gamut. It also claims to come calibrated from the factory. I am trying to figure out what that means, exactly. I haven't seen much difference in the calibration of my current smaller Dell LCD. I understand that LCD calibration is less critical because they have less color variability than CRTs had.
So, should I just use the standard Adobe RGB color profile and trust that the display is calibrated from the factory. (I have an Eye One Display 2.) I have never gotten through the calibration of my current Dell LCD without the calibration software telling me I am way out of spec, but nothing has been able to bring it within spec. I think I have been working with the wrong gamma settings in the software, but it really hasn't been critical to me and I haven't had the patience to work it through. I always use paper profiles, some of them custom.
Any advice, suggestions?
Any advice, suggestions?Calibrate in a REALLY dark room - or with a towel over the monitor when you do it. I've found that to be key to getting a good result with my wide gamut LCD.
Hi Fike,
I have also reacted to Mark's comment and tried to write a response, but it's not easy. You give very little detail about the problem you have. There are a couple of issues with LCD:s.
1) Almost all are to bright, this is a big problem making prints looking dark.
2) Color temperature should be set to arounf 6500 K
3) Regarding Wide color gamut there used to be settings for different gamuts, possibly not easy to find
4) If you don't use sRGB you probably need to understand quite a few thing about CM. sRGB is often used as a workaround for not having CM,
Finally, I know that you make good pictures and they are certainly worth some effort learning color management.
Best regards
Erik
Thanks for the advice, Erik. I have had trouble getting through the contrast settings in the Eye One Match 3 calibration routine. It would always tell me it was way out of spec. So, after getting frustrated that I couldn't bring it into the acceptable range with the monitor menus, I would just skip ahead and calibrate anyway. Your suggestion of using 6500K got me through that step. Yippee! The new profile is slightly different, enough to notice and care about.
I shoot Adobe RGB, so I calibrate to that target.
Some have mentioned that there are other better options than the Dell, but I have been pretty happy with the Dell LCD I have now, and the prices of the higher-end 24" devices are really in a different strata. The H-IPS U2410 LCD monitor arrives later this week, I'll need to report back on how I like it. It will also be interesting to compare it to the 3-year old Dell 20" LCD that was considered a bargain photo-worthy display at the time (sharing the LG LCD display with the apple of the same size).
I shoot Adobe RGB, so I calibrate to that target.
I'm not sure what this is supposed to mean ... you shoot RAW, right?
'AdobeRGB' shouldn't enter into the monitor calibration equation except as a point of reference.
It also claims to come calibrated from the factory.
Calibrated to what? Its kind of ridiculous claim at least taken at the face value provided.
They claim it is calibrated to adobe RGB. Reviewers say it isn't really that great, but for better or worse, here is a description. http://www.tftcentral.co.uk/reviews/conten...410.htm#factory (http://www.tftcentral.co.uk/reviews/content/dell_u2410.htm#factory)
The Dell U2410 features a dynamic contrast ratio (DCR) control, which boasts a spec of 80,000:1.
They say “The Dell U2410 comes factory calibrated to some extent“ which again, is a ridiculous comment (enough to bypass a sale IMHO). And no, its not “calibrated” to Adobe RGB (1998) which alone is meaningless and like every other CCFL wide gamut, it can’t hit sRGB with a profile I’ll bet.
Then they say something equally silly (“every unit is shipped incorporating pre-tuned sRGB and AdobeRGB settings and with an average DeltaE of <5“) but don’t tell us which formula is used (again, sloppy and apparently written by a marketing person with little understanding of what they just wrote). And that deltaE is in the center? The corners (which are always worse).
But wait, this is a review! Its not from Dell. It should be dismissed at this point alone.
Great, and my print is what, 250:1? That makes soft proofing a bit difficult.
It's too bad that they don't have better factory calibration. If factory calibration were done well, I can see them obsoleting calibration equipment. I wonder if the manufacturers are under any pressure to improve their calibration quality. Sometimes early in product development and manufacturing cycle they are still making improvements to quality and yield. I'm not going to hold out too much hope that this will be the case here.
Yup, think of the differing effects from just the video system in varying systems.
But wait, this is a review! Its not from Dell. It should be dismissed at this point alone.
QuoteThe Dell U2410 features a dynamic contrast ratio (DCR) control, which boasts a spec of 80,000:1.
Great, and my print is what, 250:1? That makes soft proofing a bit difficult.
A high contrast ratio is a disadvantage because color (among other aspects of images) is perceived differently depending on contrast. If I am editing an image, it is best that I edit on a display with a relatively close contrast ratio to the predicted print output.
I'm currently editing on a 400:1 display, with an 1100:1 display beside it. Quite simply, the adjustment for soft-proofing is too extreme on the higher-contrast display and I cannot predict as accurately what my prints will look like.
As of yet, I remain unconvinced that having an ultra-high contrast ratio would be a bad thing but for the fact that I personally cannot adjust to such variances quickly enough. That said, I am indeed the one who must use this system and so I therefore wish the contrast ratio shift from screen to print to be relatively mild.
I think you might be confusing the capability of the monitor with the characteristics of the image. If the CR of the monitor is too high for a successful calibration, you can always reduce it. However, the reverse is not possible.
Please explain why a high contrast ratio in a monitor could be a disadvantage. Even if we assume that the 80,000:1 figure is a meaningless exaggeration within the context of practical ambient light conditions, more is better than less is it not?
You might what to check out the thread Thread on Dell U2410 (http://luminous-landscape.com/forum/index.php?showtopic=39891) .
Without repeating this long thread concerning this monitor, I purchased the monitor and thought it was junk and just mailed it back to Dell. See the thread referenced above for more details and a fairly detailed discussion of alternatives.
If it has a flaw, it is that it is too bright. ...................... it is gloriously bright and contrasty.
This is exactly the condition which would make me VERY nervous if the end product of my photographic output were a print - even on the highest DR papers we now have.
When you say it looks good, do you mean what you see on the print with the matte paper looks faithful to what you see on the display? That's the critical issue of course. With matte paper and its lower DR, one needs to be especially careful about display brightness and contrast.
Mark and Andrew,
I think you may have to explain this in more detail. It's not making much sense to me.
You seem to be implying that a monitor with a high contrast ratio will automatically bestow a high contrast upon the image being displayed, as though the contrast of the image is wholly dependent upon the CR spec of the monitor.
As I understand, this is only partly the case, but I'm always open to a persuasive argument. If an image has an inherently low contrast and a low dynamic range, it will appear as such on the monitor irrespective of whether the monitor has a high or a low contrast ratio, provided the monitor has a sufficiently high CR to accommodate the dynamic range of the image
However, the opposite is not the case. If a monitor has too low a contrast ratio, after adjusting brightness to the recommended level for calibration, say 100 nits, then blacks will likely not be black and images will tend to appear washed out.
If two monitors have the same maximum brightness level, the one with the higher contrast ratio would be the one preferred, all else being equal. It's better to have a contrast ratio which is unnecessarily high than one which is not high enough, just as it's better to have a camera with a high DR capability even though for some, or even most applications, that high DR might not be needed.
It is understood that all images have to be processed before printing in order to fit the gamut and the contrast within the limits of the print. Both camera and monitor generally have a much higher contrast and DR capability than ink and paper.
The inherent weakness of the LCD has always been the presence of a backlight which makes it difficult to achieve a good black. For this reason only the best and most expensive LCDs could match the qualities of a moderately priced CRT in which individual phosphors are able to be swithched off completely to render a truer black.
The difference between a monitor with a high CR and one with a low CR, but both having equal maximum brightness, is the ability to separate subtle shades of near black. If that capability of the monitor with the higher CR is of no practical use because of the ambient lighting conditions of your working environment, then no harm done. The issue is, does your monitor lend itself to accurate calibration?
Can either of you give me an example of a monitor which cannot be accurately calibrated because its real and actual contrast ratio is too high? It's understood that there's often a lot of hyperbole going on with CR figures for sales purposes and that one should not always believe such inflated figures.
Ray, as far as I'm concerned, the basic point - especially with an LCD display - is to characterize the device white point, contrast curve and luminosity in a manner that provides the most reliabnle soft-proof of the final print. To do this properly, the display needs to be capable of handling the optimal settings. For example, as I'm printing mostly to Ilford Gold Fibre Silk paper, I find that a white point in the range of 5500-5700, gamma of 2.2 or L* and luminance of no more than 110 cd/mm2 does the job nicely. My display has adequate bit depth to show the differentials in shadow detail that will appear on paper. The quality of black on my LaCie 321 is pretty good, and it's not the most expensive display in the neighbourhood, as you most likely know. The key issue in terms of differentiating shadow detail is whether the bit depth is adequate to provide a smooth tonal gray scale.
Yes, the color balance, highlights, and shadows are faithful between the soft-proof and the paper version. I closely evaluated (subjectively of course) the highlights and shadows, considering exactly where the shadows finally dissolve fully into black and the last detail I can find. I did the same with snowy white highlights. I haven't done close color work with it yet, but the winter brown of tree bark can be difficult to reproduce in a neutral way. The monitor and the paper both reproduce identical and neutral browns. The print I was working with is here: http://www.trailpixie.net/general/pointy_knob_tra_1.htm (http://www.trailpixie.net/general/pointy_knob_tra_1.htm) .
So I would chalk-up one satisfied U2410 buyer, though I may not be as OCD as my other photographer/printer brethren, though don't ask my wife about my OCD photographic tendencies. I carry hyperfocal depth of field charts in my photo bag, and that is just too much for her.
When one turns down the brightness or luminance to the recommended level of 80-120 nits for calibration, the CR that originally seemed adequate at full brightness suddenly becomes inadequate and the images tend to lack shadow detail. Is this not the case?
No, from my experience it is not the case provided you use a display with enough bit depth to produce a smooth tonal ramp. And it need not cost the sky, but it won't be an el-cheapo either, unfortunately.
6 bit gives you 64 levels of tonality per channel. 8 bit gives you 256 levels per channel. That's 262 thousand colours versus 16.77 million colours. Seems pretty clear which spec will give you more tonal separation, and tonal separation is what we want for seeing detail anywhere in the luminosity scale.
6 bit gives you 64 levels of tonality per channel. 8 bit gives you 256 levels per channel. That's 262 thousand colours versus 16.77 million colours. Seems pretty clear which spec will give you more tonal separation, and tonal separation is what we want for seeing detail anywhere in the luminosity scale.
In any case, 6 bits is insufficient to reproduce good gradation in the shadows.
Bill
I would amend Mark's statement to read 6 bits gives a possible 64 levels of tonality, but that is merely the width of the channel and that does not mean that the device can actually reproduce those levels. The situation is similar with cameras. A camera may have a 14 bit ADC, but that does not mean that its dynamic range is greater than in 12 bit mode. With many cameras, the extra bit depth is wasted and is used mainly to digitize noise.
The number of bits determines the potential number of levels, but the distribution of levels in the various zones is also critical. With the linear capture used in digital sensors, most of those bits are wasted in the highlight zones and the shadows are relatively impoverished in levels. The use of a gamma 2.2 tone curve redistributes some of those levels to the shadows where they are needed. For an explanation, see the table on Norman Koren's (http://www.normankoren.com/digital_tonality.html) web site. Some users calibrate their monitors to an L* TRC with equal visual perceptual steps between the levels, and this may be slightly better than a gamma 2.2 TRC. In any case, 6 bits is insufficient to reproduce good gradation in the shadows.
Bill
-> (1) Is a 6 bit per channel monitor with a high contrast ratio better than a 6 bit monitor with a low CR?
Depends on your use. Most monitors achieve high contrast ratios by having the maximum luminance so high that you need wear sunglasses. LCD contrast ratio is governed by how little light leaks through when all the filters are active (black level) and how bright the backlight is when the filters are turned off (white level). Hitting 1000:1 or higher contrast ratios may well entail a white luminance of at least 200 cd/m2. That's bright.
-> (2) Do manufacturers of monitors match the bit depth to the contrast ratio so that monitors with a 6 bit output will always have a 'real' CR which is lower than that of a monitor with an 8 bit output?
No. 6-bit displays are used instead of 8-bit because it reduces response time. Making higher bit depth displays that are also visually appealing for watching video or action games gets expensive.
-> (3) Do manufacturers of monitors spend resources in producing a higher 'real' CR than is useful for photographic purposes, as a sales technique?
Photography is a niche market. Video games that have extensive dark, muddy scenes can benefit from a screen running at blazingly bright levels. Likewise, working in a brightly lit office environment is easier with a display set to higher luminance than one wants for an extended photo editing session. That said, there certainly is a sales factor at work as well. Judging from the contents of my junk mail folder, bigger numbers are the key to a happy, fulfilling life.
-> (4) If so, does the higher than useful CR present a disadvantage for calibration purposes in relation to print output?
It could well. Dialing down the backlight of a LCD too far usually results in smaller color gamut, increased banding, and other such goodies. Note, however, that the ISO spec for print viewing calls for an illuminance of 500 lux. This translates into a white level of 160 cd/m2 on your monitor for exact matching.
Thanks for the added dimensions Bill. Extracted above is basically what I was getting at.
For the same reason having a 12 stop scene range and a 6 stop capture device is problematic. Or a 10,000:1 display contrast ratio trying to soft proof a print that has a 250:1 ratio.On the other hand, one would have no trouble reproducing a 6 stop scene with a 12 stop capture device, provided that the 12 stop device uses enough bits to get smooth gradation of tones. This reverse analogy is more appropriate to the case being discussed here. Certainly, 8 bits would be insufficient to prevent banding with a CR of 10,000:1. In the case of a 12 stop capture device, one would likely want to use some type of HDR Encoding (http://www.anyhere.com/gward/hdrenc/hdr_encodings.html). A CR of 10,000:1 is about 13.3 stops.
Hey! Mark,
We all know here that 6 bits per channel is not ideal for photography. The issue is 'contrast ratio' and any disadvantages a specific and particularly high contrast ratio may have for photography and calibration purposes.
-Hitting 1000:1 or higher contrast ratios may well entail a white luminance of at least 200 cd/m2. That's bright.That's usually the problem. Turns out, though, that my couple-years-old MVA Westinghouse 24" is calibrated to 1300:1 at 124cd/m^2. Utterly unexpected, and I didn't have the equipment to measure it until recently. IPS panels don't do nearly as well for contrast.
I don’t know that it would appear twice as contrasty but I’m pretty darn sure, they wouldn’t match/
Currently we have two ways to attempt to simulate the contrast ratio on screen. Soft proof using the simulate options which are problematic because Adobe (no one) can as yet, control the “paper white” or “ink black” of the UI. Meaning you’re going to have to view in full screen mode with everything but the image under simulation blacked out (or as LR does, using Lights out). The other way, or in combination of the above which I suspect is necessary, alter the contrast ratio of the display itself. My take is, the more you do in the later, the less that needs to be accomplished in the former. So having the ability to control the display contrast ratio seems useful and probably why the high end reference displays have provided this since (if memory serves me), Sony Artisan (I don’t recall being able to do this in the old days on my Barco).
Better still, the ability to calibrate numerous contrast ratio’s for the type of print work you are currently soft proofing and being able to update this (and the matted ICC display profile), on the fly.
Let us assume as Andrew says that the contrast ratio of a print from a specific printer/paper combination is 250:1. Your display has a contrast ratio of 500:1.It is not necessary to use the full contrast ratio of the monitor. For example, consider an idealized print such as used for the ICC PRMG (http://www.color.org/v4_prmg.xalter). That print has a 288:1 dynamic range, having a neutral reflectance of 89% and a darkest printable colour having a neutral reflectance of 0.30911%. If the print is viewed under the recommended illuminance of 500 lux and behaves as a Lambertian reflector, the paper base would have a reflected luminance of 500*0.89/Pi = 141 cd/m2. The darkest printed color would have a reflected luminance of 0.5.
It is not necessary to use the full contrast ratio of the monitor. For example, consider an idealized print such as used for the ICC PRMG (http://www.color.org/v4_prmg.xalter). That print has a 288:1 dynamic range, having a neutral reflectance of 89% and a darkest printable colour having a neutral reflectance of 0.30911%. If the print is viewed under the recommended illuminance of 500 lux and behaves as a Lambertian reflector, the paper base would have a reflected luminance of 500*0.89/Pi = 141 cd/m2. The darkest printed color would have a reflected luminance of 0.5.
To reproduce the print as best as possible on the screen, one could calibrate the white point of the monitor to 141 cd/m2 and the black point to 0.5 cd/m2. The effective contrast ratio of the monitor would then be 282:1. Reflective and emissive sources may be perceived somewhat differently, but I would think the match would be reasonably close.
It is not necessary to use the full contrast ratio of the monitor. For example, consider an idealized print such as used for the ICC PRMG (http://www.color.org/v4_prmg.xalter). That print has a 288:1 dynamic range, having a neutral reflectance of 89% and a darkest printable colour having a neutral reflectance of 0.30911%. If the print is viewed under the recommended illuminance of 500 lux and behaves as a Lambertian reflector, the paper base would have a reflected luminance of 500*0.89/Pi = 141 cd/m2. The darkest printed color would have a reflected luminance of 0.5.Interesting. I have a NEC P-221 and SpectraView software. The software allows you to set the white point and contrast ration but not the black point. I can adjust the contrast ratio and perhaps that would effect the black point. When I first got the monitor and software I didn't pay much attention to the contrast ratio allowing it to be set at the maximum. I found that there was a pretty significant mismatch when using the full contrast and changed the setting to 450:1. From the ICC site it looks like I might be able to go a bit lower. I'll try that the next time I calibrate and see what the black point ends up being. I do have a lower white point than above because of the lighting in my "work room." Very useful reference and thanks for posting.
To reproduce the print as best as possible on the screen, one could calibrate the white point of the monitor to 141 cd/m2 and the black point to 0.5 cd/m2. The effective contrast ratio of the monitor would then be 282:1. Reflective and emissive sources may be perceived somewhat differently, but I would think the match would be reasonably close.
The software allows you to set the white point and contrast ration but not the black point.
That sounds like a very useful approach Bill. Reading that ICC reference, it appears to be a "virtual print", the file for which I could not find. How representative do you think this would be given the large variety of image characteristics and printer/paper combinations we need to deal with - would you say this is a good protrayal of approximate boundary conditions? And from what you are saying here we need not worry about the contrast ratio of our displays as long as they are at least 288:1 and we calibrate properly according to these calculations?
When you ask for a specific contrast ratio, the software is adjusting the black point (and luminance) to hit that desired target (as close as possible while still maintaining the other target calibration aim points like the cd/m2 you asked for).That's what I figured after reading the documentation. Right now I have the contrast set at 450:1 which makes the black point about 1/2 of what is suggested in the ICC article previously posted. I'll back the contrast down some and see what happens.
Mark, if you look at the white paper by Karl Lang (http://wwwimages.adobe.com/www.adobe.com/products/photoshop/family/prophotographer/pdfs/pscs3_renderprint.pdf) on the Adobe site, the best photographic or inkjet prints can have a CR of 275:1, but more typically 250:1. Of course, matt papers will have a lower DMax. My post was more in the order of a thought experiment and I haven't done such a calibration as I can not adjust the black point of my monitor. I would think that a monitor that displays an accurate rendering of the image at a CR of 288:1 would be adequate for soft proofing, but I would like to hear from Ethan Hansen or others who have actual experience in this area. Until someone convinces me otherwise, I think that a high CR is an advantage for a monitor.
Semi-related note to Andrew: Somebody goofed with the PixelGenius profile labeling. The 7800 and 9800 profiles are for the same printer and the 7880 and 9880 profiles are identical as well. The other profiles all look reasonable.
Yes, you can use the 7800 on the 9800 or vise versa but we were asked to supply a single named profile for each from the client.
Print contrast ratios depend on the ink set, printer (and driver), and paper used. Typical values for pigment ink printers range from 200:1 for high gloss stocks to 150:1 for lustre, semi-gloss, and satin, down to 40:1 and below for fine art papers. A few specialized stocks - usually polyester surfaces made for point-of-sale displays - can hit 400:1 or above. Dye ink printers achieve slightly higher DMax and, therefore, print contrast values than do pigment inks.
Silver-halide printers offer print contrast ranging between ~100:1 to 50:1 on standard surfaces depending on how much of the manufacturer's colormanagement software is enabled. Specialized papers such as metallic stocks are usually 40:1 or below because of the less-than-white paper surface.
Interesting that Andrew did not contest these figures. If the figures are true, then I'm seriously wondering if print media is the best method of displaying one's artistic efforts.
This plasma set boasts a native contrast ratio of 40,000:1, a 'dynamic' contrast ratio (whatever that means) of 'greater than' 2,000,000:1, and 6144 gradations per channel amounting to 231 billion possible colors.
<<SNIP>>
Wow! Wow! Wow! Phwoar! I've never seen such delicious and detailed blacks before. Just amazing! If I wasn't so modest I'd claim this is the best photograph I've ever seen. Sorry that most of you will not be able to appreciate it in it's full glory as I can.
As to whether a print is the best output medium, this is a debate that has raged since the advent of photography. In the pre-digital era, many a photographer's head shook n dismay when comparing a dull print to the glory of a slide viewed with a loupe on a light box. Prints make sharing easier, however.
I cropped this image to 16:9 ratio, converted to sRGB, reduced the file size to 6mb, transferred the image to an SD card and displayed it on my new plasma TV (which has a slot to accept SD cards). It was evening and I turned off all lights.
Wow! Wow! Wow! Phwoar! I've never seen such delicious and detailed blacks before. Just amazing! If I wasn't so modest I'd claim this is the best photograph I've ever seen. Sorry that most of you will not be able to appreciate it in it's full glory as I can.
Here's the full 2 megapixel HD shot. If you don't like it, tough!
[attachment=18824:01_3723_...filtered.jpg]
Ray,
That is a striking and beautiful shot. Is the moon in this image taken with a telephoto lens real? The rays of light emanating from the moon look real.
Bill
I noticed it too, have had the same kind of problem, knew you had most probably layered it in, and I have absolutely no qualms about doing stuff like that - it is simply overcoming a technical limitation in order to portray the scene as the photographer saw it. The only requirement is that it be done unobtrusively so that it would be "un-noticed" as much as possible. In this regard, I think a bit more smoothing around the circle (so that it all blends more seamlessly) would be a nice finishing touch.
I made a version of this shot using the same technique. I spent a lot of time with HDR and blending layers and such, but those techniques just didn't work. The easiest/best thing to do was take another moon shot and scale it to fit on top. I love your shot, but I just couldn't publicly present my shot with the layered moon effect. It didn't fit with my photographic style.
I don't think the noise in the shadows is bad. It makes it feel a bit painerly. Otherwise, I think if you darkened that section, it would make it easier to blend the noise away. I presume that the noise comes from boosting the shadows a lot.
Marc,
That's pretty much how the moon looked in my shot. I could darken the shadows, but this was a moonlit night and those shadows were not completely devoid of detail.
I have noticed on my plasma display that the noise in the shadows seems less prominent, I presume because the true blacks (ie. areas devoid of detail) really are blacker than they appear on my calibrated monitor.
Try making an hue and saturation adjustment layer and desaturating the noisy areas in the red, magenta, and blue channels. Mask the adjustment and paint it in with a soft edged brush with a low transparency. That will take some of the edge off the chroma noise without eliminating the detail.
As for toning down the moon, take the layer that the moon is in and set the transparency to something like 80%. That will blend it a bit better by allowing some of the bright background layer to slip through.
Hi,
One thing to consider is that your screen should not be too bright. If you measure a white area on your screen and a white paper on your wall they should have similar brightness. A screen that is too bright cause you to print dark.
Best regards
Erik
Obviously the Dell marketing people got a little excitable on this one, but if you read further in the review you will see that they do debunk this absurd claim, albeit in a considerate and non-inflammatory way.
[blockquote]While the DCR obviously worked to some extent, I've no idea where Dell got the figure of 80,000:1 from! ... I don't know where Dell picked this spec from?![/blockquote]
in the conclusion:
[blockquote]The dynamic contrast ratio was nowhere near reaching its supposed specification...[/blockquote]
As the review points out, the LCD color accuracy is good with custom, at home, calibration, even if the Adobe RGB and sRGB presets are substandard.
So after asking the question here about the necessity of LCD calibration, I have read what people have to say and some more reviews and the consensus is that, particularly with the more economically priced 24" displays, calibration does result in substantial improvements in color accuracy and consistency.
It's too bad that they don't have better factory calibration. If factory calibration were done well, I can see them obsoleting calibration equipment. I wonder if the manufacturers are under any pressure to improve their calibration quality. Sometimes early in product development and manufacturing cycle they are still making improvements to quality and yield. I'm not going to hold out too much hope that this will be the case here.
For the same reason having a 12 stop scene range and a 6 stop capture device is problematic. Or a 10,000:1 display contrast ratio trying to soft proof a print that has a 250:1 ratio.
-> (1) Is a 6 bit per channel monitor with a high contrast ratio better than a 6 bit monitor with a low CR?
Depends on your use. Most monitors achieve high contrast ratios by having the maximum luminance so high that you need wear sunglasses. LCD contrast ratio is governed by how little light leaks through when all the filters are active (black level) and how bright the backlight is when the filters are turned off (white level). Hitting 1000:1 or higher contrast ratios may well entail a white luminance of at least 200 cd/m2. That's bright.
Hey! Mark,
We all know here that 6 bits per channel is not ideal for photography. The issue is 'contrast ratio' and any disadvantages a specific and particularly high contrast ratio may have for photography and calibration purposes.
FWIW, the DELL U2410 uses a native 8-bit panel plus (temporal) dithering to obtain 10-bit performance. All but the earliest firmware revisions make the dithering invisible to the eye unless you literally pixel-peep.
(Apparently this unit didn't quite live up to expectations in other aspects, never used one).It was a major disaster at least the first iteration. Many of us were allowed to purchase the new product at a discount to help HP spread the word about this awesome technology. I decided to pass after having a conversation with Karl Lang of PressView and Artisan fame about it before making up my mind. Glad I passed. Those on Mac OS were bitching and moaning on the ColorSync list for months.