I read some responses to a post on this forum regarding calibrating a Dell 2407WFP monitor. I have the 2408WFP monitor and am confused by some of the responses that were left under that post. Namely, one poster says that the lower you adjust your RGB values the less colors your monitor will display (I believe jani said that). I was just talking with some of the color management gurus over at the Adobe forums and they say this is not true. Adjusting the RGB values of a monitor adjusts the gain of the monitor. This is like adjusting the volume of your stereo. Just as you wouldn't listen to your music with the stereo turned all the way up because of distortion, etc., you're not going to get the optimum viewing experience with your monitor's RGB values set to 100. In other words, by adjusting the RGB values, you're adjusting the strength of the colors, not the amount of colors the monitor can display.
Furthermore, during the calibration process one of the steps involves adjusting the RGB values of your monitor to get a more accurate calibration. Why would we be doing this as part of our calibration process if the very act of adjusting the RGB values would reduce the number of colors displayed by our monitors? Seems counterproductive, does it not?
I was able to get a good calibration of my monitor by reducing my RGB levels to 70%, then fine tuning the levels during calibration in addition to lowering the brightness to 120 cd/m2. I could not have lowered the brightness of my monitor enough to reach 120 cd with the RGB levels all the way at 100 using only the brightness control on my monitor.
If somebody can set the record straight and tell me if I'm misunderstanding something or misinterpreting it, it would be helpful.
Thanks,
Brad