I downloaded the image and looked at in camera raw and noted that it was overexposed and required a large negative exposure correction.
The raw data is absolutely not over exposed
. Are you familiar with ETTR? “Negative Exposure
” (which is what I rendered in the DNG) is exactly the correct way to normalize the ETTR data. It looks over exposed to your default
ACR settings. But like the WB, that’s simply a starting point. It is not necessarily correct or incorrect, accurate or inaccurate. No more than you can look at a color neg and tell us its accurate for a undefined filter pack in an enlarger. The warm condition of the WB, much like the appearance you saw initially is not correct for a proper visual appearance of this image. And that can be changed, the reason we shoot and deal with raw data!
ACR wouldn't let me WB on the white patch.
Well LR did so take that up with Adobe. And you can’t alter Tint/Temp either? Cause I can.
I looked at the image in Rawnalize and the green channel is heavily clipped. That is not a good square for white balance. The other neutral patches are intact and give reasonable WB values of about 4150K. As I suspected, your WB was screwed up. Why didn't you use a nonclipped area for the WB?
Yes, WB is screwed up in that it produces the wrong color appearance. I got a warm appearance because like the initial exposure, that’s not the correct render settings. And it illustrates again, that WB doesn’t
” color. It doesn’t ensure pleasing or desired color (or tone). So much for the simplistic notion that WB fixes all, provides “accurate
” color, a point many of us have tried to illustrate to you for a few days now!
As I understand Melissa, it uses ProPhotoRGB primaries and linear encoding, but reports the RGB values according to an sRGB tone curve. The percentage values can easily be converted to 8 bit notation.
Yes, so what? That’s output referred. You continue to ignore this has nothing to do with scene colorimetry. Suppose we open this in another converter that doesn’t use ProPhoto TRC 1.0 for processing and reports a totally different scale for numbers? How does that in any way tell you about the accuracy of the scene data and thus the capture? It doesn’t.
However, I would simply export as a TIFF in ProPhotoRGB.
Until you can answer the question above about the scene colorimetry, it doesn’t matter what space you use. The export provides a set of numbers. Great. Now are they accurate? Well without providing numbers of the scene, you can’t answer that and haven’t from day 1. This continued discussion is really simple, we don’t need to go into color geekdom. You say WB produces accurate color. You have neither described what that means or how you can prove what is accurate. Example. I can take what you say is accurate neutral RGB values (40/40/40) and convert it to an output color space for an Epson and the numbers will be far different and not identical RGB values. So if I provide you just those numbers,
you can tell us its neutral or not neutral without knowing about the original color space? No, you can’t. If I send 40/40/40 directly to the Epson, is that a neural color on the print? Nope. But wait, 40/40/40 is an accurate neutral value. Not with the limited data provided, looking solely at this one color space. So tell us how values you see in your raw converter are accurate to the scene. Or how to transform the original scene colors into this color space to prove they are accurate.
If we want to talk about the accuracy of a measurement, say the time at this very second, you can use a sundial, your wrist watch or an Atomic clock. We can argue about the better accuracy of the Atomic clock versus your wrist watch. But we’ll be pretty close without splitting hairs. I’m OK being accurate within a 10th of a second here. But we have values we can use to discuss this time accuracy
. We can agree that the sundial may be off X number of minutes and agree to what is a level of accuracy we will accept. We can’t do this with your WB belief system because we don’t even have a value like days, let alone hours or seconds to define accuracy. And we don’t have a method
to even gauge the process (the sun dial, your wrist watch). You just want us to believe in some level of accuracy and disbelieve there is any level of subjectively but you refuse to even define the beginnings of a process. If you don’t like me giving you shit about using the term accuracy
, then define the accuracy much as we could discuss the accuracy of gauging the time of day to the second! If you can’t do that, then fine, say so. We’ll move on and ignore your mangling of the term accuracy
With this approach all this confusion about scene colorimetry is circumvented. If I have an accurate capture of the color checker and print it out using an accurate profile, I should have a good match to the target. If I use a bad white balance, I suspect the match would be poor.
Accurate how compared to what? What instrument with which illuminant should I use to measure the Macbeth which then perfectly matches the scene illuminant using any specified transformation? If I have the actual spectral sensitivities of the chip AND the scene illuminant, I might
be able to do this. But neither you nor I have that. So we’re back to you saying WB produces accurate color with absolutely no way to back it up. Can and will you describe how you came up with this accuracy theory?
Why don't you try using a proper white balance?
What do you mean? You told us earlier to WB on white, (you used clouds as an example) and also said quite clearly: For accurate colors, it is usually best to take a WB reading from a neutral card such as a WhiBal.
Now you’re bringing in exposure (which isn’t over exposed) and suggesting there is something wrong with the white on the Macbeth. But that’s not really worth going over, its your continuing language that WB produces an accurate rendering which has yet to be explained let alone proven. If you care to enlighten us on how this is produced, we can dig into white cards, exposure and the like. We’re far from that point yet.
The others comprise one or two at most.
I see, they then are wrong as I am. That’s your take. Look, I’m sure I speak for the others. If you can come up with a step by step, scientific process and a metric for accuracy, we’re all ears. So far, you continue to demand that WB isn’t subjective but have provided no mythology to prove that is the case. Or that WB isn’t subjective. My DNG illustrates that what I admit is not the proper white to WB on, disproves your simplistic idea and that I can say this from a totally subjective POV (cause I saw the scene and the rendering and its way off). You’ve got a raw file that can have an almost unlimited degree of alteration to the final numbers and rendering based on the sliders in your raw converter. So prove to us how we move em about to get accurate
color, once you define what the accuracy metric is and we you got there. Otherwise, you’re wasting everyone’s time here.