Equipment & Techniques > Cameras, Lenses and Shooting gear
How to model lens imperfections for camera profiling?
torger:
Hello camera experts, armchair-bound or not;
I'm developing some new technical camera profiling software already readily usable at http://www.ludd.ltu.se/~torger/dcamprof.html , and I'm currently looking into improving measurement accuracy by modeling how the lens (or other factors) distort the photo of the colorchecker test chart.
I've already made a flatfield algorithm for targets speckled with white patches so one can even out lens vignetting and uneven light (not yet released).
Now I'm considering contrast loss, and/or possibly other lens factors. So far I've assumed that contrast loss is neglible, that is you get a linear reading without contrast loss when you look at the raw file. Therefore I've assumed that the grayscale steppings found in most test targets are a waste, only useful for calibrating non-linear systems.
If it's true I'm fine with that, but I'm not really sure if it is true. A quick comparison between camera raw RGB values and XYZ reference values indicate that I got say 7% contrast loss with the lens (comparing G against Y), that is the span between the darkest and the lightest neutral patch is smaller with the camera than in the spectrometer measured reference values. There could be some other error though.
Lens vignetting is clearly understood and documented, but when it comes to contrast I don't really know where to look. MTF charts only show contrast for details, and there are no small details here, this is about global contrast. If I make a correction for contrast, can I assume that the reproduction between darkest and lightest is linear, and can I assume all three RGB channels are affected equally, or is there some nasty non-linear effects to consider?
eronald:
presumably flare is situation and image related and not quite repeatable
--- Quote from: torger on May 21, 2015, 09:02:03 am ---Hello camera experts, armchair-bound or not;
I'm developing some new technical camera profiling software already readily usable at http://www.ludd.ltu.se/~torger/dcamprof.html , and I'm currently looking into improving measurement accuracy by modeling how the lens (or other factors) distort the photo of the colorchecker test chart.
I've already made a flatfield algorithm for targets speckled with white patches so one can even out lens vignetting and uneven light (not yet released).
Now I'm considering contrast loss, and/or possibly other lens factors. So far I've assumed that contrast loss is neglible, that is you get a linear reading without contrast loss when you look at the raw file. Therefore I've assumed that the grayscale steppings found in most test targets are a waste, only useful for calibrating non-linear systems.
If it's true I'm fine with that, but I'm not really sure if it is true. A quick comparison between camera raw RGB values and XYZ reference values indicate that I got say 7% contrast loss with the lens (comparing G against Y), that is the span between the darkest and the lightest neutral patch is smaller with the camera than in the spectrometer measured reference values. There could be some other error though.
Lens vignetting is clearly understood and documented, but when it comes to contrast I don't really know where to look. MTF charts only show contrast for details, and there are no small details here, this is about global contrast. If I make a correction for contrast, can I assume that the reproduction between darkest and lightest is linear, and can I assume all three RGB channels are affected equally, or is there some nasty non-linear effects to consider?
--- End quote ---
AlterEgo:
--- Quote from: torger on May 21, 2015, 09:02:03 am ---I've already made a flatfield algorithm for targets speckled with white patches so one can even out lens vignetting and uneven light (not yet released).
--- End quote ---
speckled as in custom made targets (where you can do it more or less evenly across the target) or ColorCheckerSG-like (where you have borders only really) or QPCard where the material between patches can be used or is it actually possible to use also non white/grey patches of the measured target to use all the patches for flatfielding - you can try to detect some trend in how illumination is even if they are way non neutral chromacity wise?
people behind RawDigger were absolutely against any such approach instead taking a position that if you do such work than you can do a real flatfielding by using a proper uniform patch instead of your target, covering the whole area of the target & 2nd shot with the same light + camera position + stand position... and then using this 2nd shot data.
NancyP:
Torger, consult some astrophotographers with higher-level processing skills than mine. I have done a vignetting profile on the Samyang 14mm for use in Nebulosity program, but it didn't involve contrast. Vignetting profile was: get large sheet of thin white plexiglass/perspex/whatever. Position lens so it is pointing straight at sun in cloudless sky, with no obstructions in field of view. Hold the perspex parallel to image plane so it fills field of view. Snap 10 or so frames at each f stop you may wish to use. Each f/stop set gets averaged and used as a lens correction template. I did it this way because I was also loading up sets of dark frames at various ambient temperatures in this program. I am sure that there are better ways. I assume contrast correction would involve a different type of target but the same physical procedure, then a different mathematical algorithm (about which I have no clue - that's why you need the experienced astrophotographer).
eronald:
Of course, I have very little understanding of color and theory, but when people sent me images, the issues they thought were bad profiles often turned out to be issues with in-camera flare. A follow-up thought would be that both any profiling target image just like any real-world image will be subject to flare. Profiles directly made from abstract data or monochromator work will not be subject to flare error. Flare is very much image dependent.
Edmund
Navigation
[0] Message Index
[#] Next page
Go to full version