Luminous Landscape Forum

Raw & Post Processing, Printing => Digital Image Processing => Topic started by: Jonathan Wienke on November 22, 2009, 04:19:35 am

Title: New Lens Correction Software
Post by: Jonathan Wienke on November 22, 2009, 04:19:35 am
I've started work on a lens correction program. The idea is to duplicate all of the features of DXO, PTLens, and other similar programs to correct the following:

The main difference between my program and DXO is the handling of lens blur profiles. DXO offers a limited selection of "canned" blur profiles for various camera/lens combinations. If your particular combination isn't on their list, or if your lens behaves differently than the one used to create DXO's profile (either better or worse), you're SOL. I'm designing a method for users to create their own custom blur profiles specific to their own equipment, regardless of how common or obscure it may be. I expect the benefits of this approach to be similar in magnitude to the difference between using "canned" printer profiles compared to custom profiles, especially when using third-party papers.

The proposed workflow at this point is to use Bridge or Lightroom to convert the RAWs to linear DNGs. This demosaics Bayer-matrix images to 16-bit linear RGB, but does not convert the RAW data from the camera color space, so you still have complete flexibility to select a white balance or output color space after the DNG is processed through my program. After being processed through my program, the corrected DNG is saaved and then can be opened in any DNG-compatible image editor (LR, ACR, etc) for final processing.

If there is sufficient interest, future versions of the program may offer bracketed focus/exposure stacking, or possibly panorama stitching capability.

If anyone has feature requests or ideas regarding how you'd like the program to work, please post them here. Thanks in advance.
Title: New Lens Correction Software
Post by: feppe on November 22, 2009, 05:42:15 am
This sounds promising! I've occasionally looked at DXO, but they don't have the camera/lens combos I have, so making my own profiles would be a killer feature.

I think one of the key things for this is to have seamless, easy and quick way to integrate into an existing workflow. DNG sounds like a good start, but batch processing based on EXIF data would take it a step further. Not sure how many of the features can be applied automatically, and how many need manual tweaking for each image, though. This could be batched as well by first running the images through the tweaking dialogs, then doing the actual CPU intensive adjustments on the second pass based on earlier inputs.

I know you already have quite a few features to implement, but I'll propose lens/lighting calibration feature using common color targets. I think this would complement the feature set, making it a pretty full lens and camera calibration suite - only focus calibration left for the hardware end.
Title: New Lens Correction Software
Post by: alain on November 22, 2009, 06:13:01 am

Why restricting you're self to a adobe only format, this seems rather silly to me.

Title: New Lens Correction Software
Post by: feppe on November 22, 2009, 06:17:53 am
Quote from: alain
Why restricting you're self to a adobe only format, this seems rather silly to me.

AFAIK there is no working non-proprietary RAW format. The only one quick googling gets is OpenRAW, but their website hasn't been updated since 2006. While DNG is far from open, and as much as I'd like to see a truly open format free from Adobe's yoke (or any one entity), it's less locked down than CR2 or NEF, for example.
Title: New Lens Correction Software
Post by: alain on November 22, 2009, 07:18:21 am
Quote from: feppe
AFAIK there is no working non-proprietary RAW format. The only one quick googling gets is OpenRAW, but their website hasn't been updated since 2006. While DNG is far from open, and as much as I'd like to see a truly open format free from Adobe's yoke (or any one entity), it's less locked down than CR2 or NEF, for example.

Hi,

Jonathan is using the demosaic data, only the colorspace and white balance are not applied then.   For quite a few corrections those aren't that important.

It would be nice to have a flexible backbone which could be format independed.  Jonathan then could make it possible to have several formats or even a sort of plugin architecture.  I suppose that quite a few makers of RAW software and/or image editors could be buyers.

Bibble labs are using noise ninja for example and do have a plugin architecture.


Alain
Title: New Lens Correction Software
Post by: Jonathan Wienke on November 22, 2009, 10:33:04 am
Quote from: feppe
This sounds promising! I've occasionally looked at DXO, but they don't have the camera/lens combos I have, so making my own profiles would be a killer feature.

I think one of the key things for this is to have seamless, easy and quick way to integrate into an existing workflow. DNG sounds like a good start, but batch processing based on EXIF data would take it a step further. Not sure how many of the features can be applied automatically, and how many need manual tweaking for each image, though.

The design is based heavily on reading the EXIF data to automate adjustment parameters. The most important EXIF data are camera make/model/serial, lens make/model/serial, focal length, and aperture. The data needed to really correct lens blur properly is far too complex to adjust manually

I'm sampling 32-64 points from center to edge of image circle. Each data point has an array of values representing blur amount in various directions and various distances from the sample point, as well as distortion and vignette correction parameters. There are separate arrays for each color channel. The user interface for manually tweaking the data would literally be a screen filled with text boxes or sliders with no room for labels or captions to explain what they all did. The only manual interaction with the program will be to specify which DNG files are to be processed, output folder, filling in data not in EXIF (if you use a lens that doesn't communicate electronically with body), and rise/fall/shift data when using a lens with shift capability.

Quote
Why restricting you're self to a adobe only format, this seems rather silly to me.

DNG is an open file format; the specifications for creating and reading DNG files is freely available, and I can create an application that reads & writes DNG without having to pay license fees to anyone. DNG has already been adopted by several camera manufacturers as their RAW format, and DNGs can be read by many non-adobe programs. Like it or not, it is the closest thing to a universal open RAW format out there.

Quote
Jonathan then could make it possible to have several formats or even a sort of plugin architecture.

I'm keeping the guts of the program separate from the user interface, to make it easier to integrate into a plugin or whatever for a RAW converter or image editor. I'm going to get the standalone version working first before trying to make plugin versions though.

Quote
I know you already have quite a few features to implement, but I'll propose lens/lighting calibration feature using common color targets.

All of the corrections I'm doing take place in the camera's native space, so if you're using custom DNG camera profiles they will work exactly the same whether the file has been run through my program or not, unless your camera lens combination has significant vignetting or lens cast issues (different color balance in the center of image circle vs edge). In that case, you'll want to run your profiling target image through my program before feeding it to Passport or whatever.

The target for profiling the lens corrections will be completely different than a target used for color profiling; it will be an array of regularly-spaced small white dots on a black background. I'm still working on the design.
Title: New Lens Correction Software
Post by: alain on November 22, 2009, 11:36:09 am
Quote from: Jonathan Wienke
...


DNG is an open file format; the specifications for creating and reading DNG files is freely available, and I can create an application that reads & writes DNG without having to pay license fees to anyone. DNG has already been adopted by several camera manufacturers as their RAW format, and DNGs can be read by many non-adobe programs. Like it or not, it is the closest thing to a universal open RAW format out there.

I'm keeping the guts of the program separate from the user interface, to make it easier to integrate into a plugin or whatever for a RAW converter or image editor. I'm going to get the standalone version working first before trying to make plugin versions though.
...

The target for profiling the lens corrections will be completely different than a target used for color profiling; it will be an array of regularly-spaced small white dots on a black background. I'm still working on the design.
Hi Jonathan

If you're keeping in mind that the majority of users isn't using DNG...   I find it strange that you're not using the original RAW information because you record the complete color information...

Remembering that those targets need to be rather large (imatest recommends at least 24" short side, ptlens recommends buildings) I think a mostly white background would be far more economical.  Making a test setup will be some work to and those need space.
Title: New Lens Correction Software
Post by: Jonathan Wienke on November 22, 2009, 10:39:30 pm
Quote from: alain
I find it strange that you're not using the original RAW information because you record the complete color information...

You have no clue what you're talking about here. The linear RGB DNG has all of the original RAW data in it, it just has the two missing interpolated values added to the uninterpolated color channel value. Adding the interpolated values does not destroy or degrade the original uninterpolated values. If you process a linear RGB DNG and the original RAW side by side with the same conversion settings, the results are an exact match. The linear DNG conversion step has zero effect on converted image quality or one's ability to adjust color or tonality.

Quote
Remembering that those targets need to be rather large (imatest recommends at least 24" short side, ptlens recommends buildings) I think a mostly white background would be far more economical.

Unless your lens blurs significantly differently at close focus distances than at infinity, the target does not need to be building-sized. But the target has to be white dots on a black background, or the image analysis routine that generates the PSF data from the target images can't make accurate calculations. There's some fundamental mathematical principles involved that can't be ignored without seriously compromising the results; it's a signal-to-noise ratio issue. Making the target background solid black isn't that big of a deal; you only need one to make all your profiles. If we're talking printing your own target, the cost of paper and ink for a black-background target is not going to be a deal-breaker. Even if the ink cost $15 (highly doubtful), it would be well worth the investment. Ever heard the phrase "penny wise, pound foolish"?
Title: New Lens Correction Software
Post by: alain on November 23, 2009, 01:05:33 pm
Quote from: Jonathan Wienke
You have no clue what you're talking about here. The linear RGB DNG has all of the original RAW data in it, it just has the two missing interpolated values added to the uninterpolated color channel value. Adding the interpolated values does not destroy or degrade the original uninterpolated values. If you process a linear RGB DNG and the original RAW side by side with the same conversion settings, the results are an exact match. The linear DNG conversion step has zero effect on converted image quality or one's ability to adjust color or tonality.



Unless your lens blurs significantly differently at close focus distances than at infinity, the target does not need to be building-sized. But the target has to be white dots on a black background, or the image analysis routine that generates the PSF data from the target images can't make accurate calculations. There's some fundamental mathematical principles involved that can't be ignored without seriously compromising the results; it's a signal-to-noise ratio issue. Making the target background solid black isn't that big of a deal; you only need one to make all your profiles. If we're talking printing your own target, the cost of paper and ink for a black-background target is not going to be a deal-breaker. Even if the ink cost $15 (highly doubtful), it would be well worth the investment. Ever heard the phrase "penny wise, pound foolish"?

The problem is to identify the original pixels and separate them from the interpolated ones which are not to be used, if you're after the maximum resolution.

All info on Barrel, pincushion, and mustache distortion correction that I've seen say that the distance plays a role.  Try shooting a target with a 17mm lens and then also think about the flatness of the target versus the size.  If you need white inside black it's easy to only use small patches black.  Those people that don't have a 24" of even 44" wide printer have maybe access to A0 plotters, but those won't plot a completely black surface.  A 70*100cm photoprint on foamcore is about 75euro's here, but I doubt they will print a completely black one for that price.

Another problem is lighting a complete black surface, I suppose it needs to be very evenly lit without reflections.




Title: New Lens Correction Software
Post by: Piero on November 23, 2009, 04:49:06 pm
So basically you seem to want to extract the point spread function of the lens at different locations across the frame. Then do deconvolution, right ?

You mention in your last post that you do consider that the PSF will not change with focus distance... Have you tested this assumption before going straight to programming ? Because I'm very much afraid that the PSF WILL depend a lot on focus distance.
You shoud test this ! As for targets, I think that setting up your camera in the dark and photographing a led seen through a small aperture  could be a good way to get it.

Do you think you'll have to get different PSFs for different apertures too ? This will be a very long procedure !
Title: New Lens Correction Software
Post by: Misirlou on November 23, 2009, 06:13:58 pm
The version of DxO that came out last week includes a new ability to remove distortions from non-DXO-profiled lens/camera combinations, or so it is written. I'd be surprised if it were much more than a barrel/pincushion stretch, but I haven't tried it yet.

At any rate, Jonathan's effort sounds like it has a lot of potential to me.
Title: New Lens Correction Software
Post by: tived on November 23, 2009, 10:09:02 pm
Hi Jonathan,

this sounds great, when will you have a trail version ready? what are you expecting the program to cost?

keep us informed

Henrik
Title: New Lens Correction Software
Post by: Eric Myrvaagnes on November 24, 2009, 11:34:10 am
Quote from: tived
Hi Jonathan,

this sounds great, when will you have a trail version ready? what are you expecting the program to cost?

keep us informed

Henrik
 Put me on the potential early adopters list, too.

Eric


Title: New Lens Correction Software
Post by: Jonathan Wienke on November 25, 2009, 12:05:39 pm
Quote from: Piero
As for targets, I think that setting up your camera in the dark and photographing a led seen through a small aperture  could be a good way to get it.

Do you think you'll have to get different PSFs for different apertures too ? This will be a very long procedure !

A complete blur profile will need to have at least shots spanning the entire range of focal lengths and apertures, but not necessarily every possible combination. If the EXIF data shows a combination of settings that isn't in the PSF database, the set of PSFs used to process the image will be interpolated from the nearest available data. For example, if you have data for f/4 and f/8, and the shot was taken at f/5.6, then the f/4 and f/8 data will be interpolated to make a custom PSF set for that image. Focal length adds an additional dimension to the algorithm.

I don't doubt that focus distance may affect blur and distortion characteristics to some degree, but my experience is that the focal length (of a zoom lens) and aperture setting have a much greater effect on lens blur than focus distance. For example, my 17-40/4 L lens is not very sharp in the corners at 17mm at any focus distance, but at 40mm it is sharp in the corners regardless of focus distance. I'm focusing (pun intended) on the most significant factors affecting blur first.

As a practical matter, focal length and aperture are pretty much always included in EXIF data, but focus distance is only rarely included. This means that dealing with it would have to be a manual process of entering the focus distance for each image processed.

Profiling a lens won't be too onerous. Set the camera to aperture priority mode, and adjust exposure compensation to so it is exposed to the right but RAW is not clipped. Select a focal length, position the target to fill the frame, and shoot a series of frames covering the entire aperture range. For zoom lenses, increment the focal length, reposition the target, and repeat as needed. When you're done, simply point the program at the folder containing the images, and it will automatically process them and add the resulting PSF datasets to its database for future use. The processing might take a while, but will be completely automated.
Title: New Lens Correction Software
Post by: Jonathan Wienke on November 25, 2009, 12:34:01 pm
Quote from: alain
The problem is to identify the original pixels and separate them from the interpolated ones which are not to be used, if you're after the maximum resolution.

I'm designing the program to work on target dots that are 10 pixels in diameter or larger. This avoids the issue of Bayer interpolation problems, and also allows PSF data to be calculated at any arbitrary angle from the source pixel--something that cannot be done if the target dot is a single pixel point source.

Quote
If you need white inside black it's easy to only use small patches black.  Those people that don't have a 24" of even 44" wide printer have maybe access to A0 plotters, but those won't plot a completely black surface.  A 70*100cm photoprint on foamcore is about 75euro's here, but I doubt they will print a completely black one for that price.

Small black patches around the white dots is no good; it limits the number of target dots the image can contain, and having large white areas will reduce contrast and skew the results. The farther the PSF analyzer can process image data from target spot before reaching the noise floor, the better the PSF will be able to compensate for diffraction and other large-radius blur and contrast reduction phenomena.

I don't see why a 97% black print would be a problem. It won't use much more ink than a normal photo printed the same size; it will just use the dark black ink exclusively instead of a mix of color inks. The cost difference is not the big deal you seem to think it is. I'd definitely recommend a matte surface to prevent stray reflections from affecting the PSF calculations.

Quote
Another problem is lighting a complete black surface, I suppose it needs to be very evenly lit without reflections.

Yes, just like shooting any other target, whether for color profiling or whatever. It will need to be as evenly lit as possible, and as perpendicular to the camera as possible, and rigid and flat.

Quote
this sounds great, when will you have a trail version ready? what are you expecting the program to cost?

I doubt a full-featured beta version will be ready until February or March of next year. As to cost, I'm envisioning something in the vicinity of $40, assuming the user prints their own target. I figure at that price it will be much easier for casual shooters to justify than the $199 or whatever DXO is charging, and I'll sell enough additional units to make up the price difference. Obviously, beta will be free to use, but time limited.
Title: New Lens Correction Software
Post by: Brad Proctor on November 25, 2009, 02:27:30 pm
Quote from: alain
If you're keeping in mind that the majority of users isn't using DNG...   I find it strange that you're not using the original RAW information because you record the complete color information...

Using DNG seems to be the most logical solution to me.  The DNG is the original RAW information, just stored in a different format.
Title: New Lens Correction Software
Post by: deejjjaaaa on November 25, 2009, 02:39:20 pm
Quote from: bproctor
Using DNG seems to be the most logical solution to me.  The DNG is the original RAW information, just stored in a different format.

OFFTOPIC

no it is not as a matter of how it is obtained ... read this thread - http://forums.adobe.com/thread/528900?tstart=0 (http://forums.adobe.com/thread/528900?tstart=0) = DNG files converted by Adobe's own DNG converter from original raw files do not have all the original information... DNG converters just strips some... who knows what it will strip tomorrow w/o much public fanfare.
Title: New Lens Correction Software
Post by: Jonathan Wienke on November 25, 2009, 03:17:50 pm
Deja, you are totally wrong. My read of that thread is that the initial "unofficial support" of the 7D didn't convert all of the metadata from the Canon RAW to DNG correctly, but the issue was fixed when Adobe updated Camera RAW and DNG Converter. The RAW image data was not affected, only the metadata. The sky isn't falling...
Title: New Lens Correction Software
Post by: deejjjaaaa on November 25, 2009, 03:44:20 pm
Quote from: Jonathan Wienke
Deja, you are totally wrong. My read of that thread is that the initial "unofficial support" of the 7D didn't convert all of the metadata from the Canon RAW to DNG correctly, but the issue was fixed when Adobe updated Camera RAW and DNG Converter. The RAW image data was not affected, only the metadata. The sky isn't falling...

no, I am not and sky is in fact falling as usual - please read what I am referring to :

was that PEF to DNG issue w/ DNG converter addressed by Adobe ?

http://forums.dpreview.com/...forums/read....essage=32904790 (http://forums.dpreview.com/...forums/read.asp?forum=1036&message=32904790)


"...The program will not work with DNG files converted from PEF's by the current version of the Adobe DNG Converter application due to this program stripping out the necessary black masked-to-light photosites at the right and bottom borders of the sensor in landscape orientation which are used by the correction algorithm..."

if the claim is correct then people who converted PEF to DNG lost in fact some "raw data", right ?

---

reply by the author of  Rawnalyze ( http://www.cryptobola.com/PhotoBola/Rawnalyze.htm (http://www.cryptobola.com/PhotoBola/Rawnalyze.htm) )

That conversion of K20 PEF files is still erroneous (with 5.6). This is an example for the necessity to keep the original raw file: not only that the conversion is wrong, but it removes data from the "image" (the masked area). This should never happen.

and

This attitude of Adobe we know it better what is useful and what is not comes up time and again. For example the masked area is removed from Nikon D90 and D300 files as well; although the pixel values of the image are black level corrected already, still that data should not be removed. The fact, that one doesn't know any usefulnes of that data is not the same that there can be no use of it.

---

Adobe DNG converter strips the raw data that can be used by other software (see the ref'd thread @ dpreview about the program written by GordonBGood for Pentax raw files @ high ISO).

Do you object ? Does DNG converter irreversibly strips the data during conversion or not ? Very simple question.
Title: New Lens Correction Software
Post by: Brad Proctor on November 25, 2009, 04:03:15 pm
Quote from: deja
no, I am not and sky is in fact falling as usual - please read what I am referring to :

was that PEF to DNG issue w/ DNG converter addressed by Adobe ?

http://forums.dpreview.com/...forums/read....essage=32904790 (http://forums.dpreview.com/...forums/read.asp?forum=1036&message=32904790)


"...The program will not work with DNG files converted from PEF's by the current version of the Adobe DNG Converter application due to this program stripping out the necessary black masked-to-light photosites at the right and bottom borders of the sensor in landscape orientation which are used by the correction algorithm..."

if the claim is correct then people who converted PEF to DNG lost in fact some "raw data", right ?

---

reply by the author of  Rawnalyze ( http://www.cryptobola.com/PhotoBola/Rawnalyze.htm (http://www.cryptobola.com/PhotoBola/Rawnalyze.htm) )

That conversion of K20 PEF files is still erroneous (with 5.6). This is an example for the necessity to keep the original raw file: not only that the conversion is wrong, but it removes data from the "image" (the masked area). This should never happen.

and

This attitude of Adobe we know it better what is useful and what is not comes up time and again. For example the masked area is removed from Nikon D90 and D300 files as well; although the pixel values of the image are black level corrected already, still that data should not be removed. The fact, that one doesn't know any usefulnes of that data is not the same that there can be no use of it.

---

Adobe DNG converter strips the raw data that can be used by other software (see the ref'd thread @ dpreview about the program written by GordonBGood for Pentax raw files @ high ISO).

Do you object ? Does DNG converter irreversibly strips the data during conversion or not ? Very simple question.

I've got a simple solution:  Don't use Jonathan's program.  The rest of us will remain interested and supportive of his efforts.
Title: New Lens Correction Software
Post by: deejjjaaaa on November 25, 2009, 04:19:13 pm
Quote from: bproctor
I've got a simple solution:  Don't use Jonathan's program.  The rest of us will remain interested and supportive of his efforts.
this is not about Jonathan's program, if you did not understand - this is about DNG converter and claims that nothing is lost... you just do not know what else was lost by people who did not save their original raw files, is lost now by the same people or will be lost in future, that's it
Title: New Lens Correction Software
Post by: Jonathan Wienke on November 25, 2009, 06:53:03 pm
Quote from: deja
no, I am not and sky is in fact falling as usual - please read what I am referring to :

...

Do you object ? Does DNG converter irreversibly strips the data during conversion or not ? Very simple question.

Citing bugs specific to converting the RAWs from a few particular camera models to DNGs does not mean the entire DNG concept is a bad idea or that the "no RAW data is lost when converting to DNG" principle is generally false. Given the hundreds of undocumented, proprietary RAW formats Adobe has had to reverse-engineer to get DNG to where it is now, what's surprising is that such glitches aren't far more common than they are.

I am a single individual, and do not have the time or inclination to learn how to properly read hundreds of different undocumented and proprietary RAW formats. DNG allows me to shift my focus as a programmer from dealing with RAW format hell to the actual core functionality of the program--correcting lens aberrations and distortions. If you have any realistic suggestions for alternative input file formats that will allow me to continue focusing on the actual program instead of properly parsing hundreds of different input file formats (which would probably put YOUR camera on the "not supported" list), I'm all ears. But if not, STFU and quit wasting my time and LL's bandwidth. DNG may not be a perfect solution, but IMO it's telling that the DNG denigrators have yet to offer a realistic alternative input file format...

And BTW, converting to DNG doesn't mean you need to erase or alter the original RAW. So if a particular version of DNG Converter doesn't convert said file properly, and the bug is fixed in a later version, why yes, the improper conversion IS reversible--simply re-convert the RAW with the new version of ACR or DNG converter. Very simple answer.
Title: New Lens Correction Software
Post by: deejjjaaaa on November 26, 2009, 11:37:10 am
Quote from: Jonathan Wienke
Citing bugs specific to converting the RAWs from a few particular camera models to DNGs

how do you know that it is few ? did you test the rest ?

Quote from: Jonathan Wienke
does not mean the entire DNG concept is a bad idea

communism is a nice idea too... theoretically.

Quote from: Jonathan Wienke
or that the "no RAW data is lost when converting to DNG" principle is generally false.

well, the problem is that implementation was always flawed before, is flawed now and still Adobe along w/ some DNG fans are claiming that conversion does not miss anything... while it the real life DNG conversions are losing the data and you just do not know what else is lost as it is a closed source.


Quote from: Jonathan Wienke
Given the hundreds of undocumented, proprietary RAW formats Adobe has had to reverse-engineer to get DNG to where it is now, what's surprising is that such glitches aren't far more common than they are.

well, that is one reason why people should stay away from a buggy software like DNG converter

Quote from: Jonathan Wienke
I am a single individual, and do not have the time or inclination to learn how to properly read hundreds of different undocumented and proprietary RAW formats. DNG allows me to shift my focus as a programmer from dealing with RAW format hell to the actual core functionality of the program--correcting lens aberrations and distortions. If you have any realistic suggestions for alternative input file formats that will allow me to continue focusing on the actual program instead of properly parsing hundreds of different input file formats (which would probably put YOUR camera on the "not supported" list), I'm all ears. But if not, STFU and quit wasting my time and LL's bandwidth. DNG may not be a perfect solution, but IMO it's telling that the DNG denigrators have yet to offer a realistic alternative input file format...

the source code of dcraw is open, it is mirrored and enhanced by http://www.libraw.org (http://www.libraw.org)


Quote from: Jonathan Wienke
And BTW, converting to DNG doesn't mean you need to erase or alter the original RAW. So if a particular version of DNG Converter doesn't convert said file properly, and the bug is fixed in a later version, why yes, the improper conversion IS reversible--simply re-convert the RAW with the new version of ACR or DNG converter. Very simple answer.

well, you in fact just do not know if DNG converter converts it properly or not, so you never should erase the original raw... not yesterday, not today, not tomorrow... which simply means that DNG is unsuitable for archiving, unless you are archiving the original raw file as well.

Title: New Lens Correction Software
Post by: Jonathan Wienke on November 26, 2009, 07:29:39 pm
Quote from: deja
how do you know that it is few ? did you test the rest ?

It works perfectly for every camera I've tried; 4 Canon camera models, 4 or 5 Nikon models, a Hasselblad MFDB, and a couple of digicams.

Quote
problem is that implementation was always flawed before, is flawed now and still Adobe along w/ some DNG fans are claiming that conversion does not miss anything... while it the real life DNG conversions are losing the data and you just do not know what else is lost as it is a closed source.

You are full of crap. On the cameras I've tested, there is zero difference between converting the original RAW and converting a DNG; comparing converted images gives a pixel-for-pixel match. If RAW data was being lost, there would be a detectable difference somewhere. And DNG is not closed source; you can download all of the specifications as well as source code needed to read and write DNG files fro free from Adobe. If you have questions about what is happening to the data, you have the ability to look at the code and see exactly what it is doing to your images.

Quote
well, that is one reason why people should stay away from a buggy software like DNG converter

It works just fine for most of the cameras out there, or people wouldn't be using Adobe software.

Quote
the source code of dcraw is open, it is mirrored and enhanced by http://www.libraw.org (http://www.libraw.org)

Thanks, but no thanks. It's not as easy to integrate into external projects, and it would require me to update my software every time a new camera is released. With the freely downloadable DNG SDK, I only need to update the file parsing code when a new version of the DNG spec is released, which is far less frequent than the release of new cameras.

Quote
well, you in fact just do not know if DNG converter converts it properly or not, so you never should erase the original raw... not yesterday, not today, not tomorrow... which simply means that DNG is unsuitable for archiving, unless you are archiving the original raw file as well.

Only for the few cameras that don't get converted properly. Whenever you do a file conversion or any sort of copying, you should always verify the copied/converted files are good before deleting the originals. It's not that hard to do. If you're really paranoid, you can step through the operation of the DNG SDK source code and verify with whatever level of detail you desire how correctly your RAWs are being converted. The fact is, I've tested numerous cameras from several different manufacturers, and had zero problems with DNG. BTW, the Library of Congress disagrees with you, and recommends DNG for long-term image archiving.

I'm done discussing this subject with you. Your arguments are based on fearmongering and falsehoods, and you don't have any constructive alternative input file format suggestion to offer. You are hereby added to my ignore list.
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 06, 2009, 02:41:43 pm
Getting back to the original focus of the thread, I've been working on the interpolation algorithms used to correct barrel/pincushion distortion and chromatic aberrations. I posted a demonstration program that allows you to open a JPEG, TIF, or BMP file and view it rescaled from 6-6400% of its original size. It's just a tech demo, so it doesn't have any of the following features:


The program DOES do the following:

Here is a screen shot:
(http://visual-vacations.com/media/PixelClarity.jpg)

You can download the ZIPped setup folder here:
http://visual-vacations.com/media/PixelClarity.zip (http://visual-vacations.com/media/PixelClarity.zip)

You'll need the latest .Net framework on your machine for this to work.

Known issues:
Error on startup due to a missing database file. Click continue and all should be well. The missing database will eventually be used to store PSF data.

Some very minor aliasing is sometimes visible at magnifications around 50%.

I'm looking for feedback on the quality of the interpolation. I've designed things to maximize sharpness and minimize aliasing, jaggies, and other artifacts. Please let me know how well you think I've achieved these goals, and why or why not. Thanks in advance!
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 07, 2009, 08:17:40 pm
I've fixed the missing database file error, and a few small bugs in the interpolation code. I also added a check box so the barrel/pincushion adjustments are optional.

The download link is the same, simply download the updated ZIP file, extract, and then run the new installer.
Title: New Lens Correction Software
Post by: stewarthemley on December 08, 2009, 03:48:16 am
Continue to ignore the negative comments, Jonathan. If you get it to work it will be a worthwhile program. Is it Mac or PC?
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 08, 2009, 09:25:41 am
Quote from: stewarthemley
Continue to ignore the negative comments, Jonathan. If you get it to work it will be a worthwhile program. Is it Mac or PC?

PC for now. Have you downloaded the interpolation demo?
Title: New Lens Correction Software
Post by: Jeremy Payne on December 08, 2009, 09:35:24 am
Quote from: Jonathan Wienke
I've fixed the missing database file error, and a few small bugs in the interpolation code. I also added a check box so the barrel/pincushion adjustments are optional.

The download link is the same, simply download the updated ZIP file, extract, and then run the new installer.

It worked for me on both Vista 64/Business and W7/RC ... I didn't get to rigorously compare to other interpolations, but will compare against CS4 on my nice monitor tonight.
Title: New Lens Correction Software
Post by: stewarthemley on December 08, 2009, 10:45:31 am
Quote from: Jonathan Wienke
PC for now. Have you downloaded the interpolation demo?

No, I'm Mac. Guess I'll have to be patient.
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 08, 2009, 12:00:11 pm
Quote from: stewarthemley
No, I'm Mac. Guess I'll have to be patient.

Or you could try one of the Windows-on-a-Mac options...

I uploaded a new version that decreases memory requirements and gracefully handles out-of-memory errors that may occur if you open a very large image file. The link is the same: http://www.visual-vacations.com/media/PixelClarity.zip (http://www.visual-vacations.com/media/PixelClarity.zip)
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 08, 2009, 10:26:55 pm
I'm shifting focus to the actual deconvolution stuff now, so it will probably be a while before I post any more updates. But in the meantime, if anyone could post feedback on the interpolation, I'd appreciate it.
Title: New Lens Correction Software
Post by: minnend on December 09, 2009, 02:27:49 am
Quote from: Jonathan Wienke
I'm shifting focus to the actual deconvolution stuff now, so it will probably be a while before I post any more updates. But in the meantime, if anyone could post feedback on the interpolation, I'd appreciate it.

Hi Jon.  Good luck with your program!  It's quite ambitious but would be a very useful tool.  I'd love to see you open-source the project and/or collaborate with other projects that have already addressed some of these challenges (hugin comes to mind).

I grabbed your program and ran it on my XP box.  It appears to work properly and the interpolation looks good.  I compared to results from imagemagick using the mitchell and lanczos filters.  Honestly, it's hard to tell a difference.  I'm generally of the opinion that box and bilinear filters are bad, but once you get passed that level of complexity, you immediately enter the realm of (rapidly) diminishing returns, especially for a general purpose filter.

The one major distinguishing factor is that your program was very slow.  I'm running a Core2 2.1Ghz machine so not the fastest but hardly a slouch.  I tested with an 800x533 image and it took several seconds to interpolate.  Imagemagick is *not* known for its speed but it was significantly quicker on the same image.  I'm assuming you're focusing on quality and not speed / optimization at this point, but I thought I would be remiss to not mention it.

I'm interested in which approach to deconvolution you're planning to implement.  There's been some *very* interesting research on single-image blind deconvolution lately.  I know you're not going that route from the previous discussion, but you may want to search for those papers if only because they're fascinating.  Are you aiming for spatially-variant PSF estimation?  You did mention a target with multiple point sources, but I wasn't sure if that was to help get a more robust single PSF estimate or if you wanted multiple estimates across the image.  If the latter, are you thinking of region-based deconvolution or will you interpolate the PSF for fully continuous variation?  Finally, there's the algorithm itself... RL?  It seems to add too much ringing unless your PSF is *perfect*.  Again, there's been some nice research in recent years (Siggraph, CVPR, etc.) on using natural priors, edge-preserving filters, and multiscale methods to improve the results, sometimes dramatically.
Title: New Lens Correction Software
Post by: Jeremy Payne on December 09, 2009, 03:24:02 pm
Quote from: Jeremy Payne
It worked for me on both Vista 64/Business and W7/RC ... I didn't get to rigorously compare to other interpolations, but will compare against CS4 on my nice monitor tonight.
I compared it to the default bicubic interpolation in the main CS4 window.

At 50%, it was hard to have a preference ... the lack of color management made it harder, but I might give CS4 a small edge.

At 200%, again - hard to pick, but I would say here there was more of a clear edge to CS4.

At 400%, they are VERY different.  I think I prefer yours - looks a bit more natural and certain less "pixely".

Hope that helps ... I could devise a more rigorous set of tests ... but if you have a test script in mind, I'd do some more ...
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 09, 2009, 09:07:32 pm
Yes, it is slow; the design is biased more toward quality than performance. That said, I've been working on ways to speed things up without compromising output quality. For upsizing, I'm using a cubic spline based algorithm, but it uses splines going vertically, horizontally, and crisscrossed diagonally to reduce the appearance of pixelation and jaggies. Part of the reason for this approach is to build something that should work well for Bayer interpolation, so that I have an alternative to the interpolation done by ACR. For downsizing, I'm using a weighted-averaging scheme tuned to maximize detail without crossing the line into aliasing. I have run across some instances where the jaggie suppression isn't working right on heavily sharpened images or text that isn't anti-aliased, so I'll probably chase that down and beat it into submission before shifting gears.

Deconvolution is based on an array of PSFs labeled by camera, lens, focal length, and distance from what I'm calling the "logical center" of the image. When using a non-shift lens, the logical center and the physical center of the image are the same. But when using a shift lens, the logical center moves away from the physical center of the image in the direction and amount of shift. Each PSF is a set of splines. Each spline is tagged with an angle (deviation from logical center), and points on the spline are tagged with a distance from the "master pixel" and a  percentage of signal from the "master pixel" that is expected to spill over into a "blur pixel" at that angle/distance. PSF data is generated by analyzing an image of a target consisting of small white dots (or possibly small light sources) on a black background, arranged in a rectangular grid so that the distortion, CA and vignetting characteristics of the lens can be analysed as well as blur.

Deconvolution is a two-step process:

During deconvolution, each master pixel has a custom PSF interpolated from the nearest PSFs stored in the database. Not necessarily EVERY pixel, but interpolation is done often enough to avoid any noticeable borders or changes in the image where deconvolution switches from one PSF to another.

I suppose there's already a name for this general algorithm, but right now I don't know what it is.
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 09, 2009, 09:32:12 pm
Quote from: Jeremy Payne
I compared it to the default bicubic interpolation in the main CS4 window.

At 50%, it was hard to have a preference ... the lack of color management made it harder, but I might give CS4 a small edge.

At 200%, again - hard to pick, but I would say here there was more of a clear edge to CS4.

At 400%, they are VERY different.  I think I prefer yours - looks a bit more natural and certain less "pixely".

Hope that helps ... I could devise a more rigorous set of tests ... but if you have a test script in mind, I'd do some more ...

That is helpful. When downsizing, I'm trying to retain as much detail as possible without aliasing, and when upsizing, I'm trying to maintain maximum sharpness, detail, and contrast without causing halos or clipping, and to give heavily upsized areas a smooth, natural-looking, "out of focus" appearance without any obvious pixel-based artifacts. The goal is to be able to go all the way to 6400% without getting any "digital looking" artifacts. It's not quite there yet, but fairly close.

The interpolation has to be able to handle upsizing and downsizing simultaneously and seamlessly. With barrel distortion, pixels that are halfway between the center and corners need to be moved toward the center, so when correcting this, the middle of the image is being downsized (pixels packed more closely together) and the edge of the image is being upsized (pixels stretched farther apart).
Title: New Lens Correction Software
Post by: ejmartin on December 14, 2009, 08:48:43 am
Quote from: Jonathan Wienke
I'm shifting focus to the actual deconvolution stuff now, so it will probably be a while before I post any more updates. But in the meantime, if anyone could post feedback on the interpolation, I'd appreciate it.


You might be interested in Bart van der Wolf's investigations of downsampling methods if you weren't already aware:

http://www.xs4all.nl/~bvdwolf/main/foto/do...le/example1.htm (http://www.xs4all.nl/~bvdwolf/main/foto/down_sample/example1.htm)
http://www.xs4all.nl/~bvdwolf/main/foto/do...down_sample.htm (http://www.xs4all.nl/~bvdwolf/main/foto/down_sample/down_sample.htm)

Also there is some good discussion in the IM webpages:

http://www.imagemagick.org/Usage/resize/ (http://www.imagemagick.org/Usage/resize/)
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 14, 2009, 10:19:15 am
Interesting stuff in IM's web page. What I'm doing for downsizing is a heavily modified box filter; if a pixel falls completely within the "box" it contributes fully to the box value, but if it intersects the edge of the box, then the pixel's value is split between the adjacent boxes. I'm doing a bit of weighting so that if a pixel is not perfectly centered on the edge of the box (which would evenly split the pixel value between boxes) the split gets exaggerated somewhat, so that a 60/40 split might get increased to ~70/30. By tuning the "exaggeration factor", you can significantly increase sharpness without causing too much aliasing, eliminating the need for a separate sharpening step.

For upsizing, I'm using a modified natural cubic spline function. Each pixel is a "knot" for splines running vertically, horizontally, and diagonally. To interpolate a pixel, I'm doing something similar to bilinear interpolation, except I'm blending the spline values from the 4 surrounding pixels instead of the pixel values themselves, and I'm blending the diagonal spline values as well as horizontal/vertical. I'm still fine-tuning the blending function to give the most natural "out-of-focus" look to heavy enlargement and the least jaggies and other obvious pixel-based artifacts.

It's nice to see that I'm not really "reinventing the wheel" all that much...
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 20, 2009, 07:50:40 pm
I uploaded a new version with minor tweaks to interpolation and major changes to the under-the-hood design to significantly reduce memory use and speed things up a bit. It's still not super-fast, but will handle much larger image files before running out of memory.
Title: New Lens Correction Software
Post by: Bart_van_der_Wolf on December 26, 2009, 08:29:44 am
Quote from: Jonathan Wienke
I uploaded a new version with minor tweaks to interpolation and major changes to the under-the-hood design to significantly reduce memory use and speed things up a bit. It's still not super-fast, but will handle much larger image files before running out of memory.

Hi Jonathan,

First of all, thanks for the initiative and for making the first trial available. I wanted to give your software a try with my torture test (http://www.xs4all.nl/~bvdwolf/main/foto/down_sample/down_sample.htm (http://www.xs4all.nl/~bvdwolf/main/foto/down_sample/down_sample.htm)). Unfortunately I ran into a problem with your 0_0_1_8 version, errors referring to DotNet at startup but I have the latest ones installed (.Net Framework V1.1,  and 3.5 SP1), and I have no complaints from other software (including Visual Studio). I can't find a reference to version 2 being installed any more, is that what your application is depending on?

If you want to try and clear the issue, feel free to send me a PM so we don't clutter this thread.

Cheers,
Bart
Title: New Lens Correction Software
Post by: Guillermo Luijk on December 26, 2009, 06:14:46 pm

Hi Bart, I was reading at your resizing tests, and from my experience non-RGB mode image files are always rescaled using nearest neighbour, no matter the algorithm selected, so I think it is not worth even considering or mentioning PS resizing capabilities in non-RGB mode.

Regards
Title: New Lens Correction Software
Post by: Bart_van_der_Wolf on December 26, 2009, 06:49:40 pm
Quote from: Guillermo Luijk
Hi Bart, I was reading at your resizing tests, and from my experience non-RGB mode image files are always rescaled using nearest neighbour, no matter the algorithm selected, so I think it is not worth even considering or mentioning PS resizing capabilities in non-RGB mode.

Hi Guillermo,

Thanks for the observation. At the time (in 2004) I was not aware of the aberrant behavior of Photoshop, since then I know better. I thought it important enough to mention, because I usually allow others to verify/duplicate my findings when practical. For those who would like to verify my findings, it is important to mention the caveats (in this case shortcomings of Photoshop), to avoid confusion. That's why I pointed it out instead of avoiding it.

It is also important to understand that Photoshop doesn't offer anything better than bicubic resampling for downsampling (which is a common procedure for web publishing), while it's been known for some time now that e.g. a Sinc based downsampling filter provides better results. Downsampling offers a different Digital Signal Processing (DSP) challenge than upsampling does.

That's why I wanted to see how Jonathan's adaptive method does, expecting more than necessary aliasing artifacts, but unfortunately I was unable to verify it.

Cheers,
Bart
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 26, 2009, 07:12:31 pm
I updated the installer and tweaked the upsizing algorithm a bit. Perhaps the new version will fix the file error, please try it and let me know.
Title: New Lens Correction Software
Post by: Bart_van_der_Wolf on December 27, 2009, 06:29:42 am
Quote from: Jonathan Wienke
I've fixed the missing database file error, and a few small bugs in the interpolation code. I also added a check box so the barrel/pincushion adjustments are optional.

The download link is the same, simply download the updated ZIP file, extract, and then run the new installer.

Hi Jonathan,

Thanks for the update. I can get this 0_0_1_9 release to work, but I have to ignore an error message when I start it:
[attachment=18915:JW_PixelClarity.png]
Setup installed it in the "c:\Users\Bart\AppData\Local\Apps\2.0\..." directory (Vista Ultimate 64-bit).


I'll do some testing with images and let you know my findings.

Cheers,
Bart
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 27, 2009, 02:39:05 pm
Quote from: BartvanderWolf
Hi Jonathan,

Thanks for the update. I can get this 0_0_1_9 release to work, but I have to ignore an error message when I start it:

For some reason, you're missing the ADODB database DLLs needed to open the PSF database file. It's not really being used right now, but will be needed for deconvolution.
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 28, 2009, 03:31:13 am
I updated the installer again; I made a few more bug fixes and changed the comparison interpolation to nearest neighbor. This makes it easier to tell whether the spline interpolation is clipping/ringing/haloing or if the artifacts are in the original image.

http://www.visual-vacations.com/media/PixelClarity.zip (http://www.visual-vacations.com/media/PixelClarity.zip)
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 28, 2009, 09:56:55 pm
Yet another update which tunes the blending between the upsized and downsized areas, and should also fix the database file open error issue. Bart, if you could re-download and see if you still get the startup error, I'd appreciate it.
Title: New Lens Correction Software
Post by: Bart_van_der_Wolf on December 29, 2009, 09:24:59 pm
Quote from: Jonathan Wienke
Yet another update which tunes the blending between the upsized and downsized areas, and should also fix the database file open error issue. Bart, if you could re-download and see if you still get the startup error, I'd appreciate it.

Jonathan,

Thanks for looking into apparently my specific issue (I haven't heard anybody else mention it). Unfortunately the same error box appears (with version 0_0_1_12). The application does start and it does its resampling. The downsampling generates aliasing errors, but for images with less critical content it does produce crisp results. The upsampling looks a bit gritty (not smooth) around edges, it seems (if I recall correctly) that the earlier versions were smoother in the upsampling.

Cheers,
Bart
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 30, 2009, 09:07:53 am
Quote from: BartvanderWolf
The downsampling generates aliasing errors, but for images with less critical content it does produce crisp results. The upsampling looks a bit gritty (not smooth) around edges, it seems (if I recall correctly) that the earlier versions were smoother in the upsampling.

Can you post a sample of the "grittiness" you're referring to? I'm not quite sure what you're referring to.

The aliasing is there, especially between 50% and 100%, but it's a trade-off between aliasing and sharpness, so I biased the parameters more toward sharpness than aliasing suppression. Perhaps I should add a sharpness or aliasing prevention control.

To resolve your startup problem, you may need to download and reinstall the MS Jet database engine. Try this link:
http://www.microsoft.com/downloads/details...;displaylang=en (http://www.microsoft.com/downloads/details.aspx?familyid=2deddec4-350e-4cd0-a12a-d7f70a153156&displaylang=en)
Title: New Lens Correction Software
Post by: Bart_van_der_Wolf on December 31, 2009, 09:16:04 am
Quote from: Jonathan Wienke
Can you post a sample of the "grittiness" you're referring to? I'm not quite sure what you're referring to.

Hi Jonathan,

The grittiness is caused by pixelization/blocking artifacts. Here's an example based on my 'torture' testtarget:
[attachment=19049:PC_02.png]
Aliasing/moiré is visible with downsampled images, it is just a matter of luck whether the image detail can mask it:
[attachment=19050:PC_01.png]

Quote
The aliasing is there, especially between 50% and 100%, but it's a trade-off between aliasing and sharpness, so I biased the parameters more toward sharpness than aliasing suppression. Perhaps I should add a sharpness or aliasing prevention control.

In general I think it's better to do the downsizing properly (i.e. after removing high spatial frequency content with a filter), and add an option for sharpening the result.

Quote
To resolve your startup problem, you may need to download and reinstall the MS Jet database engine. Try this link:
http://www.microsoft.com/downloads/details...;displaylang=en (http://www.microsoft.com/downloads/details.aspx?familyid=2deddec4-350e-4cd0-a12a-d7f70a153156&displaylang=en)

I looked at that, but it seems to address a Windows XP problem, I'm not going to try that on my Vista Ultimate 64-bit version unless there's no other way. The missing DLL seems to be a part of Visual Studio 6 and Office 2003. I'm not sure how that will play out with newer computer systems (e.g. Win 7 and Office 2007/2010), perhaps there is an alternative DLL that can be invoked? Just thinking aloud, to future proof your application. Thanks for your suggestions anyway.

Cheers, and a happy new year,
Bart
Title: New Lens Correction Software
Post by: ejmartin on December 31, 2009, 12:48:11 pm
Bart, what are we looking at in these two sets of side-by-side images?
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 31, 2009, 06:56:50 pm
Quote from: ejmartin
Bart, what are we looking at in these two sets of side-by-side images?

The image on the left is resampled with my resampling algorithms, and the one on the right is using a simple nearest-neighbor algorithm. I use the nearest-neighbor simply for comparison, it makes it easier to see if the "real" upsizing algorithm is having issues with ringing or clipping, or if those things are present in the original image and therefore not the fault of the resizing. For downsizing, the nearest-neighbor version offers a comparison to see how well the downsizing algorithm is filtering out aliasing artifacts.

The image on the left is the one to compare to other resampling algorithms, NOT the one on the right. "Grittiness" and aliasing are to be expected on the right side, my only concern is if you find that sort of thing on the left side.
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 31, 2009, 07:04:59 pm
Quote from: BartvanderWolf
I looked at that, but it seems to address a Windows XP problem, I'm not going to try that on my Vista Ultimate 64-bit version unless there's no other way. The missing DLL seems to be a part of Visual Studio 6 and Office 2003. I'm not sure how that will play out with newer computer systems (e.g. Win 7 and Office 2007/2010), perhaps there is an alternative DLL that can be invoked? Just thinking aloud, to future proof your application. Thanks for your suggestions anyway.

The DLL coexists peacefully with Office 2007 on 32-bit Vista, so I doubt you'd have issues. It's an older MS database engine (Jet/ADODB), but the newer one that is supposed to replace it is slower, has fewer capabilities, AND is harder to code, so there's been considerable resistance to switching--even more than the resistance to giving up XP in favor of Vista. It's still supported in Office 2007 and VB.Net, so I doubt it is going to go away any time soon.

BTW, I've been using your ring images for torture testing and fine-tuning the balance between aliasing and sharpness.
Title: New Lens Correction Software
Post by: jjlphoto on December 31, 2009, 08:14:33 pm
I read some articles about converting the file to a custom RGB space with a gamma of 1 when you are doing serious interpolation. Keeps color artifacts in check. If you are using a PS 32 bit file, it will already be in a gamma 1 space, but for others, you have to do it manually.
Title: New Lens Correction Software
Post by: Bart_van_der_Wolf on December 31, 2009, 08:56:35 pm
Quote from: Jonathan Wienke
The image on the left is resampled with my resampling algorithms, and the one on the right is using a simple nearest-neighbor algorithm.

Ah, thanks for clarifying that! Must have somehow missed that info, I thought it was the other way around.

In that case you'll notice that the edge pixels need special treatment by adding enough virtual pixels to accommodate the filter's support size. Personally I prefer mirrored virtual pixels.

Cheers,
Bart
Title: New Lens Correction Software
Post by: Jonathan Wienke on December 31, 2009, 08:58:47 pm
Quote from: jjlphoto
I read some articles about converting the file to a custom RGB space with a gamma of 1 when you are doing serious interpolation. Keeps color artifacts in check.

That is incorrect. Using a linear gamma causes perceptually non-linear luminance transitions--a single white pixel on a black background will be upsized differently than a black pixel on a white background. For resizing interpolation, you need to work with the image data in a gamma of ~2 to get the resized white pixel and the resized black pixel to be the same size.
Title: New Lens Correction Software
Post by: Jonathan Wienke on January 01, 2010, 11:14:03 am
Quote from: BartvanderWolf
Ah, thanks for clarifying that! Must have somehow missed that info, I thought it was the other way around.

Yeah, that makes a difference...

Quote
In that case you'll notice that the edge pixels need special treatment by adding enough virtual pixels to accommodate the filter's support size.

I'm doing something a little different that eliminates the need to do a standalone filtration process before downsampling; the downsampling and low-pass filtration are combined into a single step.

The edge oddness you see with some images is from extrapolating a few extra pixels past the edges of the image. When using spline interpolation, the original pixels are essentially points in the center of 1-pixel-wide "boxes" (before rescaling). You have to extrapolate at least a half-pixel (pre-rescaling) around the edges of the image, or you end up doing a small cropping of the image when you resize--you lose 1 pixel horizontally and 1 pixel vertically. To solve this, I'm allowing the algorithm to extrapolate that half-pixel around the edges, plus an additional pixel.
Title: New Lens Correction Software
Post by: Jonathan Wienke on January 01, 2010, 07:50:27 pm
Quote from: Jonathan Wienke
The image on the left is resampled with my resampling algorithms, and the one on the right is using a simple nearest-neighbor algorithm. I use the nearest-neighbor simply for comparison, it makes it easier to see if the "real" upsizing algorithm is having issues with ringing or clipping, or if those things are present in the original image and therefore not the fault of the resizing. For downsizing, the nearest-neighbor version offers a comparison to see how well the downsizing algorithm is filtering out aliasing artifacts.

Here's some samples of what I'm referring to:

Upsizing:
[attachment=19102:Upsize.png]

Looking at the left-side image, one might wonder whether the halos around the rocks in the water on the right side are present in the original image, or if they're caused by upsizing artifacts. By comparing to the nearest-neighbor image on the right (nearest-neighbor resizing cannot cause ringing/haloing problems), you can see that the halos are present in the original image and therefore are not upsizing artifacts.

Downsizing:
[attachment=19103:Downsize.png]

Here, we can compare the intensity of the aliasing patterns to see how well the "real" downsizing algorithm on the left filters out aliasing vs the nearest-neighbor image on the right. As you can see, some aliasing is present, but most of it is filtered out.
Title: New Lens Correction Software
Post by: Jonathan Wienke on January 03, 2010, 12:12:59 am
Another update. New features include:


[attachment=19132:AliasDn.png]
Title: New Lens Correction Software
Post by: Jonathan Wienke on January 07, 2010, 07:06:40 pm
Another update, 0.0.1.14--mostly internal redesign to allow for use of floating-point storage from start to finish through all stages of processing (to avoid rounding errors), while still allowing large images to be processed without out-of-memory exceptions. The progress indication system has been tweaked a bit as well.
Title: New Lens Correction Software
Post by: minnend on January 10, 2010, 03:00:33 am
regarding jjlphoto's remark, here's some more info one potential gamma/interpolation-related problem: http://www.all-in-one.ee/~dersch/gamma/gamma.html (http://www.all-in-one.ee/~dersch/gamma/gamma.html)

Title: New Lens Correction Software
Post by: Jonathan Wienke on January 10, 2010, 08:35:47 pm
Quote from: minnend
regarding jjlphoto's remark, here's some more info one potential gamma/interpolation-related problem: http://www.all-in-one.ee/~dersch/gamma/gamma.html (http://www.all-in-one.ee/~dersch/gamma/gamma.html)

The side effects of the "cure" (interpolating in linear) are much worse than the "disease". Converting to linear gamma doesn't fix the stripe between the green and magenta areas, and with the algorithms I'm using, ringing artifacts become a big problem in dark areas, and a black pixel on a white background upsizes much differently than a white pixel on a black background:

[attachment=19349:int_linear.jpg]

Doing the distortion correction in a non-linear gamma works far better than in linear.
Title: New Lens Correction Software
Post by: Jonathan Wienke on April 21, 2010, 08:05:16 am
OK, it's been a long time since the last update. I've got the PSF generation and blur correction code working now, but it still needs a lot of fine-tuning before it will be ready for public consumption. Stay tuned...
Title: Re: New Lens Correction Software
Post by: Guldsmed on December 04, 2012, 06:28:47 am
Is there any news about your project? Still alive?
Title: Re: New Lens Correction Software
Post by: francois on December 04, 2012, 07:18:11 am
Is there any news about your project? Still alive?

Jonathan hasn't been showing any activity here for more than one year…
Title: Re: New Lens Correction Software
Post by: Guldsmed on December 04, 2012, 08:00:49 am
Oh... I C - thx for the info. Not much chance for a completed project then, I guess...
Title: Re: New Lens Correction Software
Post by: francois on December 04, 2012, 08:40:25 am
Oh... I C - thx for the info. Not much chance for a completed project then, I guess...

You might try to PM Jonathan and see whether he's still alive or not! I wouldn't be too optimistic, though.
Title: Re: New Lens Correction Software
Post by: jeremypayne on December 07, 2012, 02:50:22 pm
You might try to PM Jonathan and see whether he's still alive or not! I wouldn't be too optimistic, though.

 I think he may be back in the employ of the government once again ...