Pages: 1 ... 7 8 [9] 10 11 ... 24   Go Down

Author Topic: A free high quality resampling tool for ImageMagick users  (Read 250968 times)

Pictus

  • Full Member
  • ***
  • Offline Offline
  • Posts: 216
    • Retouching
Re: A free high quality resampling tool for ImageMagick users
« Reply #160 on: July 28, 2014, 07:43:16 pm »

Okay, attached is version 1.1.4, this time I've only done some cosmetic script changes, like implementing a Default response input that can be accepted by just hitting the <Enter> key. It unfortunately doesn't show those defaults when just hitting the <Enter> key (unless I create a lot more verbose input text), but it  always uses the same ones. I've chosen the default inputs to be: 800x800 as output size 'fit within' maxima, choose the optimized Down-sampling algorithm, and use a sharpening Amount of 100.

You hit the Bull's eye!

Wish:
Users: Instead of just giving thumbs up or thumbs down, could you describe what you like or don't like, and specify which of the three main options you use, and possibly the context of your use?

For any time I need to resize, most will be for web.
This makes sharpened downsample the main thing.
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #161 on: July 29, 2014, 02:33:06 am »

Bart:
There is a possible improvement of the tuning of the EWA Lanczos 3 deblur that I've been thinking of trying out for a long time but never found the time to do carefully.
It goes like this:
Create a synthetic image which is light gray on the left and dark gray on the right, with a perfectly sharp vertical interface between the two.
Now, for all deblurs between the LanczosRadius one and the LanczosSharp one, enlarge this image a lot (128 times, say) with the corresponding EWA Lanczos (with -define filter:blur=value, where value ranges roughtly from .91 to 1).
Crop the result so boundary effects don't matter, and measure the largest undershoot (or overshoot).
Question: Which deblurs are local minimizers? (There may be more than one.)
A similar question concerns the second overshoot (or undershoot).
I expect, without having any solid basis for it, that there is a local minimizer at, roughly, .94 or .95.
-----

Hi Nicolas,

I'll see what I can do. Just to be sure, when you talk about 'deblur', you have the IM expert function (like '-define filter:blur=0.95 ') in mind, correct? The deblur expert function does have a nice auto-scaling of the kernel support size property, although it is probably less effective than a (slower) Deconvolution in actually restoring some detail.

The only reservation I have about the type of target you describe is, that in practice such a sharp edge can only be provided by a vector / CGI image source. A digital camera will always create a small amount of blur (I've rarely seen an image with less than 0.7 Gaussian blur with very good lenses, unless already sharpened to death). That is due to the residual lens aberrations, the IR and optional optical Low-pass filters, and the sensor element's area sampling (it's not a point sampler). Then there is usually a Bayer CFA demosaicing required as well, to reconstruct a full RGB image from undersampled colors.

So with that in mind, while a sharp discontinuity makes a nice torture test, I'll probably also do the same test with a, say Gaussian 0.39 (a theoretical 1 pixel wide sharp edge transition from an area sampling device), pre-blurred edge. That's not point-sample Gaussian blur, but one with an adjusted shape, like my PSF generator tool can produce. That would allow to design an operation that does not attempt to overcompensate for artifacts that are never encountered in real images. Cartoons and other line drawings, are perhaps better vectorized before resampling anyway, although it would be nice if they too can be handled reasonably well with the same resampling method.

Quote
There is also a similar optimization that is linked to how I determined that the Keys spline are optimal, among BC-splines, for EWA resampling. It has to do with affine gradients as closely as possible, a very attractive property from a numerical analysis viewpoint. I'll keep that in mind.
The tricky thing is that infinite deblur is most likely the global winner in both cases. It's really local minimizers I'm looking for.

I'm not sure I fully understand, could you explain? I'm always in for optimizations, so I want to understand exactly what can be done to see if it can be implemented in a practical way ...

Cheers,
Bart
« Last Edit: July 30, 2014, 02:28:46 pm by BartvanderWolf »
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #162 on: July 29, 2014, 04:41:01 am »

Bart: Forget about the two optimizations I suggested, at least for now. I should not offload such things without being willing to do a very clear write up.
-----
I would like to suggest a change to your script that sort of addresses the same issues:
Right now, you have variable sharpening for downsampling. Could we have variable sharpening, though -define filter:blur=value, for the "generic" (for upsampling, mostly) luminance weighted gamma blended resampling (as opposed to deconvolution) method?
This would mean, for example, is that you'd always use -filter Lanczos instead of -filter LanczosRadius, and that:
sharpening = 0 sets -define filter:blur=1  <- regular EWA lanczos
sharpening = 100 sets -define filter:blur=0.88549061701764 <- deblur for EWA LanczosSharpest 3, slightly different from the LanczosSharpest 4 value
default sharpening would be LanczosRadius obtained, since we use -filter Lanczos across the board, by setting -define filter:blur=.9264075766146068 <- deblur that defines LanczosRadius in terms of plain Lanczos
References: http://web.cs.laurentian.ca/nrobidoux/misc/AdamTurcotteMastersThesis.pdf Section 4.8.6.7 and http://www.imagemagick.org/Usage/filter/nicolas/#upsampling
Then, hopefully, users would let us know what works for them. But they also could adjust the upsampling to the blurriness of the source material, or taste, and actually could turn up anti-aliasing by setting the sharpening level to a low value (that corresponds, for example, to EWA LanczosSharp, which I quite like).
« Last Edit: July 29, 2014, 05:09:49 am by NicolasRobidoux »
Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #163 on: July 29, 2014, 05:15:37 am »

...
So with that in mind, while a sharp discontinuity makes a nice torture test, I'll probably also do the same test with a, say Gaussian 0.39 (a theoretical 1 pixel sharp edge transition from an area sampling device), pre-blurred edge. That's not point-sample Gaussian blur, but one with an adjusted shape, like my PSF generator tool can produce. That would allow to design an operation that does not attempt to overcompensate for artifacts that are never encountered in real images. Cartoons and other line drawings, are perhaps better vectorized before resampling anyway, although it would be nice if they too can be handled reasonably well with the same resampling method...
Good point. I do tend to optimize based on absolute worst case scenarios.
On the other hand, put a premium lense on a medium format camera and push the result through a top of the line demosaic method, and you'd be surprised how sharp things can be.
« Last Edit: July 29, 2014, 05:24:33 am by NicolasRobidoux »
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #164 on: July 29, 2014, 06:23:14 am »

Good point. I do tend to optimize based on absolute worst case scenarios.
On the other hand, put a premium lense on a medium format camera and push the result through a top of the line demosaic method, and you'd be surprised how sharp things can be.

Nicolas, I have done that, and none of them get much below 0.7 sigma Gaussian Blur (with 100% fill-factor assumption), at their best aperture, in the center of the image circle. Of course most medium formats also aliase like hell, due to the absent OLPF. While it is good to keep an eye on the absolute worst case scenario (for robustness of calculations), it can also lead to sub-optimal results for more realistic cases. Attached, as plots, the current upsampling compromise as implemented in the script, a linear light blend of a linear & gamma 2, EWA LanczosRadius upsample of your suggested test target (attached in ZIP) by a factor of 128x.

Here is the blur kernel I used on the second target (after first normalizing to a sum of 1.0):
8.240474063989279e-21, 6.8060100125474815e-15, 1.1327893594730693e-11, 9.077705692513543e-11, 1.1327893594730693e-11, 6.8060100125474815e-15, 8.240474063989279e-21
6.8060100125474815e-15, 5.6212509051295795e-9, 0.0000093559856663739, 0.00007497500186815324, 0.0000093559856663739, 5.6212509051295795e-9, 6.8060100125474815e-15
1.1327893594730693e-11, 0.0000093559856663739, 0.015572062031516103, 0.12478806846616428, 0.015572062031516103, 0.0000093559856663739, 1.1327893594730693e-11
9.077705692513543e-11, 0.00007497500186815324, 0.12478806846616428, 1, 0.12478806846616428, 0.00007497500186815324, 9.077705692513543e-11
1.1327893594730693e-11, 0.0000093559856663739, 0.015572062031516103, 0.12478806846616428, 0.015572062031516103, 0.0000093559856663739, 1.1327893594730693e-11
6.8060100125474815e-15, 5.6212509051295795e-9, 0.0000093559856663739, 0.00007497500186815324, 0.0000093559856663739, 5.6212509051295795e-9, 6.8060100125474815e-15
8.240474063989279e-21, 6.8060100125474815e-15, 1.1327893594730693e-11, 9.077705692513543e-11, 1.1327893594730693e-11, 6.8060100125474815e-15, 8.240474063989279e-21

The no-blur CGI worst case, shows a significant overshoot (53928.6562 / 51199, or 5.33%) in the highlights after upsampling by 128x, but the slightly blurred (IMHO less than actual cameras/lenses wll produce without additional sharpening), is very well behaved (51318.7344 / 51199, or 0.23% overshoot). Anything below 1% over/undershoot can probably be considered as perceptually not significant under normal viewing conditions. It's only the shadow tones undershoot that could be improved a bit, e.g. by raising the gamma from 2.0 to, say, 2.5 (because 3 had you worried about color separation).

So, the worst case scenario would suggest a significant reduction of highlight halos to be needed, while in practice (for real images) it's just fine (with this particular blend), even for the best production cameras/lenses.

I'll do some simulations without blends and different deblurs on both targets later.

Cheers,
Bart
« Last Edit: July 29, 2014, 07:24:47 am by BartvanderWolf »
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #165 on: July 29, 2014, 07:10:15 am »

Suggestions:
A) Are we really sure that gamma 3 is bad when enlarging? One thing I like about gamma 3 (besides the connection to L*a*b* and perceptual models) is that the C cbrt function can recover easily from negative values without clamping. <- rather pedestrian
B) Instead of using the auto-level luminance as is as weighting function, push it through gamma after conversion to greyscale! We want the weight of linear to grow more slowly than it already does, so something like -gamma .5 (corresponds to Pow 2, or maybe .333333333333333) should work?
P.S. Off the top of my head warning.
« Last Edit: July 29, 2014, 07:19:50 am by NicolasRobidoux »
Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #166 on: July 29, 2014, 07:18:42 am »

Bart:
Well done and points taken.
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #167 on: July 29, 2014, 07:23:43 am »

Suggestions:
A) Are we really sure that gamma 3 is bad when enlarging? One thing I like about gamma 3 (besides the connection to L*a*b* and perceptual models) is that the C cbrt function can recover easily from negative values without clamping. <- rather pedestrian

I personally prefer a slightly higher gamma than the current 2.0 for upsampling. Because I like simple numbers, not really any other reason than that at this time into the testing, a gamma 2.5 / 0.4 roundtrip would seem to produce better Luminance results, even on e.g. the Magick logo and similarly rasterized vector image content.

Quote
B) Instead of using the auto-level luminance as is as weighting function, push it through gamma after conversion to greyscale! We want the weight of linear to grow more slowly than it already does, so something like -gamma .5 (corresponds to Pow 2, or maybe .333333333333333) should work?

Sure, the code is currently easy enough to adapt. So a -gamma 0.5 (or whatever) on the luminance based blending function would be worth a try (with or without -auto-level).

Cheers,
Bart
« Last Edit: July 29, 2014, 07:27:50 am by BartvanderWolf »
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #168 on: July 29, 2014, 07:29:10 am »

TO DO:
At some point, check if -evaluate Pow 2 gives similar results to -gamma .5.
Most likely, one is done through a LUT (how dense? have not checked recently) and the other through actual function evaluation.
-evaluate Pow is probably more accurate at high/low gamma. And possibly not much slower (memory/computation trade off does not favor LUTs as much as it used to be).
Warning: I don't know if -evaluate carefully matches ranges. Maybe -evaluate Pow requires more plumbing.
In HDRI...
Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #169 on: July 29, 2014, 08:07:13 am »

How about we go back to gamma 3 instead of gamma 2 for upsampling and use that until problems occur?
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #170 on: July 29, 2014, 08:59:31 am »

How about we go back to gamma 3 instead of gamma 2 for upsampling and use that until problems occur?

Fine with me. I'll change the (Windowese with batch variable 'Msize') code line
( -clone 0 -gamma 2 -filter LanczosRadius -distort Resize %Msize% -gamma 0.5 ) ^
into
( -clone 0 -gamma 3 -filter LanczosRadius -distort Resize %Msize% -gamma 0.3333333333333333 ) ^

In Unixese that would probably become (although I'm not sure how variables are formatted in that dialect):
\( -clone 0 -gamma 3 -filter LanczosRadius -distort Resize \%Msize\% -gamma 0.3333333333333333 \) \

If nothing else transpires before the end  of the day, UTC+2 here, I'll upload a new full script version later.

Cheers,
Bart

P.S. I've attached the resulting plots of a sharp edge step, and an ever so slightly blurred one, Gamma 3, upsampled 128x. At least the halos are much more balanced. Highlight overshoot halo has increased a bit (51341.3438 / 51199 or 0.28% which was 0.23% on the blurred target version, but still well below significant), but the main improvement is in the shadow halo reduction. There is a bit more low amplitude ringing.
« Last Edit: July 29, 2014, 09:48:22 am by BartvanderWolf »
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #171 on: July 29, 2014, 09:02:07 am »

BTW, these plots really make clear that this can be a very effective halo suppression approach.
The other thing is that halo suppression has generally been performed with "anti ring" limiters. They can't give results as smooth as those, at least not as cheaply.
There are two things: Suppress halos. But don't introduce aliasing.
...
The no-blur CGI worst case, shows a significant overshoot (53928.6562 / 51199, or 5.33%) in the highlights after upsampling by 128x, but the slightly blurred (IMHO less than actual cameras/lenses wll produce without additional sharpening), is very well behaved (51318.7344 / 51199, or 0.23% overshoot). Anything below 1% over/undershoot can probably be considered as perceptually not significant under normal viewing conditions. It's only the shadow tones undershoot that could be improved a bit, e.g. by raising the gamma from 2.0 to, say, 2.5 (because 3 had you worried about color separation).

So, the worst case scenario would suggest a significant reduction of highlight halos to be needed, while in practice (for real images) it's just fine (with this particular blend), even for the best production cameras/lenses.

I'll do some simulations without blends and different deblurs on both targets later.

Cheers,
Bart
« Last Edit: July 29, 2014, 09:04:27 am by NicolasRobidoux »
Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #172 on: July 29, 2014, 09:05:40 am »

Bart:
Don't worry about conversion back to unixese. Easy enough.
Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #173 on: July 29, 2014, 10:07:54 am »

Bart:
The right scale to measure overshoot and undershoot over is the difference between the value on the "high side" and the "low side" (instead of the value on the "high side", or the "low side", separately).
In other words, compare the overshoot to the size of the "jump".
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #174 on: July 29, 2014, 10:36:56 am »

Bart:
The right scale to measure overshoot and undershoot over is the difference between the value on the "high side" and the "low side" (instead of the value on the "high side", or the "low side", separately).
In other words, compare the overshoot to the size of the "jump".

Nicolas, is there such a thing as a right scale?

In my non-academic experience, a reduction in the shadows + an increase in the highlights might (partially) cancel (or increase) each other, which to me could potentially hide the real effect on a perceptual level (the Human Visual System is more sensitive to 'detail' the bright tonal ranges, the highlight ripples at this scale of magnification). The total amplitude also changes with local contrast, so experiments need to be calibrated to the same contrast range, which can be tricky if local shadow and highlight contrast differs. That's why I prefer to differentiate between the two tone levels.

I do understand that a total amplitude approach makes more sense at a denser/higher spatial resolution (smaller magnification scale), because the level of detail is closer to the peak of the Human Contrast Transfer Function (which peaks around 6-8 cycles/degree for Luminance, and more like a 'plateau' from 1 to 6 cycles/degree for chroma). So the 'right scale', varies with viewing distance and magnification factor.

Maybe doing both would satisfy all?

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #175 on: July 29, 2014, 11:12:39 am »

Bart:
Indeed, I'm asking for a presentation of information that makes most sense in a linear context, when we are dealing with a nonlinear context.
Nonetheless, characterizing an overshoot of 5000 (in 16-bit) when the "flat values" are 15000 and 50000 as 100*5000/(50000-15000) = 14.3% is informative to me, even though it says nothing relative to JND. So if you could call this "relative measure of the overshoot" or something like this, and report it, I'd be thankful. But this is certainly not necessary. My apologies for putting "right" into this.
P.S. Maybe we misunderstood each other. Preferably, I'd like an overshoot to 55000 when the "flat values" are 15000 and 50000 to be reported as 14.3%, and an undershoot to 10000 with the same "flat values" to be reported likewise. But this is a clumsily stated wish, not a "right".
« Last Edit: July 29, 2014, 11:21:00 am by NicolasRobidoux »
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #176 on: July 29, 2014, 12:29:57 pm »

I would like to suggest a change to your script that sort of addresses the same issues:
Right now, you have variable sharpening for downsampling. Could we have variable sharpening, though -define filter:blur=value, for the "generic" (for upsampling, mostly) luminance weighted gamma blended resampling (as opposed to deconvolution) method?

Let me see If I got the gist of that, all referring to the EWA -distort Resize:
LanczosRadius (the current generic/upsampling filter) is the same as Lanczos and LanczosSharpest 3, they all are 3-lobe support filters (Jinc windowed Jinc types?) that only differ in the amount of (de)blur?

What is LanczosSharp? Does it materially differ from the above scheme?

Another thing, and I know I should go look up the details, but maybe you can get me going faster ahead with a direct response. In what way are the Blur defines special for the above named filters? Are they just optimized for different levels of sharpening, or is there a specific relationship to how the blur radius relates to the lobes (e.g. aligned with orthogonal, or diagonal pixels)?

The reason I'm asking is because I'm thinking of a simple/flexible way to set this up within the limitations of a basic script file. Something like a fine grained lookup such as, in pseudo 'code' , with simple to add intermediate levels:
If 'SharpAmount' is less than or equal to 0, then set a 'BlurVal' variable to '1' and go to 'convert'
If 'SharpAmount' is less than or equal to 64,  then set a 'BlurVal' variable to '0.9264075766146068' and go to 'convert'
If 'SharpAmount' is less than or equal to 100,  then set a 'BlurVal' variable to '0.88549061701764' and go to 'convert'

With some levels in between for finer control.

All converts would then use '-define filter:blur=%BlurVal% -filter Lanczos -distort Resize'

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #177 on: July 29, 2014, 12:49:55 pm »

Bart:
EWA LanczosSharp 3 can be obtained with
... -filter LanczosSharp -distort Resize ...
or with
... -filter Lanczos -define filter:blur=0.9812505644269356 -distort Resize ...
(Note that there is another LanczosSharp with a slightly different deblur floating around. It's not the ImageMagick one.)
EWA LanczosRadius 3 can be obtained with
... -filter LanczosRadius -distort Resize ...
or with
... -filter Lanczos -define filter:blur=0.9264075766146068 -distort Resize ...
EWA LanczosSharpest 3 can be obtained with
... -filter Lanczos -define filter:blur=0.88549061701764 -distort Resize ...
All of them are Jinc-windowed Jinc 3 lobe, different from each other only in the chosen deblur (the value, smaller than 1, set with -define filter:blur=...). Nothing more, nothing less.
Just like you can "sweep" through Keys splines by setting the b parameter in the range 1 to a smallest usable value which is slightly below 0, this sweep going from very blurry to very sharp, you can sweep through EWA Jinc-windowed Jinc 3-lobe filters going from 1 down to 0.88549061701764 in the blur value.
What I was suggesting is that you use a percent sharpening to set the deblur, as follows:
Let SharpAmount be the parameter, from 0 (little) to 100 (a lot) that controls the sharpening amount. Then,
DEBLUR = 0.88549061701764 + (1 - .01*SharpAmount)*(1-0.88549061701764)
with default SharpAmount set, when using the generic (upsampling) method, to
100*(1-(0.9264075766146068-0.88549061701764)/(1-0.88549061701764))
Passing DEBLUR to -define filter:blur=DEBLUR then makes sharpening "tighten" the radius of the Jinc-windowed Jinc 3-lobe, and the default value reproduces EWA LanczosRadius.
No need for cases.
« Last Edit: July 29, 2014, 01:17:32 pm by NicolasRobidoux »
Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #178 on: July 29, 2014, 01:03:15 pm »

To give an idea of how these values are arrived at. (There are other characterizations in terms of operator and matrix norms which I can't track down just now.)
http://www.imagemagick.org/discourse-server/viewtopic.php?f=22&t=19636&p=89068&hilit=Axiom+code#p89068
http://www.imagemagick.org/discourse-server/viewtopic.php?f=22&t=19636&start=30#p78347
EWA Lanczos Radius 3 does not come from optimization. It a pragmatic choice simply obtained by setting the deblur to the ratio of 3 and the location of the third root of the Bessel Jinc function. This fixes things so that the EWA disc used when enlarging is exactly 3, which means it is the largest deblur such that EWA Lanczos 3's "zone of influence" is fully contained inside tensor (orthogonal) Lanczos 3's "zone of influence" (which is the image of the result of "infinite ratio" upsampling of an impulse with the filter, that is, the continuous support of the impulse response).
« Last Edit: July 29, 2014, 01:16:25 pm by NicolasRobidoux »
Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #179 on: July 29, 2014, 01:48:50 pm »

Basically, EWA LanczosSharp is obtained with the same optimization principle that gives EWA Robidoux (the two variants of EWA LanczosSharp have to do with two variants of the optimization principle that give the same result in the case of Keys cubic splines), EWA LanczosSharpest is obtained with the same optimization principle that gives EWA RobidouxSharp, and EWA LanczosRadius is obtained through a reasonable but nonetheless ad hoc specification that the radius of the EWA disc should match its number of lobes.
EWA LanczosSharp basically minimizes the no-op modification of an image that has constant columns (or rows). Same with Robidoux.
EWA LanczosSharpest basically minimizes the no-op modification of an arbitrary image. Same with RobidouxSharp.
Optimization is done in a minimax sense (minimize the worst case). It's actually "the other" EWA LanczosSharp that is the minimizer in this sense. The ImageMagick EWA LanczosSharp satisfies a sligthly different criterion: A vertical (or horizontal) line (with constant pixel value) does not modify its immediate neighbour under no-op. (It modifies the neighbour of its neighbour, but very little.) Robidoux satisfies both criterions.
« Last Edit: July 29, 2014, 02:06:23 pm by NicolasRobidoux »
Logged
Pages: 1 ... 7 8 [9] 10 11 ... 24   Go Up