Pages: 1 ... 18 19 [20] 21 22 ... 24   Go Down

Author Topic: A free high quality resampling tool for ImageMagick users  (Read 252551 times)

alain

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 465
Re: A free high quality resampling tool for ImageMagick users
« Reply #380 on: October 06, 2014, 06:41:16 pm »

Keeping things soft is a good way to stay out of trouble :|


This reminded me that I have to recheck my capture sharpening after the upgrade to Capture One 8.
I checked it quite a while ago with Bart's tool.

But if I understand it correctly, it's better to keep it a tad soft and sharpen a bit more later in the pipeline.

Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #381 on: October 07, 2014, 12:42:23 am »

...
But if I understand it correctly, it's better to keep it a tad soft and sharpen a bit more later in the pipeline.
This is my general recommendation. But in any case it's how things fit together from end to end that matters.
« Last Edit: October 07, 2014, 01:15:27 am by NicolasRobidoux »
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #382 on: October 07, 2014, 04:04:26 am »

This reminded me that I have to recheck my capture sharpening after the upgrade to Capture One 8.
I checked it quite a while ago with Bart's tool.

But if I understand it correctly, it's better to keep it a tad soft and sharpen a bit more later in the pipeline.

Hi Alain,

If upsampling is going to happen, then we do not want to upsample artifacts, so indeed use Capture sharpening with restraint (either exactly right, as determined with my tool for your lens at various apertures with your Raw converter, or a tad under). Current Raw converters are pretty poor in assisting the user with that aspect of the workflow.

When we do the Capture sharpening exactly right, there is no real down-side to it, because it will not create halos, just restore resolution from Capture bur. The only possible down-side could be if the resampling algorithms cannot cope very well with sharp (but still slightly blurry) real image content. So when the tools are not good, it may be necessary to under-sharpen a bit. The goal of this thread is to develop better methods, which also do cope well with good quality input.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #383 on: October 07, 2014, 04:43:12 am »

Btw. When doing things inside prophoto RGB, I assumed that I would use the correct gamma, not the sRGB one.

Alain, this is a bit complex, and to a certain extend ImageMagick specific.

In general, we get better down-sampling quality (accurate color and luminance as it was in the original scene) when we use weighted averaging in a (near) linear gamma space to blend the various RGB pixel values of a larger sized region into one destination pixel at the smaller size. The downside is the risk of dark side halo being more prominent than light side halo once we return to the destination gamma space.

ProPhoto RGB has, besides different color primaries and White-point, a native gamma of 1/1.8. ImageMagick has built-in conversion support between sRGB gamma space and linear gamma space (probably because of lots of image content in the scientific community is in either of those gamma spaces). AFAIK it does not change the assumed color primaries.

So a proper conversion would require a full-blown color-management system (I believe IM uses at least parts of LCMS) to convert between color spaces including rendering intents (various flavors available) and chromatic adaptation (e.g. Bradford).

However, we only temporarily need the gamma linearization for color/luminance blending. So we would need a linear gamma version of all possible source profiles we might encounter to do a more accurate job. Instead we currently cut some corners by using the built-in IM functions. That means that we will convert a 1/1.8 gamma to approx. 2.2/1.8 = 1.2222 gamma space instead of linear gamma 1.0. The question is, how bad is that? Well, it's still better than a 1/1.8 or a 1/2.2 gamma blending (and even suppresses some of the dark side halo), because it is closer to 1.0, but it could be even better (in the sense of being calibrated and thus more predictable and accurate for color blends).

So while it's not ideal, it's not all that bad either. And we can circumvent some of the issues by already converting to the destination profile, before resampling. Not ideal, but not the end of the world either, after all, we are already creating totally new pixels in the process ...

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #384 on: October 07, 2014, 06:08:29 am »

Alain:
Going through the sRGB to RGB conversion (and back) is unnecessarily inaccurate when you are dealing with a gamma space at both the input and output of the script.
For example, if what you are feeding to the script is ProPhoto RGB (which is very well approximated by gamma 1.8) and what you are getting out of it is also ProPhoto RGB, you should be getting better results, without using a fully color managed toolchain, by converting to linear RGB using -gamma 1.8 and converting out of linear RGB with -evaluate Pow 1.8. AFAIK ImageMagick will leave the primaries alone provided you don't use profiles (just copying the profiles from the input file to the output file does not do anything to pixel values).
The above works with any color space that is close to a gamma space: if what you feed the script and get out of it is same gamma with same primaries, just use -gamma and -evaluate Pow. (sRGB is close to being a gamma space but actually not that close. It's worthwhile to go through the standard in that case.)
I'll try to post a version of V122 of the script that does exactly that, hopefully later today. It will exploit the fact that pow((pow(a,b),c) = pow(a,b*c) to squeeze out more accuracy.
« Last Edit: October 07, 2014, 06:10:18 am by NicolasRobidoux »
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #385 on: October 07, 2014, 07:06:00 am »

P.S. Could someone post what recent Lightroom does?

Attached, Lightroom 5.6 conversion at 100% JPEG quality, with no sharpening, and screen sharpening  (Low, Standard, High).

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #386 on: October 07, 2014, 08:52:15 am »

Thank you Bart.
Do you happen to know the cause of the tone mismatch in the bricks of the house and windmill? I remember a lengthy discussion of exactly this issue (but not where the discussion occured; dpreview?).
I hope that these most glaring differences are easily trackable (and are not caused by something like mismatched use of sRGB v4 profiles with black point compensation or something arcane like that).
P.S. I meant between LR and the "secret" prototype (which I assume shows more or less the same as your script).
P.S. What a mess: http://ninedegreesbelow.com/photography/srgb-profile-comparison.html.
« Last Edit: October 14, 2014, 05:47:51 am by NicolasRobidoux »
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #387 on: October 07, 2014, 11:00:52 am »

Thank you Bart.
Do you happen to know the cause of the tone mismatch in the bricks of the house and windmill?

They are a different type and color of bricks. The windmill has traditional style (more reddish/yellow colored) bricks, because it is a monument. It functions as a museum and flour mill. The houses chose to use a darker colored version, and smaller sized bricks.

So the colors are rendered accurately, given the lighting conditions at that moment of the day.

Attached 2 crops from the down-sampled windmill sample, at 800% magnified view, the differences are subtle but illustrative. Left is the Lightroom version with Standard screen sharpening, right is the regular (Version 1.2.2) LWGB script version with deconvolution sharpening at 100.

Look at the branches against the sky. They are marginally less blocky and a bit more 'organic' in the LWGB version. Look at the sky color that shines through the support beams, it is preserved a bit better (caused by linear light resampling). The beams themselves are also slightly less blocky and more anti-aliased against the sky (possibly due to EWA resampling). The mishmash of tree branches at the bottom is slightly better visible in the LWGB version. The arc above the door is slightly less blocky in the LWGB version.

In other areas, such as the lighter leaves and branches above the boat, there is also a bit more differentiation in the LWGB version.

The differences are really subtle, they will change with adjustments in sharpening, and it requires serious pixel peeping to notice what is different.

Cheers,
Bart
« Last Edit: October 07, 2014, 02:24:42 pm by BartvanderWolf »
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #388 on: October 14, 2014, 07:51:54 am »

Given that I am going for punch, and in light of subtle side effects RE: tone preservation in high frequency areas, I've decided to remove from my current "top downsampling prototype" the code that explicitly tries to mitigate haloing. This makes the "secret prototype" something like this. As the chart makes quite obvious, I'm not holding punches in the sharpness department (although I limit myself to adding a single halo, and I make sure that the added halo is one pixel wide).
Warning: Some subtle color toolchain issues make the tones different than what was shown by Bart as coming out of Lightroom. I have not decided to address this, since it would require, among other things, figuring out what Lightroom does, what my doctored ImageMagick does, what my viewer (nip2) does, and nailing the cause(s) of the mismatch. When I compare with the result of resampling with the ImageMagick's correctly implemented bilinear (a.k.a. triangle) through linear RGB, I can tell that my "secret scheme" does not mess tones up. For my purposes, this is good enough.
P.S. It appears that Bart, LR and I are hunting for very roughly the same ballpark look. We're talking apples and oranges, not pumpkins and raisins.
P.S. Using the same method for enlarging is absolutely terrible :( One of the worst image enlargement schemes I've ever put together.
« Last Edit: October 14, 2014, 02:01:48 pm by NicolasRobidoux »
Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #389 on: October 15, 2014, 04:57:02 am »

As it is, I am not only looking for good downsampling. I am also looking for good "near identity transformations", the simplest such transformation being a rotation by a small angle.
The first attachement is the result of rotating the "chart" about its center using the default general purpose ImageMagick method, performing the filtering in linear light.
convert input.png -evaluate Pow 2 +distort SRT 1 -gamma 2 IMdefault.png
The second attachment is obtained using the "secret" scheme inspired, in part, by Bart van der Wolf and Jim Kasson's comments.
P.S. The difference is immediately obvious with the star target suggested by Bart van der Wolf in http://www.luminous-landscape.com/forum/index.php?topic=91754.msg767344#msg767344. Since I did not know the color space, I simply used
convert input.png -distort SRT 1 IMdefault.png
P.S.2 Comparing the "secret" scheme with the default ImageMagick "distort" scheme is cheating, since the default ImageMagick scheme was purposely chosen to be somewhat "soft" (Elliptical Weighted Averaging with the Robidoux Keys filter, which is roughly equivalent to Mitchell-Netravali). This being said, I think that you'd be hard pressed to find a general purpose scheme that keeps things this sharp and artifact free. (Hubris?)
« Last Edit: October 15, 2014, 10:21:26 am by NicolasRobidoux »
Logged

Jack Hogan

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 798
    • Hikes -more than strolls- with my dog
Re: A free high quality resampling tool for ImageMagick users
« Reply #390 on: October 15, 2014, 10:39:09 am »

Looks good.  Are we allowed to play with the secret prototype sauce recipe?
Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #391 on: October 15, 2014, 10:40:08 am »

Jack:
Unfortunately not: Some things are amenable to be done fully transparently; others, not.
The "secret" ingredient will remain a secret.
« Last Edit: October 15, 2014, 10:46:48 am by NicolasRobidoux »
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #392 on: October 15, 2014, 10:58:18 am »

As it is, I am not only looking for good downsampling. I am also looking for good "near identity transformations", the simplest such transformation being a rotation by a small angle.

Hi Nicolas,

Just a gentle word of caution. These maximum amplitude charts hide the degree of clipping. So while the image looks better for line art and things that are allowed to clip to black and white, they do not necessarily look as good on lower amplitude versions or other continuous tone image content.

I do appreciate that rotation and other types of (non-linear) scaling may benefit from a different type of filtering, more focused on preservation of sharpness. Since down-sampling and aliasing are commonly found together, aliasing may be mitigated for rotation by first upsampling, or only upsampling for e.g. keystone correction (a poor man's alternative to oversampling), optionally followed by an optimal down-sampling.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #393 on: October 15, 2014, 11:17:34 am »

...
Just a gentle word of caution. These maximum amplitude charts hide the degree of clipping. So while the image looks better for line art and things that are allowed to clip to black and white, they do not necessarily look as good on lower amplitude versions or other continuous tone image content.
When I read your word of caution, I was just about done looking at the results of rotating binary images in which 0 is mapped to, say, 255/3, and 255 is mapped to 2*255/3. That is, dark grey on light grey instead of black on white. For example, the "grey" star looks just fine.
I take note of your other piece of advice.
Thank you. :)
« Last Edit: October 15, 2014, 11:44:03 am by NicolasRobidoux »
Logged

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: A free high quality resampling tool for ImageMagick users
« Reply #394 on: October 15, 2014, 01:40:22 pm »

Another quick set, based on DeltaE_8bit_gamma2.2.tif http://www.brucelindbloom.com. First, the "secret" scheme. Then, the ImageMagick default (through linear light). And then a scheme that can be achieved with ImageMagick reasonably close to the secret scheme (a nudge too sharp I would guess). Note that the images are gamma 2.2.
convert DeltaE_8bit_gamma2.2.tif -evaluate Pow 2.2 -define filter:c=0.48146869222618521 -distort Resize 15\% -gamma 2.2 IMKeysC.48146869222618521.png
You may want to compare with http://blog.kasson.com/?p=7244
« Last Edit: October 16, 2014, 03:32:07 pm by NicolasRobidoux »
Logged

JRSmit

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 922
    • Jan R. Smit Fine Art Printing Specialist
Re: A free high quality resampling tool for ImageMagick users
« Reply #395 on: October 18, 2014, 02:42:29 am »

As i mentioned earlier, i would report my findings with uprezzing and image developments , all primarely for better large size prints.
Well, that has become an excersize in frustration.
first the good part: the uprezzing , script 1.2.2 works fine. The choices in sharpening both work fine, the only thing missing is the lack of direct feedback like in PS or LR.
However working out the best setting for an image takes quite some time.
As a general rule i would say to use an image to start from, with sharpening reduced, or in LR literally set to 0. You then have more room for sharpening after uprezzing without risk of halo.
then the bad part:
i also tried to use Topaz Detail and Focus Magic, well. . Focus magic has a standalone option, but then cannot handle the large files (its 32bit memory limit?), as plugin(64bit) it has limited functionality, so not of any use for me.
Topaz Detail is only a plugin in PS, fine sofar, but first need to load the image in PS, then in TD, thus loading memory, and 16GB is then not really enough. But if you stay within the memory limit, then TD provides a lot of options for sharpening. However with a steep learning curve.
Where i am now is that i cannot get an advantage over LR wrt to sharpening after uprezzing with FD or with TD, unless using a lot of my time to tinker around. And then still the question remains, does it show in the print.
So , i will continue with finding ways to improve images for large size prints but at a slower pace.
The uprezzing works fine, that is for sure. And the sharpening in the script is quite adequate to start with.
Logged
Fine art photography: janrsmit.com
Fine Art Printing Specialist: www.fineartprintingspecialist.nl


Jan R. Smit

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: A free high quality resampling tool for ImageMagick users
« Reply #396 on: October 18, 2014, 03:36:20 am »

As i mentioned earlier, i would report my findings with uprezzing and image developments , all primarely for better large size prints.
Well, that has become an excersize in frustration.

Hi Jan,

Thanks for the feedback, with a focus on the upsampling side of things. The frustration part I can understand a bit, although it's also part of any learning curve, gaining experience through trial and error.

Quote
first the good part: the uprezzing , script 1.2.2 works fine. The choices in sharpening both work fine, the only thing missing is the lack of direct feedback like in PS or LR.
However working out the best setting for an image takes quite some time.
As a general rule i would say to use an image to start from, with sharpening reduced, or in LR literally set to 0. You then have more room for sharpening after uprezzing without risk of halo.

Yes, this is usually the case, cascading of sharpening runs is usually not a good idea, unless extreme caution is used in the beginning stages. Also, the upsampling will magnify any issues that were created earlier in the workflow, making them more visible.

Quote
then the bad part:
i also tried to use Topaz Detail and Focus Magic, well. . Focus magic has a standalone option, but then cannot handle the large files (its 32bit memory limit?), as plugin(64bit) it has limited functionality, so not of any use for me.

But its deconvolution quality is praised throughout the photographic community. It also works great on upsampled images to remove some of the upsampling blur.

Quote
Topaz Detail is only a plugin in PS, fine sofar, but first need to load the image in PS, then in TD, thus loading memory, and 16GB is then not really enough. But if you stay within the memory limit, then TD provides a lot of options for sharpening. However with a steep learning curve.

Topaz Detail can also be launched by itself directly from Lightroom. You need to install the free utility, a stand alone editor, called Fusion Express. That will allow to right mouse click on an image and process it outside of LR with any of the Topaz Labs plugins.

They additionally have a plugin called "photoFXlab" that can also run as a standalone application, as a control hub for all the other plugins, and it offers additional  functionality like masking and layers blending.
 
Quote
Where i am now is that i cannot get an advantage over LR wrt to sharpening after uprezzing with FD or with TD, unless using a lot of my time to tinker around. And then still the question remains, does it show in the print.
So , i will continue with finding ways to improve images for large size prints but at a slower pace.
The uprezzing works fine, that is for sure. And the sharpening in the script is quite adequate to start with.

Again, thanks for taking the time to give feedback.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

Jack Hogan

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 798
    • Hikes -more than strolls- with my dog
V1.22: Too Much Michelson Contrast Lost in the Transfer
« Reply #397 on: October 27, 2014, 04:26:17 pm »

Hello Bart, Nicolas et al,

I've been thinking about v1.2.2 because I really like its idea although I do not like what it does to my pictures...yet :)  I don't think it's just perceptual and I may have an idea as to where that loss of local contrast and saturation may be coming from when downsizing: it appears that v1.2.2 smears the incoming data too far beyond final Nyquist therefore giving up too much information - before then applying (by my standards) aggressive sharpening to attempt to recover it.

The strategy may work with black and white test or similar images because the viewer may not perceive the fact that a fair amount of the final observed 'sharpness' has been re-built artificially.  But it seems to fall apart with natural images where, for instance, no sharpening is ever going to be able to figure out whether what arrived as a bland yellow from the downsizing process was actually supposed to be a saturated orange from a high local contrast area in the original image.

Here is an example with two screen captures, one the original 11500px image fit to the screen by CS5 (bicubic, the benchmark) and the other the same image downsized to 16.37% by v1.2.2, D option and no sharpening, displayed at 100%.  The loss of local contrast and saturation is evident as emphasized by the histograms of a highly saturated, high frequency patch (the square marquee near the top of the saturated yellow/orange tree to the left).  The histogram of the original looks [similar to] that of the bicubic.


http://i.imgur.com/7CAb9aX.gif

It seems to me that the combined filtering and downsizing algorithm in v1.2.2 blur the original data to the point where too much information gets averaged out giving up more Michelson contrast than needed (as confirmed by comparing with the benchmark).  Contrast is transferred accurately where it changes slowly.  Where it changes quickly it is not: the deep shadows get averaged up (they get brighter) and saturation is lost.  Here is a full resolution image of the difference between the two screen captures, with a curve for emphasis:


http://i.imgur.com/UDgLibu.png

If my intuition is correct, it seems to me that we would need to use a smaller 'radius' for the pre-filter+algorithm in order to let more local (Michelson) contrast through.  The sharpening would also probably be more effective as it would have more to bite on.

What do you think?

Jack
« Last Edit: October 30, 2014, 05:45:56 am by Jack Hogan »
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8915
Re: V1.22: Too Much Michelson Contrast Lost in the Transfer
« Reply #398 on: October 27, 2014, 08:00:27 pm »

Here is an example with two screen captures, one the original 11500px image fit to the screen by CS5 (bicubic, the benchmark) and the other the same image downsized to 16.37% by v1.22, D option and no sharpening, displayed at 100%.

Hi Jack,

I'm not going to jump to conclusions (although I have some ideas about cause and effect, namely gamma related effects and clipping), but do want to first understand what we are really looking at.

To avoid comparison with Photoshop screen resampling (which might be something like bilinear or worse), it would help if you could make an image crop available of the original image at 100%, e.g. the general region around the tree where you also sampled the histogram from, something like 1500x1500 pixels in size. That would result after down-sampling to 16.7% to something like 250 pixels square, still large enough to be meaningful for (visual) inspection.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

NicolasRobidoux

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 280
Re: V1.22: Too Much Michelson Contrast Lost in the Transfer
« Reply #399 on: October 28, 2014, 04:07:18 am »

...
I've been thinking about v1.22 because I really like its idea although I do not like what it does to my pictures...yet :)  I don't think it's just perceptual and I may have an idea as to where that loss of local contrast and saturation may be coming from when downsizing: it appears that v1.22 smears the incoming data too far beyond final Nyquist therefore giving up too much information - before then applying (by my standards) aggressive sharpening to attempt to recover it.
...
Here is an example with two screen captures, one the original 11500px image fit to the screen by CS5 (bicubic, the benchmark) and the other the same image downsized to 16.37% by v1.22, D option and no sharpening, displayed at 100%.
...
It is a defensible position that one should not apply very strong antialiasing and then sharpen back. Although it may be a good approach, it certainly does not come without cost. But nothing's for free.

Now, just to make sure that there is no confusion: You are aware that Bart's script, when used with the "D" option and sharpening set to 0, uses a method that is used at no other time (EWA with a Quadratic B-spline kernel)?
Among the options presented by the script, "D" with sharpening 50 or 100, or the generic scheme with sharpening 50 or 100, should be preferable, unless you really really do not want the downsampler to introduce any halo.
« Last Edit: November 03, 2014, 01:12:57 pm by NicolasRobidoux »
Logged
Pages: 1 ... 18 19 [20] 21 22 ... 24   Go Up