Pages: 1 2 [3]   Go Down

Author Topic: Sharpening for the Web  (Read 4380 times)

rabanito

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1577
Re: Sharpening for the Web
« Reply #40 on: March 17, 2019, 11:32:29 am »

In case somebody could find this useful:
I sent Bart a file (again my cat) in original size.
He returned it to me downsized to 800x together with a sharpened copy for comparison

1. original2_800x.jpg is downsampled from the original file with a method that's very similar to the one used in Lightroom.
https://www.dropbox.com/s/xkjsok4a8gjncxd/original2_800x.jpg?dl=0
2. original2_800x-sharpen-030-010-010.jpg is that same file, with some sharpening added (I used Topaz Sharpen AI version 1.1.0, in Sharpen mode).
https://www.dropbox.com/s/67kco5s7k9oravc/original2_800x-sharpen-030-010-010.jpg?dl=0


I sharpened the original with the default values (RemoveBlur .50 Supress Noise .50 Add  Grain 0) of Topaz Sharpen AI version 1.1.0, in Sharpen mode
The result was (for my taste) oversharpened.
I repeated this using the values from Bart (RemoveBlur .30 Supress Noise .30 Add  Grain 10) And obtained the same result (of course) as Bart, a pleasant sharpening.
Using Photokit w default values for Output Sharpening produces a similar (a tad sharper) result
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: Sharpening for the Web
« Reply #41 on: March 18, 2019, 06:17:56 am »

Rabanito's original full size image is an image that is very sharp, but with shallow Depth of Field, and with some noise.

As was obvious to me from the start, the original reduced size image did suffer from the process of downsampling, with ringing and aliasing/stairstepping artifacts as the tell-tale signs. So some improvement in that initial step was possible. Using a better downsampling method, produced a more robust image for further processing/sharpening.

The amount of subsequent sharpening remains a personal preference thing. But the sharpening method should IMHO preferably be either deconvolution sharpening (Focusmagic is still a great tool for that), to restore some of the resampling blur or, as it has started becoming available, detail replacement by AI.

Deconvolution, or AI, both allow minimizing the risk of amplifying the artifacts that were already there, or of introducing new artifacts.
It was an interesting exercise.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

Alan Klein

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 15850
    • Flicker photos
Re: Sharpening for the Web
« Reply #42 on: March 18, 2019, 08:35:38 am »

I think what's happening is that trying to correct a too narrow DOF with sharpening just doesn't work.  You have to get it right in the camera first. 

rabanito

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1577
Re: Sharpening for the Web
« Reply #43 on: March 18, 2019, 01:01:31 pm »

I think what's happening is that trying to correct a too narrow DOF with sharpening just doesn't work.  You have to get it right in the camera first.

The intention was not to "correct" the DOF but "to restore some of the resampling blur", quoting Bart.
The DOF is OK w me  ;)
Logged

Lightsmith

  • Full Member
  • ***
  • Offline Offline
  • Posts: 197
Re: Sharpening for the Web
« Reply #44 on: March 24, 2019, 02:32:58 pm »

Look at the cat's body in the different pictures and you will see some where all the detail is missing. I would start by doing a Levels type of adjustment and then adjusting the contrast of the full image and then do a resizing for the Web and lastly to sharpen the image. The smaller the Web file size the more sharpening I need to use.

That is why I always start with the original file, resize for the output file size needed, and then sharpen. I do not resize to 1000 x 1000 and sharpen and then resize that file to 400 x 400 and sharpen again as the resulting image quality will be much less.
Logged
Pages: 1 2 [3]   Go Up