I suppose, sometimes, if I start feeling a bit pleased with myself and think that I know quite a lot about photography, it is probably good for me to be taken down a peg or two and realise that there are people in this world with whom I would actually be unable to communicate, except on the level of "Would you like a cup of tea?".

John

A more technical note: The deconvolution-problem is typically ill-posed, at least, initially. In the continuous domain the usual distortion due to blurring effect acts as an integral operator and problem statement boils down to a Fredholm integral equation of the first kind. In the discrete domain, which we usually operate due to digitization, the inherent ill-posedness is inherited, while some of the problems are ameliorated. More well-behaved solutions can be obtained by introducing some sort of "smoothness" or regularization criterion at this stage. Richardson-Lucy deconvolution converges to maximum-likelihood (ML) estimation. Maximum-likelihood techniques just do the analysis of image data, and hence, in general may not be smooth enough. However, some regularization is imparted by incorporating some notions regarding the a priori (default) distribution of image data, and hence, converting the problem to max a priori (MAP) estimation, which might provide more acceptable results. Under the assumptions of Gaussianity of certain image parameters (NOTE: not-necessarily the Gaussanity of the blur function) some equivalence of minimum mean square error estimation (MMSE), linearity, and MAP estimation can be obtained. Further optimizations can be introduced by using a more realistic nonstationary form of the blur function and variations of image data distribution and noise distribution - the draw back being that one might have to forgo some quick operations in the form of fast Fourier transforms (FFT) embedded somewhere in many deconvolution techniques.

I sympathise with your frustration here, John, but let's not be intimidated by poor expression. Here's my translation, for what it's worth, sentence by sentence.

(1) The deconvolution problem is typically ill-posed.

*means: The sharpening problem is often poorly defined. (That's easy).*(2) In the continuous domain the usual distortion due to blurring effect acts as an integral operator and problem statement boils down to a Fredholm integral equation of the first kind.

*means: The analog world, which is a smooth continuum, is different from the digital world with discrete steps. You need complex mathematics to deal with this problem, such as a Fredholm integral equation. (Whatever that is).*(3) In the discrete domain, which we usually operate due to digitization, the inherent ill-posedness is inherited, while some of the problems are ameliorated.

*means: We're now stuck with the digital domain. There's a hangover from the analog world with incorrect definitions, but we can fix some of the problems. There's hope.*(4) More well-behaved solutions can be obtained by introducing some sort of "smoothness" or regularization criterion at this stage.

*means: We can achieve a balanced result by sacrificing detail for smoothness.*(5) Richardson-Lucy deconvolution converges to maximum-likelihood (ML) estimation.

*means: The Richardson-Lucy method attempts to provide the best result, in terms of detail.*(6) Maximum-likelihood techniques just do the analysis of image data, and hence, in general may not be smooth enough.

*means: The best result may introduce noise.*(7) However, some regularization is imparted by incorporating some notions regarding the a priori (default) distribution of image data, and hence, converting the problem to max a priori (MAP) estimation, which might provide more acceptable results.

*means: With a bit of experimentation we might be able to fix the noise problem.*(

Under the assumptions of Gaussianity of certain image parameters (NOTE: not-necessarily the Gaussanity of the blur function) some equivalence of minimum mean square error estimation (MMSE), linearity, and MAP estimation can be obtained.

*means: Gaussian mathematics is used to get the best estimate for sharpening purposes. (Guass was a German mathematical genius, considered to be one of the greatest mathematicians who has ever lived. Far greater than Einstein, in the field of mathematics).*(9) Further optimizations can be introduced by using a more realistic nonstationary form of the blur function and variations of image data distribution and noise distribution - the draw back being that one might have to forgo some quick operations in the form of fast Fourier transforms (FFT) embedded somewhere in many deconvolution techniques.

*means: You can get better results if you take more time and have more computing power.*Okay! Maybe I've missed a few nuances in my translation. No-one's perfect. Any improved translation is welcome.