Luminous Landscape Forum

Equipment & Techniques => Medium Format / Film / Digital Backs – and Large Sensor Photography => Topic started by: eronald on January 10, 2012, 03:52:19 pm

Title: Best downsize to reduce noise?
Post by: eronald on January 10, 2012, 03:52:19 pm
I have a D3x. Nice camera. But I'd like to be able to get results like from a D3 or D3s :)
In other words, I would like to be able to get good 6MP or so images at high ISO.
Discuss.

Edmund
Title: Re: Best downsize to reduce noise?
Post by: MichaelEzra on January 10, 2012, 04:35:28 pm
The interpolation method used for preview rendering in RawTherapee seems to remove all noise on smaller zoom scales. This is available in preview only, however.
Title: Re: Best downsize to reduce noise?
Post by: eronald on January 10, 2012, 04:42:57 pm
There should be some interesting stuff one could do here - maybe not enuff to turn a D3x into a D3s, but at leats enuff to not make it worthwhile to buy a D3s :)

Edmund

The interpolation method used for preview rendering in RawTherapee seems to remove all noise on smaller zoom scales. This is available in preview only, however.
Title: Re: Best downsize to reduce noise?
Post by: Bart_van_der_Wolf on January 10, 2012, 07:15:51 pm
I have a D3x. Nice camera. But I'd like to be able to get results like from a D3 or D3s :)
In other words, I would like to be able to get good 6MP or so images at high ISO.
Discuss.

Hi Edmund,

It depends on the spectral noise profile whether downsampling really helps (you will lose resolution when downsampling). The best approach that always works is to use a dedicated noise reduction application first (and then downsample if needed).

The noise spectrum can be estimated by using e.g. ImageJ and a (radial profile) Plug-in. It's ideally based on the result of 2 subtracted images of a uniformly lit featureless surface, but one can also follow the simple empirical method of trial and error.

Cheers,
Bart
Title: Re: Best downsize to reduce noise?
Post by: ejmartin on January 10, 2012, 07:42:22 pm
Bart has it right.  Straight downsampling (the preview in RT is a simple pixel binning operation) doesn't use any of the image information on scales finer than the target resolution.  A dedicated NR program will use that information and thereby lead to a better result (sharper, less noise) after downsampling than simply blurring (to remove fine scale noise and suppress aliasing) followed by downsampling, or downsampling without blurring.
Title: Re: Best downsize to reduce noise?
Post by: theguywitha645d on January 10, 2012, 07:50:31 pm
My understanding is downsizing is just masking the noise through binning, but since you are really just reducing artifacts by making bigger pixels and the pixels themselves are still unresolved by the viewer, I don't really know if you are really doing anything to the final perceived image. If noise is like granularity (you never perceive grain, just the effects of it), then two prints at the same dimensions may not actually look any different.
Title: Re: Best downsize to reduce noise?
Post by: eronald on January 10, 2012, 08:11:27 pm
Hi Edmund,

It depends on the spectral noise profile whether downsampling really helps (you will lose resolution when downsampling). The best approach that always works is to use a dedicated noise reduction application first (and then downsample if needed).

The noise spectrum can be estimated by using e.g. ImageJ and a (radial profile) Plug-in. It's ideally based on the result of 2 subtracted images of a uniformly lit featureless surface, but one can also follow the simple empirical method of trial and error.

Cheers,
Bart

Bart,

 Please give references.

EJMartin,
 
  The "dedicated" noise reduction algorithm I am looking for is one which "knows" that I am satisfied with a picture with 1/4 of the number of pixels in the end.

  And BTW, surely such things should be done on the Raw, and not on the debayered tonemapped Tiff?

Edmund
Title: Re: Best downsize to reduce noise?
Post by: madmanchan on January 10, 2012, 08:53:52 pm
Hi Edmund,  you can downsize your image to 1 pixel.  Guaranteed to have no noise.   ;D
Title: Re: Best downsize to reduce noise?
Post by: ejmartin on January 10, 2012, 10:33:30 pm
  And BTW, surely such things should be done on the Raw, and not on the debayered tonemapped Tiff?

One has to be sure not to obliterate the correlations among the color channels that assist in demosaicing the resulting image.  Without that, denoising the color channels separately prior to demosaic will typically yield a worse result.
Title: Re: Best downsize to reduce noise?
Post by: ErikKaffehr on January 11, 2012, 12:08:48 am
Hi,

What I don't understand really is why the D3S has less noise than the D3X when D3X is downscaled to same size. It seems that D3S has less shot noise which may indicate that the high ISO performance is achieved due to better quantum efficiency? Would be interesting to find out.

Regarding noise reduction and downscaling it's my impression that it would be best done in raw conversion. Noise reduction in LR3 works pretty well.
What used to be Bibble Pro could integrate Noise Ninja at early stage.

Best regards
Erik

I have a D3x. Nice camera. But I'd like to be able to get results like from a D3 or D3s :)
In other words, I would like to be able to get good 6MP or so images at high ISO.
Discuss.

Edmund
Title: Re: Best downsize to reduce noise?
Post by: EricWHiss on January 11, 2012, 12:19:54 am
Edmund,
I'm just doing this with LR3 or C1 with NR set higher than I'd like for full size and then outputting at 25% with decent results.  In general I dislike to use Luminance NR because it seems to muddle as much as it does clean, but this magically goes away at 25%.   What kind of noise is the problem? Shadows probably but pattern or splotches or purple green stuff? 
Title: Re: Best downsize to reduce noise?
Post by: Fine_Art on January 11, 2012, 01:06:56 am
Hi,

What I don't understand really is why the D3S has less noise than the D3X when D3X is downscaled to same size. It seems that D3S has less shot noise which may indicate that the high ISO performance is achieved due to better quantum efficiency? Would be interesting to find out.

Regarding noise reduction and downscaling it's my impression that it would be best done in raw conversion. Noise reduction in LR3 works pretty well.
What used to be Bibble Pro could integrate Noise Ninja at early stage.

Best regards
Erik


If the electronics around the pixel are a required thickness for electrical reasons smaller pixels will have a higher percentage of lost light gathering area. This would not be the case with Canon's claimed 100% coverage microlenses or Sony's Exmor with the electronics under the pixel.

Yes, Bibble's integrated NN at raw conversion was effective. They really hyped it as an important advantage. Too bad Bibble's Colors started to suck on newer cameras. It was awesome with the A100.

Regarding OP question,
Noise is an overused parameter perhaps due to it being easy to measure. If noise is removed with a competent software a higher resolution image almost always looks more realistic than a lower one. The only reason people dont always see that is the 2MP screens we use. Images at 50% always look much better than 100% due to oversampling. Same deal.
Title: Re: Best downsize to reduce noise?
Post by: hjulenissen on January 11, 2012, 02:16:17 am
Re: downsampling reduces noice! - uh noise! (http://forums.dpreview.com/forums/readflat.asp?forum=1018&message=39778828)
Re: downsampling reduces noice! - uh noise! - con't.
 (http://forums.dpreview.com/forums/readflat.asp?forum=1018&message=39856801)
Quote
Here is a crop from a plenty-noisy Imaging Resource D3s 102400 ISO shot. It has been processed as follows:

1 Original

2 Lowpass filtered, everything above Nyquist/2 zeroed using DFT. Not downsampled.

3 Original downsampled by factor 2, without filtering. Equivalent to "Nearest-neighbor" or true "decimation by 2." Includes aliases due to lack of lowpass filter.

4 Lowpass filtered as in 2, then downsampled as in 3. "Properly downsampled."

5 Image 4 subtracted from image 3 to reveal noisy aliases resulting from lack of lowpass filter. (offset 128 has been added)
(http://g4.img-dpreview.com/D2A80F8338634A538D1E37AAB5582654.jpg)

My take:
If you are to downsample a noisy image anyways, you should select a scaling algorithm that does plenty of averaging (smoothing everything above fs/2).

I dont see the point in downsampling to remove noise, use your favorite noise reduction algorithm instead.

-h
Title: Re: Best downsize to reduce noise?
Post by: eronald on January 11, 2012, 06:06:22 am
Look, I'm a scientist -or was- but at this point I am trying to be purposely vague.

For me, the point of the exercise, as I see it, is to start with the Raw images of a pretty good camera with lots of pixels (24MP), and determine the best way to process those Raws in order to gain say 1-2 stops of subjective ISO, shooting at  6400 instead of 1600, and end up with enough pixels for a nice rather than ugly magazine page.

A typical application would be if taking the camera to a fashion show, and hitting conditions that force one to use a higher shutter speed and DOF eg. because it is desirable to shoot the model as she is walking and not posed at runway end. The D3x landed me with that dilemma more than once.

More prosaically, when I pick up my huge SLR to take an image of my toddler playing with his mum in the very dimly lit living room, I am quite certain that I won't need 24MP, but I still want that fullframe look, or I'd pick up some other camera :)

Pros who face low-light action situations usually have a dedicated camera, but for those who do not, it may be interesting to know how a studio camera can cope. This may be interesting to the MF crowd too because an 80MP camera has a lot of room for downsizing ;)

Edmund

 
Title: Re: Best downsize to reduce noise?
Post by: hjulenissen on January 11, 2012, 07:25:41 am
Look, I'm a scientist -or was- but at this point I am trying to be purposely vague.

For me, the point of the exercise, as I see it, is to start with the Raw images of a pretty good camera with lots of pixels (24MP), and determine the best way to process those Raws in order to gain say 1-2 stops of subjective ISO, shooting at  6400 instead of 1600, and end up with enough pixels for a nice rather than ugly magazine page.
In other words: you want to reduce the visible noise, and are willing to sacrifice details/sharpness to get there? Isn't that noise-reduction in a nutshell?

-h
Title: Re: Best downsize to reduce noise?
Post by: mediumcool on January 11, 2012, 08:49:48 am
The options possible in Photoshop scaling can help, if further software purchases are not wanted.
Title: Re: Best downsize to reduce noise?
Post by: bjanes on January 11, 2012, 09:07:01 am
What I don't understand really is why the D3S has less noise than the D3X when D3X is downscaled to same size. It seems that D3S has less shot noise which may indicate that the high ISO performance is achieved due to better quantum efficiency? Would be interesting to find out.

Erik,

One reason why a larger pixel performs better than smaller pixels producing a higher resolution image which is then downsized is the difference between hardware and software binning. Consider 4:1 binning (http://www.photometrics.com/resources/learningzone/binning.php) of a monochrome image in hardware. The SNR due to shot noise is improved by a factor of two. However, the binned superpixel would be read out with the same read noise as the unbinned pixels. With software binning, 4 read noises would be binned.

With color sensors, hardware binning is considerably more complicated. See this post (http://www.phaseone.com/en/Digital-Backs/IQ180/IQ180-Tutorials.aspx) on the Phase One site.

Regards,

Bill
Title: Re: Best downsize to reduce noise?
Post by: theguywitha645d on January 11, 2012, 09:16:24 am
Downsampling in and of itself, does not address noise. I don't think noise has simply a pixel level effect where averaging neighboring pixel luminance values takes care of the problem (won't Bayer interpolation do the same?)--noisy images still look noisy at different magnifications other than 100%. So noise is like waves on a ocean, at different scales, you are still able to perceive noise in the frame just like you can still see the surface of the water is not flat.

Why do you think this would work and what have you been trying?
Title: Re: Best downsize to reduce noise?
Post by: bjanes on January 11, 2012, 02:48:57 pm
Downsampling in and of itself, does not address noise. I don't think noise has simply a pixel level effect where averaging neighboring pixel luminance values takes care of the problem (won't Bayer interpolation do the same?)--noisy images still look noisy at different magnifications other than 100%. So noise is like waves on a ocean, at different scales, you are still able to perceive noise in the frame just like you can still see the surface of the water is not flat.

Why do you think this would work and what have you been trying?

Downsizing does affect noise, and that is the basis for the normalization that DXO does to compare a higher resolution sensor to a lower resolution one. See the explanation by DXO here (http://www.dxomark.com/index.php/Publications/DxOMark-Insights/Detailed-computation-of-DxOMark-Sensor-normalization). Those familiar with statistics will recognize that this is the same as the calculation of standard error (http://en.wikipedia.org/wiki/Standard_error_%28statistics%29). By collecting more photons, the sampled mean of a picture element will be a better representation of the true mean. A similar principle is with polling (as with the recent elections). To get a more accurate statistic, the pollster takes a larger sample size.

Regards,

Bill

Title: Re: Best downsize to reduce noise?
Post by: hjulenissen on January 11, 2012, 03:02:35 pm
Downsampling in and of itself, does not address noise.
Reasonable downsampling includes a lowpass filter. A lowpass filter reduce the energy of any signal/noise within the stopband. SNR tends to be poor at high spatial frequencies.
Quote
I don't think noise has simply a pixel level effect where averaging neighboring pixel luminance values takes care of the problem (won't Bayer interpolation do the same?)-
High-quality Bayer reconstruction usually tries to keep sharp edges (guess image information that is unknown), meaning that it tends to be vulnerable to sensor noise (even amplify it) unless great care is taken.
Quote
-noisy images still look noisy at different magnifications other than 100%. So noise is like waves on a ocean, at different scales, you are still able to perceive noise in the frame just like you can still see the surface of the water is not flat.

Why do you think this would work and what have you been trying?
I like extreme examples. I am not saying that this is what happens, but I hope that this will make you think through your claims.

Imagine that a line of pixels should have had luminance values of:
[17 17 17 17 17 17 17 17]
(signal has no high-frequency components, only DC)

Then imagine that the pixels have been read out with an error ("noise") turning it into:
[16 18 16 18 16 18 16 18]
(noise has only a single frequency component, and seems to be deterministic)

What do you think averaging would do?

In the real world, signal and noise are not perfectly separated in the frequency domain, and it is extremely hard to change one without affecting the other. It seems that practically usable algorithms exist, though, where some (small) loss of signal quality is accepted for a significant reduction in perceived noise.

-hj
Title: Re: Best downsize to reduce noise?
Post by: madmanchan on January 11, 2012, 03:10:21 pm
My earlier example about producing an image with only 1 pixel was partly a joke, but also partly serious.  We know that such an image has no noise.  The original image may have a lot of noise.  As you transition from the original image to the smallest possible image (1 pixel), you transition from the original amount of noise to zero noise.  In other words, noise will go down.  The exact nature of that transition will depend a lot on the resampling method.
Title: Re: Best downsize to reduce noise?
Post by: hjulenissen on January 11, 2012, 03:22:51 pm
My earlier example about producing an image with only 1 pixel was partly a joke, but also partly serious.  We know that such an image has no noise.
"Noise" corresponds more or less measurement error, right? Or the non-linear, non-deterministic part of the error or whatever (I am sure Joofa have something to say on that).

If you took 2 shots of the same scene and averaged both down to single pixel size, chances are that those two single pixel images would differ ever so slightly. I vote that this difference is indication that both images contains some noise (granted only Dc-component).

I think that some light can be shed on this topic by studying Wiener filters. Basically a Wiener filter is the solution to the problem "what linear, time (or space)-invariant filter will minimize the residual error after filtering a signal corrupted by noise"
(http://www.dspguide.com/graphics/F_17_7.gif)
http://www.dspguide.com/ch17/3.htm

If total SNR was the only thing that affected our perception, and LTI filters was the only tool that we had at our disposal to improve it, and signal/noise characteristics were known, I believe that Wiener filters would be the solution to all noise reduction problems. Clearly, this is not the case.

-h
Title: Re: Best downsize to reduce noise?
Post by: theguywitha645d on January 11, 2012, 03:45:41 pm
Reasonable downsampling includes a lowpass filter. A lowpass filter reduce the energy of any signal/noise within the stopband. SNR tends to be poor at high spatial frequencies. High-quality Bayer reconstruction usually tries to keep sharp edges (guess image information that is unknown), meaning that it tends to be vulnerable to sensor noise (even amplify it) unless great care is taken.I like extreme examples. I am not saying that this is what happens, but I hope that this will make you think through your claims.

Imagine that a line of pixels should have had luminance values of:
[17 17 17 17 17 17 17 17]
(signal has no high-frequency components, only DC)

Then imagine that the pixels have been read out with an error ("noise") turning it into:
[16 18 16 18 16 18 16 18]
(noise has only a single frequency component, and seems to be deterministic)

What do you think averaging would do?

In the real world, signal and noise are not perfectly separated in the frequency domain, and it is extremely hard to change one without affecting the other. It seems that practically usable algorithms exist, though, where some (small) loss of signal quality is accepted for a significant reduction in perceived noise.

-hj


As you said, it is an extreme example. I am not claiming I know the answer, but down sampling does not eliminate noise from my image as noise is random--not having a cycle like your example--and you still need to preserve the signal--what is the difference of an image of a sandy beach and a noisy sky? And what is the difference of viewing two images that are 300dpi+ with one down sampled and the other at native resolution, do we perceive the images as the same? So does the inability to resolve smaller pixels have the same visual effect of down sampling?

Perhaps it would help to know more details from the OP and what tests the OP has done on this.
Title: Re: Best downsize to reduce noise?
Post by: bjanes on January 11, 2012, 04:05:17 pm
... but down sampling does not eliminate noise from my image as noise is random--not having a cycle like your example--and you still need to preserve the signal--what is the difference of an image of a sandy beach and a noisy sky? And what is the difference of viewing two images that are 300dpi+ with one down sampled and the other at native resolution, do we perceive the images as the same? So does the inability to resolve smaller pixels have the same visual effect of down sampling?

Noise is random, but downsampling can reduce the randomness and thus the noise. Consider two flat frames taken with the Nikon D3 at ISO 3200. The standard deviation is the noise. If I measure the standard deviation of a 400 x 400 pixel area and compare it to the standard deviation of a 200 x 200 pixel downsize using bicubic in Photoshop, the standard deviation (the noise) decreases. This is confirmed by visual inspection of the images.

Regards,

Bill
Title: Re: Best downsize to reduce noise?
Post by: theguywitha645d on January 11, 2012, 06:11:40 pm
Bill, here is may question. If I have two 16x20 prints viewed at 20 inches and from the same image, but one was from the original file and one was down sampled, would I see a difference in noise? Would the visual system do the basic averaging work of down sampling?
Title: Re: Best downsize to reduce noise?
Post by: madmanchan on January 11, 2012, 06:30:54 pm
I think of noise as variance, or standard deviation, or some statistical measure of variation in image values (take your pick).  The 1-pixel image has zero variation.  Hence zero noise.

Same thing happens in the real world when you view something with texture (like a rock or brick wall) and then move away from it.  Eventually you will not see the texture anymore and you'll just see a flat tone.
Title: Re: Best downsize to reduce noise?
Post by: bjanes on January 11, 2012, 07:00:29 pm
Bill, here is may question. If I have two 16x20 prints viewed at 20 inches and from the same image, but one was from the original file and one was down sampled, would I see a difference in noise? Would the visual system do the basic averaging work of down sampling?

That is an interesting question, and it would depend in part on the visual acuity of the observer and on the characteristics of the printing process. Error diffusion printing as with an inkjet softens the image somewhat as compared to a continuous tone device, and some of the noise might be smoothed out by the printing process with an inkjet. However, if you are making a 16 by 20 print, you would likely want all the resolution that you can get (even with an IQ180) and it would not make sense to downsize merely to reduce noise. A dedicated NR program such as Noiseware would be a better choice if noise were an problem. Why downsize to reduce noise and lose resolution, only to do upsizing (either in Photoshop or the printer driver) when the printing?

Image normalization for resolution (as done by DXO) makes more sense when comparing two sensors with differing resolution. If you have resolution to spare with the higher resolution sensor when making a given print size, downsizing would be necessary either in Photoshop or the printer driver and this would improve the signal to noise ratio. Whether or not the difference would be noticeable in a print depends on many variables.

Regards,

Bill
Title: Re: Best downsize to reduce noise?
Post by: hjulenissen on January 12, 2012, 12:47:22 am
I think of noise as variance, or standard deviation, or some statistical measure of variation in image values (take your pick).  The 1-pixel image has zero variation.  Hence zero noise.
So what is the cause of slight differences between two single-pixel images taken under identical conditions?

-h
Title: Re: Best downsize to reduce noise?
Post by: Fine_Art on January 12, 2012, 02:10:37 am
So what is the cause of slight differences between two single-pixel images taken under identical conditions?

-h

Have you tried the experiment with some old images downsampling to 1 pixel? Probably moving clouds, maybe some from sensor temp.
Title: Re: Best downsize to reduce noise?
Post by: Fine_Art on January 12, 2012, 02:17:08 am

For me, the point of the exercise, as I see it, is to start with the Raw images of a pretty good camera with lots of pixels (24MP), and determine the best way to process those Raws in order to gain say 1-2 stops of subjective ISO, shooting at  6400 instead of 1600, and end up with enough pixels for a nice rather than ugly magazine page.


Edmund

 

Run noise software to remove the noise wiping out some fine detail, then downsample to hide the mush. Ive done this with a 75% downsample ratio. I had no way to know if there was an optimum ratio. I assumed the wavelets left intact would fit in 75% pixels. Yes, it works.
Title: Re: Best downsize to reduce noise?
Post by: hjulenissen on January 12, 2012, 02:45:22 am
Bill, here is may question. If I have two 16x20 prints viewed at 20 inches and from the same image, but one was from the original file and one was down sampled, would I see a difference in noise? Would the visual system do the basic averaging work of down sampling?
It is good scientific practice to try to isolate variables. If you scale your image at some point in the pipeline, but keep the absolute print size constant, your printer or some other component will scale the image up to compensate. Knowing how it does that and predicting the quality of the end-result is hard.

My suggestion is to experiment with different lowpass filters (the primary noise-altering component in a scaler is a lowpass filter) while keeping the pixel grid constant.

-h
Title: Re: Best downsize to reduce noise?
Post by: Fine_Art on January 12, 2012, 03:48:10 am
Original from one of the software vendor's websites vs NR then downsampled.
Title: Re: Best downsize to reduce noise?
Post by: hjulenissen on January 12, 2012, 06:53:49 am
Original from one of the software vendor's websites vs NR then downsampled.
The second one looks less noisy to me, hard to conclude as they are rendered at different sizes on my display... What am I supposed to conclude?

-h
Title: Re: Best downsize to reduce noise?
Post by: Bart_van_der_Wolf on January 12, 2012, 07:45:34 am
Noise is random, but downsampling can reduce the randomness and thus the noise. Consider two flat frames taken with the Nikon D3 at ISO 3200. The standard deviation is the noise. If I measure the standard deviation of a 400 x 400 pixel area and compare it to the standard deviation of a 200 x 200 pixel downsize using bicubic in Photoshop, the standard deviation (the noise) decreases. This is confirmed by visual inspection of the images.

Hi Bill,

Indeed, downsampling will (by weighted averaging) cancel some of the highest frequency noise. But the important thing is that also the high spatial frequency (HSF) signal is reduced, often in quite a similar amount (so there is no real improvement). Of course the noise which is mostly random will be reduced and there will usually be some remaining signal at the new resolution, so there is some increase in S/N ratio at the expense of loss of detail.

Therefore the question becomes, do we measure image quality as lower noise but also having lost HSF signal, or as high Signal to Noise (with HSF detail to spare). I would favor the latter, which can be achieved with a dedicated noise reduction application quite effectively with only minimal reduction of HF signal. This keeps the larger output option intact, which is important to many.

If the OP's goal is to also reduce file size, and not only by lossless compression, then reducing noise before downsampling wil provide superior results. In the absense of noise the down-sampling filter becomes very important if aliasing artifacts are to be avoided.

Cheers,
Bart
Title: Re: Best downsize to reduce noise?
Post by: theguywitha645d on January 12, 2012, 10:57:13 am
It is good scientific practice to try to isolate variables. If you scale your image at some point in the pipeline, but keep the absolute print size constant, your printer or some other component will scale the image up to compensate. Knowing how it does that and predicting the quality of the end-result is hard.

My suggestion is to experiment with different lowpass filters (the primary noise-altering component in a scaler is a lowpass filter) while keeping the pixel grid constant.

-h

Well, I can just do it on my printer, but as you pointed out, there are systemic factors. But my question was a little more fundamental. Assuming a losses system, given the same image, one down sampled and one native, when actually viewed at the same size and same distance, would the human visual system perceive the difference, after all, none of the pixels in each image can be resolved and some sort of averaging would be done in situ so to speak.
Title: Re: Best downsize to reduce noise?
Post by: eronald on January 12, 2012, 07:15:18 pm

If the OP's goal is to also reduce file size, and not only by lossless compression, then reducing noise before downsampling wil provide superior results. In the absense of noise the down-sampling filter becomes very important if aliasing artifacts are to be avoided.

Cheers,
Bart

Bart,

 The OP's goal is to obtain a nice file that is very substantially downsized, eg to 1/4 the pixels.

Edmund
Title: Re: Best downsize to reduce noise?
Post by: BJL on January 12, 2012, 08:01:27 pm
What is the significance of 6MP target?

Is it because some use case forbids files of higher pixel count, like some web site with that is its maximum allowed pixel count?

Is it because you believe that this will give better IQ at high ISO speeds than, for example, applying NR but keeping the full pixel count, or just printing at the size you intend to use with the 6MP but at twice the PPI?

It is hard to judge alternative approaches without knowing the objective. (I mean the goal, not the lens.)

Title: Re: Best downsize to reduce noise?
Post by: Fine_Art on January 12, 2012, 08:08:14 pm
The second one looks less noisy to me, hard to conclude as they are rendered at different sizes on my display... What am I supposed to conclude?

-h
Conclude what you will, the images have to tell the tale.

Like I said the first one is the poster shot at a noise reduction software website. The second shot has noise removed by me in a competing software. It is downsampled to look like it has detail at the pixel level.

Which looks better?
Title: Re: Best downsize to reduce noise?
Post by: eronald on January 12, 2012, 08:22:11 pm
What is the significance of 6MP target?

Is it because some use case forbids files of higher pixel count, like some web site with that is its maximum allowed pixel count?

Is it because you believe that this will give better IQ at high ISO speeds than, for example, applying NR but keeping the full pixel count, or just printing at the size you intend to use with the 6MP but at twice the PPI?

It is hard to judge alternative approaches without knowing the objective. (I mean the goal, not the lens.)



I am interested in the case where I simply don't need 24MP; Assume, I want 6MP, or even 3MP because I am going into a BADLY LIT locale with an absolute NECESSITY to print a single magazine page or get a good web shot; that I need every bit of shutter speed and DOF I can squeeze out of the body and so I'm stuck with 6400 ISO. Now what is the best workflow to postprocess the 14 bit 6400 ISO Raw image into a colorful, sharp, small, destination file?

Edmund
Title: Re: Best downsize to reduce noise?
Post by: hjulenissen on January 13, 2012, 01:05:49 am
Conclude what you will, the images have to tell the tale.
Images can be selected that aids generalized conclusions, and ones can be selected that does not aid conclusions. Comparing two images on my display of different sizes surely looks visibly different, but so does an image of a horse compared to an image of a cat...

My point being that you should resize image#2 so that it matches image#1. Then one can form sensible conclusions about which is "better". I could do this for myself, but my scaling might not be the same as yours, and then it would be difficult to exchange opinions.

-h
Title: Re: Best downsize to reduce noise?
Post by: Thomas Krüger on January 13, 2012, 02:42:54 am
Sizimg is a command line program for resizing an image with anti-aliasing, works great:
http://www.realitypixels.com/products/sizimg.html
Title: Re: Best downsize to reduce noise?
Post by: Fine_Art on January 13, 2012, 10:29:54 pm
Images can be selected that aids generalized conclusions, and ones can be selected that does not aid conclusions. Comparing two images on my display of different sizes surely looks visibly different, but so does an image of a horse compared to an image of a cat...

My point being that you should resize image#2 so that it matches image#1. Then one can form sensible conclusions about which is "better". I could do this for myself, but my scaling might not be the same as yours, and then it would be difficult to exchange opinions.

-h

You seem to have forgotten what the OP was about. Can downscaling a noisy image improve the perceived quality? How much would be optimum? Its not "is your converter better than mine?" The comparison is downscaled vs original. My reply was use noise software before downscaling. The best %age I dont know. I think it depends on your noise removal software which is mostly a black box with sliders.

An exact measure would be to use a wavlet based noise reduction from the noise sampled in the picture. Based on that you would know what frequencies are removed the most so you would know what downscaling to use to present the detail that is left.

It sounds like you want a software comparison instead. That should be a different thread.
Title: Re: Best downsize to reduce noise?
Post by: eronald on January 14, 2012, 03:40:52 am
You seem to have forgotten what the OP was about. Can downscaling a noisy image improve the perceived quality?

That wasn't quite the question. The question was, IF I KNOW that I will be downscaling a RAW very substantially, is there a better way to process it, than going thru a standard Raw converter, then thru noise reduction and then through "standard" downscaling ?

Edmund
Title: Re: Best downsize to reduce noise? As opposed to compressing for a smaller file?
Post by: BJL on January 14, 2012, 12:45:03 pm
I am interested in the case where I simply don't need 24MP; Assume ... I am going into a BADLY LIT locale with an absolute NECESSITY to print a single magazine page or get a good web shot; that I need every bit of shutter speed and DOF I can squeeze out of the body and so I'm stuck with 6400 ISO. Now what is the best workflow to postprocess the 14 bit 6400 ISO Raw image into a colorful, sharp, small, destination file?
If small file size and low visible noise in the final smallish displayed image are the goals, then even though you do not _need_ 24MP, might the best solution be a combination of NR processing and, for file size, compressing the final JPEG? My sense is that both NR processing and JPEG compression algorithms can be far smarter than the "brute force" approach of downsizing, by choosing what information to discard and what to keep, likely giving more "IQ per MB" in the final file.
Title: Re: Best downsize to reduce noise? As opposed to compressing for a smaller file?
Post by: hjulenissen on January 14, 2012, 03:08:37 pm
If small file size and low visible noise in the final smallish displayed image are the goals, then even though you do not _need_ 24MP, might the best solution be a combination of NR processing and, for file size, compressing the final JPEG? My sense is that both NR processing and JPEG compression algorithms can be far smarter than the "brute force" approach of downsizing, by choosing what information to discard and what to keep, likely giving more "IQ per MB" in the final file.
I dont think that either JPEG compression or downscaling is the route to go for noise reduction.

1. Decide what output size/pixel-grid you are going to use
2. Scale, sharpen and NR into that grid in whatever order makes your tools produce visually pleasing results with minimal effort.

-h
Title: NR processing for noise reduction, then JEPG compression for size reduction
Post by: BJL on January 14, 2012, 06:18:00 pm
I dont think that either JPEG compression or downscaling is the route to go for noise reduction.
Agreed, and apologies for my poorly worded subject line. I proposed JPEG compression only if file size reduction is needed, after NR processing. My understanding is that this is safer against moire and such than just downsampling, and keep the option open of zooming in on the details. And just printing the high MP file at high PPI is already useful for reducing visible noise.
Title: Re: NR processing for noise reduction, then JEPG compression for size reduction
Post by: eronald on January 14, 2012, 07:32:52 pm
what about jpeg2000

Edmund

Agreed, and apologies for my poorly worded subject line. I proposed JPEG compression only if file size reduction is needed, after NR processing. My understanding is that this is safer against moire and such than just downsampling, and keep the option open of zooming in on the details. And just printing the high MP file at high PPI is already useful for reducing visible noise.
Title: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: BJL on January 15, 2012, 11:38:11 am
what about jpeg2000
Indeed, why do we not see a lot more of JPEG2000, which seems unequivocally superior to the older JPEG, and is used quite a bit for archival image storage, using its loss-less compression mode? Maybe a thread on "going beyond JPEG" would be interesting. JPEG2000 does seem to offer full respecting of the bit-depth of the source data, and a graceful trade-off of IQ vs file size by discarding the higher frequenc parts of its wavelet encoded image.

But then again, I have read several comparisons arguing that Both PNG and the intra-frame compression mode of H.264 mathouts out-perform JPEG2000, and since H.264 intra-frame is used in some higher level video cameras, there is a tantalizing convergence for a video-plus-stills workflow.
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: Fine_Art on January 15, 2012, 03:00:23 pm
Indeed, why do we not see a lot more of JPEG2000, which seems unequivocally superior to the older JPEG, and is used quite a bit for archival image storage, using its loss-less compression mode? Maybe a thread on "going beyond JPEG" would be interesting. JPEG2000 does seem to offer full respecting of the bit-depth of the source data, and a graceful trade-off of IQ vs file size by discarding the higher frequenc parts of its wavelet encoded image.

But then again, I have read several comparisons arguing that Both PNG and the intra-frame compression mode of H.264 mathouts out-perform JPEG2000, and since H.264 intra-frame is used in some higher level video cameras, there is a tantalizing convergence for a video-plus-stills workflow.

Agree. H.264 should be the new standard for replacing jpg as most cameras that do video should have the standard in hardware. TVs are now going 4x 1920x1080p resolution. I can't wait to see my pictures in a 7680x4320 frame on the wall.
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: hjulenissen on January 15, 2012, 04:18:37 pm
TVs are now going 4x 1920x1080p resolution. I can't wait to see my pictures in a 7680x4320 frame on the wall.
4k is 4x the pixels of 1080p, or about 4000x2000 pixels. And one 80" monster was revealed at CES a few days ago at an unknown prize.

No content is available or suggested, the benefit is supposed to be upscaling 1080p.

With any luck, we will end up with tvs that do 4k, 120fps internally but with a HDMI input (and content) that is limited to 1080p24 and 1080p60.

-h
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: hjulenissen on January 15, 2012, 04:25:44 pm
Indeed, why do we not see a lot more of JPEG2000,
Used as a jpeg-replacement it seems to offer too little at too high a prize.

Say that you have an existing jpeg-based digital camera. You want less compression artifacts.

Option #1. Implement/purchase software/hardware that does jpeg2k encoding (using wavelets and very different tech from all other image/video coding). Deal with patent/ip issues. Deal with customers that are unable to read their files on the PC/mac/tv with usb image reader/...

Option #2. Increase the jpeg file-size by 30% and ask the user to buy a bigger memory card.

Which sounds like the easier sell? Add to this that the most demanding customers are probably using raw anyways.

-h
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: ErikKaffehr on January 16, 2012, 01:40:31 am
Hi,

36 MP DSLR shots would be quite nice on 8K, don't you think? I'm using projector by the way so I can project any size ;-)

Best regards
Erik


4k is 4x the pixels of 1080p, or about 4000x2000 pixels. And one 80" monster was revealed at CES a few days ago at an unknown prize.

No content is available or suggested, the benefit is supposed to be upscaling 1080p.

With any luck, we will end up with tvs that do 4k, 120fps internally but with a HDMI input (and content) that is limited to 1080p24 and 1080p60.

-h
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: eronald on January 16, 2012, 02:57:24 am
Used as a jpeg-replacement it seems to offer too little at too high a prize.

Say that you have an existing jpeg-based digital camera. You want less compression artifacts.

Option #1. Implement/purchase software/hardware that does jpeg2k encoding (using wavelets and very different tech from all other image/video coding). Deal with patent/ip issues. Deal with customers that are unable to read their files on the PC/mac/tv with usb image reader/...

Option #2. Increase the jpeg file-size by 30% and ask the user to buy a bigger memory card.

Which sounds like the easier sell? Add to this that the most demanding customers are probably using raw anyways.

-h

Doesn't explain why it is not being used as a software save option. Eg. as a compression integrated in Tiff.

Also, maybe Jpeg2000 or fractal compression does what I want, naturally.

Edmund
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: Fine_Art on January 16, 2012, 03:59:47 am
4k is 4x the pixels of 1080p, or about 4000x2000 pixels. And one 80" monster was revealed at CES a few days ago at an unknown prize.

No content is available or suggested, the benefit is supposed to be upscaling 1080p.

With any luck, we will end up with tvs that do 4k, 120fps internally but with a HDMI input (and content) that is limited to 1080p24 and 1080p60.

-h

There is 8k along with 4k

UHDTV's main tentative specifications:[3]
Number of pixels: 7,680 × 4,320 (33.2 megapixels)
Aspect ratio: 16:9
Viewing distance: 0.75 H
Viewing angle: 100°
Colorimetry: under discussion
Frame rate: 120 FPS progressive scan
Bit depth: 12-bit per channel
Audio system: 22.2 surround sound
Sampling rate: 48 kHz, 96 kHz


There is also already IMAX movies which are about the same resolution. More IMAX material is needed.
Maybe this is why Sony, as an electronics company keeps pushing more pixels!

Added:
(http://www.ultrahdtv.net/wp-content/uploads/2008/02/ultra-hdtv.gif)

see also
http://www.ultrahdtv.net/sharp-ultrahdtv-prototype/ (http://www.ultrahdtv.net/sharp-ultrahdtv-prototype/)
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: Fine_Art on January 16, 2012, 04:02:11 am
Hi,

36 MP DSLR shots would be quite nice on 8K, don't you think?

Best regards
Erik



Exactly.
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: hjulenissen on January 16, 2012, 04:36:27 am
There is 8k along with 4k
Sure, there might even be 16k, 32k or some other 2^N k spec down the line. Your post could be read like this was a technology that was available now, or one that for certain would be available at sensible prices reasonably soon.

My post was a claim that it is not so.

As long as the digital cinema spec only does 2k@48fps and 4k@24fps (incidentally using jpeg2000), I don't see home cinema going any further soon (they would not dare having better specs at home than in theatres). Still-image is a different segment, and I could see them driving higher resolutions.

http://en.wikipedia.org/wiki/Digital_Cinema_Initiatives#Image_and_audio_capability_overview
"2048x1080 (2K) at 24 frame/s or 48 frame/s, or 4096x2160 (4K) at 24 frame/s
In 2K, for Scope (2.39:1) presentation 2048x858 pixels of the imager is used
In 4K, for Scope (2.39:1) presentation 4096x1716 pixels of the imager is used
Stereoscopic 3D Image:
2048x1080 (2K) at 48 frame/s - 24 frame/s per eye (4096x2160 4K not supported)
"
-h
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: hjulenissen on January 16, 2012, 04:40:02 am
Doesn't explain why it is not being used as a software save option. Eg. as a compression integrated in Tiff.
If most customers dont want it, researchers think that it does not bring enough of a benefit, marketers don't know how to advertise it and economers worry about licensing costs, it probably won't be introduced.

I think that the combination of jpeg for small file-size delivery and lossless formats for archiving works very well for me and many others.
Quote
Also, maybe Jpeg2000 or fractal compression does what I want, naturally.
What do you mean?

-h
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame
Post by: hjulenissen on January 16, 2012, 04:48:13 am
36 MP DSLR shots would be quite nice on 8K, don't you think? I'm using projector by the way so I can project any size ;-)
Yes. I think it is the only feasible way we may someday be able to view the full DR of new sensors/HDR without excessive tonemapping. Probably direct-lit displays and not projectors, though.

-h
Title: Re: JPEG, JPEG2000, PNG, H.2264 intra-frame ... and JPEG-XR
Post by: BJL on January 16, 2012, 11:25:29 am
Used as a jpeg-replacement it seems to offer too little at too high a prize.
...
Add to this that the most demanding customers are probably using raw anyways.
That sounds right: JPEG and TIFF are ubiquitous and between them meet most people's needs ... and those what want more go for raw.

And there is no commercial motive to drive JPEG2000 adoption, whereas several other recent alternatives have big corporations behind them while being "open" in the limited sense of the format definitions being published and usable free of license fees.

1. Microsoft has had its "HD Photo" format adopted as the JPEG-XR standard, with a promise of fee-free patent licensing, and is adding support for that to its software. JPEG-XR does not use wavelets like JPEG2000, but has the option of a two layer system of 4x4 and 16x16 blocks, which achieves some for the wavelet virtues without the memory bandwidth needs of the inherently global wavelet transform.

2. Google is pushing yet another alternative, WebP, but maybe just for web-page display.

3. As someone said, hardware support for H.264 is becoming common, and one trend these days is that video drives technological changes more than still photography, so maybe the attraction of full still-quality frame grabs from an H.264-intra video will favor that format.
Title: Re: Best downsize to reduce noise?
Post by: ondebanks on January 17, 2012, 12:25:03 pm
Bill, here is may question. If I have two 16x20 prints viewed at 20 inches and from the same image, but one was from the original file and one was down sampled, would I see a difference in noise? Would the visual system do the basic averaging work of down sampling?

Yes, you would see a reduction in noise - especially if the down sampling used something like the median of values within each block of pixels, implicitly rejecting outlying values, which in most cases means rejecting noise.

But more obviously than that, you would see the reduction in resolution.

And if you are not standing close enough (or the photo is not enlarged enough) to see the reduction in resolution, then you are not able to perceive the full resolution of the original either...nor its higher noise at the pixel level - I guess this is what you mean by "would the visual system do the basic averaging work of down sampling".

It's always possible to trade pixel resolution for noise, and vice-versa (the D3X designers knew they were doing this, in comparison to the D3).

Ray
Title: Re: Best downsize to reduce noise?
Post by: ondebanks on January 17, 2012, 12:59:14 pm
I think of noise as variance, or standard deviation, or some statistical measure of variation in image values (take your pick).  The 1-pixel image has zero variation.  Hence zero noise.

Same thing happens in the real world when you view something with texture (like a rock or brick wall) and then move away from it.  Eventually you will not see the texture anymore and you'll just see a flat tone.

"The 1-pixel image has zero variation.  Hence zero noise." - Actually, it has noise.
A measurement does not have to have spatial extent (more than 1 pixel, i.e. sampling in image space) for there to be both noise and signal present. In fact, information theory says that there must be noise present.

If I mask off all except 1 pixel on an imaging sensor (let's keep it simple and give it perfect 100% quantum efficiency), and then fire exactly 10,000 photons at that pixel, and it spits out a count of 10,003 electrons...is there still "zero noise"? No...readout noise has added a random extra component, 3 electrons in this case.

But you are right to "think of noise as variance, or standard deviation, or some statistical measure of variation in image values". Where is the variation here? It's temporal, not spatial. If I keep repeating the 10,000 photons experiment, I will keep getting different values - in a statistical distribution, the standard deviation of which is the camera's readout noise.

More realistically, I won't be able to release precisely 10,000 photons every time. Any light source will emit photons per Poisson statistics, so there will also be variations (of the order of the square root of 10,000 = 100) in the photon count reaching the sensor each time. Now I will get output counts like 9913, 10045, 10024, 9935, 10000, 9989, ...a broader distribution, due to the two sources of noise, one external and one internal to the single pixel.


Noise simply means that you cannot exactly measure the correct amount of a signal. There are plenty of light detectors (like photomultiplier tubes) which give no spatial information - they are essentially single-pixel devices. If these gave measurements which were truly noiseless, then using one of these on a telescope would allow me to perfectly measure the brightness of every source in the night sky, right out to the dimmest and furthest galaxies. Such magic would make doing astronomy trivial! Alas, physics says it ain't so...

Ray
Title: Re: Best downsize to reduce noise?
Post by: hjulenissen on January 17, 2012, 02:47:27 pm
"The 1-pixel image has zero variation.  Hence zero noise." - Actually, it has noise.
A measurement does not have to have spatial extent (more than 1 pixel, i.e. sampling in image space) for there to be both noise and signal present. In fact, information theory says that there must be noise present.

If I mask off all except 1 pixel on an imaging sensor (let's keep it simple and give it perfect 100% quantum efficiency), and then fire exactly 10,000 photons at that pixel, and it spits out a count of 10,003 electrons...is there still "zero noise"? No...readout noise has added a random extra component, 3 electrons in this case.

But you are right to "think of noise as variance, or standard deviation, or some statistical measure of variation in image values". Where is the variation here? It's temporal, not spatial. If I keep repeating the 10,000 photons experiment, I will keep getting different values - in a statistical distribution, the standard deviation of which is the camera's readout noise.

More realistically, I won't be able to release precisely 10,000 photons every time. Any light source will emit photons per Poisson statistics, so there will also be variations (of the order of the square root of 10,000 = 100) in the photon count reaching the sensor each time. Now I will get output counts like 9913, 10045, 10024, 9935, 10000, 9989, ...a broader distribution, due to the two sources of noise, one external and one internal to the single pixel.


Noise simply means that you cannot exactly measure the correct amount of a signal. There are plenty of light detectors (like photomultiplier tubes) which give no spatial information - they are essentially single-pixel devices. If these gave measurements which were truly noiseless, then using one of these on a telescope would allow me to perfectly measure the brightness of every source in the night sky, right out to the dimmest and furthest galaxies. Such magic would make doing astronomy trivial! Alas, physics says it ain't so...

Ray

You put it more clearly than me: temporal and spatial components of noise.

Is non-zero-mean measurement error noise? Is a deterministic bias noise?

Can we make strict definitions that clearly separate errors that are a linear function of input (e.g. diffraction), from non-linear errors (sensor saturation), from noise (photon shot-noise)?

-h
Title: Re: Best downsize to reduce noise?
Post by: ejmartin on January 17, 2012, 03:16:28 pm
"The 1-pixel image has zero variation.  Hence zero noise." - Actually, it has noise.


Yes, but it doesn't look noisy -- even if you look close up    ;D
Title: Re: Best downsize to reduce noise?
Post by: BJL on January 17, 2012, 03:24:34 pm
And if you are not standing close enough (or the photo is not enlarged enough) to see the reduction in resolution, then you are not able to perceive the full resolution of the original either...nor its higher noise at the pixel level - I guess this is what you mean by "would the visual system do the basic averaging work of down sampling".
This is the scenario of interest to me: the OP was about situations where 24MP is more than enough, and I will take that to mean situations where the proposed lower pixel count of 6MP or whatever provides sufficient resolution. For example, something like viewing the original 6000x4000 and down sampled 3000x2000 versions with 12"x8" prints from a distance of 20", or 6"x4" prints from 10", so with still a healthy 5000 "pixels per viewing distance" for the down-sampled version.

So my follow-up question is whether, once the lower pixel count versions has as much resolution as the eye can use, would our visual system's blurring of the finer detail in the 24MP image do about as good a job of noise reduction?


Aside: One subtle point: our eye-balls jiggle slightly for purposes like edge-detection, making it unreliable to judge the eye's  resolving power from rod-cone density alone. That is, the human eye operates in a way that enhances _edge-sharpness_, possibly at the expense of _resolving power_ with lower contrast details.
Title: With apologies to Dretske
Post by: LKaven on January 17, 2012, 06:20:21 pm
"The 1-pixel image has zero variation.  Hence zero noise." - Actually, it has noise.

You've also moved into the /semantic/, or /representational/ theory of information in an interesting way here.  Since that single bit is not reliably caused to be "1" by signal alone, it may be said to have a conjunctive content.

The question is framed this way: if you take a photography of a homogenous black field with a single-pixel, one-bit camera, and the camera registers a "1", then what is the semantic or representational content of the "1"? 

Framed this way, you can explain the fact that "it doesn't look noisy" (as Emil observes) with recourse to the semantic content.  The "1" bit does not mean just "black expanse" because of the underlying noise content which only by chance did not predominate.
Title: Re: Best downsize to reduce noise?
Post by: LKaven on January 17, 2012, 06:42:50 pm
Aside: One subtle point: our eye-balls jiggle slightly for purposes like edge-detection, making it unreliable to judge the eye's  resolving power from rod-cone density alone. That is, the human eye operates in a way that enhances _edge-sharpness_, possibly at the expense of _resolving power_ with lower contrast details.

Yes, feature detectors in the brain for edge-and-orientation detection were the subject of Hubel and Wiesel's Nobel Prize-winning paper.  Edges in various orientations are some of the most salient things in our visual field, and singular disruptions to a pattern are salient.  This was tested in the cat striate cortex, but one expects the same is true for humans. 

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1363130/
Title: Re: With apologies to Dretske
Post by: ondebanks on January 17, 2012, 07:00:10 pm
You've also moved into the /semantic/, or /representational/ theory of information in an interesting way here.  Since that single bit is not reliably caused to be "1" by signal alone, it may be said to have a conjunctive content.

The question is framed this way: if you take a photography of a homogenous black field with a single-pixel, one-bit camera, and the camera registers a "1", then what is the semantic or representational content of the "1"? 

Framed this way, you can explain the fact that "it doesn't look noisy" (as Emil observes) with recourse to the semantic content.  The "1" bit does not mean just "black expanse" because of the underlying noise content which only by chance did not predominate.

Hey Luke, go easy on me. I'm just a physicist. We don't do semantics. ;) 

Ray
Title: Re: Best downsize to reduce noise?
Post by: ErikKaffehr on January 19, 2012, 12:50:52 am
Hi,

So what you say is that 9913 = 10045 with some probability ;-)

You may also miss the issue, we can very effectively reduce noise by binning all pixels into one pixel. With 24 MP and 10000 photons per pixel SNR will be around 489000, pretty good. Of course some MTF will be lost in the process. Honestly, all MTF will be lost, but in marketing terms it's just a minor loss, like a few percent.

We then expand the pixel using a decent algorithm like GF, that actually creates new detail, and print at A2 at 720PPI.

The resulting image will be a great piece of art with a wide range of possible interpretations. I cannot see any problems, except some singularities.

We have seen very nice pictures produced with similar algorithm from NASA clearly indicating that the technique actually works:

(http://www.johnnyronnberg.com/astrowebb/rymdfart/sonder/teleskop/kepler/bilder/111206/kepler22b.jpg)

The above image correctly depicts the planet Kepler 22b with a probability in the interval [0,1].

Best regards
Erik


"The 1-pixel image has zero variation.  Hence zero noise." - Actually, it has noise.
A measurement does not have to have spatial extent (more than 1 pixel, i.e. sampling in image space) for there to be both noise and signal present. In fact, information theory says that there must be noise present.

If I mask off all except 1 pixel on an imaging sensor (let's keep it simple and give it perfect 100% quantum efficiency), and then fire exactly 10,000 photons at that pixel, and it spits out a count of 10,003 electrons...is there still "zero noise"? No...readout noise has added a random extra component, 3 electrons in this case.

But you are right to "think of noise as variance, or standard deviation, or some statistical measure of variation in image values". Where is the variation here? It's temporal, not spatial. If I keep repeating the 10,000 photons experiment, I will keep getting different values - in a statistical distribution, the standard deviation of which is the camera's readout noise.

More realistically, I won't be able to release precisely 10,000 photons every time. Any light source will emit photons per Poisson statistics, so there will also be variations (of the order of the square root of 10,000 = 100) in the photon count reaching the sensor each time. Now I will get output counts like 9913, 10045, 10024, 9935, 10000, 9989, ...a broader distribution, due to the two sources of noise, one external and one internal to the single pixel.


Noise simply means that you cannot exactly measure the correct amount of a signal. There are plenty of light detectors (like photomultiplier tubes) which give no spatial information - they are essentially single-pixel devices. If these gave measurements which were truly noiseless, then using one of these on a telescope would allow me to perfectly measure the brightness of every source in the night sky, right out to the dimmest and furthest galaxies. Such magic would make doing astronomy trivial! Alas, physics says it ain't so...

Ray

Title: Re: Best downsize to reduce noise?
Post by: ondebanks on January 19, 2012, 05:44:18 am
You may also miss the issue, we can very effectively reduce noise by binning all pixels into one pixel. With 24 MP and 10000 photons per pixel SNR will be around 489000, pretty good. Of course some MTF will be lost in the process. Honestly, all MTF will be lost, but in marketing terms it's just a minor loss, like a few percent.

We then expand the pixel using a decent algorithm like GF, that actually creates new detail, and print at A2 at 720PPI.

The resulting image will be a great piece of art with a wide range of possible interpretations. I cannot see any problems, except some singularities.

We have seen very nice pictures produced with similar algorithm from NASA clearly indicating that the technique actually works:

The above image correctly depicts the planet Kepler 22b with a probability in the interval [0,1].

Best regards
Erik

 :D :D Very good, Erik! Of course there are also those CSI-style TV/movie blunders where they can somehow magically take a low-res still of a crowd and "enhance...ok, zoom in to that person...enhance...zoom in on his hand...enhance...yes, just that knuckle...enhance...enhance more...zoom again...there! see? there's at least a milligram of the victim's blood dried onto that hair follicle!"

Unfortunately this goes back to "Bladerunner", which is otherwise an outstanding movie.

So what you say is that 9913 = 10045 with some probability ;-)

Not quite: 9913 can only equal 9913. What one can say is that 9913 and 10045 are both independent attempts (samples) to measure the underlying mean flux rate of 10000 per exposure. Or that 9913 and 10045 are both drawn from an approximately Poisson distribution with a mean of 10000 and a standard deviation of slightly more than 100 (including some low amount of readnoise).

In normal circumstances, you don't know that the true underlying flux rate is 10000 photons per exposure (if you knew, why would you bother trying to measure it and keep getting it slightly wrong?). But you really want to come as close as possible to finding out that rate; the closer you get, the more you've reduced the effect of noise. And you do know that the maximum-likelihood estimate (MLE) of this rate is simply the average of all the samples you've measured - so the more samples you obtain, the closer your MLE gets to the true value; the better the signal to noise.

Ray