Luminous Landscape Forum

Site & Board Matters => About This Site => Topic started by: Dave Millier on February 15, 2012, 02:18:06 pm

Title: Naked sensor
Post by: Dave Millier on February 15, 2012, 02:18:06 pm
Interesting article from Sean.

However, I feel he has understated the problems that come from omitting the AA filter. As a (not so proud) owner of a Kodak 14n and several Sigmas, it's very obvious to me that there is quite a bit more to it than colour moire. The Kodak is afflicted by a number of pixel level distortions, the well known "christmas tree lights" being an obvious example (takes the form of coloured speckles, points, lines and threads that occur on regularly repeating patterns). Those who think it only occurs on fabric or fly screens have obviously not tried to do landscapes with a Kodak - it happens on rock, brickwork, twigs, even shingle beaches. Sean's claim about the Foveon sensor seems wrong to me too. Luminance aliasing occurs just as readily even if it isn't so immediately obvious. But jaggies, diagonal hatching and rope-like artifacts on thin lines are readily discernible on some images. This test shot of mine has a beautiful demonstration in the spacing of the gaps in the balustrade as you look right to left and also the diagonal tiling pattern on the orange roof. You'll have to trust me on this one, but the tiling is actually a coventional horizonal pattern, the sensor has invented the diagonal stripes (indeed I suspect that this false data often by pure fluke makes the Foveon resolution seem higher than it actually is):

(http://www.whisperingcat.co.uk/alias1.jpg) 

One thing I've wondered for some time since we have had cameras with in camera stabilisation is why can't the IS be used in reverse, to give the sensor a little shake just sufficient to blur the optical image to match the sensor resolution. Then we could safely do away with AA filters without getting images full of jaggies.
Title: Re: Naked sensor
Post by: deejjjaaaa on February 15, 2012, 02:25:49 pm
One thing I've wondered for some time since we have had cameras with in camera stabilisation is why can't the IS be used in reverse, to give the sensor a little shake just sufficient to blur the optical image to match the sensor resolution.

well, Pentax was rumored to provide different vertical/horizontal AA blurring strength (through AA filters) in K10D to account for a sensor shake because of a shutter induced vibration in its less than perfect SR implementation... trying to make use of that deficiency so to say.
Title: Re: Naked sensor
Post by: BJL on February 15, 2012, 02:31:27 pm
Thanks for the tile example of color-less aliasing, and your other comments, which make sense to me. [Added: in particular, all digital (all discrete sampling) has aliasing unless suitable low pass filtering is applied, it is only color aliasing like moiré that is related to specific color sampling strategies like the Bayer CFA.]

On this point
One thing I've wondered for some time since we have had cameras with in camera stabilisation is why can't the IS be used in reverse, to give the sensor a little shake just sufficient to blur the optical image to match the sensor resolution. Then we could safely do away with AA filters without getting images full of jaggies.
check out this thread that I started recently: Ultrasonic sensor jiggling for AA effect (http://www.luminous-landscape.com/forum/index.php?topic=62429.0)
Title: Re: Naked sensor
Post by: theguywitha645d on February 15, 2012, 04:26:07 pm
After reiding the article, I think photographers don't understand significant figures. It seems that keeping numbers to every decimal place during a calculation is required.

This obsession to keep every possible piece of detail is so important even when this detail can only be perceived at home in front of their monitors at 100% is getting a little silly. It has even come to the point where photographers are afraid to open up or stop down a lens to have an ideal DoF for an image rather than lose "pixel-level detail" at the lens's "sweet spot." Even when this detail will never be perceived in a print. Although, manufactures can now save some money by just putting a fixed f/11 aperture in each lens and be done with it.
Title: Re: Naked sensor
Post by: peterzpicts on February 15, 2012, 06:04:00 pm
Great article Sean, Loosing the AA filter makes lots of sense. I have been living AA free with my SD14 for almost 4 years. Although I think the 800lbs gorilla in the corner being ignored is the Foveon's lack of lateral color reconstruciton of CFA sensors.  This is the second part of what gives X3's look or "presence" as I like to call it.
I am not in denial of the quirks of the Sigma line, if Sigma could get them under control they could really make some headway.
Pete
Title: Re: Naked sensor
Post by: Graeme Nattress on February 15, 2012, 08:23:09 pm
Talking about chroma moire as if that was the only issue when loosing an OLPF on a Bayer CFA is missing the point that luma aliasing can also be a pretty nasty problem, and unlike chroma moire, it is practically impossible to remove. Luma moire effects all types of sensor systems - the three chip system used in video cameras, the RGB stripe pattern used by Sony, the Foveon system as well as the Bayer CFA traditionally used in digital cameras.

If you have sufficiently high resolution on your sensor that you don't need an OLPF, then that only means that there is sufficient optical filtering occurring elsewhere in the system, be it the lens MTF, or diffraction from the aperture.

My concerns are doubly so for motion, where any aliasing or moire is more visible due to movements in the system causing the patterns to move in the opposite direction. This causes issues for motion adaptive codecs on the distribution side, and the bad effects of the aliasing are even harder to remove due to the motion.

Title: Re: Naked sensor
Post by: BernardLanguillier on February 15, 2012, 08:33:20 pm
Nikon for sure has to be praised for being the first camera manufacturer who:

1. Provides both options,
2. Openly speaks about the possible issues with the AA filter less version.

For those photographers who think that having a credible back up is an essential part of their operations, getting one of each might be the best of both worlds.  ;D

Cheers,
Bernard
Title: Re: Naked sensor
Post by: bjanes on February 16, 2012, 09:35:41 am
Great article Sean, Loosing the AA filter makes lots of sense. I have been living AA free with my SD14 for almost 4 years. Although I think the 800lbs gorilla in the corner being ignored is the Foveon's lack of lateral color reconstruciton of CFA sensors.  This is the second part of what gives X3's look or "presence" as I like to call it.
I am not in denial of the quirks of the Sigma line, if Sigma could get them under control they could really make some headway.

Sean makes a big deal that with the M9 Leica did not want to ruin the resolution of their excellent lenses with a low pass filter. However, I understand that the short flange to sensor distance of the rangefinder (no retrofocus design needed to allow for mirror movement), it would be very difficult to include a low pass filter. Indeed, with the M8 they couldn't even include an infra-red filter. That short distance also creates problems with lens cast, especially with wide angle lenses.

Regards,

Bill
Title: Re: Naked sensor
Post by: KirbyKrieger on February 16, 2012, 11:41:56 am
After reiding the article, I think photographers don't understand significant figures. It seems that keeping numbers to every decimal place during a calculation is required.

This obsession to keep every possible piece of detail is so important even when this detail can only be perceived at home in front of their monitors at 100% is getting a little silly. It has even come to the point where photographers are afraid to open up or stop down a lens to have an ideal DoF for an image rather than lose "pixel-level detail" at the lens's "sweet spot." Even when this detail will never be perceived in a print. Although, manufactures can now save some money by just putting a fixed f/11 aperture in each lens and be done with it.

I'm going to clumsily wade into what I half-discern to be newly roiled waters plashing on the sands surrounding Michael's luminous lake.  (Partial disclosure  ;) : my degree is in rhetoric; my advanced degree is in painting; and that winky emoticon needs a skilled make-over.)

theguywitha645d makes a good point, poorly.  The point made is that the differences being discussed may not be perceivable in a print.  The way it was made was to:
 - demean the author by generalizing to "photographers",
 - demean the author by presenting the criticism as a failure to understand basic science
 - falsely strengthen the claim by presenting the straw man "the afraid photographer"
 - and then lather the false argument with the pepper-jelly of "knowing" sarcasm

I see this unmannered bullying behavior regularly.  I see it more and more frequently on LuLa.  It has no place in public discussion.  Imho.

Sean Reid carefully specifies that the differences he discusses are perceivable.  (I don't recall whether he differentiates between on-screen and printed.)  theguywitha645d's disputation might be better expressed as "IME, I don't see these differences in my prints.  Could you confirm that you do, at what size, and for what audience is this important?"

In general, I encourage all on-line participants to not make attempts to win arguments, but rather to refine knowledge.  Our tribal beavering is best devoted to chipping away the bark to reveal the pith.  Ask and answer specific questions.  Use your intellectual bite to make sense, not scents.

Title: Re: Naked sensor
Post by: welder on February 16, 2012, 12:07:51 pm
Hmm. I love detailed photos as much as any photographer. But my gut feeling is this whole business about removing the AA filter is a bit overhyped. From what I've seen in comparisons that actually use the same cameras with the only diffence being the AA filter removed in one, the gain in resolution is rather small and the resulting artifacts when they appear are a bit garish to my eye. The moire patterns maybe not such a big deal because they can be corrected but the aliasing and jagged edges, that's harder to deal with....the images are sharper sure, but tend to feel less organic. Or maybe I'm just looking at the wrong examples  :-\
Title: Re: Naked sensor
Post by: billh on February 16, 2012, 03:01:37 pm
Kirby,
I love this! Both the “rhetoric” and the very welcome points made by your beautiful prose. I was a science major and hadn’t a clue a major like rhetoric existed.
I used the Ricoh A12 for a week and compared the images to those I took with a Sony NEX-5n and 7. The images from the Ricoh convinced me to order the D800E instead of the D800. I’ve read various explanations saying what we are seeing are artifacts, not actual image detail, but for me the bottom line is the images made without AA filters look as if they have more detail in them than those from sensors with AA filters. There seems to be an inherent desire lurking within many photographers to extract the most detail possible. In the past we used slow films and experimented to various developers, and when possible used a large format camera. Perhaps these new cameras sans AA filters are the fine grain film of the past.
Title: Re: Naked sensor
Post by: theguywitha645d on February 16, 2012, 04:21:33 pm
Use your intellectual bite to make sense, not scents.



My degree is not in rhetoric, but imaging. But let me try again. At least let me try to frame the issues.

The problem about significant figures is applicable in imaging. Just because an effect can be measured, does not mean it is significant. Just as I can do math, it does not mean every decimal place should be kept.

Lets go into some basics. Image quality is subjective in that it is based on a viewer--there is no absolute frame in which to judge image qualities outside the reference of a human observer. Because of this, the idea of a standard viewing distance is important. The standard viewing distance is defined as a distance equal to the diagonal of the print/display image. You can use different definition depending criteria, but the standard viewing distance works well for how folks view an image and should be sufficient for this topic--I see nothing the Reid's essay where he is apply any special criteria to a photograph.

A 300dpi 8x10 inch print viewed at about 10 inches will appear as a continuous tone reproduction--the pixels will not be resolved. Because of standard viewing distance, a 150dpi 16x20 print viewed at 20 inches or a 75dpi 32x40 print viewed at 40 inches will appear the same and the pixels will still not be resolved. So the issue with image quality is really not one where when you print larger you need more pixels, but rather once you have reached a certain number of pixels the angular size of the pixels fall below where an observer can resolve them. If you do the math, when you reach about 7.2MP, you have a photo-quality image that can be printed to any size.

Are there reason to have more than 7.2MP? Yes, simply because the human visual system can work beyond its resolving power limits. Detection, for example, is very powerful in the visual system--think power cables in the sky or cell towers on the horizon. This is very different from resolving line pairs which is separating a surface with a frequency, but just noticing a line feature in a uniform field. So by going to high-resolution sensors we can perceive finer details and those details will have better contrast. But like anything, the human visual system is limited.

Another good reason for a standard viewing distance is that subjective image qualities such as depth of field and sharpness can be judged--both qualities change with ratio of viewing distance to print size. Because images from different formats are enlarged by different amounts to reach a display size, the circle of confusion (CoC) which defines both sharpness and depth of field are based on format and not on pixel pitch. An 18MP image that looks sharp at standard viewing distance, will still remain sharp if I divide the pixels into four creating a 72MP image. Whether an image looks sharp at a pixel level (the 18MP will look sharper than the 72MP up-rezed image at 100%) does not matter, it is whether the image looks sharp when viewed.

Lets look at Reid's basic hypothesis of why not wanting an AA filter. His argument is that you should get everything from your lens. Why? There is nothing to suggest the the human visual system can perceive infinite amounts of detail. There is a limit to our perception. Already a 7.2MP image will make a photo quality image where the pixels are unresloved.

Secondly, is it even possible? Since the resolving power of the lens is not simply a property of the lens, target contrast directly impacts resolving power, the lower the contrast, the lower the resulting resolving power.  Only the plane of focus would carry the information he is talking about, everything else will be something less, but DoF is a real quality and that something less sharp than at the plane of focus is still acceptably sharp. And then if I want a certain amount of DoF by either open up where aberrations impact detail or stopping down where diffraction impacts the detail, how is anything supposed to get "the most" out of your optics? How is the AA filter going to have a significant impact when so many variables are in the system? How can you spot from single images the impact of the AA filter?

What is more important, the smallest details or the perception of the image in its entirety. People want a pleasing image and will not and cannot see the stuff that you can see at a 100% monitor view. Here is some of my experience with this. People were really impressed by the wonderful 6' high Darwin poster I created for them for his birthday--he did not come, you understand. The print was made from a 700 pixel tall web image. I routinely print MFD files on 44" roll paper. No one has been able to spot the difference between a 22MP sensor or a 40MP sensor. 20" prints from m4/3 files look beautiful. And visitors to the Imaging Center I work at are amazed at the detail of a 24" microscope image take with a 4MP camera--they are always disappointed it is 4MP because it does not look that way to them. And when I print my MFD prints to 44x58, I can not even see the detail that is in the file when I put my face into the image--hardly a proper viewing condition and one I have never seen try by a viewer. Now, some of these cameras have an AA filter and some do not. Looking at large prints (20+"), I see no evidence of an AA filter--all the images are sharp and detailed. I am not saying these prints/images were all the same, but the images in and of themselves are sharp and detailed.

I think Reid is making a fundamental error in his evaluation. He is evaluating images at 100% on a monitor view. That is not a real world conditions. And as the pixel count goes up, that condition moves further and further away from a real viewing condition. The variability in the photographic process going to be far more significant than an AA filter--having an AA filter does not result in a soft image, just some loss to frequencies at the Nyquist limit. Reid's own argument suggests he really does not know, he is simply making an argument that because he thinks he sees detail at 100% more clearly without an AA filter, that that is significant detail. I have never read that is true nor do I have experience that that is true--my experience fits the theories I have studied. Reid himself does not really offer any support to his claim.
Title: Re: Naked sensor
Post by: Dave Millier on February 16, 2012, 05:33:14 pm
This is really interesting stuff, thank you.

I have something to add that may slightly contradict one aspect of what you say.

On the DPReview Sigma forum where I spend too much time, there is (unsurprisingly) a great deal of attention spent on considering images on screen at 100%.  Some Foveon fans claim that they are fans because the characteristics of the Foveon mean you can enlarge a great deal more than you can with a CFA sensor and achieve bigger more detailed prints. But many more aren't interested in prints but viewing only on screen at 100% (where the Foveon usually looks good). 

I've questioned this habit in some depth and it appears that quite a few people get their enjoyment of photographs, not from admiring how skillful composition and lighting comes together in the whole image to make a great picture,  but rather from viewing at 100% and exploring all the details close up, scrolling through the image section by section. 

I'm a print man myself and can't understand how someone can willingly give up the pleasure of viewing beautiful artistic composition in exchange for forensic examination of detail.  But this habit does seem surprisingly prevalent. The "wow" factor (in terms of sharpness and clarity) is seemingly valued more highly than composition.

This may well go far to explain why so few good photographs are shot on Foveon based cameras  ;)

Title: Re: Naked sensor
Post by: hjulenissen on February 16, 2012, 05:46:01 pm
Are there reason to have more than 7.2MP? Yes, simply because the human visual system can work beyond its resolving power limits. Detection, for example, is very powerful in the visual system--think power cables in the sky or cell towers on the horizon. This is very different from resolving line pairs which is separating a surface with a frequency, but just noticing a line feature in a uniform field. So by going to high-resolution sensors we can perceive finer details and those details will have better contrast. But like anything, the human visual system is limited.
I have a problem understanding that a system might work "beyond its resolving power limits". Detection cannot detect features that are small enough that the human eye optics smears it into a lump, or that cannot be "read" with at least two photo sites with sufficient contrast?

I do get that the HVS may be non-linear, and that estimates of "resolving power" using a swept sine may or may not be easily reinterpreted to judge if a power-line will be sufficiently detailed shown so as to not cause visible errors.

As a first approximation, I would assume that any camera system that does "proper" pre-filtering before sampling, would be able to render a given range of spatial frequencies/edge rise-times with sufficient accuracy/contrast so as to be transparent (in this regard) up to a given display size/vieweing distance.

-h
Title: Re: Naked sensor
Post by: billh on February 16, 2012, 07:11:47 pm
There sure seems to be a lot of energy going into explaining why cameras without AA filters should not be selected over those with AA filters.

I see information (detail) in images from sensors without AA filters that is absent from these with AA filters, when both images were taken with the same lens and aperture. If the detail is present in one image and not the other, how can we expect the image with the missing detail to look the same as that with the detail when printed? In the case where the AA filtered image is simply blurred slightly, or less sharp, can this difference (non AA filter) advantage be negated by post processing (USM)?

Medium format digital cameras come without AA filters, and now cameras with smaller sensors are appearing without them. Why would the camera companies do this if there were not advantages that potentially override the disadvantages? From Nikon’s website (http://imaging.nikon.com/lineup/dslr/d800/features01.htm):

Optical low-pass filter optimized for sharpness on the D800

Reducing false color and moiré is the main job of the optical low-pass filter located in front of the image sensor. However, this benefit is generally gained with a small sacrifice of sharpness.
The ultimate attention to detail — the D800E
Nikon engineers have developed a unique alternative for those seeking the ultimate in definition. The D800E incorporates an optical filter with all the anti-aliasing properties removed in order to facilitate the sharpest images possible.

Thanks!
Title: Re: Naked sensor
Post by: ErikKaffehr on February 17, 2012, 12:34:29 am
Hi,

An AA filtered image needs more sharpening then an unfiltered image.

The point you make that an unfiltered image has more detail  may depend on it producing fake detail. For instance the structures on a feather (is that called "barb"?) may be finer than the native resolution of the sensor (called Nyquist limit). On a correctly filtered image detail beyond sensor resolution would be a gray mass while an unfiltered image would "invent" detail with lower frequency.

See enclosed image, the red line shows the Nyquist limit, all detail right of Nyquist is "fake". What you see is that detail "mirrors around" the Nyquist limit. The lower image is same as the above one, converted to monochrome.

These images were shot with a Sony Alpha 55 SLT which probably has some AA-filtering but it seems to be weak. I have noticed aliasing in real images so I made these test shots to find out.

I also added a screen dump from Nikon's demo page for the Nikon D800/D800E, where extra sharpening was made on the filtered image (as an OLP-filtered image needs more sharpening). I don't think that the non AA-filtred image has better detail than the left one. Now you could also sharpen the right image a bit more and really have a "sharpening race".

Best regards
Erik



There sure seems to be a lot of energy going into explaining why cameras without AA filters should not be selected over those with AA filters.

I see information (detail) in images from sensors without AA filters that is absent from these with AA filters, when both images were taken with the same lens and aperture. If the detail is present in one image and not the other, how can we expect the image with the missing detail to look the same as that with the detail when printed? In the case where the AA filtered image is simply blurred slightly, or less sharp, can this difference (non AA filter) advantage be negated by post processing (USM)?

Medium format digital cameras come without AA filters, and now cameras with smaller sensors are appearing without them. Why would the camera companies do this if there were not advantages that potentially override the disadvantages? From Nikon’s website (http://imaging.nikon.com/lineup/dslr/d800/features01.htm):

Optical low-pass filter optimized for sharpness on the D800

Reducing false color and moiré is the main job of the optical low-pass filter located in front of the image sensor. However, this benefit is generally gained with a small sacrifice of sharpness.
The ultimate attention to detail — the D800E
Nikon engineers have developed a unique alternative for those seeking the ultimate in definition. The D800E incorporates an optical filter with all the anti-aliasing properties removed in order to facilitate the sharpest images possible.

Thanks!

Title: Re: Naked sensor
Post by: hjulenissen on February 17, 2012, 12:58:48 am
Why would the camera companies do this if there were not advantages that potentially override the disadvantages?
I am a cynic. I expect companies to introduce those products that (according to their predictions) cost the least to manufacture, that can sell i the biggest numbers and at the highest selling price.

Now, having the best possible "image quality" (whatever that is) certainly may help, but it is probably not the only thing that major camera manufacturers think about. For many products, it seems that it is more profit-efficient to include e.g. a swiveling lcd-screen than to spend money on long-term sensor research (or to buy the most expensive model from those who did).

-h
Title: Re: Naked sensor
Post by: theguywitha645d on February 17, 2012, 10:56:17 am
I have a problem understanding that a system might work "beyond its resolving power limits". Detection cannot detect features that are small enough that the human eye optics smears it into a lump, or that cannot be "read" with at least two photo sites with sufficient contrast?

I do get that the HVS may be non-linear, and that estimates of "resolving power" using a swept sine may or may not be easily reinterpreted to judge if a power-line will be sufficiently detailed shown so as to not cause visible errors.

As a first approximation, I would assume that any camera system that does "proper" pre-filtering before sampling, would be able to render a given range of spatial frequencies/edge rise-times with sufficient accuracy/contrast so as to be transparent (in this regard) up to a given display size/vieweing distance.

-h

This is complicated and I am not sure systemically I understand it enough to go into fine technical detail. However, there are sort of two issues, what the camera "sees" and what the viewer of the print sees. Here I am taking about why having a file from a camera that exceeds the 7.2MP "limit" can make a difference--there is no question that high MP sensors impact the image.

The test target you choose, influences the result. If I do a lines per mm target and a dots per mm target, I get different results. With lines per mm, I have a square wave that at the threshold of resolution turns into a sine wave. As we approach the limit of resolving power, not only does the frequency increase, but the amplitude decreases, both from the top and bottom--the blacks and whites go gray. What if I put a single black line on a white field, this is the idea of detection. The amplitude also changes, but not the same way, the white is still there and only the black is making an amplitude change. Basically, I can more easily see a black line thinner than a resolving power patch of lines.
Title: Re: Naked sensor
Post by: BJL on February 17, 2012, 11:07:30 am
If the detail is present in one image and not the other, how can we expect the image with the missing detail to look the same as that with the detail when printed?
No one is saying that they will be the same; the debate is whether the combination of extra detail plus contamination by aliasing artifacts (not just moiré) is better or worse. As a teacher, I prefer an honest "I don't know" to a wrong answer, but that is getting too philosophical. The choice is clearly one where different photographers and different photographic situations will lead to different choices.

Quote
In the case where the AA filtered image is simply blurred slightly, or less sharp, can this difference (non AA filter) advantage be negated by post processing (USM)?
/quote]
To a good extent yes, but I will let other demonstrate and debate that. Looking at "straight, minimally processed raw" might appeal to some puritanical sense of "Photographic Correctness" (PC), but it is most often not the best way to make use of the information recorded by the sensor.

Quote
If the detail is present in one image and not the other, how can we expect the image with the missing detail to look the same as that with the detail when printed?

In the case where the AA filtered image is simply blurred slightly, or less sharp, can this difference (non AA filter) advantage be negated by post processing (USM)?

Why would the camera companies do this if there were not advantages that potentially override the disadvantages? From Nikon’s website (http://imaging.nikon.com/lineup/dslr/d800/features01.htm):

Optical low-pass filter optimized for sharpness on the D800

Reducing false color and moiré is the main job of the optical low-pass filter located in front of the image sensor. However, this benefit is generally gained with a small sacrifice of sharpness.


Quote
Why would the camera companies do this if there were not advantages that potentially override the disadvantages? From Nikon’s website ... Reducing false color and moiré is the main job of the optical low-pass filter located in front of the image sensor. However, this benefit is generally gained with a small sacrifice of sharpness.
And on the other hand, "why do the vast majority of cameras have an OLPF, which increases manufacturing costs, if there were not advantages that potentially override the disadvantages?"


But beyond the obvious cost advantage in most cases of omitting the OLPF, I will float are several other possibilities:

1. In some cases, aliasing could be the lesser of two evils. Or, for some photographers, the post-processing effort to erase moiré could be less than for careful sharpening, if a small enough fraction of their images need moiré removal.

2. Once enough customers ask for it, and are willing to pay a sufficient premium for it, the free market system will provide it. And that demand might driven by factors other than pure reason and evidence: see the book "Thinking, Fast and Slow" on how far we fall short of perfectly rational decision making.

3. It could be the judgment by MF marketing people that resolution, and the sharpness seen in 100% pixel peeping, is a significant factor in many customers' choice between DMF vs 35mm format DSLRs. (As evidence, look at all the people judging the D800 inferior to DMF on the basis of viewing a few crops at 100% pixels on-screen, without even considering appropriate sharpening as part of the "giving it your best shot" comparison, and without seeing comparisons at equal image size and at the higher PPI that are likely to be used when displaying high MP images.) So some very sharp example images, avoiding ones visible aliasing artifacts, can help persuade some people to pay the premium for DMF over 35mm, and Nikon might likewise think it can push some decisions back the other way with the D800E.

Title: Re: Naked sensor
Post by: Ray on February 17, 2012, 11:13:15 am
I think Reid is making a fundamental error in his evaluation. He is evaluating images at 100% on a monitor view. That is not a real world conditions. And as the pixel count goes up, that condition moves further and further away from a real viewing condition. The variability in the photographic process going to be far more significant than an AA filter--having an AA filter does not result in a soft image, just some loss to frequencies at the Nyquist limit. Reid's own argument suggests he really does not know, he is simply making an argument that because he thinks he sees detail at 100% more clearly without an AA filter, that that is significant detail. I have never read that is true nor do I have experience that that is true--my experience fits the theories I have studied. Reid himself does not really offer any support to his claim.

I would agree that the increase in detail and microcontrast that is probably visible in a D800E shot compared with a D800 shot of the same scene using the same lens, after both images have been appropriately sharpened, is probably trivial and probably not noticeable when viewing large prints from a recommended viewing distance.

It would be revealing to do some comparisons involving stepping back slightly when using the D800E, so that after cropping the D800E file to the same FoV as the D800 shot one could compare say a 33mp D800E shot with the full 36mp shot from the D800, then a 30mp shot, then 27mp etc. It might be better to use 2-dimensional subjects to avoid any confusion resulting from different perspectives.  ;D

With such a procedure one could get a clearer idea of just how much that extra resolution, resulting from the lack of an AA filter, might be worth in terms of pixel count. Is it equivalent to a 5% increase in pixel count or a 10% or 20% etc?

I never get excited by small increases in pixel count when I'm upgrading a camera. My first upgrade from my first DSLR (the 6mp Canon 60D) was the 8mp 20D, which represented a 33% increase in pixel numbers. My main reason for the upgrade was the significantly improved noise characteristis of the 20D. I considered the extra pixels as icing on the cake, but not significant.

I recall doing some resolution comparisons with those two cameras some years ago, and came to the conclusion that any increase in pixels numbers of less than 50% is probably not worth bothering with, and therefore I would never upgrade a camera based solely on an increase in pixel numbers of less than 50%, and perhaps not even 50%.

However, as regards comparing images at 100% on screen, I think that's quite legitimate. The issue of whether or not differences seen at 100% on screen may or may not be visible on certain size prints viewed from a certain distance, is a separate concern.

The first step is to establish whether or not there are any differences, and to do that one usually has to pixel peep.

Having established that there are noticeable differences at 100% on the monitor, one should then address the real-world, practical circumstances whereby such differences could be apparent.

The first example that springs to mind is the habit of viewers, not only photographers, to inspect a large print at a close distance, when possible, and for no other reason than to appreciate and take pleasure in the observation of any fine detail and texture that is not visible from the 'appropriate' or 'correct' viewing distance.

I believe this is a matter of normal inquisitiveness. When we view the real world, we expect to see greater detail the closer we look at any object. If our eyesight is not up to the task, we know that we will be able to see more and more detail as the power of the magnifying glass increases.

The photograph has the reputation of 'capturing' reality, albeit in a 2-dimensional format most of the time. However, if one approaches a large print to view the fine detail and instead discovers bluriness, it's a disappointment. Have you never experienced that before, theguywitha645d?  ;D

The second example that springs to mind is the opportunity for cropping. When one inspects a small portion of the total image at 100% on the monitor and finds that it is significantly sharper or more detailed than another 100% view of the same scene taken with a different camera, one knows that any print of a crop that is the same size as the image on the screen, will also reveal such differences, when viewed from the same distance as one views one's monitor.

That's worth something, surely.

Title: Re: Naked sensor
Post by: welder on February 17, 2012, 11:29:02 am
I believe this is a matter of normal inquisitiveness. When we view the real world, we expect to see greater detail the closer we look at any object. If our eyesight is not up to the task, we know that we will be able to see more and more detail as the power of the magnifying glass increases.

The photograph has the reputation of 'capturing' reality, albeit in a 2-dimensional format most of the time. However, if one approaches a large print to view the fine detail and instead discovers bluriness, it's a disappointment.
For me personally, it is even more disappointing to examine a large print up close and discover jagged edges from aliasing artifacts.
Title: Re: Naked sensor
Post by: theguywitha645d on February 17, 2012, 12:00:07 pm
The photograph has the reputation of 'capturing' reality, albeit in a 2-dimensional format most of the time. However, if one approaches a large print to view the fine detail and instead discovers bluriness, it's a disappointment. Have you never experienced that before, theguywitha645d?  ;D

Yes, but that is usually a result of bad technique, not specifications of the media. (And I have never thought to myself that it comes down to an AA filter.)

But I have never known anyone to expect an image not to change with viewing distance. It is a pleasant surprise when the detail is robust, but neither is it unexpected that an image does not have endless detail. And here again I would argue that the AA filter is not the determining factor.

I certainly pixel peep--it is fun and it provides some useful information. The question is what kind of information and the significance of that. I have been told that I should not stop down past f/11 with my 645D because the effects of diffraction can be seen. So one of the first things I did was to shoot a test at f/22 and make a 3 foot print. The idea that diffraction is really a function of format size rather than pixel pitch still holds true (someone ever suggested to me that blurring by diffraction at small apertures would simply offset the additional DoF, which of course is not true).

Now, every photographer is going to define acceptable image quality in their own way. Reid certainly has. My point is that an extreme view that every bit of resolution that you can wring out of your system is important is not really a very useful position nor really taking into account all the factors in image making/formation and the perception of that image. I am pretty sure that you know how to round and work with significant figures in math, I think photographers simply need to understand the equivalent process in imaging. OCD is a psychological condition, not a photographic technique.
Title: Re: Naked sensor
Post by: Rob C on February 17, 2012, 01:25:50 pm
You can imagine my disappointment when I actually read the first post! Such an inviting title, I though, news of a promising bit of new technology - but I guess I should have known about books and their covers; I'm wondering if there are grounds for misrepresentation...

; -)

Rob C
Title: judging IQ solely by resolution at extreme magnification is usually misleading
Post by: BJL on February 17, 2012, 01:43:51 pm
Ray,

    Two comments:

1) Any discussion of the pros and cons of AA filtering that never mentions the raison d'être of the OLPFm namely the demonstrated image degradation due to aliasing (including but not limited to moiré) is rather pointless. It is like so many of the endless gear-head debates where some people declare that one single measure of virtue is of paramount importance and should always be maximized, ignoring sacrifices made in order to increase that one virtue. A famous example: arguing for the superiority of one sensor in given format size over another solely on the basis of higher resolution despite worse noise did not work out well for the Kodak 14N vs the Canon 1Ds.

Edit: another example: there is a very easy way to get improved resolution at equal or better equal ISO speed, and even to do it with no risk of moiré: switch from color to monochrome. The moral, as with the OLPF, is that getting the color right can be important too! Maybe the market for unfiltered sensors will be comparable to that for the monochrome sensors that are repeatedly called for.


2) The filmic equivalent of 100% viewing is examining the negative/transparency under a microscope, enlarged enough to see the individual grains or dye clouds. Both are extreme magnification, beyond what viewers will see at any same print size/viewing combination, and though both might possibly have some technical value in comparisons where all other relevant factors are roughly equal, say when comparing two lenses on the same camera, they are mostly rather remote from any worthwhile judgement of image quality as will be perceived in end usage.
Title: Re: Naked sensor
Post by: Ray on February 18, 2012, 12:50:06 am
Any discussion of the pros and cons of AA filtering that never mentions the raison d'être of the OLPFm namely the demonstrated image degradation due to aliasing (including but not limited to moiré) is rather pointless.

BJL,
Any discussion of the pros and cons of AA filtering would have to mention the cons of image degradation, otherwise it wouldn't be a discussion of pros and cons. (Although, in the case of the D800E the cons could relate to non-image factors such as price.)  ;D

However, my post wasn't specifically about the pros and cons of an AA filter but the significance of small improvements in some aspect of technical performance, such as resolution, which may be given undue attention by people who only judge image quality at 100% size on their monitor, a size which is usually representative of a much, much larger print than such people may ever produce from their images.

For example, if one compares 100% crops of D800 and D800E images on an average 24" monitor with HD resolution of 1920x1080 pixels, and one sees a minor degree of extra crispness in the D800E image, one should be aware that such differences would only be apparent (hopefully) on a print from the same crop that one sees on the monitor, and at least the same size as the crop one sees on the monitor. If one were to make a print of the entire uncropped image it would be approximately 7ft wide and 4ft 7" high, and to see the same differences one noticed on the monitor at 100% one would need to view the 7ft wide print from the same distance one views a monitor.

(I think my maths is correct here. I'm working on the basis that the D800 image after conversion is about 109Mb in 8 bit mode, that a 24" monitor is about 20-21" wide, and that the resolution of such a monitor, if it's HD, is about 90 ppi.)

So far, the differences I've seen in the few comparison crops that are available, in respect of the D800 and D800E (refer Erik's example in reply #15), are not a cause for excitement, especially when one considers the disadvantages of more frequent and more severe aliasing artifacts.

I can understand a manufacturer of expensive MFDBs omitting the AA filters to keep costs down. The customer can then be positive about the slight advantage of a lack of AA filter, as MFDB owners generally are, and downplay the disadvantages of moire. But it doesn't make the same sense for a manufacturer to charge extra for the privilege of not having an AA filter when the advantages are so slight and the disadvantages sometimes significant.

Title: Re: Naked sensor
Post by: ErikKaffehr on February 18, 2012, 02:25:37 am
Hi,

I actually think it is OK to pixel peep at 1:1. Those are the pixels that actually go into the image.

On the other hand, it is seldom intended that we look at an image at actual pixels, unless we have a 60" wide screen at 100PPI. That would be nice, by the way.

So we discuss prints. When we print the image, consisting of all the pixels we see at 1:1, is converted by a smart rescaling algorithm to either 360 PPI, 720 PPI or what happens to your native resolution. In the next step output sharpening will be applied. The image driver will translate the RGB pixels to a CMYK dithered pattern that will be sprayed on the ink receiving layer of the paper. Finally the ink droplets may diffuse and dry. So the actual pixels image is going trough a lot of processing before it ends up on paper.

Another factor is that human vision is very sensitive to contrast, essentially, a contrasty image is perceived more sharp. That is a very good reason to include a color checker in all tests. If we assert that grey fields match between compared samples the probability that tonal differences would affect our comparison is greatly reduced.

Regarding the D800/D800E we will see. I guess that it takes careful work to take those systems to the limit, although we have been at the same limit with 16MP APS-C for a relatively long time.

Best regards
Erik






I think Reid is making a fundamental error in his evaluation. He is evaluating images at 100% on a monitor view. That is not a real world conditions. And as the pixel count goes up, that condition moves further and further away from a real viewing condition. The variability in the photographic process going to be far more significant than an AA filter--having an AA filter does not result in a soft image, just some loss to frequencies at the Nyquist limit. Reid's own argument suggests he really does not know, he is simply making an argument that because he thinks he sees detail at 100% more clearly without an AA filter, that that is significant detail. I have never read that is true nor do I have experience that that is true--my experience fits the theories I have studied. Reid himself does not really offer any support to his claim.
Title: Re: judging IQ solely by resolution at extreme magnification is usually misleading
Post by: ErikKaffehr on February 18, 2012, 02:53:44 am
Hi,

Although color moiré is the most obvious artifact, there also is monochrome moiré and that cannot be filtered out.

The issue I see is that an unfiltered sensor creates fake detail. All detail right of the red line in the enclosed image is fake. It may show up and enhance impression of sharpness, like fake goosebumps on skin, fake details in feathers.

The enclosed image was made with an OLP-filtered camera that has a weak filter. The experiment was set up to investigate/demonstrate aliasing. The reason I made it was because I have seen a lot of issues in my images that looked as aliasing to me.


Best regards
Erik




Edit: another example: there is a very easy way to get improved resolution at equal or better equal ISO speed, and even to do it with no risk of moiré: switch from color to monochrome. The moral, as with the OLPF, is that getting the color right can be important too! Maybe the market for unfiltered sensors will be comparable to that for the monochrome sensors that are repeatedly called for.



Title: Re: Naked sensor
Post by: Ray on February 18, 2012, 03:10:36 am

So we discuss prints. When we print the image, consisting of all the pixels we see at 1:1, is converted by a smart rescaling algorithm to either 360 PPI, 720 PPI or what happens to your native resolution. In the next step output sharpening will be applied. The image driver will translate the RGB pixels to a CMYK dithered pattern that will be sprayed on the ink receiving layer of the paper. Finally the ink droplets may diffuse and dry. So the actual pixels image is going trough a lot of processing before it ends up on paper.


Erik,
I'm assuming that what one sees on the monitor can be faithfully rendered on a print of the same size as the crop on the monitor, with the right technique. At least that's what I would consider the goal of a good printing technique. One doesn't expect to see colors and detail on a print which are invisible at 100% on the monitor. Likewise, it would be frustrating if the print could not deliver the detail one sees on the monitor, when the size of the print is large enough to accommodate the detail.

Of course, there are always distinguishing differences between a reflective medium and a transmissive medium, but I'm sure you know what I mean. If one can discern the tiny eye of an insect on a leaf in the middle of a large composition, at 100% on the monitor, one expects to be able to see that eye on a print of a crop of that portion of the image.

Whether or not the lack or presence of an AA filter would make the difference between being able to discern that eye or not, is the sort of thing that needs to be determined. ;D
Title: Re: Naked sensor
Post by: ErikKaffehr on February 18, 2012, 04:03:06 am
Hi,

Actually I don't think it is possible, because they are very different devices. A monitor used to have around 100 PPI and a printer uses something like 360 PPI res implemented as a 1440 or 2280 DPI dither pattern. The screen is continuous tone while inkjet has only few tones. So there will be a lot of transforms on the way, and it will take quite a lot of dots in the print to represent a simple pixel.

Best regards
Erik




Erik,
I'm assuming that what one sees on the monitor can be faithfully rendered on a print of the same size as the crop on the monitor, with the right technique. At least that's what I would consider the goal of a good printing technique. One doesn't expect to see colors and detail on a print which are invisible at 100% on the monitor. Likewise, it would be frustrating if the print could not deliver the detail one sees on the monitor, when the size of the print is large enough to accommodate the detail.

Of course, there are always distinguishing differences between a reflective medium and a transmissive medium, but I'm sure you know what I mean. If one can discern the tiny eye of an insect on a leaf in the middle of a large composition, at 100% on the monitor, one expects to be able to see that eye on a print of a crop of that portion of the image.

Whether or not the lack or presence of an AA filter would make the difference between being able to discern that eye or not, is the sort of thing that needs to be determined. ;D
Title: Re: Naked sensor
Post by: Ray on February 18, 2012, 08:35:08 am
Hi,

Actually I don't think it is possible, because they are very different devices. A monitor used to have around 100 PPI and a printer uses something like 360 PPI res implemented as a 1440 or 2280 DPI dither pattern. The screen is continuous tone while inkjet has only few tones. So there will be a lot of transforms on the way, and it will take quite a lot of dots in the print to represent a simple pixel.

Best regards
Erik


Sorry! I can't follow this, Erik. Are you saying that in your experience you actually lose detail that you see on the monitor, when you make a print? This is not my experience.

In my example of a 109MB D800 image viewed at the pixel level on a 24" HD monitor, the full image that's viewed would be 7ft wide (or 2.1 metres) at the monitor's resolution of 90 or 91 ppi.

If one were to print such an image at that size, of course one would first upres it to at least 240 ppi, if not 300 ppi, and apply an appropriate amount of sharpening for that output size. Generally, large prints need to be more contrasty, so one would probably apply some additional 'local contrast' enhancement.

The resulting file size would be about 770MB at 240ppi and over 1GB at 300ppi. What's the problem?

Cheers!
Title: Re: Naked sensor
Post by: ErikKaffehr on February 18, 2012, 08:51:34 am
Hi,

No, I didn't say anything like that. What I said is that is not possible to exactly reproduce a crop on screen on print media.

Best regards
Erik


Sorry! I can't follow this, Erik. Are you saying that in your experience you actually lose detail that you see on the monitor, when you make a print? This is not my experience.

In my example of a 109MB D800 image viewed at the pixel level on a 24" HD monitor, the full image that's viewed would be 7ft wide (or 2.1 metres) at the monitor's resolution of 90 or 91 ppi.

If one were to print such an image at that size, of course one would first upres it to at least 240 ppi, if not 300 ppi, and apply an appropriate amount of sharpening for that output size. Generally, large prints need to be more contrasty, so one would probably apply some additional 'local contrast' enhancement.

The resulting file size would be about 770MB at 240ppi and over 1GB at 300ppi. What's the problem?

Cheers!
Title: The extremely limited value of viewing at pixel level
Post by: BJL on February 18, 2012, 10:44:51 am
Ray and Erik,

I am mostly on Ray's side on the low value of 100% pixel on-screen viewing. I never felt the need to view my negatives enlarged enough to see the individual light detecting elements (silver grains, the pixels of traditional monochrome film), and see little or no need to do so in order to evaluate the image quality that can be got from a digital file. It is, as some one else said, the difference been "measurable" and "significant". This is well-known to working scientists and probably even more so to successful engineers and product designers, but is often missed by us dilettantes, who are these days faced with a flood of easily acquired data but sometime have a deficit of ability and judgement to draw practical conclusions from the data.

On the other hand, I agree that it is unclear how to use on-screen viewing as a completely reliable surrogate for print image quality: color gamuts and dynamic range differences also complicate things.
Title: Re: Naked sensor
Post by: hjulenissen on February 18, 2012, 06:26:05 pm
That isn't cynicism it's basic economics.
What is the difference? :-)

-h
Title: Re: judging IQ solely by resolution at extreme magnification is usually misleading
Post by: 32BT on February 18, 2012, 06:49:17 pm
Hi,

Although color moiré is the most obvious artifact, there also is monochrome moiré and that cannot be filtered out.

The issue I see is that an unfiltered sensor creates fake detail. All detail right of the red line in the enclosed image is fake. It may show up and enhance impression of sharpness, like fake goosebumps on skin, fake details in feathers.

The enclosed image was made with an OLP-filtered camera that has a weak filter. The experiment was set up to investigate/demonstrate aliasing. The reason I made it was because I have seen a lot of issues in my images that looked as aliasing to me.


Best regards
Erik


This has nothing to do with unfiltered sensors. This looks more like a really really bad RAW converter, which might explain why you have seen a lot of issues. Which RAW converter did you use?
Title: Monochrome sensor or film, not monochrome conversion
Post by: BJL on February 18, 2012, 07:01:03 pm
Sorry Erik,
I didn't make myself clear: I meant switching to a monochrome sensor with the same cell size, or from color to monochrome film. Both can increase resolution, but at an often unacceptable price. My point was only a soemwhat extreme example of why we should not fetishize resolution alone when evaluating image quality.
Title: Re: Naked sensor
Post by: nma on February 18, 2012, 09:42:54 pm
This question,  to AA or not, has been raging for a very long time. Photographers who are experts in evaluating images often argue that the camera without the AA filter makes superior images. I stipulate that these experts really know what they are looking, with respect to the popular and fine-art appeal of images. This AA argument is reminiscent of the “golden ears” of the high-fidelity audio era, who claimed that amplifiers with tubes sounded better than transistor amps and that CD's did not sound as good as analog recordings on vinyl.

Some “experts” say things like the per pixel sharpness of the AA-less sensor is superior. There are many statements like that, offered with religious certainty but no scientific basis. How, for example, can a term like per pixel sharpness have any meaning when it takes more than one digital sample to represent a line-width? Put another way, the line intensity has a characteristic width in any imaging system and it takes at least two digital samples to represent that variation.  If you had higher digital sampling the image fidelity should improve, but these experts would see that as lower per pixel sharpness. 

The question is are experts like Reid, Reichman and others religiously sure, or do they really understand the mathematics and engineering of digital imaging? Are we sure that the differences they see are due solely to the lack of an AA filter? Are we sure that the images without AA filters exhibit higher spatial resolution? We know there are some distortions in the images with AA-less sensors. Of course the AA-filtered images are not perfect either. They have different distortions, due to the imperfect roll-off of the AA filter and the properties of real lenses.

Understanding the formation and sampling of the Bayer-array sensor is very complicated. The reason it is so complicated is that the blue and red elements have different sampling (lower Nyquist frequency) than the green elements. If the RGB elements were considered independently and the resolution was set by the blue and red, it would be easier to analyze. But in order to wring the last drop of image quality out of the Bayer array, the Nyquist sampling theory is replaced with more aggressive algorithms that exploit the spatial correlations in the images. But as a general rule, I would say that when you push a system like that and remove the AA filter, you are asking for trouble. There will be situations where removing the AA filter will have higher apparent sharpness. But I argue that this appearance is due to aliasing.  Those “experts” who think that aliasing only occurs when there are repetitive structures in the image are wrong. On the contrary, whenever there is high frequency information above Nyquist from the lens which is not removed BEFORE image formation, it will be incorrectly rendered at lower frequency. Maybe we will like the appearance of a little aliasing, just like many liked the distortions of tube amplifiers and vinyl records.  But before we drive the field down what may very well be the wrong road, the science and engineering should be clarified. We should not be ruled by those with the biggest megaphone who are arguing on the basis of experience and intuition.
Title: Re: judging IQ solely by resolution at extreme magnification is usually misleading
Post by: ErikKaffehr on February 18, 2012, 11:42:21 pm
Hi,

I used Lightroom 3.6. I forgot to mention that the screen dump was made at 2:1 (twice actual pixels). The attached samples are shot at f/5.6 and f/16.

The test pattern is Norman Korens MTF test target: http://www.normankoren.com/Tutorials/MTF5.html#using

Best regards
Erik



This has nothing to do with unfiltered sensors. This looks more like a really really bad RAW converter, which might explain why you have seen a lot of issues. Which RAW converter did you use?

Title: Re: judging IQ solely by resolution at extreme magnification is usually misleading
Post by: 32BT on February 19, 2012, 12:13:00 am
Hi,

I used Lightroom 3.6. I forgot to mention that the screen dump was made at 2:1 (twice actual pixels). The attached samples are shot at f/5.6 and f/16.

The test pattern is Norman Korens MTF test target: http://www.normankoren.com/Tutorials/MTF5.html#using

Best regards
Erik


From the original ACR plugin to the current LR I don't recall ever seeing the errors you show in these samples. The colorized edge errors are characteristic of a RAW converter with inadequate edge detection, such as for example Canon's DPP, Iridient Raw developer, Silkypix, to name just a few i tested recently.

Would you have the original RAW file available for us?
 
Title: Re: judging IQ solely by resolution at extreme magnification is usually misleading
Post by: ErikKaffehr on February 19, 2012, 12:53:31 am
Hi,

http://echophoto.dnsalias.net/ekr/images/Articles/Aliasing/_DSC0963.ARW

Best regards
Erik


From the original ACR plugin to the current LR I don't recall ever seeing the errors you show in these samples. The colorized edge errors are characteristic of a RAW converter with inadequate edge detection, such as for example Canon's DPP, Iridient Raw developer, Silkypix, to name just a few i tested recently.

Would you have the original RAW file available for us?
 

Title: Re: judging IQ solely by resolution at extreme magnification is usually misleading
Post by: 32BT on February 19, 2012, 02:26:08 am
Hi,

http://echophoto.dnsalias.net/ekr/images/Articles/Aliasing/_DSC0963.ARW

Best regards
Erik

Okay, thanks.
Attached _DSC0963a
The actual RAW data as a grayscale
contrast and brightness changed for viewing ease.

As you can see in that image, there is nothing to suggest that the sensor is to blame for fabricated detail.
Having said that, I have to admit that it is indeed LR fabricating the errors, which may be due to the relatively dark exposure. Also the data seems slightly out-of-focus for that part of the image, which one might think helps mitigate aliasing, but in this case, sharper is probably better...

Attached _DSC0963b
For reference, I also have attached a more adequate RAW conversion through a proprietary algorithm which shows that it is certainly not necessary to fabricate detail as much, and preserve the soft aliasing. The algorithm works equally well on natural images as on these types of test charts.

In that respect it is interesting to see how DXO performs, because that converter works well on artificial test charts, but hopelessly messes up on natural images.
Title: Re: judging IQ solely by resolution at extreme magnification is usually misleading
Post by: ErikKaffehr on February 19, 2012, 02:42:24 am
Hi,

Your conversion is better than mine. It reproduces the fine letter text under the image readable too.

On the other hand, your conversion also contains detail above say 120 lp/mm, a line pattern with decreasing frequency. In my view a clear example of fake detail. The test target has increasing frequency up to 200 lp/mm.

Best regards
Erik


Okay, thanks.
Attached _DSC0963a
The actual RAW data as a grayscale
contrast and brightness changed for viewing ease.

As you can see in that image, there is nothing to suggest that the sensor is to blame for fabricated detail.
Having said that, I have to admit that it is indeed LR fabricating the errors, which may be due to the relatively dark exposure. Also the data seems slightly out-of-focus for that part of the image, which one might think helps mitigate aliasing, but in this case, sharper is probably better...

Attached _DSC0963b
For reference, I also have attached a more adequate RAW conversion through a proprietary algorithm which shows that it is certainly not necessary to fabricate detail as much, and preserve the soft aliasing. The algorithm works equally well on natural images as on these types of test charts.

In that respect it is interesting to see how DXO performs, because that converter works well on artificial test charts, but hopelessly messes up on natural images.
Title: Re: judging IQ solely by resolution at extreme magnification is usually misleading
Post by: 32BT on February 19, 2012, 04:02:06 am
Hi,

Your conversion is better than mine. It reproduces the fine letter text under the image readable too.

On the other hand, your conversion also contains detail above say 120 lp/mm, a line pattern with decreasing frequency. In my view a clear example of fake detail. The test target has increasing frequency up to 200 lp/mm.

Best regards
Erik



I suppose you could call it "fake detail". The data clearly shows moire, as in: an undulating pattern of increasing frequency that will change to an undulating pattern of lower frequency once NF is reached.

But I always thought of images as having a kind of "flow" for lack of a better term. In a picture of a tree for example the flow of edges goes from the stem, along the branches, to the twigs and the leaves. Or the feathers of a bird, perfectly combed and groomed. There is a certain directionality, along which the eye expects to find edges, and any deviations may even be detectable on subconscious levels. So as long as the "faked" detail at least has a reasonably correct directionality, it may improve the overall viewing experience, while at the same time improving apparent detail.

This is opposed to software that makes significant errors in relatively large detail like Canon's DPP. This creates a noisy, visible or at least detectable grid in the flow, and makes post-processing sharpening a nightmare.

Perhaps there is a more reasonable compromise for aliasing filters. Something that will allow the image reconstruction to know the directionality while at the same time not create excessive contrast edges. As a RAW converter I don't want the aliasing filter to try and cope with color-aliasing:

1) Because you can get away with a lot of softening in color information, allowing anti-aliasing in post-processing.
2) Detail errors primarily occur in the green channel, which then translate to the colorchannels. 

Instead I want it to cope with the lack of continuous samples in green, as follows:

For every green pixel, you would want to sample the slightly larger area that includes part of the direct neighbors, typically a 45degr square that exactly encompasses the original green pixel. But no larger! My gut feeling tells me that would be the optimal anti-aliasing blur required. But the actual physics involved probably won't allow such sampling. I do believe that this is one of the reasons that the original Fuji Super CCD worked so well. As opposed to the new lay-out they came up with, which is not going to solve anything, and will likely introduce unwanted color artifacts. (I suspect that they will soon find all kinds of color smearing on thin dark lines against light backgrounds, as in trees against a sky, color texts on opposing color background…)






Title: Re: Naked sensor
Post by: Dale Villeponteaux on February 19, 2012, 03:22:09 pm
As ignorant as I am, I should be barred from reading technical threads.  Wasn't it one of Walt Kelly's "Pogo" characters who said, "Sometimes I don't understand you.  And I'm glad!"  My head is swimming; I'm going to lie down.

At least I'm clearer about the extent of my ignorance.  Thanks.
Dale V.
Title: Re: Naked sensor
Post by: 32BT on February 19, 2012, 04:45:07 pm
As usual I was too lazy to produce a picture, and started to write my obligatory 1000 words. Obviously, that requires even more energy so I stopped short of 20 or so. But then here is a picture.

If the green squares represent the green pixels to be captured, then I would like to actually capture the equivalent of the dotted squares.

Light capture in a camera at that scale is probably some complex round gaussian function, and I have no idea at all what is involved. It may well be for example that anti-aliasing filter simply can't be scaled properly, so you may have to choose between "no anti-aliasing" or "a little too much". But purely based on intuition I would consider this the optimal amount of blurring for the specific task of debayering.
Title: Re: Naked sensor (Flow)
Post by: 32BT on February 19, 2012, 04:55:19 pm
And an example of flow:

Attached a bird's feathers using Canon's DPP. You can clearly see the grittyness of incorrect directional interpolation.
Title: Re: Naked sensor
Post by: hjulenissen on February 19, 2012, 04:57:26 pm
Light capture in a camera at that scale is probably some complex round gaussian function, and I have no idea at all what is involved. It may well be for example that anti-aliasing filter simply can't be scaled properly, so you may have to choose between "no anti-aliasing" or "a little too much". But purely based on intuition I would consider this the optimal amount of blurring for the specific task of debayering.
would that be your ideal green-channel blurring, or your ideal blurring for all channels?

Does the figure suggest a rotated 2x2 rectangular filter? I would think that a smooth response would be preferreable. Either a gaussian or some kind of windowed sinc.

-h
Title: Re: Naked sensor (Flow 2)
Post by: 32BT on February 19, 2012, 05:03:24 pm
And the res chart for Canon DPP faithfully shows where that grittyness originates.
Also note the edge and corners of the black square.
Title: Re: Naked sensor
Post by: 32BT on February 19, 2012, 05:13:44 pm
would that be your ideal green-channel blurring, or your ideal blurring for all channels?

Does the figure suggest a rotated 2x2 rectangular filter? I would think that a smooth response would be preferreable. Either a gaussian or some kind of windowed sinc.

-h

My ideal for just the green channel, as I think that color-aliasing is far less of a problem, and more easily dealt with in the RAW converter because you can get away with a lot of blurring of colorinformation before it becomes objectionable.

I would prefer a single black dotted square. (size = sqrt(2) x sqrt(2)). The pattern was created to show the fit.

Normally when scanning truly adjacent pixels, you would likely have some overlapping gaussian function. (I suppose it would be like the equivalent of "umfelt" in old Chromagraph scanners). I don't know what the ideal amount is, but I would like to multiply that ideal spread by sqrt(2).
Title: Re: Naked sensor
Post by: hjulenissen on February 20, 2012, 02:45:30 am
My ideal for just the green channel, as I think that color-aliasing is far less of a problem, and more easily dealt with in the RAW converter because you can get away with a lot of blurring of colorinformation before it becomes objectionable.

I would prefer a single black dotted square. (size = sqrt(2) x sqrt(2)). The pattern was created to show the fit.

Normally when scanning truly adjacent pixels, you would likely have some overlapping gaussian function. (I suppose it would be like the equivalent of "umfelt" in old Chromagraph scanners). I don't know what the ideal amount is, but I would like to multiply that ideal spread by sqrt(2).
Do you know the sensel fill-factor with and without micro-lenses? I dont think that you can automatically assume that a single sensel will capture light across the area suggested by pixel pitch?

If you have a square sensor PSF and a very good lense, you would still have "jagged edges" and other artifacts attributed to improper pre-filtering. 

-h
Title: Re: Naked sensor
Post by: 32BT on February 20, 2012, 05:48:25 am
Do you know the sensel fill-factor with and without micro-lenses? I dont think that you can automatically assume that a single sensel will capture light across the area suggested by pixel pitch?

I don't think so either, and even the microlenses may not fill the entire area of the pixel pitch, and if they do, then they still can't provide an optimal gaussian spread. So you need some kind of additional filter on top of the sensor that spreads the light at some optimal amount.

Now, I am presuming that there is always some kind of filter on top of the sensor, like an infrared filter and/or antiglare layer, so that there is a minimal amount of spread. The questions under consideration are these:

1) how much spread is optimal?

2) is an anti-aliasing filter able to provide that specific spread?


Way back in my advertising prepress days, we used to have specialized scanner operators for B&W newspaper ads. The one thing that stood them apart from the other scanner operators was their specific skill in controlling "umfelt". The best scanner operators in general were the ones that knew how to select the correct umfelt for both the input and output medium, which btw happens to be one of those skills that no stinking 10.000hours of practice is going to help! Or maybe it would, but you can't expect someone to have that many hours under the belt before becoming a skilled craftsman. So, only the most talented scanner operators would do the newspaper imagery.

The best operator incidentally happened to be a woman, and I guess if it was such a hard to find intuitive skill, then maybe that explains why it is so hard to decide what the optimal anti-aliasing filter should or should not do in an even more complex context. It may well be that the more technically inclined scientists working on these problems aren't exactly the ones to properly judge the resulting pixels in the endproduct. But given the discussions and opinions amongst professional photographers, I am not sure they are equipped to give the scientist the correct feedback either.
Title: Re: Naked sensor
Post by: NikoJorj on February 20, 2012, 07:07:35 am
I recall doing some resolution comparisons with those two cameras some years ago, and came to the conclusion that any increase in pixels numbers of less than 50% is probably not worth bothering with, and therefore I would never upgrade a camera based solely on an increase in pixel numbers of less than 50%, and perhaps not even 50%.
Aaaamen!
See also http://www.luminous-landscape.com/essays/sensor-design.shtml in the same vein.
Title: Re: Naked sensor
Post by: hjulenissen on February 20, 2012, 08:56:11 am
1) how much spread is optimal?
Image scaling is really similar to image sampling, except that it all takes place in software, so the choice of conversion algorithm is really flexible. I believe that it is generally believed that a lanczos2/lanczos3 kernel is close to the ideal trade-off for linear kernels.
(http://www.marumo.ne.jp/auf/png/lanczos.png)

Of course, there may be other linear kernels that are slightly better, and for every source/destination there will probably be a specific kernel that is better. Finally, non-linear operations is a superset of linear operations, so expect to find some (possibly small) gain by venturing there.
Quote
2) is an anti-aliasing filter able to provide that specific spread?
No. To my knowledge, they are limited to linear functions, positive-only coefficients, and a limited number of PSFs. The Nikon PR material suggests that the D800 has an AA-filter that looks like:
x x
x x
In other words, the image is mirrored spatially, once vertically, then horisontally.
(http://www.robgalbraith.com/data/1/rec_imgs/5563_d800_olpf_graphic.jpg)
Quote
Way back in my advertising prepress days, we used to have specialized scanner operators for B&W newspaper ads. The one thing that stood them apart from the other scanner operators was their specific skill in controlling "umfelt". The best scanner operators in general were the ones that knew how to select the correct umfelt for both the input and output medium, which btw happens to be one of those skills that no stinking 10.000hours of practice is going to help! Or maybe it would, but you can't expect someone to have that many hours under the belt before becoming a skilled craftsman. So, only the most talented scanner operators would do the newspaper imagery.

The best operator incidentally happened to be a woman, and I guess if it was such a hard to find intuitive skill, then maybe that explains why it is so hard to decide what the optimal anti-aliasing filter should or should not do in an even more complex context. It may well be that the more technically inclined scientists working on these problems aren't exactly the ones to properly judge the resulting pixels in the endproduct. But given the discussions and opinions amongst professional photographers, I am not sure they are equipped to give the scientist the correct feedback either.
I think it is very interesting to compare the knowledge of two such related but slightly different domains.

-h
Title: Re: Naked sensor
Post by: ndevlin on February 20, 2012, 09:32:36 pm

Why don't we reason this one backwards: Nikon has spent an enormous amount of R&D money and a similar amount of production phasing/planning/implementation money, to produce an identical product to the mass-market D800, but for the lack of the AA filter.

The D800E commands a paltry $300 premium.

How many people will buy a D800E who would not have bought a D800 if the D800E had never existed or been mentioned?  Reason tells me that's a fairly low number. 

So...Nikon has done something expensive, for a limited market gain, at a modest price premium. 

Now does anyone really think Nikon did this without having seen a compelling visual results case for it?

Unless the accounting department at Olympus is now running product development group at Nikon, the only logical conclusion is that Nikon saw sufficiently compelling results to produce two variants, trusting that real-world users would see enough difference to buy a lot of 800Es to make the exercise worthwhile.

The testing sure will be fun :-) 

- N.
Title: Re: Naked sensor
Post by: BernardLanguillier on February 20, 2012, 09:44:37 pm
Unless the accounting department at Olympus is now running product development group at Nikon, the only logical conclusion is that Nikon saw sufficiently compelling results to produce two variants, trusting that real-world users would see enough difference to buy a lot of 800Es to make the exercise worthwhile.

Another daring - complementary - theory might be that they simply listened to their customers?  ;D

Things never happen by chance in Japan. The release by Pentax, Ricoh, Fuji and now Nikon of devices without an AA filter in a very short time frame is probably not a coincidence. Although I have no proof of this, there is a high probability that these companies decided to join force a few years back to do research around the topic of AA filter less sensors and the image processing thereof.

Past public conversations with Nikon guys make me think that some people inside the company are very strongly opposed to AA filter less devices because they go against the core Nikon philosophy since the beginning of digital. This philosophy being to make digital as transparent as possible for users coming from film. Meaning a digital experience devoid of issues that were not present with film. They are really concerned about the support cost of having to deal with users complaining about digital artifacts.

So the reason why they decided to deliver both options with the D800 is probably that they could not reach an internal agreement about selecting only one option... and they did have the means to propose the 2 options to their customers.

Cheers,
Bernard
Title: Re: Naked sensor
Post by: BJL on February 20, 2012, 10:31:53 pm
From a crass ROI perspective, all it would take to justify Nikon's decision to offer the D800E is evidence of sufficient demand at a sufficiently high price, without Nikon needing to know or care whether that demand is rationally based. A purist rejection of any resolution loss regardless of any side-effects on IQ is at least a possible source if such demand.

I will again try to turn this around: Nikon and all other DSLR makers routinely go to the extra expense of using AA filters: what does that say about their judgement of a "compelling visual case" for having an AA filter?
Title: Re: Naked sensor
Post by: ErikKaffehr on February 20, 2012, 11:38:14 pm
Hi,

What surprises me is that they essentially seem to use a dual layer of birefringent, with the second layer inverting the effect of the first one, instead of just removing both layers. I presume that they want to keep the thickness and composition of the optical package in front of the sensor.

Best regards
Erik


From a crass ROI perspective, all it would take to justify Nikon's decision to offer the D800E is evidence of sufficient demand at a sufficiently high price, without Nikon needing to know or care whether that demand is rationally based. A purist rejection of any resolution loss regardless of any side-effects on IQ is at least a possible source if such demand.

I will again try to turn this around: Nikon and all other DSLR makers routinely go to the extra expense of using AA filters: what does that say about their judgement of a "compelling visual case" for having an AA filter?
Title: Re: Naked sensor
Post by: BJL on February 21, 2012, 04:12:05 pm
What surprises me is that they essentially seem to use a dual layer of birefringent, with the second layer inverting the effect of the first one, instead of just removing both layers. I presume that they want to keep the thickness and composition of the optical package in front of the sensor.
Yes, that is the only sense I can make of this unique approve to the absence of low pass filtering: avoiding the need for different versions of other components that have been designed with other optical effects of those birefringent layers taken into account.
Title: Re: Naked sensor
Post by: ErikKaffehr on February 22, 2012, 01:04:39 am
Hi,


I guess that it does not take much research effort to rotate a slice of a lithium niobate mono crystal 90 degrees and flip sides ;-)

Doing what Nikon does may be smart but doesn't request enormous R&D. Some things do take lot of R&D, lenses for sure, I have great respect for Nikon and others developing all different technologies we have in cameras. I'm specially impressed by the camera ASICs that can process 4-10 images/per second. Try to do that on a hex core PC!

I'm a little bit tried of blaming everything on R&D, prices are set by supply and demand, and also what customers are willing to pay for.

Best regards
Erik

Why don't we reason this one backwards: Nikon has spent an enormous amount of R&D money and a similar amount of production phasing/planning/implementation money, to produce an identical product to the mass-market D800, but for the lack of the AA filter.


The testing sure will be fun :-)  

- N.

Title: Re: Naked sensor
Post by: hjulenissen on February 22, 2012, 03:48:07 am
I'm specially impressed by the camera ASICs that can process 4-10 images/per second. Try to do that on a hex core PC!
I imagine that would be possible, especially if you assume that most high-power systems include a somewhat flexible, powerful GPU.

The reason why PC software appears to be slower may be that the demand is for slightly higher quality/flexibility, or lower development time rather than extreme framerates.

Now, running a hex-core PC with a 150W Nvidia GPU off of a regular DSLR battery while processing 10 raw frames/second would be hard.

-h
Title: Re: Naked sensor
Post by: 32BT on February 22, 2012, 06:06:56 am
Yes, that is the only sense I can make of this unique approve to the absence of low pass filtering: avoiding the need for different versions of other components that have been designed with other optical effects of those birefringent layers taken into account.

Could it be that the birefringent layers also act as anti-glare measure. i.e. it traps light coming in, but blocks light reflected off the sensor surface?