There has been a lot of discussion regarding AA filtering on sensors, but very little about the fill factor.
My understanding is that a perfect pixel would have a fill factor of 1, that means that it would be gapless. It is also my understanding that reducing the fill factor will increase aliasing but also increase apparent sharpness.
Let's assume we have micron pixels and a line that is 3 microns wide, which projected on the boundary between two pixels. With a fill factor of 1 it would be seen by both pixels with 25% intensity. Would we reduce fill factor to 0.25 ( 3 micron sensitive area in a 6 micron pixel) both pixels would miss the line. Would we move the line 3 microns one of the pixels would see 100% intensity and the other one 0%.
So what I would expect is:
- High fill factor would produce little aliasing
- Lower fill factor would show higher microcontrast but much of that would be fake
Whats your take on the issue?
The enclosed images are from the "Great 2006 MFDB shootout" the camera on the left is the P25 and on the right the P30. P25 is upsized to same size as P30 with bicubic, both use Lightroom default sharpening. No additional sharpening was applied after up scaling the P25. I believe P30 has microlenses and P25 does not.