Pages: 1 [2]   Go Down

Author Topic: Do your lenses match your sensors?  (Read 6779 times)

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8914
Do your lenses match your sensors?
« Reply #20 on: May 08, 2010, 05:18:48 am »

Quote from: fredjeang
But then, if so, the pixel race is totally fake.

I mean, there are pressures for more megapixels in a XY sensor area, but if I understand, we rarelly use our older sensor to their full potencial till we can get the best possible lenses.

In that sense, the first thing we should upgrade is the lens equipment and not that much the sensor.

So instead of pixel race, what should be more appropriate is a lens quality race.

Yes, and No. When the sensor has higher resolution, it also gets a better sampling of whatever the lens can provide. That helps with sharpening as well. When the lens has higher resolution, especially across the image including corners, then the sensor can record a better image quality.

So it works both ways. That's why I mentioned the MTF performance in an earlier post, it's the combined performance that matters, both the sensor and the lens are a factor in the equation and the combined performance cannot exceed (it doesn't even come very close to) the worst contributor. So improving the worst component will allways help most, but even improving the better components will help some.

Lenses can have a very high resolving power (although usually less in the corners than in the center). The capability to accurately sample the blur with a higher resolution sensor array also allows to do a better job of deconvolution sharpening afterwards, effectively removing the blur (although there are limits).

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

fredjeang

  • Guest
Do your lenses match your sensors?
« Reply #21 on: May 08, 2010, 05:26:54 am »

Quote from: BartvanderWolf
Yes, and No. When the sensor has higher resolution, it also gets a better sampling of whatever the lens can provide. That helps with sharpening as well. When the lens has higher resolution, especially across the image including corners, then the sensor can record a better image quality.

So it works both ways. That's why I mentioned the MTF performance in an earlier post, it's the combined performance that matters, both the sensor and the lens are a factor in the equation and the combined performance cannot exceed (it doesn't even come very close to) the worst contributor. So improving the worst component will allways help most, but even improving the better components will help some.

Lenses can have a very high resolving power (although usually less in the corners than in the center). The capability to accurately sample the blur with a higher resolution sensor array also allows to do a better job of deconvolution sharpening afterwards, effectively removing the blur (although there are limits).

Cheers,
Bart
Bart, your points are very clear indeed.

But let's take for example the P65, wich is truth MF FF sensor. So this is a standard physical dimensions, exactly like we have the 35mm standard.
If Phase is keeping the pixel race in this standard format, the backs will have more and more pixels for a same physical area.
This is where I can't understand the limits. I thought that pixel density was relevant in IQ.

To my mind, more pixels for a same size means less IQ. So I see that the lenses tech could be the right response and not so much the sensors.
But I'm far from being a tech guru.

Cheers.


Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Do your lenses match your sensors?
« Reply #22 on: May 08, 2010, 11:48:01 am »

Hi,

The IQ issue is quite complex. The human vision is a limiting factor. So if the pixels are much smaller than what you would see in print they may not matter. So if you print small enough, increasing the number of pixels gives diminishing returns.

Now, IQ is often seen as noise related issues. We may call it noise or DR. The main source of noise in a picture is shot noise, random variation of the number of photons. Shot noise has a Poisson distribution, so it is proportional to the square root of the detected photons. Lets now assume that we replace a big pixel with a smaller one. Each pixel will count a quarter of the number of photons so noise will increase by a factor of two (the square root of 4 is 2). But the four pixels together will see the same number of photons as a single, so assuming the same enlargement the shot noise will be the same.

If we look at read noise as the limiting factor it will be the same weather the pixels are large or small. But the small pixels will be able to hold only a forth of the photons, so DR (which is defined as full well capacity / read noise) will be a fourth on a per pixel basis. To sum up:

- If you print large enough, resolution will be better with more pixels, God, lenses and diffraction permitting.
- Normally it's no big deal
- At the pixel level (like actual pixels in Photoshop) the pixels will be noisier
- If read nose dominates, you will be worse of with more pixels

On the other hand:

- There is no good reason to make pixels very small. If the pixels are smaller than what the lens can resolve or the diffraction limit, we will get diminishing returns.
- A bigger sensor will always see more photons
- Some sensors may be more efficient than others

Finally, obtaining maximum performance from a sensor is not a trivial task. Stopping down to f/16 may reduce a 50 MPixel sensor to a 12 MP one, because of diffraction.

Best regards
Erik




Quote from: fredjeang
Bart, your points are very clear indeed.

But let's take for example the P65, wich is truth MF FF sensor. So this is a standard physical dimensions, exactly like we have the 35mm standard.
If Phase is keeping the pixel race in this standard format, the backs will have more and more pixels for a same physical area.
This is where I can't understand the limits. I thought that pixel density was relevant in IQ.

To my mind, more pixels for a same size means less IQ. So I see that the lenses tech could be the right response and not so much the sensors.
But I'm far from being a tech guru.

Cheers.
Logged
Erik Kaffehr
 

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Do your lenses match your sensors?
« Reply #23 on: May 08, 2010, 12:30:40 pm »

Quote from: BartvanderWolf
The important spatial frequencies for the human visual system in turn have to do with output magnification and viewing distance, with a peak contrast sensitivity around 8 cycles/mm.
Bart,

An excellent post (as always), but I think the Contrast Sensitivity should be 8 cycles/degree, not 8 cycles/mm.

Regards,

Bill
Logged

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8914
Do your lenses match your sensors?
« Reply #24 on: May 08, 2010, 01:00:27 pm »

Quote from: bjanes
An excellent post (as always), but I think the Contrast Sensitivity should be 8 cycles/degree, not 8 cycles/mm.

Hi Bill,

Thanks. OOPS, you're right, 8 cycles/degree is what I meant to say. Thanks for catching that (I've corrected that post to avoid confusion).

Cheers,
Bart
« Last Edit: May 08, 2010, 01:04:44 pm by BartvanderWolf »
Logged
== If you do what you did, you'll get what you got. ==
Pages: 1 [2]   Go Up