Pages: 1 2 3 [4] 5 6   Go Down

Author Topic: CCD and CMOS  (Read 66221 times)

ondebanks

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 858
Re: CCD and CMOS
« Reply #60 on: September 30, 2011, 08:20:32 pm »

I think these are minor differences in the big picture. 

By far the biggest factor in the output color response is the spectral transmission of the Bayer color filters.  As far as color reproduction goes (*), I've measured some CCDs with great spectral transmission curves and other CCDs with not-so-great ones.  The same with CMOS sensors. 

Eric

(*) i.e., getting a good match on the so-called Luther-Ives condition

Eric,

Apropos of nothing, you might be amused to hear that for some time, whenever I saw your username "madmanchan" I kept reading it as "Mad Manchán" - Manchán is an old and uncommon Irish forename (pronounced Man-KHAWN). Your signature alerted me to the fact that it's really "Madman Chan"!  :D

Anyway, I'm interested in your statement that you've measured several sensor transmission curves. What's your measurement setup? Have you seen any significant deviations from the corresponding curves in data sheets?

Ray

Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: CCD and CMOS
« Reply #61 on: October 01, 2011, 02:23:20 pm »

How much "better" would/could the color response have been if one removed the CFA and used a set of 3 purpose-made color filters that were inserted one at a time for 3 separate exposures (assuming that the scene did not run and hide in-between exposures)?

So perhaps the question could be paraphrased "how big a limitation is it that the spectral filtering carried out in the CFA have to have really small features, and be economically/practically feasible"?

I believe that color wheels are commonly used for multi-spectral cameras. I would guess that with 7 or 10 wisely chose bandpass filters, you would have a lot of options not available to regular cameras.

-h
Logged

PierreVandevenne

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 512
    • http://www.datarescue.com/life
Re: CCD and CMOS
« Reply #62 on: October 01, 2011, 02:44:25 pm »

Well, most medium to upper level amateur astro cameras come with filter wheels. You typically take three exposures, and then combine them in a color image. Here's a typical filter wheel
http://www.optcorp.com/product.aspx?pid=319-327-339-4242
Here's a typical integrated wheel
http://www.qsimaging.com/540-overview.html

The differences with a RGB filter array is, of course, that when one filter is in front of the camera, you benefit from photons captured by all the sensels, thereby doubling the green practical QE vs a bayer matrix and quadrupling red and blue. Thise comes, of course, with the penalty of having to do three exposures. The problems for photography are essentially that conditions change when you take frames in succession: the camera moves a very tiny bit, the lighting has changed, etc... and also that you end up with three images that aren't identical. Even the focal plane can change its position somewhat (it's dramatic in an achromat, tolerable in an apochromat). Assuming a star is red, you'll also get, even if the focus is perfect, a larger diameter in the red channel that you will in the green channel and you'll have to handle that in some way. It's very bad for bright point sources, maybe a bit less for full images with less luminosity differences in terms of photography, but you don't want to have to deal with all those issues shooting pictures.
The way those cameras are used, when they aren't used with standard RGB filters which have no other purpose than producing "pretty pictures" of no scientific value, is with filters whose bandwidth is very well defined (http://en.wikipedia.org/wiki/Photometric_system)
BTW, that tri-color filter processed was used in the early 20th century. http://en.wikipedia.org/wiki/Sergei_Mikhailovich_Prokudin-Gorskii

And of course, a variation of that is the triple CCD in some video cameras
« Last Edit: October 01, 2011, 02:46:03 pm by PierreVandevenne »
Logged

theguywitha645d

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 970
Re: CCD and CMOS
« Reply #63 on: October 01, 2011, 02:46:08 pm »

I run a variety of microscope cameras with color filters (commonly the same filtration as the Bayer filters) and Bayer patterns. The color is practically speaking the same. The only advantage to the filtered camera is the possibility of tuning the filters--color fitter wheels are a little old fashioned and LCD tunable filters work better. However, with broad bands, there is not much benefit in tuning the filters. That is usually left to very narrow bands measured in angstrom.

As far as the loss of resolution due to the Bater pattern, it is insignificant. The benefit of unfiltered monochrome sensors is really in sensitivity.
Logged

madmanchan

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2115
    • Web
Re: CCD and CMOS
« Reply #64 on: October 01, 2011, 04:54:30 pm »

As far as the sensor is concerned, the important factor (as noted above by others) in color reproduction is not the density of the color filters (or their spatial arrangement in a mosaic pattern such as Bayer), but rather the shapes of the transmission curves and how they relate to each other. Ideally from a color perspective, you'd want the transmission curves to be the same as the human cone responses (in the eye) or a linear transformation thereof. But there is a tradeoff in terms of color vs noise, and of course there are other practical constraints due to materials, manufacturing, costs, etc., so in practice this technical condition is not satisfied. as I mentioned earlier, this is rather a separate issue from the choice of CCD vs CMOS.

But as far as photography is concerned, my experience has been that color response differences from system to system have less to do with the sensor, and more to do with the software rendering applied in post-processing (even if the user never touches any sliders or controls). Example: Canon has various Picture Styles (such as Portrait and Landscape) available in their software for their cameras, some of which have CMOS sensors, some of which have CCD sensors. The difference in visual appearance between these software-based styles is far greater than the actual differences in the color filters!

Ray, I generally measure camera optical systems with a monochromator to estimate the transmission curves over the visible and near-IR range. However, I don't have manufacturer data sheets for most of the systems I measure (and even for those for which I do, the maker's data is usually for the sensor alone, whereas I prefer to measure sensor + lens combinations, so comparisons are hard). And you never know -- maybe I was a mad Irishman in a previous life!! ;D
Logged
Eric Chan

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: CCD and CMOS
« Reply #65 on: October 03, 2011, 03:08:13 am »

And of course, a variation of that is the triple CCD in some video cameras
But then each photon is counted (at least in theory). For bayer and color wheel solutions, only e.g. 1/3rd of the photons hitting the sensor during the total exposure time is counted, the rest is absorbed in spectral bandpass filters.

-h
Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: CCD and CMOS
« Reply #66 on: October 03, 2011, 03:10:49 am »

As far as the sensor is concerned, the important factor (as noted above by others) in color reproduction is not the density of the color filters (or their spatial arrangement in a mosaic pattern such as Bayer), but rather the shapes of the transmission curves and how they relate to each other.
Sure, but my gut-feeling is that whenever you have to do something really tiny, complex and economical, you loose something. If that gut-feeling is wrong, and Canon & Nikon are free to make whatever spectral response they see fit (keeping in mind the color response vs noise issue you mentioned), then my gut-feeling was wrong.

-h
Logged

PierreVandevenne

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 512
    • http://www.datarescue.com/life
Re: CCD and CMOS
« Reply #67 on: October 03, 2011, 07:36:06 am »

But then each photon is counted (at least in theory). For bayer and color wheel solutions, only e.g. 1/3rd of the photons hitting the sensor during the total exposure time is counted, the rest is absorbed in spectral bandpass filters.

Not sure about that

http://en.wikipedia.org/wiki/File:Dichroic-prism.svg

http://en.wikipedia.org/wiki/File:A_3CCD_imaging_block.jpg

I'd don't have a well defined opinion on the efficiency of splitting vs filtering, but I am under the impression the sensors in a 3CCD or 3MOS cameras don't get all the photons. If they did, it would be a mess to colour balance imho.
Logged

eronald

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 6642
    • My gallery on Instagram
Re: CCD and CMOS
« Reply #68 on: October 03, 2011, 08:25:25 am »

Not sure about that

http://en.wikipedia.org/wiki/File:Dichroic-prism.svg

http://en.wikipedia.org/wiki/File:A_3CCD_imaging_block.jpg

I'd don't have a well defined opinion on the efficiency of splitting vs filtering, but I am under the impression the sensors in a 3CCD or 3MOS cameras don't get all the photons. If they did, it would be a mess to colour balance imho.

+1

Edmund
Logged
If you appreciate my blog posts help me by following on https://instagram.com/edmundronald

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: CCD and CMOS
« Reply #69 on: October 03, 2011, 08:45:05 am »

Not sure about that

http://en.wikipedia.org/wiki/File:Dichroic-prism.svg

http://en.wikipedia.org/wiki/File:A_3CCD_imaging_block.jpg

I'd don't have a well defined opinion on the efficiency of splitting vs filtering, but I am under the impression the sensors in a 3CCD or 3MOS cameras don't get all the photons. If they did, it would be a mess to colour balance imho.
How should I interpret those figures in light of you statement?

I am by no means an expert on this topic. But it seems to me that if such a thing as "perfect" splitting of light based on wavelength exists (I am sure that it does not, but perhaps as an approximation), then some kind of 3-band bandpass filtering might be possible with a "3CCD" solution. It might not provide the _desirable_ shape of spectral selectivity, and it may have all kinds of practical/economical drawbacks, but I think this is an interesting aspect of it.

It all boils down to doing spectral selection using spectral absorption vs spectral reflectance - at least on the level of physics that I am able to follow :-)

-h
Logged

PierreVandevenne

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 512
    • http://www.datarescue.com/life
Re: CCD and CMOS
« Reply #70 on: October 03, 2011, 10:27:14 am »

It seems that multi channel dichroic prisms are, in theory at least, better than wideband RGB filters, in the sense that there are no holes, spikes, overlaps, etc... in the transmission band. The incoming light is split, you characterize it and that's it. I guess this could also allow for different distances for the three focal planes to compensate for chromatic aberration.

http://www.optec.eu/eng/multichannel/1194.htm

http://www.firstlightoptics.com/rgb-filters-filter-sets/baader-lrgbc-ccd-filter-set.html

But in practice, I have only worked with wide and narrow band filters and therefore will try to keep my foot out of my mouth, waiting for someone more competent in those matters to eventually jump in ;-)
Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: CCD and CMOS
« Reply #71 on: October 03, 2011, 10:31:05 am »

As far as the sensor is concerned, the important factor (as noted above by others) in color reproduction is not the density of the color filters (or their spatial arrangement in a mosaic pattern such as Bayer), but rather the shapes of the transmission curves and how they relate to each other. Ideally from a color perspective, you'd want the transmission curves to be the same as the human cone responses (in the eye) or a linear transformation thereof. But there is a tradeoff in terms of color vs noise, and of course there are other practical constraints due to materials, manufacturing, costs, etc., so in practice this technical condition is not satisfied. as I mentioned earlier, this is rather a separate issue from the choice of CCD vs CMOS.

An example of these tradeoffs is discussed in the DXO paper comparing the Nikon D5000 with the Canon EOS 500D. The Canon has poor color depth due the characteristics of its CFA filters. The problem lies mainly in the Red CFA filter, which is actually more sensitive to green rather than red as shown below. This necessitates a large coefficient in the color matrix, which adds noise. In contrast, the Nikon has a better red response and a greater color depth.

CCD sensors can also have unfavorable CFA characteristics as shown by the DXO analysis of the Phase One P45+, where the red channel is also more sensitive to green than red. The camera has a poor metamerism index 0f 72, as compared to an index of 83 for the D5000. The P45+ is an older camera, and the situation is much improved with the newer P40+. These studies indicate that CCDs do not necessarily have better color depth than CMOS designs.

Regards,

Bill

Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: CCD and CMOS
« Reply #72 on: October 03, 2011, 11:05:57 am »

An example of these tradeoffs is discussed in the DXO paper comparing the Nikon D5000 with the Canon EOS 500D. The Canon has poor color depth due the characteristics of its CFA filters. The problem lies mainly in the Red CFA filter, which is actually more sensitive to green rather than red as shown below. This necessitates a large coefficient in the color matrix, which adds noise. In contrast, the Nikon has a better red response and a greater color depth.
Do you think that this is a trade-off of achromatic SNR vs color noise, or sensor cost/performance vs color noise?

-h
Logged

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: CCD and CMOS
« Reply #73 on: October 03, 2011, 11:57:26 am »

Do you think that this is a trade-off of achromatic SNR vs color noise, or sensor cost/performance vs color noise?

-h

The article states, "This comparison is a bit surprising with respect to the previous SNR 18% results. Why such a difference? Color sensitivity is impacted by noise curves and spectral responses. If SNR curves are close, most of the divergence observed must be due to a difference in spectral sensitivities, which implies very different color processing for each sensor."

I conclude that the difference is largely due to color noise.

Regards,

Bill
Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: CCD and CMOS
« Reply #74 on: October 03, 2011, 12:41:57 pm »

The article states, "This comparison is a bit surprising with respect to the previous SNR 18% results. Why such a difference? Color sensitivity is impacted by noise curves and spectral responses. If SNR curves are close, most of the divergence observed must be due to a difference in spectral sensitivities, which implies very different color processing for each sensor."

I conclude that the difference is largely due to color noise.

Regards,

Bill
I should have phrased my question differently. Given that Canon have less spectrally selective CFA than Nikon, and thereby a color correction matrix that is more different from the identity matrix and more color-noise prone:
-Did they do this because they think that having wider filters, passing more photons, gives them an advantage when shooting spectrally broad/flat scenes
-Or does Canon have a sensor with a disadvantage in the first place, and spectrally wide filters used to hide its flaws

Or perhaps this is a feature to the silicon process that Canon use, linked perhaps to micro lenses etc?

I have heard that Sony alpha DSLRs have a radically different philosophy (closer to the standard CIE observer, at the cost of more noise)?

-h
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: CCD and CMOS
« Reply #75 on: October 03, 2011, 01:17:10 pm »

Hi,

I guess that Canon seeks better high ISO performance and therefore has more overlap between the filters in the CGA. But, that is just a guess.

Best regards
Erik

I should have phrased my question differently. Given that Canon have less spectrally selective CFA than Nikon, and thereby a color correction matrix that is more different from the identity matrix and more color-noise prone:
-Did they do this because they think that having wider filters, passing more photons, gives them an advantage when shooting spectrally broad/flat scenes
-Or does Canon have a sensor with a disadvantage in the first place, and spectrally wide filters used to hide its flaws

Or perhaps this is a feature to the silicon process that Canon use, linked perhaps to micro lenses etc?

I have heard that Sony alpha DSLRs have a radically different philosophy (closer to the standard CIE observer, at the cost of more noise)?

-h
Logged
Erik Kaffehr
 

mikejyg

  • Newbie
  • *
  • Offline Offline
  • Posts: 1
Re: CCD and CMOS
« Reply #76 on: January 11, 2013, 12:10:50 pm »

I think there is still one important area that CMOS can not approach CCD's quality: uniformity.

Its effect is subtle and not easily understood, and CMOS manufacturers do not like to talk about it either.

It's like the THD (dynamic noise) in audio quality, vs SNR (the common term "noise" people use in describing CMOS and CCD).
Logged

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: CCD and CMOS
« Reply #77 on: January 11, 2013, 05:07:06 pm »

Hi,

I guess that Canon seeks better high ISO performance and therefore has more overlap between the filters in the CGA. But, that is just a guess.

Best regards
Erik


That was my take as well. It's for high ISO performance by letting through more photons.
Logged

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: CCD and CMOS
« Reply #78 on: January 11, 2013, 05:17:30 pm »

It seems that multi channel dichroic prisms are, in theory at least, better than wideband RGB filters, in the sense that there are no holes, spikes, overlaps, etc... in the transmission band. The incoming light is split, you characterize it and that's it. I guess this could also allow for different distances for the three focal planes to compensate for chromatic aberration.

http://www.optec.eu/eng/multichannel/1194.htm

http://www.firstlightoptics.com/rgb-filters-filter-sets/baader-lrgbc-ccd-filter-set.html

But in practice, I have only worked with wide and narrow band filters and therefore will try to keep my foot out of my mouth, waiting for someone more competent in those matters to eventually jump in ;-)


You know a lot more about the technology than me.

What I will say is I have seen 3 chip HD video vs 1 chip HD video. The 3 chip systems look way better. Maybe an order of magnitude better. Go to your local electronics store and compare the cameras for yourself. Panasonic makes a nice 3CMOS camcorder. Compare it to any manufacturer using 1 chip of similar size. Not the sony nex, that is a much bigger chip.

Edit: by compare I mean shoot video in the store with each. Output it to a HDTV.
« Last Edit: January 11, 2013, 05:19:12 pm by Fine_Art »
Logged

jeremypayne

  • Guest
Re: CCD and CMOS
« Reply #79 on: January 11, 2013, 05:21:30 pm »

I think there is still one important area that CMOS can not approach CCD's quality: uniformity.

Hi ... can you back that up? 

Never heard that before and not sure what would account for such a difference.
Logged
Pages: 1 2 3 [4] 5 6   Go Up