Your friend is wrong - or at least, quoting out of date information. CMOS has overtaken CCD in noise performance and is now beginning to displace CCDs in research instrumentation. I'm an astronomer too; my collaborators have just bought some new cameras for use in a high speed photometer on one of the world's largest telescopes - and guess what, they are CMOS (Andor S-CMOS), not CCD. Future space sensors will be CMOS too; just remember that space missions often take 10 years to develop, and usually deploy very old, reliable, tested-to-death, radiation-hardened technology, so it takes a while for new tech to work through to launch.Hi what about Infra RED? I am asking because I have this CCD DSLR and I was thinking of removing the IR filter. From your argument the old adventages of CCD for visible spectrum no longer holds. Could you elaborate abut IR (for astro photo)
Excellent High ISO performance requires 3 things: (1) Efficient light capture, (2) Low internal noise, (3) Large sensitive surface area per pixel. Since we can make pixels as big as we like with both CCD and CMOS, that third factor cancels out when comparing them. So let's look at (1) and (2).
(1) Sensitivity/Quantum Efficiency: by using microlenses to direct the light into the active areas of each pixel, CMOS are now ahead of front-illuminated CCDs and gaining on back-illuminated ones. Silicon is silicon in both cases; once photons are on target for an active pixel well, photons will be absorbed in depth distributions proportional to their wavelength in both cases.
(2) Internal noise: At normal (photographic) readout speeds, modern CMOS cameras are several times lower in readout noise than CCDs. Many are in the 2-electron area. S-CMOS is at 1 electron. CCD-based photographic cameras are at best around 6 or 7 electrons, and medium format digital's current best is 12 electrons.
Then there's dark noise. It is almost negligible in good CMOS sensors at everyday temperatures, but still woeful in uncooled CCDs (cooled means taking it to several 10's of degrees C below ambient temperature, as in an astronomical camera). A 300 second exposure with a modern Canon CMOS, without dark frame subtraction, looks as good as a 3 second exposure with many CCD cameras, also without dark frame subtraction. All medium format digital backs/cameras require long dark frame subtractions to alleviate this noise, but of course you cannot subtract its random (poisson) fluctuations, so that adds to the internal noise.
Ray
Could someone give me a technical explanation why CMOS can shoot at such high ISO's (nikon D700, etc) compared to CCD's despite CCD's being more sensitive to light and being a more expansive technology ? A friend of mine is a scientist and tells me that in astronomy and industrial applications the reverse of what is going on in the camera business is true, CCD's are more light sensitive and have less noise compared to CMOS. I noticed with most cameras that use CCD's (like the phase backs) anything over 800iso is not really considered to be great. The sensors that go to space are also CCD's.All MFDBs are CCD (up to now), the P1 P25+ and P45+ are the world leaders in low light because of their ability to keep the sensor cool for long exposures, this is due to the great cooling system not the type of sensor used, the MFDBs are worst in their ISO performance because they lack any kind of noise reduction filter or circuit, this is a choice of the manufacturers due to the demand of the MF market for purity so that low ISO performance is maximum! The IQ of DBs is well beyond any modern DSLR even if they are 5 or 6 years old! The reason for CMOS sensors is mainly the low cost of production, but they do cost a lot more than CCDs to develop. I guess that the production volume of DSLRs compensates for the high development cost of CMOS. CCDs are considered sharper and with greater color than CMOS, I'm not aware which technology is better for high Iso, to compare that we should have a comparison of cameras with both technologies but the same NR system. Regards, Theodoros. www.fotometria.gr
Excellent High ISO performance requires 3 things: (1) Efficient light capture, (2) Low internal noise, (3) Large sensitive surface area per pixel. Since we can make pixels as big as we like with both CCD and CMOS, that third factor cancels out when comparing them. So let's look at (1) and (2).
(1) Sensitivity/Quantum Efficiency: by using microlenses to direct the light into the active areas of each pixel, CMOS are now ahead of front-illuminated CCDs and gaining on back-illuminated ones. Silicon is silicon in both cases; once photons are on target for an active pixel well, photons will be absorbed in depth distributions proportional to their wavelength in both cases.
Microlenses will collect light and focus it on the active sensing area of the sensel, but what about charge density, which limits the full well capacity of a sensel? The transistors in a CMOS will take up space and reduce the active sensing area of the sensel, leading to increased charge density in the CMOS as compared to CCD. I don't quite agree with point 3. For a given sensor size, pixel size is inversely proportional to the megapixel count. You can make the sensor larger, but at considerable expense.
Regards,
Bill
CCDs are considered sharper and with greater color than CMOS
CCD-based photographic cameras are at best around 6 or 7 electrons, and medium format digital's current best is 12 electrons.
did you try to compare CCD-based P&S w/ CMOS based FF dSLR ;) ? so is the difference because of CCD vs CMOS or because of other factors ::)1. No, neither I will. :) 2. Don't know, is it? You seem to ignore "is considered" from my quote! ;) Who cares anyway? 8) Regards, Theodoros. www.fotometria.gr
Microlenses will collect light and focus it on the active sensing area of the sensel, but what about charge density, which limits the full well capacity of a sensel? The transistors in a CMOS will take up space and reduce the active sensing area of the sensel, leading to increased charge density in the CMOS as compared to CCD.
I don't quite agree with point 3. For a given sensor size, pixel size is inversely proportional to the megapixel count. You can make the sensor larger, but at considerable expense.
small CCD sensor based P&S like S95 are <= 4 electrons, are they not ?
did you try to compare CCD-based P&S w/ CMOS based FF dSLR ;) ? so is the difference because of CCD vs CMOS or because of other factors ::)
Hi what about Infra RED? I am asking because I have this CCD DSLR and I was thinking of removing the IR filter. From your argument the old adventages of CCD for visible spectrum no longer holds. Could you elaborate abut IR (for astro photo)
All MFDBs are CCD (up to now), the P1 P25+ and P45+ are the world leaders in low light because of their ability to keep the sensor cool for long exposures, this is due to the great cooling system not the type of sensor used, the MFDBs are worst in their ISO performance because they lack any kind of noise reduction filter or circuit, this is a choice of the manufacturers due to the demand of the MF market for purity so that low ISO performance is maximum! The IQ of DBs is well beyond any modern DSLR even if they are 5 or 6 years old! The reason for CMOS sensors is mainly the low cost of production, but they do cost a lot more than CCDs to develop. I guess that the production volume of DSLRs compensates for the high development cost of CMOS. CCDs are considered sharper and with greater color than CMOS, I'm not aware which technology is better for high Iso, to compare that we should have a comparison of cameras with both technologies but the same NR system. Regards, Theodoros. www.fotometria.gr
P.S. I prefer to judge products with what they do for my photography, don't care much to analyze the technical origin of the solution given from manufacturers nor the scientific theory behind it for that matter.
I wish that MFDB manufacturers would introduce this as a selectable feature on their backs - a slower, lower noise readout option, for those who can wait a few seconds longer between shots.
But Bill, we are talking about high-ISO performance ....
.....
Ray
small CCD sensor based P&S like S95 are <= 4 electrons, are they not ?
Thanks Ray, for this detailled and interesting information.
Thierry
In Kodak MF CCDs, the Green- and Blue-filtered pixels tend to recover in transmission as the IR band begins beyond 700nm, and beyond about 820nm, every pixel is transmitting equally; allowing shorter IR exposures, and giving monochromatic B&W IR images which in principle require no de-Bayering interpolation, which would allow perfect per-pixel sharpness.
Ray, this is very interesting. I believe my Pentax 645D has a Kodak sensor and I was thinking of trying some IR (but not remove the IR cut filter). Pentax has really worked on the performance of this sensor with ISOs of 1600 and no limit on integration time--I have done some wide-field astro images with exposures up to 5 minutes and known folks to shoot up to half an hour. You wouldn't know where I could get the spectral response of a Kodak 44x33mm 40MP Bayer sensor?
Your friend is wrong - or at least, quoting out of date information. CMOS has overtaken CCD in noise performance and is now beginning to displace CCDs in research instrumentation. I'm an astronomer too; my collaborators have just bought some new cameras for use in a high speed photometer on one of the world's largest telescopes - and guess what, they are CMOS (Andor S-CMOS), not CCD.
Hello,
I believe this is what you were looking for: http://www.kodak.com/ek/uploadedFiles/Content/Small_Business/Images_Sensor_Solutions/Datasheets(pdfs)/KAF-40000LongSpec.pdf the spectral response is on page 15.
Best regards,
Radu
That is true, but a read noise of 4 electrons for a small pixel has a quite different effect on measured noise than for a large pixel because of camera gain, which is the number of electrons collected per data number (raw value). Since the P&S sensors use 12 bits per pixel for their output, I will use this bit depth for comparison.
A small pixel camera such as the Canon S70 may collect 8200 electrons and has a read noise of 3.2 electrons. Assuming that the raw pixel value at full well is near the maximum 12 bit pixel value of 4095, the gain is 2.0 electrons / data number. A larger pixel, such as with the Nikon D3, can collect about 66,500 electrons with a read noise of 4.9 electrons and a gain of 16 electrons /data number. The data are from Roger Clark (http://www.clarkvision.com/articles/digital.sensor.performance.summary/).
When we look at measured noise in the image, we are concerned with data numbers, not electrons. A read noise of 3.2 electrons for the S70 would translate to 3.2/2.0 or 1.6 data numbers. The read noise in terms of data numbers for the D3 would be 4.4/16 or 0.27 data numbers.
Where is the "cutting point" in exposure time when film generally give better results than digital?That is a lot more complicated as film suffers from reciprocity law failure. So with a long exposure, you will not get noise with film, but you will not be capturing as many photons (or rather the photosites on the grain lose energy and unexpose, so to speak). If you have one stop of compensation with a calculated 15 minute exposure, then a 15 minute digital exposure would need to be compared to a 30 minute film exposure. The digital camera will not have a change in contrast either. At some point, film will give up completely and not be able to record information whereas a sensor can still be counting photons. I guess a film exposure can be much better in very hot environments. A digital back would need active cooling and as far as I know, only scientific cameras do that.
And I've done my own little version, comparing my MFDB to my Canon 5DII. Surprise, surprise, the Canon trounced the MFDB. I still prefer the MFDB at low ISO and short exposures, and the handling of a MF camera.
Leica is looking into this as a firmware update for the S2. It would be a user-selectable menu option.
Neither yours nor any other Canon turn their NR off when you instruct them to do so from menu, it stays on to some extend by default!Lots of hot air, lots of opinion, no references.
Lots of hot air, lots of opinion, no references.Its true that temprature can rise up if people distort completely what you state and call you "confused", as for references, if my British MechEng BEng degree doesn't do, have a look at DPreview on any modern Dslr, they state the same. Regards, Theodoros. www.fotometria.gr
-h
I feel that answer is a bit misleading. Aren't the keywords "high speed" here? You can find all kinds of instruments in professional telescopes, including some based on relatively exotic designs (we aren't getting Aladdin III In:Sb sensors in our cameras any time soon, I think). CMOS based architectures are of course widely used in fields where high speed is possible or desirable (photometry of occultations for example, solar observation etc...). But CCD still reigns in imaging applications. I was so surprised by the above statement that I double checked what current major observatories use as imagers
ESO Paranal - http://www.eso.org/sci/facilities/paranal/instruments/index.html - have a look at the detailed description of the instruments, too many to list here
Gran Telescopio Canarias - http://www.gtc.iac.es/en/pages/instrumentation/osiris.php#Detector
Subaru - http://www.naoj.org/Observing/Instruments/SCam/
If the purpose is going deep and long exposures, everyone seems to still be using CCDs
Not that I disagree with the increased usefulness of CMOS based sensors in many fields in general - but do you have examples of CMOS sensors used for image acquisition in fairly long exposures?
Neither yours nor any other Canon turn their NR off when you instruct them to do so from menu, it stays on to some extend by default!
I've never seen any proof for that assumption, I did see proof to the opposite. A simple method to detect noise reduction is to shoot a well focused image of a "White noise" patch, and then display the Fourrier transform. The gradual signal decline towards higher spatial frequencies (caused by MTF, finite sensel size, and AA-filter for suppressing Bayer CFA related aliasing) is pretty gradual, unlike the Fourier transform of a noise reduced image. The Fourier transform of the Read noise alone also doesn't show signs of noise reduction.
If you have any trustworthy proof for your assertion, a lot of people would be interested, I'm sure.
Cheers,
Bart
What I said was "CMOS...is now beginning to displace CCDs in research instrumentation". Not "CMOS has displaced". The instruments you linked to are all relatively old; some were completed in the 1990s, the rest commenced development before the recent surge in CMOS performance. They generate science, not income, so they won't be replaced unless absolutely necessary. It's going to take time.
Another reason is that we still await large low-noise CMOS sensors (the same reason why there are still no CMOS MF digital backs). Large research telescopes have giant focal planes to populate with mosaics of imaging sensors, so the availability of large CCDs keeps them at the forefront.
You are right insofar as CMOS usage in research at present is mainly in the high-speed or timeseries niche. Any scenario where you have to take many frames, is where their low readout noise per frame makes the
You ask if I "have examples of CMOS sensors used for image acquisition in fairly long exposures"? I presume you mean discounting amateur astro-imagers, who use both off-the-shelf and modified CMOS DSLRs for exposures running to hours net? Well in research there are areas like wide-field auroral monitoring which use similar setups.
Ray,
1. On your first quote on me, I suggest that perhaps there was ....low oxygen in the place where you read my post! ??? :o What are you talking about? It clearly has nothing to do with my statement! :-X 8)
2. On your second quote on me there was certainly much CO2 present in your room! ;D ;) Its just an answer to the quoter not a post on the OP! :'( :-*
3. Neither yours nor any other Canon turn their NR off when you instruct them to do so from menu, it stays on to some extend by default! :P Cheers, Theodoros www.fotometria.gr
Link to such a patch?
Theodoros, clearly you are a man of many, and very strong, emotions. 1 short message, 9 emoticons!1. You clearly quoted that I was confusing LONG EXPOSURE with HIGH ISO performance, while you are talking to a 30 years "quite well" known photographer that is at least respected for his work at more than one countries as if his knowledge was the knowledge of an ignorant kid! Clearly to take a part of the whole statement to "analyze" like if the the answer wasn't in front of you on the rest of the statement, is because: a) you intented to do so ....or b) there was not much oxygen in the room and you missed it from "eye blaring" :-[
Could you please identify the offending parts of my "first" and "second" quotes to you? As I am having trouble matching up your objections to what I had said.
On your 3rd point, if what you say were true, no-one would be able to produce a photon-transfer curve which obeys Poisson statistics, from Canon RAW files. So, how do you explain the fact that several Canon users have successfully done so?
When you say that Canon's NR cannot be turned off, are you perhaps thinking of Canon's on-chip double-correlated sampling? That is not noise reduction in the sense that everyone assumes; it is better described as noise prevention. In no way does it usurp the integrity, statistics, or spatial correlation of the signal, unlike true "NR". It's kosher. This (http://www.luminous-landscape.com/forum/index.php?topic=31741.0)is a good LL thread on that topic (page 2 is the important part).
Ray
Totally random noise will do, even Photoshop could produce something usable.
A program like ImageJ has plugins to create various types of noise.
I've never seen any proof for that assumption, I did see proof to the opposite. A simple method to detect noise reduction is to shoot a well focused image of a "White noise" patch, and then display the Fourrier transform. The gradual signal decline towards higher spatial frequencies (caused by MTF, finite sensel size, and AA-filter for suppressing Bayer CFA related aliasing) is pretty gradual, unlike the Fourier transform of a noise reduced image. The Fourier transform of the Read noise alone also doesn't show signs of noise reduction.It has been answered to a previous quoter Bart, I'm sorry I don't intend, want, or am in a mood to spend pages (that is what is needed) to prove it. Although I have a British BEng in MechEng, I am a photographer and thus I find such conversations time consuming and useless, (its also that I haven't really practice my English for more than 20 years and thus its even more painful and time consuming for me). It is so though... all Dslrs have NR by default and if you happen to have some friends that work in the manufacturing of MFDBs or in sensor constructing (like I do), they would verify this to you! Please look at it..., you'll find that I'm right! Dslrs are constructed for the "enthusiast" or "advanced" user, (industry terms) which is considered by manufacturers to be a ....(dragged by the nose junkie) that will never notice the difference! This is exactly the reason why photo equipment "tests" are done by "experts" that don't have even one photograph or work published and why "sample photos" of these tests is bridges, or castles, or beaches, or flowers..... thats what the industry thinks of "target group"! Regards, Theodoros. www.fotometria.gr
If you have any trustworthy proof for your assertion, a lot of people would be interested, I'm sure.
Cheers,
Bart
as for references, if my British MechEng BEng degree doesn't do,Sorry, does not impress me. If that education was worth anything, surely they must have taught you what people mean when they ask for references?
Sorry, does not impress me. If that education was worth anything, surely they must have taught you what people mean when they ask for references?Who gives a ....whatever if you was impressed or not? ??? DSLRs still don't turn their NR off! :P ....and what you are doing by erasing the rest of the sentence is at least .....immoral! :-* You obviously did so to object people of reading it, although it was only a few more words... ??? >:( ;D :P Cheers (well..... not really) Theodoros. 8) www.fotometria.gr
-h
According to this study (http://theory.uchicago.edu/~ejm/pix/20d/tests/noise/noise-p4.html) by Emil Martinec, The D3 and D300 apply noise reduction at exposures of 1/4 sec. and longer which can't be turned off by the user.
1. You clearly quoted that I was confusing LONG EXPOSURE with HIGH ISO performance, while you are talking to a 30 years "quite well" known photographer that is at least respected for his work at more than one countries as if his knowledge was the knowledge of an ignorant kid! Clearly to take a part of the whole statement to "analyze" like if the the answer wasn't in front of you on the rest of the statement, is because: a) you intented to do so ....or b) there was not much oxygen in the room and you missed it from "eye blaring" :-[
2. Your second INSULT, has been answered already :o ???
3. It has been answered already to another quoter and you PARTIALLY accepted it up there, ......I've no intention (although I can) to be drugged in such a conversation, There is NO DSLR that turns its NR OFF when its instructed by the user to do so and .....thats it!! If you have a proof for the opposite (PROOF NOT THEORY like ...if it was so... then.... so it should.... that article says....), then ....I can interfere for you to have a really good job in the industry, ......I will even help you in your photography if you want! ::) Regards, Theodoros. www.fotometria.gr
CCDs are considered sharper and with greater color than CMOS
Here is the entire, unedited first post you made in this thread. I encourage others to read it:OK! READ IT THE WAY I RIGHT IT!!!! :o I suggest you need some therapy ;D Excessive CO2 has caused you some very serious brain damage. ;) :) :D ;D Its really a pity for a PHD professor to humiliate himself by his own statements! :o ::) There is no person in the world "buddy" that would come in the assumptions that you came by reading what you have up there in bold WHICH IS MY STATEMENT AND THERE IS NOTHING WRONG WITH IT!! :-* And please don't try to give me any advice of morality, SINCE ITS YOU THAT "JUMPED" ME WITHOUT ANY REASON WHATSOEVER!! ??? :P I suggest you stop humiliating yourself and do some photography or read some Aristoteles (The greatest of philosophers and inventor of logic as a science) ;) :) AH! and open up the window..., fresh air will do you some good.... 8) By the way... there is no DSLR (certainly no Canon) that turns completely its NR off when its instructed to do so from menu :-* .....and there is nothing to change that no matter what you think or not. :D ....and please don't bother me, answer to me, talk to me, refer to me, or think about me anymore! ;D :-* I don't even want your apology for stating that what you have up there in bold is a result of a confused mind! You don't exist for me! :) 8) www.fotometria.gr
Quote from: larkis on September 25, 2011, 08:46:11 AM
Could someone give me a technical explanation why CMOS can shoot at such high ISO's (nikon D700, etc) compared to CCD's despite CCD's being more sensitive to light and being a more expansive technology ? A friend of mine is a scientist and tells me that in astronomy and industrial applications the reverse of what is going on in the camera business is true, CCD's are more light sensitive and have less noise compared to CMOS. I noticed with most cameras that use CCD's (like the phase backs) anything over 800iso is not really considered to be great. The sensors that go to space are also CCD's.
All MFDBs are CCD (up to now), the P1 P25+ and P45+ are the world leaders in low light because of their ability to keep the sensor cool for long exposures, this is due to the great cooling system not the type of sensor used, the MFDBs are worst in their ISO performance because they lack any kind of noise reduction filter or circuit, this is a choice of the manufacturers due to the demand of the MF market for purity so that low ISO performance is maximum! The IQ of DBs is well beyond any modern DSLR even if they are 5 or 6 years old! The reason for CMOS sensors is mainly the low cost of production, but they do cost a lot more than CCDs to develop. I guess that the production volume of DSLRs compensates for the high development cost of CMOS. CCDs are considered sharper and with greater color than CMOS, I'm not aware which technology is better for high Iso, to compare that we should have a comparison of cameras with both technologies but the same NR system. Regards, Theodoros. www.fotometria.gr
So - in answer to the OP's question, which asked only about high ISO and never once mentioned long exposures, your first explanation connects "low light" with "long exposures". What else were we to think, but that you were confused between them? We can only read it the way you write it.
I still have no idea what my insult was? If an apology is warranted, how can I apologise if I don't know what I'm apologising for?!
Thanks, but I already have a really good job, as a university lecturer in physics and astronomy. I teach a course on signal processing, another on astronomical data analysis, and another on observational astronomy, with a particular emphasis on detector issues. I've been doing digital image processing since 1992; my PhD thesis concerned calibrating astronomical imaging photon-counting detectors called MAMAs, and their use in high-resolution imaging, photometry and deconvolution of the centres of star clusters.
But I guess a British MechEng BEng degree trumps that.
Now I AM annoyed, because you're the first person ever on LuLa to force me to play the "my credentials are better than yours" card, and I really hate doing that. The letters after one's name should have nothing to do with anything in a discussion like this. Only the ability to present proper evidence matters. We're still waiting for yours...
CCD or CMOS's don't have "greater color". They don't have any color. They count photons. You can filter photons before they reach the CCD, by means of an individual filters or a filter matrix (for example Bayer).True enough, sensors are really B&W, I guess "considered" is the key word.... :) I'm surprised how you confused "considered" (which refers to the result) with "have", :o ....its really easy to goof between the two words i guess, ::) ....they are so near in grammar and vocabulary that can happen to anybody :-X 8) Regards, Theodoros. www.fotometria.gr
CCD or CMOS's don't have "greater color". They don't have any color. They count photons. You can filter photons before they reach the CCD, by means of an individual filters or a filter matrix (for example Bayer).
I'd think that CCDs have better color since the value of each pixel is more matched to other pixels due to the different readout of CCD vs CMOS.
I'd think that CCDs have better color since the value of each pixel is more matched to other pixels due to the different readout of CCD vs CMOS.
Bill
OK!
OK! READ IT THE WAY I WRIGHT IT!!!! :o
P.S. Does anybody remember the story of the tower of Babel from the bible? I wonder why communication is falling lower and even lower by the day....
Yup, Nikon are bad boys, but only for long exposures. If you search my posting history, I've often chastised Nikon for this. And also, for the way they subtract a bias level from the RAW, which makes it harder (but not impossible) to measure the readnoise, and impossible to fully subtract bias pattern noise from images like flatfields, which is of course nasty when one then comes to divide by a flatfield...as someone has noted here before, it's mostly the astrophotographers who moan about that!
It's nicely explained by Jerry Lodrigus here (http://www.astropix.com/HTML/I_ASTROP/NIK_CAN.HTM); scroll down to points 5, 6 and 7 under the "Nikon" heading.
Canon, emphatically, do not alter their RAW files in long exposures - or short ones, for that matter. Nor do they subtract the bias level. RAW means RAW with them.
Emil and Jerry may be very knowledgeable and careful workers, but a FFT analysis of a 30 second Darkframe from the D3 is shown below and I see no evidence of filtering.
Emil and Jerry may be very knowledgeable and careful workers, but a FFT analysis of a 30 second Darkframe from the D3 is shown below and I see no evidence of filtering. Processing was as before: split_cfa in Iris and cropping of the green 1 channel to 300 x 300 px in Iris and analysis by Image J. All NR and dark frame subtraction was turned off. I can supply the raw file and the .fit files to anyone who wants to do their own critique.
Here is the FFT of the same file with the median filter applied with a radius of 4 showing clear cut evidence of filtering:
And here is a PNG of the cropped green 1 channel
Regards,
Bill
I'd think that CCDs have better color since the value of each pixel is more matched to other pixels due to the different readout of CCD vs CMOS.
Bill
True enough, sensors are really B&W, I guess "considered" is the key word.... :) I'm surprised how you confused "considered" (which refers to the result) with "have"
I think the argument is based on the camera testing at the end of manufacture. It is going to be tested against known images of blue , green, red, for output accuracy. The firmware is set to correct any deviation in the signal. In CMOS the correction is against each pixel. In CCD the conversion of photon to current is against whole rows so the converter is calibrated against 1000s of pixels output. Of course each pixel still has to output the same value so the correction of the color is the same. The accuracy of the signal converter should be much greater in CCD.
Emil and Jerry may be very knowledgeable and careful workers, but a FFT analysis of a 30 second Darkframe from the D3 is shown below and I see no evidence of filtering.
The only purpose that Nikon intended for the so-called noise reduction is to suppress unmapped hot/dead pixels, so the effect is definitely going to be a lot more subtle than a global median filter. That's why I prefer the astrophotographer's term for it, the "Nikon star killer algorithm." :)
A highly talented engineer named Marianne Oelund completed a rather detailed characterization of the algorithm as it was implemented in older Nikon cameras, the D3, and most recently, the D7000:
http://forums.dpreview.com/forums/read.asp?forum=1021&message=34309201
Here is a link to an image demonstrating the star killer effect:
http://actionphotosbymarianne.com/spectra/HPSeffectEx1.jpg
I think these are minor differences in the big picture.
By far the biggest factor in the output color response is the spectral transmission of the Bayer color filters. As far as color reproduction goes (*), I've measured some CCDs with great spectral transmission curves and other CCDs with not-so-great ones. The same with CMOS sensors.
Eric
(*) i.e., getting a good match on the so-called Luther-Ives condition
They are small in the sense that all consumer cameras now have very good color reproduction, CCD or CMOS. Its still a consumer product not something made for NASA or a defense department. The tolerance in the parts is a real issue when there is a very competitive market. For example Pentax in the K5 managed to squeeze much more out of the same sensor than Sony, the manufacturer, or Nikon did with their version of the camera. They chose some higher quality part. With CCD row conversion a higher quality signal converter has much more return per part due to it's effect on entire rows.
Maybe you are right that the tolerances of the color filters have much more effect. Anyone know how they degrade over time?
I would leave out NASA, they do a lot of false color work and also a lot of invisible part of electromagnetic spectrum stuff. Also NASA stuff is probably radiation hardened.
I think these are minor differences in the big picture.
By far the biggest factor in the output color response is the spectral transmission of the Bayer color filters. As far as color reproduction goes (*), I've measured some CCDs with great spectral transmission curves and other CCDs with not-so-great ones. The same with CMOS sensors.
Eric
(*) i.e., getting a good match on the so-called Luther-Ives condition
And of course, a variation of that is the triple CCD in some video camerasBut then each photon is counted (at least in theory). For bayer and color wheel solutions, only e.g. 1/3rd of the photons hitting the sensor during the total exposure time is counted, the rest is absorbed in spectral bandpass filters.
As far as the sensor is concerned, the important factor (as noted above by others) in color reproduction is not the density of the color filters (or their spatial arrangement in a mosaic pattern such as Bayer), but rather the shapes of the transmission curves and how they relate to each other.Sure, but my gut-feeling is that whenever you have to do something really tiny, complex and economical, you loose something. If that gut-feeling is wrong, and Canon & Nikon are free to make whatever spectral response they see fit (keeping in mind the color response vs noise issue you mentioned), then my gut-feeling was wrong.
But then each photon is counted (at least in theory). For bayer and color wheel solutions, only e.g. 1/3rd of the photons hitting the sensor during the total exposure time is counted, the rest is absorbed in spectral bandpass filters.
Not sure about that
http://en.wikipedia.org/wiki/File:Dichroic-prism.svg
http://en.wikipedia.org/wiki/File:A_3CCD_imaging_block.jpg
I'd don't have a well defined opinion on the efficiency of splitting vs filtering, but I am under the impression the sensors in a 3CCD or 3MOS cameras don't get all the photons. If they did, it would be a mess to colour balance imho.
Not sure about thatHow should I interpret those figures in light of you statement?
http://en.wikipedia.org/wiki/File:Dichroic-prism.svg
http://en.wikipedia.org/wiki/File:A_3CCD_imaging_block.jpg
I'd don't have a well defined opinion on the efficiency of splitting vs filtering, but I am under the impression the sensors in a 3CCD or 3MOS cameras don't get all the photons. If they did, it would be a mess to colour balance imho.
As far as the sensor is concerned, the important factor (as noted above by others) in color reproduction is not the density of the color filters (or their spatial arrangement in a mosaic pattern such as Bayer), but rather the shapes of the transmission curves and how they relate to each other. Ideally from a color perspective, you'd want the transmission curves to be the same as the human cone responses (in the eye) or a linear transformation thereof. But there is a tradeoff in terms of color vs noise, and of course there are other practical constraints due to materials, manufacturing, costs, etc., so in practice this technical condition is not satisfied. as I mentioned earlier, this is rather a separate issue from the choice of CCD vs CMOS.
An example of these tradeoffs is discussed in the DXO paper (http://www.dxomark.com/index.php/Publications/DxOMark-Insights/Canon-500D-T1i-vs.-Nikon-D5000/Color-blindness-sensor-quality) comparing the Nikon D5000 with the Canon EOS 500D. The Canon has poor color depth due the characteristics of its CFA filters. The problem lies mainly in the Red CFA filter, which is actually more sensitive to green rather than red as shown below. This necessitates a large coefficient in the color matrix, which adds noise. In contrast, the Nikon has a better red response and a greater color depth.Do you think that this is a trade-off of achromatic SNR vs color noise, or sensor cost/performance vs color noise?
Do you think that this is a trade-off of achromatic SNR vs color noise, or sensor cost/performance vs color noise?
-h
The article states, "This comparison is a bit surprising with respect to the previous SNR 18% results. Why such a difference? Color sensitivity is impacted by noise curves and spectral responses. If SNR curves are close, most of the divergence observed must be due to a difference in spectral sensitivities, which implies very different color processing for each sensor."I should have phrased my question differently. Given that Canon have less spectrally selective CFA than Nikon, and thereby a color correction matrix that is more different from the identity matrix and more color-noise prone:
I conclude that the difference is largely due to color noise.
Regards,
Bill
I should have phrased my question differently. Given that Canon have less spectrally selective CFA than Nikon, and thereby a color correction matrix that is more different from the identity matrix and more color-noise prone:
-Did they do this because they think that having wider filters, passing more photons, gives them an advantage when shooting spectrally broad/flat scenes
-Or does Canon have a sensor with a disadvantage in the first place, and spectrally wide filters used to hide its flaws
Or perhaps this is a feature to the silicon process that Canon use, linked perhaps to micro lenses etc?
I have heard that Sony alpha DSLRs have a radically different philosophy (closer to the standard CIE observer, at the cost of more noise)?
-h
Hi,
I guess that Canon seeks better high ISO performance and therefore has more overlap between the filters in the CGA. But, that is just a guess.
Best regards
Erik
It seems that multi channel dichroic prisms are, in theory at least, better than wideband RGB filters, in the sense that there are no holes, spikes, overlaps, etc... in the transmission band. The incoming light is split, you characterize it and that's it. I guess this could also allow for different distances for the three focal planes to compensate for chromatic aberration.
http://www.optec.eu/eng/multichannel/1194.htm
http://www.firstlightoptics.com/rgb-filters-filter-sets/baader-lrgbc-ccd-filter-set.html
But in practice, I have only worked with wide and narrow band filters and therefore will try to keep my foot out of my mouth, waiting for someone more competent in those matters to eventually jump in ;-)
I think there is still one important area that CMOS can not approach CCD's quality: uniformity.
Hi ... can you back that up?
Never heard that before and not sure what would account for such a difference.
Its old tech. It used to be that the CCDs were making better color uniformity. CMOS looked artificial.
Many people thought it was based on the method of charge conversion. CMOS integrated the A/D converter into the pixel. CCD sent all the charge to a row converter. The theory was the CCD method allowed calibration of the value to a standard with higher tolerance than a single value from a pixel.
The last generation of CMOS from all the DSLR manufacturers seems to have much better color accuracy than a few years ago. Pictures now look realistic, like looking through a window whereas the CMOS a few years back looks a bit artificial. Maybe its improved bayer algorithms. Maybe its the latest updates to RAW converters. Something has changed on the CMOS side.
Camera | DR | comments |
Nikon D800 | 14.4 | Exmoor based sensors |
Nikon D600 | 14.2 | |
Sony RX1 | 14.3 | |
PentaxK5 | 14.1 | Note that K5 is APS-C, impressive feat! |
Sony Alpha 99 (SLT) | 14 | |
Phase One IQ180 | 13.6 | Best medium format CCD based camera |
Nikon D4 | 13.1 | Best non Exmoor design CMOS |
Canon EOS 1Dx | 11.8 | |
Leica M9 | 11.7 | Best CCD based full frame |
We both use Sony and they have advertised on chip A/D conversion for ages.
Yes, Canon and Nikon were doing something different, maybe why they had 14 bit while we were stuck with 12. Pentax also tried high end external 22 bit converters. They went back to something basic in the next version.
DR figures from DxO (latest sensor)
Camera DR Exmoor based sensors Nikon D800 14.4 Nikon D600 14.2 Sony RX1 14.3 PentaxK5 14.1 Sony Alpha 99 (SLT) 14 Nikon D4 13.1 Best non Exmoor design Pentax 645D 12.6 Best CCD based camera Canon EOS 1Dx 11.8 Leica M9 11.7 Best CCD based full frame
Best regards
Erik
Hi, Remember those figures are for the print performance (imaged resized to 8"x12" @ 300 dpi)
It is not possible to have more than 14 bit DR at the sensor with 14 bit linear encoding
The highest DR (according to DxOMark) would be thge Sony RX1 with 13.47 and Nikon D600 with 13.44
regards,
Francisco
It used to be that the CCDs were making better color uniformity. CMOS looked artificial.
Previously, you stated that better "uniformity" was "still" an advantage for CCD-based sensors.
I don't think there is anything inherent in CCD technology that would account for that.
Previously, you stated that better "uniformity" was "still" an advantage for CCD-based sensors.
I don't think there is anything inherent in CCD technology that would account for that.
I guess Sony is not alone with on chip per column conversion. The new sensor for the Leica M is also a CMOS design with column type on chip converters.Panasonic is also doing column-parallel ADC in some sensors: read about the MN34070 is this document
http://www.cmosis.com/news/press_releases/new_leica_m_uses_cmosis_24_mp_cmos_image_sensor
http://www.chipworks.com/blog/technologyblog/2012/10/25/full-frame-dslr-cameras-part-iii-new-entrants-and-look-forward/
The DR and the color tone accuracy are improved, from hardware or software is unknown.
I believed it until I started seeing a lot of pictures from people using the Sony A99, A77 ( I use sony), the Nikon D800, D600.
Maybe its the down-sampling leeway of the 24MP of data. Maybe its improved software. I dont know. All I know is these cameras are outputting images that look very realistic vs older generation DSLRs.
The DR and the color tone accuracy are improved, from hardware or software is unknown.
It does look like a big improvement in LR3 over past versions.
Lightroom is on v4.x
Lightroom is beyond impressive to me ... it the single most impressive pieces of "consumer" software I have ever seen. The ambition and "game-changing" nature of the project is inspiring to me as someone who lives in "legacy" world trying to innovate.
The story of Lightroom is a an amazing story.
Lets not get carried away. We are not talking about a Saint here.
I stand by every word I said.
I manage a lot of software development as part of my professional life. One of the "catchphrases" I use often in the context of user-driven software companies and projects ... "Rather than build a little something for everyone, build everything for someone."
The Lightroom team clearly understands that principle.
Hi,
For me Lightroom was exactly the program I wanted to write, but I estimated it at 30-100 man years, longer than I plan to live.
Best regards
Erik
From my perspective, Lightroom is a very nice light-table, filing and printing program with a good-enough Raw converter. If you want superlative Raw conversion, and only raw conversion, you can often get it for free with the manufacturer's software eg. Canon DPP, freeware like RPP, or resort to boutique products like Iridient's Raw Developer, or even Capture One which I believe quite a few people on this forum have used.
Edmund