Pages: 1 ... 5 6 [7] 8   Go Down

Author Topic: 16 Bit Myth  (Read 58920 times)

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: 16 Bit Myth
« Reply #120 on: January 01, 2012, 05:09:47 pm »

Hi,

I wish more examples would be posted on the issue, I have seen very few actual images available as raw files. Miles Hecker had a few when he tested Pentax 645D, Imaging Review has some P645D, I found three raw images from P65+ available at Phase One (good demo, by the way) and two images from the Leica S2 from Diglloyd. That's about it ;-(

Tim Parkin has posted a very good review which is useful but it was more about film and now raw images.

Best regards
Erik


I think everybody sensed a bit would do that. And by doing real life tests it will definitively show the superiority (wherever this superiority comes from) of a 22 MPx digital MF back, still today, nearly a decade after its introduction. And my guess here is that one does not buy such a capture medium to output only 8x10" prints.

Thierry

Logged
Erik Kaffehr
 

Steve Hendrix

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1662
    • http://www.captureintegration.com/
Re: 16 Bit Myth
« Reply #121 on: January 01, 2012, 06:38:52 pm »

steve, the only reason why this 16bit myth has so much relevance is that noone of the manufactors and sellers never has admitted in the various discussions about (e.g. in forums as this one) that his mf backs arent 16bit. as happens now again. rounds and rounds are made without a clear statement, even the dalsa data spec couldnt be more clear as it is..... and even if this was now posted since years. how you can say that you dont see clear if the a22 ( and others ) are 12, 14 or 16bit? its very clear and its easy to see, isnt it? 
next week the 16bit statement will come up again and the same discussion will go on and on, creating this myth which you say is the reason that you have to say its 16bit, because it explains so nice the quality difference between mf and smaller formats.  r.



Happy New Year everyone. Now that my head has cleared a bit....  I have to admit to a brain cramp in my posting about the bit depth of the sensor itself, as that is not what I meant when I brought up the question of 14 bit vs 16 bit, Kodak, Dalsa, etc. I was discussing the processing and ADC conversions themselves, which in the case of the Leaf product (as Doug has also stated) are 16 bit (this is true). I haven't spoken with Doug, so I'm not sure how the term "true" was intended. But again, in the case of ADC, I believe there may be imaging products that have 14 bit ADC's, but the data sheets themselves state 16 bits. I'm not sure of this, though, and there's a very good reason for that. Because frankly, when it comes to determining the right product for a user, it is far from an important statistic to me. There are indeed certain enhancements that the 16 bit processing and 16 bit ADC creates for the Leaf product that positively affect the outcome of the image quality. No one can definitely prove that it doesn't, because your only argument would be scientific or theoretically-based, you are not part of the development team for Leaf Imaging. The inclusion of the term "True 16 Bit" certainly could be treated as an accurate statement, but only if one were to understand that the reference is to the processing chain and ADC. From that standpoint, we're more than happy to omit that, given the limited 10 character space for the term.

The ramifications of what Leaf does with this architecture and how it affects their image quality (and in particular compared to a competitor's image quality) would make for a great stand alone article. It is relevant. It is not necessarily appropriate as the emblazoned #1 reason to choose a Leaf product. But then again, that is quite a subjective choice, is it not? Does large sensor size matter to everyone? No. If someone desires more depth of field, or sharper results from the edge of their frame, perhaps not. Etc, etc.

So, at the end of the day (and this far too long thread) with all of the concern over 16 bit vs 14 bit, etc, what matters most is how someone intends to use the product and what the final image looks like. And that is what is discussed with a client. We, Capture Integration, are not guilty of pervading the 16 bit anything. We grabbed a statistic and posted it as a differentiating feature (which it may very well be, in terms of what it allows the final image quality to become). I would like a show of hands - how many here have purchased from us and had discussions which were reduced to "Leaf has 16 bits, so it is superior, and that is why you should buy it"?

What I know is that when someone calls us for information on the potential purchase of a medium format digital back, the resulting initial discussion lasts an average of 40 - 80 minutes, sometimes longer. And in that time, what 16 bits as a statistic does is rarely discussed, much less even mentioned. Friday night I left the office at 8:30 pm, our last business day of the year, because during the course of the day, in addition to everything else that was happening, I spent several hours with a client picking up a system, going over some final quick start tips, and among others, took 3 phone calls that lasted 40 - 60 minutes each. One call was from a client who had purchased elsewhere and was wondering about the quality of some of the lenses he had been sold that didn't seem to hold up. With nothing to gain, other than the hope he might choose to do business with us in the future, we spent nearly an hour discussing real world performance of various lenses. No technical statistical data came up once. And that is how we do business. We're sorry that in advertising a product, the actual spec from the manufacturer was used and treated as a significant feature (regardless of whether you believe it is and can be easily measure it or not).

I just find it astonishing that because we stated a statistic as a feature (in a questionable or vague context, granted) turns into an attack on the integrity of how we do business as a company. While I understand that perhaps some of you have had experiences with dealers who relied on simple technical specifications from data sheets to sell you a product, to broad-brush our company with the same stroke is insulting. We'll take part of the blame for this very rare occasion of a questionable attention grabbing banner on a pretty damn good deal for a Leaf Aptus 22. But there is an abundance of evidence available that points to how we do business as a company, some may just not realize (and we certainly strive to improve), while some have just taken the opportunity to randomly trash.


For Thierry -

Yes, Thierry, my friend, as a former employee of SinarBron, I can attest to the tradition of Sinar being very literal in their documentation. If the Sinarback 54H, being a Kodak sensor, is stated as 14 bit on the data sheet, is this referring to the native bit depth of the sensor, or the ADC? What about the Sinarback 54LV? This product used the same sensor as the Leaf Aptus 22, yet on the Sinar data sheet it states 16 bit per channel.

http://www.sinar.ch/en/products/digital-backs/146-sinarback-emotion-54-lv



Steve Hendrix

Logged
Steve Hendrix • 404-543-8475 www.captureintegration.com (e-mail Me)
Phase One | Leaf | Leica | Alpa | Cambo | Sinar | Arca Swiss

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: 16 Bit Myth
« Reply #122 on: January 01, 2012, 07:27:44 pm »

Question. Unless we had two identical camera systems, down to the lens and setup, but one was 14 bit, the other 16-bit, how could we know the extra 2 bits were at all useful? Plus it seems there is no agreement what a 14-bit or 16-bit product really is (where and how those bits get created or processed).

It is obvious we can have two different camera systems and examine the files and decide which we believe is superior. On print, screen, using some raw analysis, whatever. There seems to be dozens, maybe hundreds of different aspects of a products design and implementation that attribute to the quality.

It seems the marketing departments of many companies would like to suggest (a kind term) that their product produces superior quality based on the number bits. I can’t comprehend how they can prove their case.

Quote
I just find it astonishing that because we stated a statistic as a feature (in a questionable or vague context, granted) turns into an attack on the integrity of how we do business as a company.

Attacks are not warranted. But neither is promoting a feature that may not be a feature, especially when it is vague and questionable. Now if any company can setup a truly scientific test that proves two more bits per color alone produce superior image quality that one can actually see, that would be interesting, educational and useful. I suspect it would be an expensive test to produce.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Camdavidson

  • Full Member
  • ***
  • Offline Offline
  • Posts: 185
    • http://www.camerondavidson.com
Re: 16 Bit Myth
« Reply #123 on: January 01, 2012, 08:09:15 pm »

Capture Integration are the good guys.  I have purchased kit from them and asked far too many questions of Steve, Dave and Doug than the value of the business they have gotten from me.

I've known them for years and have purchased a Phase back, an Eizo Monitor and a few other goodies.  There have many conversations with Doug about Capture One and how to integrate Media Pro into my workflow.  (I've used C1 for years)

They have ALWAYS been straight shooters about software and gear.  Doug has increased the knowledge of C1 through the blog posts and his noise reduction techniques are excellent.  What does not go reported is how Capture Integration has pleaded the case for photographers who use C1.  They have gotten Phase to listen to what shooters need and to incorporate a few of these desires into the software.  I know of no other dealer who cares as much about their clients and their success as Capture Integration.

They know their stuff.

Cameron Davidson
 http://www.camerondavidson.com
Logged

uaiomex

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1211
    • http://www.eduardocervantes.com
Re: 16 Bit Myth
« Reply #124 on: January 01, 2012, 08:13:33 pm »

I've had before the 16-bit answer more than once. But I was always under the impression that MFDB had better quality mainly because of the bigger sensor. The other reasons in decreasing importance are: Superior components, circuitry and wiring. 16 bit, now on trail and the CCD. (just my belief).

Later on, when I finally got my first FF dslr, a 5D, I immediately saw the superior image quality over the aps cameras of that time. More or less, everything was the same with the exception of  the bigger sensor (more than twice) and just a few more mp's. This to me was enough evidence that the sensor size was the most important thing regarding IQ when comparing similar technologies (which is usually the case). Digital is analog to Analog in too many ways.

The 5D2 arrived with 14-bit. Colors are better and I suspect the bit rate has something to do here. Overall the MkII is bettter than the MkI but the jump in quality was not as much despite the big increase in mp's.

What I'm trying to say is that DMFB manufacturers should keep their promotional efforts on sensor size. For IQ, aproach and workflow, the bigger sensor size is their bigger asset.

Just my 2 centavos
Eduardo

« Last Edit: January 01, 2012, 08:16:30 pm by uaiomex »
Logged

BJL

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 6600
Re: 16 Bits from ADC, only analog charges or voltages from CCD sensors
« Reply #125 on: January 01, 2012, 08:23:42 pm »

For Thierry -

... If the Sinarback 54H, being a Kodak sensor, is stated as 14 bit on the data sheet, is this referring to the native bit depth of the sensor, or the ADC? What about the Sinarback 54LV? This product used the same sensor as the Leaf Aptus 22, yet on the Sinar data sheet it states 16 bit per channel.

http://www.sinar.ch/en/products/digital-backs/146-sinarback-emotion-54-lv
I am not Thierry, but I think I can partly answer this, having read the data sheets.
Kodak's data sheets never mention bit depth of output, for the good reason that Kodak CCD sensors output an analog signal (a charge). Kodak instead talks in terms like DR of 12 stops or 72dB or 4000:1.
(IIRC, Dalsa sensors do an extra step, charge-to-voltage conversion, and so specify output in voltage instead of charges, but still analog, and still using specs like DR of 4000:1 or 72dB.)

That Sinar page seems to describe 16-bits as the "digitalization" of the back, not of the sensor.
Logged

Steve Hendrix

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1662
    • http://www.captureintegration.com/
Re: 16 Bits from ADC, only analog charges or voltages from CCD sensors
« Reply #126 on: January 01, 2012, 09:36:02 pm »

I am not Thierry, but I think I can partly answer this, having read the data sheets.
Kodak's data sheets never mention bit depth of output, for the good reason that Kodak CCD sensors output an analog signal (a charge). Kodak instead talks in terms like DR of 12 stops or 72dB or 4000:1.
(IIRC, Dalsa sensors do an extra step, charge-to-voltage conversion, and so specify output in voltage instead of charges, but still analog, and still using specs like DR of 4000:1 or 72dB.)

That Sinar page seems to describe 16-bits as the "digitalization" of the back, not of the sensor.


Thanks BJL - yes, that was my assumption as well.


Steve Hendrix
Logged
Steve Hendrix • 404-543-8475 www.captureintegration.com (e-mail Me)
Phase One | Leaf | Leica | Alpa | Cambo | Sinar | Arca Swiss

Steve Hendrix

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1662
    • http://www.captureintegration.com/
Re: 16 Bit Myth
« Reply #127 on: January 01, 2012, 09:48:04 pm »

Question. Unless we had two identical camera systems, down to the lens and setup, but one was 14 bit, the other 16-bit, how could we know the extra 2 bits were at all useful? Plus it seems there is no agreement what a 14-bit or 16-bit product really is (where and how those bits get created or processed).

It is obvious we can have two different camera systems and examine the files and decide which we believe is superior. On print, screen, using some raw analysis, whatever. There seems to be dozens, maybe hundreds of different aspects of a products design and implementation that attribute to the quality.

It seems the marketing departments of many companies would like to suggest (a kind term) that their product produces superior quality based on the number bits. I can’t comprehend how they can prove their case.

Attacks are not warranted. But neither is promoting a feature that may not be a feature, especially when it is vague and questionable. Now if any company can setup a truly scientific test that proves two more bits per color alone produce superior image quality that one can actually see, that would be interesting, educational and useful. I suspect it would be an expensive test to produce.


Vague from the standpoint of how bit depth is generally interpreted and what Leaf's use of the term specifically means. The 16 bit process and ADC of the Leaf Aptus 22 does play an integral role in the resulting quality under a specific variety of situations. From that standpoint, the fact that Leaf states this as 16 bit, means that it is warranted and for the purpose of  the quality they deliver. It doesn't have to be bragged about necessarily, which is where the grey area enters as to whether it is a feature or not. The key is whether the end result of the application of Leaf's 16 bit process and ADC provides an advantage over competitive products. Leaf feels that it does. If it does, there's no question it qualifies as a feature. If it does not, then it is simply the means to their end result. The fact it is not spelled out specifically how the result benefits from this, is what brings into question stating it as a feature.


Steve Hendrix
Logged
Steve Hendrix • 404-543-8475 www.captureintegration.com (e-mail Me)
Phase One | Leaf | Leica | Alpa | Cambo | Sinar | Arca Swiss

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: 16 Bit Myth
« Reply #128 on: January 01, 2012, 10:19:53 pm »

The 16 bit process and ADC of the Leaf Aptus 22 does play an integral role in the resulting quality under a specific variety of situations.

That’s been proven how, specifically and only due to the higher bit depth? The identical process at 14 or 12-bit has been demonstrated to be inferior?
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Steve Hendrix

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1662
    • http://www.captureintegration.com/
Re: 16 Bit Myth
« Reply #129 on: January 01, 2012, 10:27:29 pm »

That’s been proven how, specifically and only due to the higher bit depth? The identical process at 14 or 12-bit has been demonstrated to be inferior?


There's nothing to prove. Leaf uses this process toward their end result. It doesn't have to be comparative. That's the point I'm making. For Leaf to produce the quality that they do - regardless of where it stands comparatively - this is part of their process. It contributes to the end result, and particularly under certain conditions, but is not solely responsible for it, the same as any other manufacturer.


Steve Hendrix
Logged
Steve Hendrix • 404-543-8475 www.captureintegration.com (e-mail Me)
Phase One | Leaf | Leica | Alpa | Cambo | Sinar | Arca Swiss

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: 16 Bit Myth
« Reply #130 on: January 01, 2012, 10:36:04 pm »

There's nothing to prove.

Of course there is, if marketing or others are going to boast about 16-bits being somehow superior to 12-bit or 14-bit captures. From the pages of discussions here, it is still unclear what a ‘true’ 16-bit capture implies. The quality may very well be there, I don’t question that. I question that it is the higher bit depth alone that produces this quality. Which is why some (myself included) question the need to market a higher bit depth, and then use the term “true” in the same context.

Now if you had two captures that were identical expect for bit depth and you could demonstrate the higher bit depth produced superior quality, you’d have few if any naysayers.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Guillermo Luijk

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2005
    • http://www.guillermoluijk.com
Re: 16 Bit Myth
« Reply #131 on: January 01, 2012, 10:38:28 pm »

Question. Unless we had two identical camera systems, down to the lens and setup, but one was 14 bit, the other 16-bit, how could we know the extra 2 bits were at all useful?
This is simple, developing the same RAW file in its original 16bits, and developing it again with bit decimation prior to demosaicing: 15bits, 14bits, 13bits,... until we can see some damage on image quality. If IQ is damaged from 15bits, then those 16bits were useful, otherwise they were marketing bits.

Unfortunately this thread already reached 7 pages with 0 MFDB RAW files posted. But to be honest I think we don't need to do the test; noise measurements suggest those 16 bits are unnecesary in today's MFDB's so they would fall in the unuseful category.

An example, this is a particularly noisy camera (Olympus EP-1) tested at a particularly noisy ISO setting (ISO1600):

Original RAW file (12 bits):


Severely decimated RAW file (8 bits):


The second image even looks better! it's just an illusion due to some noisy pixels clipping to black, but the important conclusion is that the 12-bit is by no means better than the 8-bit decimated version, so the extra 4 bits were useless in this case.

Regards
« Last Edit: January 01, 2012, 10:48:26 pm by Guillermo Luijk »
Logged

Steve Hendrix

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1662
    • http://www.captureintegration.com/
Re: 16 Bit Myth
« Reply #132 on: January 01, 2012, 11:10:10 pm »

Of course there is, if marketing or others are going to boast about 16-bits being somehow superior to 12-bit or 14-bit captures. From the pages of discussions here, it is still unclear what a ‘true’ 16-bit capture implies. The quality may very well be there, I don’t question that. I question that it is the higher bit depth alone that produces this quality. Which is why some (myself included) question the need to market a higher bit depth, and then use the term “true” in the same context.

Now if you had two captures that were identical expect for bit depth and you could demonstrate the higher bit depth produced superior quality, you’d have few if any naysayers.


I'll say again, other than our inadvertent prioritizing of 16 bit as a top tier feature, I don't really see a lot of boasting going on anywhere. Perhaps there has been in the past, I don't know. Like I've said, I have never really concerned myself as much with how as with what. Check Leaf's website, there's barely a mention of 16 bit anything. Maybe in the past manufacturers have made a big deal out of it, I don't know. But is this really the case today?

I think part of what is at play here, is that 16 bit is being evaluated as a singular quality specification, which in Leaf's case, sort of is and sort of is not. It is more a part of the process which solves several problems for them in the digitization process and enables some enhancement, likely specific to the way that they produce an image with the sensors that they work with. If we all understand and accept that, then really, there is no issue here. That is unless, someone is trumpeting 16 bit as in and of itself, a verifiable feature of advantage, and that is not the case, at least it should not be, unless there is a qualitative analysis of how the 16 bit process for Leaf provides the competitive advantage and in what situations.

Absent that, it simply becomes a situation similar to how a floor is swept. I use a specific broom and have a specific technique for when I sweep the floor, and my wife has a different technique. She may use a different broom and call it 14 bit broom, I may use a 16 bit broom, and a different technique (for instance, I turn on 2 lights in a room and she only turns on 1), and this allows for a better result for certain things in certain situations. The end result from each may be a completely clean floor, but that is where the real difficulty lies, in determining who has the cleaner floor. From that standpoint, unless you believe I produce a cleaner floor, it doesn't matter that my broom is 16 bit, it is only the means to my end. What matters is the end.

I will also say that the process of how Leaf produces a digital file, and the role that their 16 bit process plays in it, is I believe, much more complex than most understand, judging from the discussion here.


Steve Hendrix
« Last Edit: January 01, 2012, 11:19:10 pm by Steve Hendrix »
Logged
Steve Hendrix • 404-543-8475 www.captureintegration.com (e-mail Me)
Phase One | Leaf | Leica | Alpa | Cambo | Sinar | Arca Swiss

TH_Alpa

  • Full Member
  • ***
  • Offline Offline
  • Posts: 214
Re: 16 Bit Myth
« Reply #133 on: January 02, 2012, 05:12:25 am »

Hi Steve,

Please note that my intervention was solely meant to give an historical information, but not at all meant to criticize or attack any manufacturer or dealer.

In this respect, it wasn't meant either to say that Sinar gave right and true/honest (or wrong) information, purposely or not, concerning products after the SB 54. I was speaking in this instance about the fact that Sinar did publish the SB 54 with 22 MPx as a 14 bit digital back, some other manufacturers too, some others with 16 bit.
I believe, from what I remember from back to 2002, that the 14 bit were meant for the sensor, not the A/D converter, but I could be wrong. I simply remember that the SB 54 does produces 16 bit files (for obvious reasons), and that I was wondering why others did mention 16 bit. My understanding at that time was that some manufacturers were speaking sensor while others where mentioning the A/D converter.
That is not to accuse anyone, nor to say that it was marketing reasons.

Interesting enough, and again for this 22 MPx back, some manufacturers did advertise the nominal ISO at 25, like Sinar, others at 50. Here also I can remember that when I was doing ISO tests with a Sinarback and another back given as ISO 50, both set to there given nominal ISO, the results in terms of density of the image were practically identical, when using the same f-stop. Again, this is pure information, not meant to criticize anyone. Beside this, there wasn't any ISO standard for digital backs, and there probably still isn't any standard.

Now all this being said, it is certainly not the dealers' mistake or responsibility for the publications content. They have the information from the manufacturers and rely on this one. Therefore I don't understand why CI is being criticized so strongly in this thread.

Best regards
Thierry

For Thierry -

Yes, Thierry, my friend, as a former employee of SinarBron, I can attest to the tradition of Sinar being very literal in their documentation. If the Sinarback 54H, being a Kodak sensor, is stated as 14 bit on the data sheet, is this referring to the native bit depth of the sensor, or the ADC? What about the Sinarback 54LV? This product used the same sensor as the Leaf Aptus 22, yet on the Sinar data sheet it states 16 bit per channel.
Steve Hendrix
Logged

Radu Arama

  • Full Member
  • ***
  • Offline Offline
  • Posts: 138
Re: 16 Bit Myth
« Reply #134 on: January 02, 2012, 06:39:35 am »

Hi Erik and Happy New Year to all!

In the Focus Numerique test here: http://www.focus-numerique.com/test-1160/moyen-format-pentax-645d-bruit-electronique-12.html in the first third of the page just under the photos they wrote "Vous pouvez également télécharger les fichiers Raw (100-200-400-800-1600 ISO)" and those are links to DNG files used to show the noise throughout the ISO range.

Hope it helps,
Radu

Hi,

I wish more examples would be posted on the issue, I have seen very few actual images available as raw files. Miles Hecker had a few when he tested Pentax 645D, Imaging Review has some P645D, I found three raw images from P65+ available at Phase One (good demo, by the way) and two images from the Leica S2 from Diglloyd. That's about it ;-(

Tim Parkin has posted a very good review which is useful but it was more about film and now raw images.

Best regards
Erik


Logged

mediumcool

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 770
Re: 16 Bit Myth
« Reply #135 on: January 02, 2012, 08:58:36 am »

I couldn’t be bothered reading all of the posts in this thread because, frankly, there seems to be a lot of angst being intermittently expressed; lots of barrows being shoved around in a not-always-polite way.

My perspective (and knowledge) is this: I have a “35” digital SLR that claims to use 12 bits per colour channel; I also have an Aptus back that claims to use 16 bits per channel.

I know, from experience, that I can pull enormous amounts of detail and colour from severely under-exposed 16-bit files, but the headroom of the “35” is much much less (highlights blow if I don’t expose it as if it were Kodachrome). Applies to C1 and Photoshop/Camera Raw, and any other RAW converter I have tried. My Panasonic GF1 seems worse again. Why? Dunno.

Is all the carry-on about the difference between 14-bit and 16-bit colour? Focusing on this alone does not help account for differing internal workflows, A-D converters, hardware and software. BTW, I shoot only RAW, even for happy snaps.

I suppose what I am trying say is “get out there and take/make the photographs you are capable of creating, with the equipment that both suits you and is affordable” rather than being obsessed by bit depth. I see obsessive gear behaviour elsewhere, where folks think only one camera brand [pick one!] can do a good job for them. When I was an adolescent, I wanted a Nikon; nothing else was good enough! How shallow. I am now nearing 60, and simply want to use the time I have left to make interesting and perhaps useful photographs, whether for monetary profit or for “pleasure profit”.

Phew!

Todd Rundgren and Darryl Hall with a thought for the day, and New Year. 0.02.

Logged
FaceBook facebook.com/ian.goss.39   www.mlkshk.com/user/mediumcool

cunim

  • Full Member
  • ***
  • Offline Offline
  • Posts: 130
Re: 16 Bit Myth
« Reply #136 on: January 02, 2012, 11:17:38 am »

Sitting on my shelf I have an old 16-bit camera.  This thing is monochrome, has 30 um pixels (1K x 1K), reads out at 1kHz, and is cryogenically cooled by a gas refrigeration unit.  Cost was about $100K back in the day and yes, it delivered 16 bit precision with luminescent targets, careful calibration, and adequate consideration of system MTF.  Great for star fields or gene expression studies.  Useless for photography.  Point is, high precision imaging and photographic imaging are completely different beasts.  That's why this discussion keeps chasing its own tail.  There is nothing wrong with specifying high precision parts for a photographic imager, but I have never seen a photographic acquisition paradigm that delivers even 14 bits.
Logged

asf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 510
    • http://www.adamfriedberg.com
Re: 16 Bit Myth
« Reply #137 on: January 02, 2012, 12:50:13 pm »

After reading the pages following my original posting I see the futility of my query and apologize for pointing my finger at CI through their ad and subsequently by implication painting them and other dealers all with the same brush.

I had noticed this supposed 16 bit superiority claim come up quite often recently, almost always in a snobbish and/or misinformed manner, and was wondering why this one "myth" of MFDB continues. Then I saw that ad and wondered if perhaps if dealers were still possibly promoting this oversimplified and questionable viewpoint, mostly because I was surprised to see it included in writing from a dealer as highly esteemed as CI.

When I bought into MFDB and dropped that first $50k+ in one afternoon I didn't do it because of 16 bit anything. That didn't even come into it. But I do remember it being mentioned, and being told the 16 bit MFDB back captured a lot more info than 14 bit FF 35 cameras. Even some digital techs repeated that info to me.
After a while I learned more about how these things work and I started to notice when people would insist the extra 2 bits made a big difference in the amount of info captured and the quality of the file, usually while defending/promoting their MFDB choice. Somehow this idea that there is some "16 bit" quality in certain capture devices continues.

No, it doesn't matter in the end. My dealer didn't lie to me, most likely he repeated the sales info he knew. He's not a scientist or a technician, he's a salesman and a good guy. I'm still a customer. When I bought my first digital system I didn't know as much as I do now, my dealer knew more than me and didn't steer me wrong. Now I (think) I know more about some of this than he does and I see that he's a salesman and knows what he needs to know to sell the stuff.

Should dealers "know better" than to use sales talk like this? Should buyers "know better" than to believe all sales talk?

"16 bit" can mean many things. The various simplifications of these complexities leads to much of the confusion. Sales talk is simplification for the most part, esp in a small banner ad.

We're all lucky to have good dealers around like CI, people who stand behind what they sell and you can call when you have problems.
Logged

Stefan.Steib

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 471
    • HCam - Hartblei Pro Photography solutions
Re: 16 Bit Myth
« Reply #138 on: January 02, 2012, 02:06:27 pm »

I Think it should be stated that there are actually cameras available which do have an advantage by 16 bit processing:
these are e.g. made by Andor and are cooled down to -100 degr. celsius and do have 1 electron sensitivity, pixel sizes of 8-24 μm, max resolution of 1024x1024 and sensors mounted in vacuum. see here  http://www.andor.com/scientific_cameras/ixon_emccd_camera/897_single-photon/
at this stage there is enough information to use high bit - True 14 bit @ 10, 5, 3 & 1 MHz readout rate (16 bit @ 1 MHz) -see this excerpt from the brochure page 47

"....5. It is clear that the actual sensor Dynamic Range only exceeds 14-bits at 1 MHz, through either EM or a conventional amplifier. Therefore, it is at 1 MHz that we require an option to match this higher dynamic range output with a scientific grade, noise free 16- bit A/D digitization. The iXon3 is uniquely designed to do just that, making use of a real scientific grade A/D that is optimized for 1 MHz readout.
Note: There is a direct relationship between readout noise and maximum dynamic range at a given readout speed. Lower readout noise affords higher dynamic range....."

Hope this helps a bit, I think this proofs that there is usage for it. On the other hand it shows that for actual commercial photography chips it has close to none relevance.

Greetings from Munich
Stefan
Logged
Because Photography is more than Technology and "as we have done it before".

bcooter

  • Guest
Re: 16 Bit Myth
« Reply #139 on: January 02, 2012, 03:17:16 pm »

This type of conversation has gone on for about 8 years and honestly, it's all old school thought.

Maybe in the days when we were transitioning from film to digital did clients ask questions about file size, or photographers were looking for that holy grail of a digital camera that would perform like film, but today . . . in today's world I don't think any of this is relevant.

Consequently, 8 years ago there were  a whole world of digital expert camera dealers that offered specialized service that was needed because everyone was so new to the process.  Same with digital techs and consultants.

Today it's different.  Sure you need a good dealer and Dave Gallagher's Capture Integration is a good company, though today there are a lot of good dealers that will stand behind their products.  Samy's in LA, Digital Transitions in New York and if I really looked around there are probably another half dozen or so in the U.S. that know their stuff.  I know because I've bought from most of them with good results. 

The difference with CI is they have more of an online presence than most dealers and I think some people appreciate it. I assume it must enhance CI's sales  or they wouldn't expend the effort, though as we've seen sometimes it goes a little sideways.

That's just the way the world works.  Not everyone is always going to like everyone.

To some that matters, but most professionals could care less about forums, bit depth, file size or how to tether a camera to a computer because the digital process is now standard fare and we all know how to do it.

You can take a Nikon, Canon, Hasselblad . . . almost any professional still camera and if the content is compelling any client at virtually any level will eagerly accept the files if the work is done professionally.

If your a professional you know your stuff and don't need to make a dozen phone calls to figure out the firewire cord is broken.

My suggestion is buy what you want, enjoy what you have and don't worry about the camera, because the only really great camera is the one you like.

IMO

BC

Logged
Pages: 1 ... 5 6 [7] 8   Go Up