Pages: 1 ... 3 4 [5] 6 7 8   Go Down

Author Topic: 16 Bit Myth  (Read 59085 times)

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: 16 Bit Myth
« Reply #80 on: December 31, 2011, 04:02:49 am »

Hi,

You are mixing up things a bit:

1) 32 bit color mean 3x8 (RGB) pu 8bit Alpha channel

2) DR essentially defines what information is contained in the image. We always transform the image before viweing it.

Best regards
Erik


Another issue is there is no current technology to view 16 bits. If you look at the websites of LCD manufacturers you will see they are mostly 10 bit devices with some 12 (There may be 14 since the last time I looked) for very expensive hospital devices. Everything over the 10 bits is dithering of adjacent pixels. Yes, your OS claims you have 32 bit color. Prints have even less dynamic range.


Logged
Erik Kaffehr
 

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: 16 Bit Myth
« Reply #81 on: December 31, 2011, 04:10:32 am »

An IQ180.

Kidding, a D7000. Very nice little camera I have to say. Works wonders when stitching.

Cheers,
Bernard


It's always bugged me that Nikon gets more out of a given Sony sensor than Sony. Of course the price is also much higher. The alternate 14bit DA converter pathway does give Nikon a real advantage.
Logged

Fine_Art

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1172
Re: 16 Bit Myth
« Reply #82 on: December 31, 2011, 04:28:14 am »

Hi,

You are mixing up things a bit:

1) 32 bit color mean 3x8 (RGB) pu 8bit Alpha channel

2) DR essentially defines what information is contained in the image. We always transform the image before viweing it.

Best regards
Erik


Isnt 32 bit color more of a myth that 16bit? ;D
Logged

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: 16 Bit Myth
« Reply #83 on: December 31, 2011, 04:52:05 am »

32 bits color on computers means 8bit RGB + 8 bit Alpha, 3*8 = 24 + 8 = 32.

Isnt 32 bit color more of a myth that 16bit? ;D
« Last Edit: December 31, 2011, 04:57:55 am by ErikKaffehr »
Logged
Erik Kaffehr
 

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: 16 Bit Myth
« Reply #84 on: December 31, 2011, 05:36:06 am »

In older Canons ISO3200 and above was obtained from ISO1600 multiplied before encoding RAW data. In some newer ones that happens from ISO6400. In any case ISOs above ISO1600, no matter if they are analogue or digital, don't add anything to the RAW shooter because SNR doesn't improve anymore.

By processing I meant applying noise reduction on the RAW data. The digital ISOs (and this includes +1/3 and +2/3 values) are of no interest to the RAW shooter.


If that digital iso is eg a mult by 0.5, you would need 13 bits to store a 12 bit signal without loss. That iselevant to all raw shooters using such an iso setting. My guess is that many do.

Only offered as a theory as to why canon use more bits than they seem to need

-H
« Last Edit: December 31, 2011, 05:38:15 am by hjulenissen »
Logged

Guillermo Luijk

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2005
    • http://www.guillermoluijk.com
Re: 16 Bit Myth
« Reply #85 on: December 31, 2011, 06:04:50 am »

That iselevant to all raw shooters using such an iso setting. My guess is that many do.

They shouldn't. There is a post by Gabor Schorr where he explained all ISOs in Canon cameras. Anyway I am not sure if ISOs 160, 320, 640, 1250, 2500 and 5000 are post processed before encoding the RAW data (which would lead to information loss), or are just metatagged in order to inform the RAW developer to appy additional -1/3EV baseline exposure correction.

Regards

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: 16 Bit Myth
« Reply #86 on: December 31, 2011, 06:18:25 am »

They shouldn't.
You probably are right. But for understanding why Canon do what they do, I think it is more relevant to understand what their customers actually do, than what they should have done.

Canon did hint by letting the 1/3 stop ISO settings be hidden by default, and by limiting the auto-ISO to not exceed 1600 or 3200.

-h
Logged

hjulenissen

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2051
Re: 16 Bit Myth
« Reply #87 on: December 31, 2011, 06:20:33 am »

Another issue is there is no current technology to view 16 bits. If you look at the websites of LCD manufacturers you will see they are mostly 10 bit devices with some 12 (There may be 14 since the last time I looked) for very expensive hospital devices. Everything over the 10 bits is dithering of adjacent pixels. Yes, your OS claims you have 32 bit color. Prints have even less dynamic range.

Despite that I would still prefer a camera that could record a full 16 bits for future screen technology. Some shots cannot be redone. A full 16 bit pipe from sensor to file is a good thing. My camera does multishot HDR very well, the files look good. Unfortunately it only outputs this to 8bit jpg. Doh! Now why cant they combine multishot into a full 16bit RAW?
As Erik said. Displays have a non-linear "gamma" response. Cameras have a linear response. You need more "camera bits" than "display bits"

-h
Logged

Guillermo Luijk

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2005
    • http://www.guillermoluijk.com
Re: 16 Bit Myth
« Reply #88 on: December 31, 2011, 08:20:02 am »

You probably are right. But for understanding why Canon do what they do, I think it is more relevant to understand what their customers actually do

Canon cares of what their customers do: most of them shoot JPEG and that is why even DSLR cameras are JPEG-oriented. And RAW shooters still have good enough tools to obtain beautiful pictures, even if they are not the best possible.

Only advanced users and geeks like us are interested in such things as RAW histograms, know about ISO implementation,... and so forth.

Regards

Steve Hendrix

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1662
    • http://www.captureintegration.com/
Re: 16 Bit Myth
« Reply #89 on: December 31, 2011, 10:14:14 am »

On the topic of bit depth, and in particular 16 bits, I don't have much to add technically, except that my sense is that if a product is a native 16 bit product, then at the least, that spec is required to deliver the quality of image that the product is capable of. Whether 14 bits or 16 bits, etc, - by itself - is superior, necessary, noticeable, is not something that IMO particularly matters or is even easy to specifically identify. I will say that I have not mentioned 16 bit depth as a feature in quite some time, though I have discussed 16 bit output on many occasions, most often in the context of a minimal Capture One to Photoshop or Lightroom workflow for those who have an affinity for Adobe products (raw converting in Capture One, then outputting a robust 16 bit tiff into either Adobe program).

In any event, it is not, nor should it be, a dominant part of any discussion that we would have with a client. There are far too many elements that are of much greater importance that determine whether a Leaf Aptus 22, for example, is an appropriate fit for a client of ours.

From that standpoint, I would have to say I agree that the term "True 16 Bit Depth" is of questionable merit or value when limited to stating just 3 features on the banner for our $4,990 Leaf Aptus 22 special. But, not an easy spot to come up with something that is designed to grab, given the limitations. And that banner is just that, a grabber. From its intent of provoking a phone call, email inquiry, mentioning 16 bit depth may indeed provide an end to that means, but let's say one does make a call (as many have). I can say not a single call on that product has asked about True 16 Bit Depth, nor has there been a discussion on it. However, if there was, the relevance of True 16 Bit Depth would gradate into the actual tonality of the image quality itself, rather than a simple "Well look fella, 16 bits means better tonality". The fact is that the original Leaf Aptus 22 did indeed have a very nice transition in tonal range when used appropriately, even compared to other 22 megapixel digital backs. But it is not a straight path, as Doug said, and how and why is a combination of many elements. The 16 bit part of the equation is highly likely a necessary part of that equation, but not solely responsible for the smoothness of those transitions.

With more time and thought, I agree, we could probably find a more relevant stated feature that would be a reason to purchase that product. That said, it is difficult to find "16 Bit Depth" trumpeted anywhere on our site, and even on Leaf's site. So, I think to be fair, it's pretty clear we do not as a company make broad use of the term to sell product or mislead clients. And compared to just about any other website that sell digital backs (manufacturer or dealer) I think we have by comparison an abundance of real world information vs manufacturer-driven marketing data.

Now, one more thing, getting back to the use of the word "True". If anything, I feel like that was the one word that had the most legitimate reason for being there. Many, many products advertise themselves as "16 bit", but in almost all cases, they are referring to up-sampling the output. So, True 16 Bit, is relevant from the standpoint of 16 bit on the input side. Further, I will also say that I am not clear on whether the Dalsa 22MP sensor is a "true native 16 bit sensor". It has always been claimed and/or assumed, while it was clear that the Kodak Sensors used in digital backs (Sinar, Imacon/Hasselblad, Phase One (pre-P65+, post H10), have always been 14 bit native, even though data sheets for all of these products trumpet 16 bit. So, stating True 16 Bit, has also been a differentiator between Kodak and Dalsa sensors. However, at some point - for me - even the assumption that Dalsa sensors were native 16 bit came into question, and to this day remains unclear. As a result, the importance of the 16 bit spec is rarely even mentioned, much less discussed. The question of how the real world tonality and handling of shadow and highlight gradations differ between one digital back and another is a more relevant and much more frequently discussed topic that may be related to bit depth, but for the user, much more pertinent than a discussion of 14 bits or 16 bits etc.

We spend an enormous amount of time going over what we put on our website, I have discussed this with Doug, and it is a bit late now, but on the next 3 word banner we have, one of the stated features absolutely, has to be "Fabulous".


Steve Hendrix

Logged
Steve Hendrix • 404-543-8475 www.captureintegration.com (e-mail Me)
Phase One | Leaf | Leica | Alpa | Cambo | Sinar | Arca Swiss

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: 16 Bit Myth
« Reply #90 on: December 31, 2011, 10:27:24 am »

Hi,

That makes us two...

I don't really think it's the AD converter that plays that role, but could be that the Bionz processor is just 12 bits wide. On the Exmor chips the AD converters are actually on the chip, so the physical converters are probably/possibly the same.

Best regards
Erik


It's always bugged me that Nikon gets more out of a given Sony sensor than Sony. Of course the price is also much higher. The alternate 14bit DA converter pathway does give Nikon a real advantage.
Logged
Erik Kaffehr
 

Ken Doo

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1079
    • Carmel Fine Art Printing & Reproduction
Re: 16 Bit Myth
« Reply #91 on: December 31, 2011, 10:53:05 am »

....We spend an enormous amount of time going over what we put on our website, I have discussed this with Doug, and it is a bit late now, but on the next 3 word banner we have, one of the stated features absolutely, has to be "Fabulous".

Steve Hendrix


How about: "Steve recommended.  Doug Approved."

Works for me.   ;D    lol  ken

LKaven

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1060
Re: 16 Bit Myth
« Reply #92 on: December 31, 2011, 10:58:46 am »

Steve, the product data sheet for the Dalsa FTF4052C sensor used in the Aptus 22 reads:

"The FTF 4052C is a full frame CCD colour image sensor designed for professional digital photography applications with very low dark current and a linear dynamic range of over 12 true bits at room temperature."

I think this, by stipulation, is the only sense of "true" in terms of "true bits" that you are looking at.

PierreVandevenne

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 512
    • http://www.datarescue.com/life
Re: 16 Bit Myth
« Reply #93 on: December 31, 2011, 11:45:12 am »

while it was clear that the Kodak Sensors used in digital backs (Sinar, Imacon/Hasselblad, Phase One (pre-P65+, post H10), have always been 14 bit native, even though data sheets for all of these products trumpet 16 bit.

KAF 31600
FWC: 60000 e (per Kodak data sheet)
Read Moise: 16 e (per Kodak data sheet)
-> 3750 levels -> under 12 bits of DR
Sanity check
DR = 20 x (Log (60000/16) = 71.5 dB

Kodak data sheet: 70.5 dB

Good match allowing for rounding errors.

If one wants a sensor with 14 bits of DR, one should look at a sensor with a linear DR of 84 dB (or a sensor using partial read, multiple exposure summing per pixel described in the other thread).

And yes, it doesn hurt to have an A/D converter that outperforms the sensor a bit. But if one goes from sensor to a/d converter to data path, it's quite easy to tell which number will be used by marketeers :-).
Logged

Steve Hendrix

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1662
    • http://www.captureintegration.com/
Re: 16 Bit Myth
« Reply #94 on: December 31, 2011, 12:03:08 pm »

Steve, the product data sheet for the Dalsa FTF4052C sensor used in the Aptus 22 reads:

"The FTF 4052C is a full frame CCD colour image sensor designed for professional digital photography applications with very low dark current and a linear dynamic range of over 12 true bits at room temperature."

I think this, by stipulation, is the only sense of "true" in terms of "true bits" that you are looking at.


Quite possibly. However, back to the original cruxt of stating 16 bits as a specification, if "true" bit depth was stated for every product, it might also lead to confusion, as the stated bit depth from marketing data sheets would likely always differ. So, stating the FTF 4052C as a true 12 bit sensor would confuse, as many other products might genuinely natively be 8 bit, 11 bit, etc. Inacurrate as it may be, 8 bit, 12 bit, 14 bit, 16 bit, etc, are an established reference point. And given the relevance, or lack of relevance if you will, of bit depth as a singular statistic for making a purchasing decision, I don't believe reinventing the wheel when it comes to establishing "true" bit depth for every imaging device on the market is worth the time. And considering I cannot recall the last time I even had a discussion on the bit depth statistic with a client, I would say interested purchasers tend to agree.
Logged
Steve Hendrix • 404-543-8475 www.captureintegration.com (e-mail Me)
Phase One | Leaf | Leica | Alpa | Cambo | Sinar | Arca Swiss

LKaven

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1060
Re: 16 Bit Myth
« Reply #95 on: December 31, 2011, 12:34:48 pm »


Quite possibly. However, back to the original cruxt of stating 16 bits as a specification, if "true" bit depth was stated for every product, it might also lead to confusion, as the stated bit depth from marketing data sheets would likely always differ. So, stating the FTF 4052C as a true 12 bit sensor would confuse, as many other products might genuinely natively be 8 bit, 11 bit, etc. Inacurrate as it may be, 8 bit, 12 bit, 14 bit, 16 bit, etc, are an established reference point. And given the relevance, or lack of relevance if you will, of bit depth as a singular statistic for making a purchasing decision, I don't believe reinventing the wheel when it comes to establishing "true" bit depth for every imaging device on the market is worth the time. And considering I cannot recall the last time I even had a discussion on the bit depth statistic with a client, I would say interested purchasers tend to agree.

Even as someone who studied semantics for many years, I still had to pace the floor for a solid 5 minutes to unpack this circuitous defense.

This all started due to an Aptus back that you advertised as having "16 true bits."  This left us scratching our heads about the intended meaning of the word "true."  By implicature, it appeared you were trying to put forth the impression that this back turned out 16 bits of image data, as opposed to 12 or 13 bits of image data and 3-4 bits of noise.  This of course was a false impression.  You wisely decided to correct it, while somehow still defending it.

The manufacturer's technical data sheet supplies a meaning of "true" that is hard to counter.  It claims just over 12 "true bits."  This is not a marketing sheet by the way, but a technical reference with pin-outs and voltages. 

You cite confusion on the part of the prospective customer over varying ratings of bit depth.  You also say that the customer rarely discusses it.  But in point of fact, you try to forestall such discussion by advertising "16 true bits" on your bullet list (as do some of the MF manufacturers in their own marketing literature, e.g., the Pentax 645D).  So by this time, the customer has accepted it unquestioningly as evidenced by the number of times the claim is parroted on these forums.  I would not expect the discussion to come up again unless the customer somehow gets the correct impression that this is not a truthful claim.  You've introduced the confusion you see, a confusion that is not exposed at the point of sale, but later on, in forum threads like this one.  And if the bit specifications were truly inconsequential (both to buyers and in practice), you wouldn't use them in marketing literature.  I don't know what "wheel" you are talking about reinventing here.  And I don't know what kind of "reference point" is established regarding bit depth except those that are intended to induce vague, positive impressions about final image quality in a prospective buyer.

By your lights, all DSLRs might be called 16 bit since they can be made to produce 16 bit files.  In fact, most DSLRs have more "true" bits, considered as measurable dynamic range, than your Aptus.  What your Aptus has, and it does have something special, is something else altogether. 

ErikKaffehr

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 11311
    • Echophoto
Re: 16 Bit Myth
« Reply #96 on: December 31, 2011, 12:52:00 pm »

Hi,

I would say that telling that a certain CCD based MF digital back is a 16 bit device when it's more like a 12 bit device is a bit of false marketing, especially when the fake 16 bits are claimed to be an advantage over the real 14 bits sensor may actually use. Pentax K5 is close to 14 bits, it seems.

The best way to demonstrate the advantage of MFDBs over other cameras are decent and technically correct raw-files shot under controlled conditions. I have seen very few examples of that. At least such files involving both MFDBs and state of the art DSLRs. I have seen a couple of such tests:

- Lloyd Chambers has tested the Leica S2, Pentax 645D, Hasselblad H3D and some others.

- Miles Hecker has compared Pentax 645D to Canon 5DII and Nikon D3X.

- Imaging Resource has raw files from the Penatx 645D for download

In my view those tests demonstrated the advantages of MFD. Photography is about images, so why use fake bits for marketing instead of real raw images?

Best regards
Erik

The point is to explain to someone who isn't a scientist why most 22mp backs are better than most 22mp dSLRs. Why pixels are not all created equally. Assume you have about 10 characters worth of space to explain that and the person is not the 5% of users who get deeply involved in the science. "True 16 bit" is about as good as I can come up with as most customers understand the difference between 8 and 16 bits elsewhere in photography. I also considered "Great tonal smoothness" or "holds up to strong styling in post processing well" but it didn't have the same ring and took up much more room. I'm open for suggestions. We're not trying to be dishonest or disingenuous.

The real world advantage is real (e.g. how the file looks after you add a lot of contrast and local dodge/burn or a significant change in WB) but the explanation could take up many pages regarding the entire imaging chain (including the quality, not just bit depth, of the A/D convertor and dark frame technology) and the emphasis throughout the chain on image quality over convenience, features, speed, or cost.

Anyone who has dealt with us (Capture Integration) knows our emphasis is on the real world and on results and real world testing not specs or marketing spiels - but the reality of marketing is you have only a few seconds of attention span and a few characters to use it.

I'm serious when I say I'm open for suggestions.

Doug Peterson (e-mail Me)
__________________

Head of Technical Services, Capture Integration
Phase One Partner of the Year
Leaf, Leica, Cambo, Arca Swiss, Canon, Apple, Profoto, Broncolor, Eizo & More

National: 877.217.9870  |  Cell: 740.707.2183
Newsletter | RSS Feed
Buy Capture One 6 at 10% off

« Last Edit: December 31, 2011, 07:37:02 pm by ErikKaffehr »
Logged
Erik Kaffehr
 

Steve Hendrix

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1662
    • http://www.captureintegration.com/
Re: 16 Bit Myth
« Reply #97 on: December 31, 2011, 01:00:37 pm »

Even as someone who studied semantics for many years, I still had to pace the floor for a solid 5 minutes to unpack this circuitous defense.

This all started due to an Aptus back that you advertised as having "16 true bits."  This left us scratching our heads about the intended meaning of the word "true."  By implicature, it appeared you were trying to put forth the impression that this back turned out 16 bits of image data, as opposed to 12 or 13 bits of image data and 3-4 bits of noise.  This of course was a false impression.  You wisely decided to correct it, while somehow still defending it.

The manufacturer's technical data sheet supplies a meaning of "true" that is hard to counter.  It claims just over 12 "true bits."  This is not a marketing sheet by the way, but a technical reference with pin-outs and voltages. 

You cite confusion on the part of the prospective customer over varying ratings of bit depth.  You also say that the customer rarely discusses it.  But in point of fact, you try to forestall such discussion by advertising "16 true bits" on your bullet list (as do some of the MF manufacturers in their own marketing literature, e.g., the Pentax 645D).  So by this time, the customer has accepted it unquestioningly as evidenced by the number of times the claim is parroted on these forums.  I would not expect the discussion to come up again unless the customer somehow gets the correct impression that this is not a truthful claim.  You've introduced the confusion you see, a confusion that is not exposed at the point of sale, but later on, in forum threads like this one.  And if the bit specifications were truly inconsequential (both to buyers and in practice), you wouldn't use them in marketing literature.  I don't know what "wheel" you are talking about reinventing here.  And I don't know what kind of "reference point" is established regarding bit depth except those that are intended to induce vague, positive impressions about final image quality in a prospective buyer.

By your lights, all DSLRs might be called 16 bit since they can be made to produce 16 bit files.  In fact, most DSLRs have more "true" bits, considered as measurable dynamic range, than your Aptus.  What your Aptus has, and it does have something special, is something else altogether. 



My point was that there may be technical reference sheets for many products that state the actual bit depth as less than the stated bit depth on the manufacturers end product data sheet. That is all that I was saying.

A lot is being made of our use of the term true 16 bit (as stated by Leaf themselves) on a banner for the Leaf Aptus 22. I can barely find a mention of 14 or 16 bit depth anywhere else on our website, so I think it is being a bit overblown so far as our use of bit depth is concerned as it relates to selling a product. I agree, "true 16 bit" is vague, possibly technically inaccurate, and not one of the top 3 features I would mention as reasons to consider an Aptus 22. However, we've already noted that, and it is not indicative of Capture Integration being the industry culprit for this.


Steve Hendrix
Logged
Steve Hendrix • 404-543-8475 www.captureintegration.com (e-mail Me)
Phase One | Leaf | Leica | Alpa | Cambo | Sinar | Arca Swiss

LKaven

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1060
Re: 16 Bit Myth
« Reply #98 on: December 31, 2011, 01:32:31 pm »

[...]
I agree, "true 16 bit" is vague, possibly technically inaccurate, and not one of the top 3 features I would mention as reasons to consider an Aptus 22. However, we've already noted that, and it is not indicative of Capture Integration being the industry culprit for this.

True enough, Steve.  If Leaf and/or Pentax had their in-house reps writing to this forum, we'd be taking them to task in a bigger way.  [In fact, they should be put on notice for these kinds of claims.  This kind of myth, about a product that costs as much as an Audi in many cases, should be squelched.]  Since you and Doug undertook a defense of the practice, you probably caught more than your fair share.

Steve Hendrix

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1662
    • http://www.captureintegration.com/
Re: 16 Bit Myth
« Reply #99 on: December 31, 2011, 01:44:14 pm »

True enough, Steve.  If Leaf and/or Pentax had their in-house reps writing to this forum, we'd be taking them to task in a bigger way.  [In fact, they should be put on notice for these kinds of claims.  This kind of myth, about a product that costs as much as an Audi in many cases, should be squelched.]  Since you and Doug undertook a defense of the practice, you probably caught more than your fair share.


And to be clear, we're not advocating a defense of it. I've already stated that in hindsight, it wasn't worth mentioning as a top 3 feature, regardless of the actual specification (for the Aptus 22 product). And I can't find anywhere else on our website where we really even mention bit depth other than when re-positing manufacturer stated specifications. Because of the public vagueness of actual bit depth on a per sensor basis, we have never really spent much energy espousing it. This just happened to find its way into a little space as a buzzword and I agree it is not why someone buys an Aptus 22. The physical sensor size, pixel dimensions, tonality, color reproduction and that Leaf Special Sauce are far more pertinent.


Steve Hendrix
Logged
Steve Hendrix • 404-543-8475 www.captureintegration.com (e-mail Me)
Phase One | Leaf | Leica | Alpa | Cambo | Sinar | Arca Swiss
Pages: 1 ... 3 4 [5] 6 7 8   Go Up