Luminous Landscape Forum

Equipment & Techniques => Medium Format / Film / Digital Backs – and Large Sensor Photography => Topic started by: asf on December 28, 2011, 07:23:20 pm

Title: 16 Bit Myth
Post by: asf on December 28, 2011, 07:23:20 pm
Why is it do you think the 16 bit myth refuses to die?
Who is it that is still promoting this?

I just noticed a recent well known dealer's ad for (a good deal) on a back and the 2nd thing they promote is "true 16 bit quality".

I remember my dealer gave me that line many years back when I laid out $20k for my first back and I also remember the feeling when I learned the truth about this bit of sales speak. Not that it made my back any worse, but it did make me think about every technical thing they told me and reassess just how knowledgeable they really were.

Title: Re: 16 Bit Myth
Post by: Doug Peterson on December 28, 2011, 07:54:35 pm
The point is to explain to someone who isn't a scientist why most 22mp backs are better than most 22mp dSLRs. Why pixels are not all created equally. Assume you have about 10 characters worth of space to explain that and the person is not the 5% of users who get deeply involved in the science. "True 16 bit" is about as good as I can come up with as most customers understand the difference between 8 and 16 bits elsewhere in photography. I also considered "Great tonal smoothness" or "holds up to strong styling in post processing well" but it didn't have the same ring and took up much more room. I'm open for suggestions. We're not trying to be dishonest or disingenuous.

The real world advantage is real (e.g. how the file looks after you add a lot of contrast and local dodge/burn or a significant change in WB) but the explanation could take up many pages regarding the entire imaging chain (including the quality, not just bit depth, of the A/D convertor and dark frame technology) and the emphasis throughout the chain on image quality over convenience, features, speed, or cost.

Anyone who has dealt with us (Capture Integration) knows our emphasis is on the real world and on results and real world testing not specs or marketing spiels - but the reality of marketing is you have only a few seconds of attention span and a few characters to use it.

I'm serious when I say I'm open for suggestions.

Doug Peterson (e-mail Me) (doug@captureintegration.com)
__________________

Head of Technical Services, Capture Integration
Phase One Partner of the Year
Leaf, Leica, Cambo, Arca Swiss, Canon, Apple, Profoto, Broncolor, Eizo & More

National: 877.217.9870  |  Cell: 740.707.2183
Newsletter (http://"http://www.captureintegration.com/our-company/newsletters/") | RSS Feed (http://"http://www.captureintegration.com/2008/08/11/rss-feeds/")
Buy Capture One 6 at 10% off (http://"http://www.captureintegration.com/phase-one/buy-capture-one/")
Title: Re: 16 Bit Myth
Post by: LKaven on December 28, 2011, 08:22:51 pm
The point is to explain to someone who isn't a scientist why most 22mp backs are better than most 22mp dSLRs. Why pixels are not all created equally. Assume you have about 10 characters worth of space to explain that and the person is not the 5% of users who get deeply involved in the science. "True 16 bit" is about as good as I can come up with as most customers understand the difference between 8 and 16 bits elsewhere in photography. I also considered "Great tonal smoothness" or "holds up to strong styling in post processing well" but it didn't have the same ring and took up much more room. I'm open for suggestions. We're not trying to be dishonest or disingenuous.

The true reasons you are alluding to have to do more with the area of the sensor.  So why wouldn't you just say that?  There are plenty of true reasons to want your products.

Saying "16 bit" is already false, but adding the word "true" as if to persuade the buyer "no really really" seems to cross the boundary.  There is no need to make things up.  You have a good product.

Quote
The real world advantage is real (e.g. how the file looks after you add a lot of contrast and local dodge/burn or a significant change in WB) but the explanation could take up many pages regarding the entire imaging chain (including the quality, not just bit depth, of the A/D convertor and dark frame technology) and the emphasis throughout the chain on image quality over convenience, features, speed, or cost.

But there are still true things you could say, instead of false things.  The size of the sensor comes to mind, and the importance of that.

Quote
Anyone who has dealt with us (Capture Integration) knows our emphasis is on the real world and on results and real world testing not specs or marketing spiels - but the reality of marketing is you have only a few seconds of attention span and a few characters to use it.

I'm serious when I say I'm open for suggestions.

I've seen you here enough to know that you put a lot of care into your relations.  But the 16-bit claim shouldn't be excused as an expedient.  You can say that it is a very high fidelity capture system without exaggerating its specs.  
Title: Re: 16 Bit Myth
Post by: asf on December 28, 2011, 08:32:07 pm
I've seen you here enough to know that you put a lot of care into your relations.

I don't know Mr Peterson but I can agree with the above.

Title: Re: 16 Bit Myth
Post by: digitaldog on December 28, 2011, 09:39:25 pm
Why is it do you think the 16 bit myth refuses to die?
I just noticed a recent well known dealer's ad for (a good deal) on a back and the 2nd thing they promote is "true 16 bit quality".

The question could be, are they saying this back truly produces 16-bits or that it is high bit and assuming or misunderstanding that Photoshop considers all documents with more than 8-bits per color a “16-bit file”?

Or is the question, do people today really still question the usefulness of high bit data? Well one fellow continues to do so, but he’s best ignored.
Title: Re: 16 Bit Myth
Post by: BernardLanguillier on December 28, 2011, 10:11:39 pm
The question could be, are they saying this back truly produces 16-bits

I am sure they are equiped with parts producing 16 bits data. So writing "true 16 bits" is no more false advertising than Epson writing "true 4800dpi" for their scanners or than YBA writing about the power that their high end amps can handle.

The question is whether these 16 bits include more useful data than a 14 bits pipe. There are little evidence pointing to a yes. Am I saying that backs do not have smoother transitions? Nope. I am saying than even if they do, the true reason is not the bit depth the inaging pipe can handle, but more likely a combination of CCD sensor and the quality of the adc parts used.

As a side comments, this is related to DR also, since you obviously need more sampling information to cover a wider range of values. But it is also pretty obvious that 14bits are sufficient to cover the DR backs can handle.

I understand the point Doug is making about the difficulty to market these because of the lack if easily understandable/measurable metrics associated with the smoothness of color transitions in a file.

Besides, oher than a few posts at LL among people who mostly do not own backs, the 16 bits "myth" probably works wonders among backs owners. Just like a heavier amp Must sound better. A 16 bits imaging pipe Musr deliver better transitions. Once the buyer is convinced of sonething the deal is closed. :-)

So I would personnally stick to "true 16 bits" if I were in the Phaseone eco-system.

Cheers,
Bernard
Title: Re: 16 Bit Myth
Post by: pindman on December 28, 2011, 10:32:53 pm
 ???
Hello Doug,

The truth Doug is you a telling lies when you say “true 16 bit quality” plan and simple.

Regards

Simon

Simon,

I don't know why you need to launch a personal attack on Doug.  It only shows your ignorance, and adds nothing to the topic. 

Yes, I use an IQ180 back, and that quality is great, and I don't care where the discussion of bit-depth leads.  On the other hand, I have known Doug for several years, and have had the pleasure of getting to spend some time with him in Carmel, CA.  I have nothing but respect for his integrity. 

If you have better science, please feel free to share it, but please don't stoop to personal attacks.

Paul
Title: Re: 16 Bit Myth
Post by: LKaven on December 28, 2011, 10:43:41 pm
I am sure they are equiped with parts producing 16 bits data. So writing "true 16 bits" is no more false advertising than Epson writing "true 4800dpi" for their scanners or than YBA writing about the power that their high end amps can handle.

Most all DSLRs are equipped with 16-bit data buses, and maybe even some P&S cameras.  So they are all equally entitled to call themselves "true 16 it"?  Or none of them are?  What would you call a camera that actually succeeds in recording 16 bit of DR -- a "really really real 16-bit?" 

Quote
The question is whether these 16 bits include more useful data than a 14 bits pipe. There are little evidence pointing to a yes. Am I saying that backs do not have smoother transitions? Nope. I am saying than even if they do, the true reason is not the bit depth the inaging pipe can handle, but more likely a combination of CCD sensor and the quality of the adc parts used.

I think it's the area of the sensor.  I don't see where the Exmor is lagging behind the CCD per unit area of the sensor.

Quote
...

Quote
Besides, oher than a few posts at LL among people who mostly do not own backs, the 16 bits "myth" probably works wonders among backs owners. Just like a heavier amp Must sound better. A 16 bits imaging pipe Musr deliver better transitions. Once the buyer is convinced of sonething the deal is closed. :-)

So I would personnally stick to "true 16 bits" if I were in the Phaseone eco-system.
This is strange logic.  Because it's customary to lie, it isn't morally wrong, and closing the deal is the only thing that matters? 
Title: Re: 16 Bit Myth
Post by: BernardLanguillier on December 28, 2011, 10:50:06 pm
Most all DSLRs are equipped with 16-bit data buses, and maybe even some P&S cameras.  So they are all equally entitled to call themselves "true 16 it"?  Or none of them are?  What would you call a camera that actually succeeds in recording 16 bit of DR -- a "really really real 16-bit?"  

I don't believe that any non MFDB cameras uses a 16 bits analog to digital converter.

Do you have any facts proving me wrong?

Cheers,
Bernard
Title: Re: 16 Bit Myth
Post by: BernardLanguillier on December 28, 2011, 10:54:22 pm
The fact is that if he has made such a comment about there backs shoot in 16 bit then it is a lie and is false advertising.

Simon,

No, it is not a lie since the backs use 16 bits parts.

I don't see what is so hard to understand here.

Nothing prevents you favorite DSLR brand from also using 16 bits parts, the cost would be trivial considering the volumes. The raw files would be 13% larger with little or no tangible benefits, but they could do it. They don't do it mostly because their marketing dpt is telling them that the ROI would not be significant compared to other forms of improvement.

Also, it is well known that Japanese companies talk with each other to define the main steps of technological evolution, and they have obviously decided that it would be 14bits for a few more generations so you should buy a MFDB if it bothers you not to own a device with the highest bit depth spec. :-)

Cheers,
Bernard
Title: Re: 16 Bit Myth
Post by: LKaven on December 28, 2011, 11:00:13 pm
I don't believe that any non MFDB cameras uses a 16 bits analog to digital converter.

Do you have any facts proving me wrong?

Cheers,
Bernard

When you referred to "parts producing 16 bits data" I assumed that you meant firmware processes.  What I referred to were 16 bit "data buses," for internal DSP functions, used by all DSLRs to provide two extra bits of precision to avoid cumulative error.   For there are no parts producing 16 bit data otherwise.  Surely even 16-bit converters could not produce 16 bits of "data" where bits 15-16 were uncorrelated noise at best.
Title: Re: 16 Bit Myth
Post by: BernardLanguillier on December 28, 2011, 11:03:45 pm
When you referred to "parts producing 16 bits data" I assumed that you meant firmware processes.  What I referred to were 16 bit "data buses," for internal DSP functions, used by all DSLRs to provide two extra bits of precision to avoid cumulative error.   For there are no parts producing 16 bit data otherwise.  Surely even 16-bit converters could not produce 16 bits of "data" where bits 15-16 were uncorrelated noise at best.

So we do agree that backs have a factual spec diferentiator (16 bits adc) that no other camera has that justifies the "true 16 bits" wording, right?

Cheers,
Bernard
Title: Re: 16 Bit Myth
Post by: LKaven on December 28, 2011, 11:06:03 pm
No, it is not a lie since the backs use 16 bits parts.

I don't see what is so hard to understand here.

Is that "16-bit", "true 16-bit" or "really really true 16-bit" or "no honest this time we're not kidding true 16-bit"?
Title: Re: 16 Bit Myth
Post by: LKaven on December 28, 2011, 11:10:42 pm
What is the point of touting a non-functioning, zero-benefit "feature" unless one is somehow trying to induce the false belief in the buyer that there actually is some corresponding benefit.  

What further is the point of adding the word "true" to characterize this except to add perlocutionary force to a fiction.
Title: Re: 16 Bit Myth
Post by: BernardLanguillier on December 28, 2011, 11:17:22 pm
Is that "16-bit", "true 16-bit" or "really really true 16-bit" or "no honest this time we're not kidding true 16-bit"?

Hum... I have no choice but to wave the white flag considering the undeniable strenght of your arguments. :-)

Plus my train is about to arrive in a wonderful shooting locale where the 14 bits of the D7000 I carry will unleash their earth shaking power.

Cheers,
Bernard
Title: Re: 16 Bit Myth
Post by: LKaven on December 28, 2011, 11:23:44 pm
:-)  Those 14 bits wouldn't be true without you.
Title: Re: 16 Bit Myth
Post by: BernardLanguillier on December 28, 2011, 11:26:55 pm
:-)  Those 14 bits wouldn't be true without you.

Cheers to that, I wish you a truely excellent year end!

Cheers,
Bernard
Title: Re: 16 Bit Myth
Post by: Chris_Brown on December 28, 2011, 11:32:27 pm
Why is it do you think the 16 bit myth refuses to die?

Okay, I'll bite. What is the myth?
Title: Re: 16 Bit Myth
Post by: Snapped on December 29, 2011, 12:31:45 am
A 16 Bit file is infinitely more robust under heavy PS work ( color grading / exposure adjustment ) Many 35mm CMOS chip cameras are only 14 Bit files, even when captured RAW and processed to " 16 Bit " they are simply interpolated up to 16b Bit depth size from their native 14 Bit capability. As we all know image interpolation is basically crap.  The difference in fidelity / integrity between a 16 Bit and a 14 Bit and an 8 Bit is huge, if you understand the maths of Bit depth you will appreciate the difference in descriptive capabilities of 16 Bits of data over 14 Bits is simply HUGE ..........

For those disbelievers I seriously suggest you do some homework and a simple test,

Take any RAW file, process it to say a 3 stops underexposed, and to the wrong color balance by say 3,000 K   in both 8 Bit and 16 Bit, then correct the two files to correct exposure and color, then look at the two histograms...........  well if you still feel good giving your client the 8 Bit file I'm really happy because ultimately it means there is one more lazy shooter out there selling weak files that means my files will look comparatively better
Title: Re: 16 Bit Myth
Post by: Snapped on December 29, 2011, 12:33:02 am
Okay, I'll bite. What is the myth?

It refuses to " die " because it's not a myth, it's fact !
Title: Re: 16 Bit Myth
Post by: theguywitha645d on December 29, 2011, 01:07:28 am
Snapped, what camera are you using that produces a 16-bit file?
Title: Re: 16 Bit Myth
Post by: bjanes on December 29, 2011, 01:09:36 am
A 16 Bit file is infinitely more robust under heavy PS work ( color grading / exposure adjustment ) Many 35mm CMOS chip cameras are only 14 Bit files, even when captured RAW and processed to " 16 Bit " they are simply interpolated up to 16b Bit depth size from their native 14 Bit capability. As we all know image interpolation is basically crap.  The difference in fidelity / integrity between a 16 Bit and a 14 Bit and an 8 Bit is huge, if you understand the maths of Bit depth you will appreciate the difference in descriptive capabilities of 16 Bits of data over 14 Bits is simply HUGE ..........

Snapped,

I fear that you have snapped from reality into fantasy and it is you who does not understand the math of bit depth. It makes no sense to quantize the data into finer steps than the noise (see Emil Martinec (http://theory.uchicago.edu/~ejm/pix/20d/tests/noise/noise-p3.html#bitdepth)). Those extra two bits serve largely to quantize noise. If that is your intent, fine. The per pixel of the best Phase One sensors is no better than that of the Nikon D3x. When rendering a 14 bit raw file into a 16 bit space, interpolation is not performed. The least significant bits are merely padded to zero. Look at the tonal range of the Phase One IQ 180 as measured by DXO--it is 8.52 bits screen. That camera simply can not make use of the full 16 bit range of the ADC.

For those disbelievers I seriously suggest you do some homework and a simple test,

Take any RAW file, process it to say a 3 stops underexposed, and to the wrong color balance by say 3,000 K   in both 8 Bit and 16 Bit, then correct the two files to correct exposure and color, then look at the two histograms...........  well if you still feel good giving your client the 8 Bit file I'm really happy because ultimately it means there is one more lazy shooter out there selling weak files that means my files will look comparatively better

Why would anyone want to perform manipulation of a 14 bit sensor in an 8 bit space? That is total nonsense. Why don't you perform tests comparing a 14 bit D3x to one of the older 16 bit MFDBs? The D3x would likely come out better. With a severely underexposed image, the MFDB CCD would not fare well because of its high read noise.   Also, the histogram is not the best way to judge image quality. If the gaps in a histogram are not perceptible and the levels are dithered by noise the image will be fine.

Regards,

Bill
Title: Re: 16 Bit Myth
Post by: Snapped on December 29, 2011, 01:13:39 am
Snapped, what camera are you using that produces a 16-bit file?

I didn't say I was using a 16 Bit camera, I'm merely supporting the opinion that a 16 Bit file is superior to an 8 Bit, or 14 Bit interpolated to 16 Bit ..........

The discussion isn't about brands or cameras, simple file integrity and robustness....
Title: Re: 16 Bit Myth
Post by: bradleygibson on December 29, 2011, 01:28:49 am
I'm not sure what "myth" the OP refers to exactly, but you have cameras which package 14, 12 or some other number lower than 16 into a 16-bit data structure within the file.  To use anything other than 8 or 16 bit data structures in the file complicates encoding (writing) and later decoding (reading) of the file -- doing this is not a marketing game played by manufacturers, it lowers engineering cost and improves encode/decode performance.

Some manufacturers use a limited range (0-2^n, n < 16) while others scale their values to the full range of a 16-bit data structure.  In terms of information, both procedures are legitimate, and no data is lost in either method of storing a chunk of less-than-16-bit data in  a 16-bit data structure.

But herein lies the confusion.  How to differentiate between a 16-bit data structure containing <16 bits of information from the analog-to-digital (ADC) converter, and a 16-bit data structure containing fully 16-bits of data from the ADC?

I'm with Doug--a simple phrase like "true 16-bit" is a reasonable approach for getting the point across quickly.  If the photographer is curious, s/he can ask for more information ("what do you mean by 'true', Doug", and get a complete answer).  I see nothing wrong or dishonest with this whatsoever.  On the contrary, it informs me that the person I'm talking to just might understand how this stuff works better than the average salesperson, and that is a rare thing.

As for whether the 16-bit ADC output actually contains 16 bits of true information, this is a separate question (clearly it depends on the hardware implementation.  There are many sources for noise (I won't re-open that discussion), there are many different designs for analog stage processing and there are different technologies (CMOS and CCD being the primaries), and there are many applications (some more demanding than others).)  But even if "true 16-bit" data contains noise which effectively lowers the fidelity to < 16-bits, the same can be said of 14- or 12-bit data; in general, they will contain less than 14- or 12-bits of respective signal as well, for exactly the same reasons.

So for all intents and purposes, "true 16-bits" should contain more information than 14- or 12-bits, given comparable hardware implementations.  Whether these differences make any visible difference to your work will depend on your hardware and your application.

-Brad
Title: Re: 16 Bit Myth
Post by: ErikKaffehr on December 29, 2011, 02:07:38 am
Hi,

Sorry, this is non sense. We discuss 16-bit vs. 14 bit. Photoshop can handle 8, 16 and 32 bit data. 16 bit data is actually handled internally as 15 bit.

Interpolation is not used, nor necessary. You can leave the data as it is (MSBs padded with zeroes) or shift the data 1 or 3 bits to the left and zero pad LSB). No interpolation whatsoever. Sixteen bit data just contains noise in the corresponding bytes.

The DxO DR figure in screen mode is a good measure of the utilization of the signal path. Enclosed are Pentax 645D and Hasselblad H3DII 50, both cameras have a DR around 11.3, meaning that the only utilize a 12 bits of the signal path.

It's like tacho markings that go to 120 MPH but the car only goes 90 MPH.

Adding Pentax K5 to the mettle we can see that it actually uses 13.6 bits, so it is actually utilizing it's 14 bit signal path fully.

Interestingly enough it seem that Sony's Exmoors sensors may be the first ones to utilize more than 14 bits. A 24 MP full frame sensor using current Exmoor design would probably have a DR of 14.1

Best regards
Erik


A 16 Bit file is infinitely more robust under heavy PS work ( color grading / exposure adjustment ) Many 35mm CMOS chip cameras are only 14 Bit files, even when captured RAW and processed to " 16 Bit " they are simply interpolated up to 16b Bit depth size from their native 14 Bit capability. As we all know image interpolation is basically crap.  The difference in fidelity / integrity between a 16 Bit and a 14 Bit and an 8 Bit is huge, if you understand the maths of Bit depth you will appreciate the difference in descriptive capabilities of 16 Bits of data over 14 Bits is simply HUGE ..........

For those disbelievers I seriously suggest you do some homework and a simple test,

Take any RAW file, process it to say a 3 stops underexposed, and to the wrong color balance by say 3,000 K   in both 8 Bit and 16 Bit, then correct the two files to correct exposure and color, then look at the two histograms...........  well if you still feel good giving your client the 8 Bit file I'm really happy because ultimately it means there is one more lazy shooter out there selling weak files that means my files will look comparatively better
Title: Re: 16 Bit Myth
Post by: Stefan.Steib on December 29, 2011, 05:53:36 am
As Bill has already stated the 16 bit story is just that - a myth ! There is some solid info available from scientific image processing and even the input suffers to deliver that amount of data - so it absolutely does not make any sense to think it will improve overall quality. A good explanation for this you can find here :

http://theory.uchicago.edu/~ejm/pix/20d/tests/noise/         and following pages. from this comes the following information (page 3) and we are not even speaking about 16 bit !!!!

".....Curiously, most 14-bit cameras on the market (as of this writing) do not merit 14-bit recording. The noise is more than four levels in 14-bit units on the Nikon D3/D300, Canon 1D3/1Ds3 and 40D. The additional two bits are randomly fluctuating, since the levels are randomly fluctuating by +/- four levels or more. Twelve bits are perfectly adequate to record the image data without any loss of image quality, for any of these cameras (though the D3 comes quite close to warranting a 13th bit). A somewhat different technology is employed in Fuji cameras, whereby there are two sets of pixels of differing sensitivity. Each type of pixel has less than 12 bits of dynamic range, but the total range spanned from the top end of the less sensitive pixel to the bottom end of the more sensitive pixel is more than 13 stops, and so 14-bit recording is warranted.

A qualification is in order here -- the Nikon D3 and D300 are both capable of recording in both 12-bit and 14-bit modes. The method of recording 14-bit files on the D300 is substantively different from that for recording 12-bit files; in particular, the frame rate slows by a factor 3-4. Reading out the sensor more slowly allows it to be read more accurately, and so there may indeed by a perceptible improvement in D300 14-bit files over D300 12-bit files (specifically, less read noise, including pattern noise). That does not, however, mean that the data need be recorded at 14-bit tonal depth -- the improvement in image quality comes from the slower readout, and because the noise is still more than four 14-bit levels, the image could still be recorded in 12-bit tonal depth and be indistinguishable from the 14-bit data it was derived from...... "

Noise, Dynamic Range and Bit Depth in Digital SLRs

by Emil Martinec ©2008
last update: February 11, 2008

This whole 16 bit story is a bit like More Megahertz, More Horsepowers, and (excuse me the analogy) more centimeters..........;-)

Greetings from Munich
Stefan

Title: Re: 16 Bit Myth
Post by: LKaven on December 29, 2011, 07:48:13 am
I didn't say I was using a 16 Bit camera, I'm merely supporting the opinion that a 16 Bit file is superior to an 8 Bit, or 14 Bit interpolated to 16 Bit ..........

The discussion isn't about brands or cameras, simple file integrity and robustness....
You didn't fully read the beginning of the thread before you wrote in.  The discussion is about how many bits of image data are being recorded by the sensor.  MF camera manufacturers (and dealers) are advertising falsely that their cameras produce 16 bits, when by and large, they produce between 12 and 14 and absolutely no more.  It's actually the newer Sony Exmor sensors used in smaller sensor cameras that do a bit better at just a hair over 14.
Title: Re: 16 Bit Myth
Post by: theguywitha645d on December 29, 2011, 10:45:58 am
I didn't say I was using a 16 Bit camera, I'm merely supporting the opinion that a 16 Bit file is superior to an 8 Bit, or 14 Bit interpolated to 16 Bit ..........

The discussion isn't about brands or cameras, simple file integrity and robustness....

Perhaps you should go back and read the OP. This is about using the term "true 16-bit quality" in advertising when the back is not 16-bit.
Title: Re: 16 Bit Myth
Post by: Chris_Brown on December 29, 2011, 11:48:49 am
It's actually the newer Sony Exmor sensors used in smaller sensor cameras that do a bit better at just a hair over 14.

There are no fractions of bits (or "hairs" of bits) in any A/D converter. An analog input signal is either truncated (rounded down) or extended (rounded up) to an integer value.

If given the choice of a sensor & A/D converter system which converts its signal into an 8-bit data set or a 14-bit data set extrapolated to 16-bits, I'll take the 14-bit system. No myth there.
Title: Re: 16 Bit Myth
Post by: digitaldog on December 29, 2011, 11:52:14 am
I am sure they are equiped with parts producing 16 bits data. So writing "true 16 bits" is no more false advertising than Epson writing "true 4800dpi" for their scanners or than YBA writing about the power that their high end amps can handle.

Well in terms of marketing hype (or flat out lies), and using the scanner analogy, we’ve seen for years and years, specs such as 4800x9600 or so forth. Knowledgeable people understand that one value is optical resolution, the other, higher, better sounding marketing driven value is interpolated resolution. But we still see the two values shown which is a bit of marketing hype IMHO. Now if indeed the optical resolution isn’t really 4800ppi in this example, someone is flat out lying!

Quote
The question is whether these 16 bits include more useful data than a 14 bits pipe. There are little evidence pointing to a yes. Am I saying that backs do not have smoother transitions? Nope. I am saying than even if they do, the true reason is not the bit depth the inaging pipe can handle, but more likely a combination of CCD sensor and the quality of the adc parts used.

Agreed! For me, the differentiation is a product that can capture only 8 bits per color or one that can produce more bits (12, 14 or 16-bits, doesn’t really matter to me). And I’d agree, suggesting a 14 bit capture alone is going to produce superior quality data than 12 bits, without looking at lots of other factors in the capture is silly. Lets see, does anyone really think a “true” 16-bit single capture device is going to give a “true” 12-bits per capture scanning back a run for its money?
Title: Re: 16 Bit Myth
Post by: theguywitha645d on December 29, 2011, 12:18:43 pm
The best defense against marketing is an educated consumer. Personally, I look on a company with suspicion if they are using fuzzy facts. I am less likely to buy from them--if they are willing to stretch the truth before they have my money, how are they going to act after they have it?
Title: Re: 16 Bit Myth
Post by: madmanchan on December 29, 2011, 12:43:28 pm
The problem is the claim "true 16 bit quality" is very broad/vague.  What aspect(s) of the imaging pipeline does that refer to?

For example, many modern cameras' internal raw-to-JPEG conversion engines use 16-bit intermediate math.  This results in smoother gradations and less likelihood of artifacts during the rendering steps.  This is very different from, say, getting 16 bits of raw data from the sensor, or having the noise floor of the sensor be roughly 2^-16.

Title: Re: 16 Bit Myth
Post by: LKaven on December 29, 2011, 12:43:52 pm
There are no fractions of bits (or "hairs" of bits) in any A/D converter. An analog input signal is either truncated (rounded down) or extended (rounded up) to an integer value.

If given the choice of a sensor & A/D converter system which converts its signal into an 8-bit data set or a 14-bit data set extrapolated to 16-bits, I'll take the 14-bit system. No myth there.
The fractions of bits I was referring to were measurements of dynamic range, which gives us the quantity of information and tells us how many physical bits will be needed to encode it.  If a sensor delivers 13.7 bits of information (as in dynamic range), it can be accommodated in 14 physical bits.

Extrapolating to 16 bits would be pointless however.  
Title: Re: 16 Bit Myth
Post by: Bryan Conner on December 29, 2011, 01:06:46 pm
The best defense against marketing is an educated consumer. Personally, I look on a company with suspicion if they are using fuzzy facts. I am less likely to buy from them--if they are willing to stretch the truth before they have my money, how are they going to act after they have it?

I agree 100%.  It is a shame that today's business world is full of people/companies that spin partial truths...or stretch the truth all in the name of making an extra sale.  They are choosing this extra sale over an extra happily satisfied customer.  They are betting that the customer will not be educated enough to see the missing truths in their marketing speak.

Instead of only relying on the quality of their product and the testimonies of existing customers, they think that they must pretend to be a sleazy used car salesman in order to succeed. 

If the output file of the camera is not truly 16 bit, then do not call it true 16 bit...unless you fully explain what you mean by "true 16 bit".  Oh, and print your explanation in a font size that is readable...without a microscope.

I have no idea who the dealer or company the OP is referring to, so I am not directing my opinion at any particular party.
Title: Re: 16 Bit Myth
Post by: Guillermo Luijk on December 29, 2011, 03:11:05 pm
The discussion isn't about brands or cameras, simple file integrity and robustness....

Robustness vs bitdepth strongly depends on the presence of noise. As Bill pointed, if your bitdepth is higher than the level of noise you are wasting resources because you are encoding more bits than the strictly necessary, making your files larger with no advantage. Of course from a marketing point of view it can still be a good idea to fool uninformed users like you.

Left image is 8-bit, right image is 5-bit. Thanks to noise both have the same robustness against postprocessing:
(http://www.guillermoluijk.com/article/ettr3/poster.gif)


BTW did you know Photoshop is a 15-bit tool? bad news, it is. It works with half the levels a genuine 16-bit file can encode. This (http://www.guillermoluijk.com/curious/hist16bits/linearhisps.gif) is what any Photoshop histogram looks like when observed at 1:1 zoom. Is this a problem? no way, 15-bit is still more than enough to produce fantastic images.

Regards


Title: Re: 16 Bit Myth
Post by: LKaven on December 29, 2011, 03:14:08 pm
I have no idea who the dealer or company the OP is referring to, so I am not directing my opinion at any particular party.
It seems to be pervasive.  Even the Pentax literature touts the 645D as being 16-bit.  I've been surprised at times at the number of people who are accomplished photographers and write intelligently here and who also believe that.  It's evidence that there is an interest that people keep believing in it.
Title: Re: 16 Bit Myth
Post by: fredjeang on December 29, 2011, 03:43:32 pm
When you inject material in Nuke, it's automatically converted in 32 bits float. Like it or not, no choice.
The gamma is disabled although you see it of course in the viewer with a LUT for proper viewing.
Title: Re: 16 Bit Myth
Post by: BJL on December 29, 2011, 03:54:55 pm
When you inject material in Nuke, it's automatically converted in 32 bits float. Like it or not, no choice.
The gamma is disabled although you see it of course in the viewer with a LUT for proper viewing.
The IEEE32 mantissa is still at least 23 bits, so no need to worry about it hurting the resolution of your raw data ... even if  some future camera does deliver 16 significant bits of signal information.
Title: Re: 16 Bit Myth
Post by: Schewe on December 29, 2011, 04:06:45 pm
BTW did you know Photoshop is a 15-bit tool? bad news, it is.

Actually, it's 15 bits plus one level...done for algorithmic processing reasons. And since there really isn't a real life source of full 16 bit images, that's all the precision Photoshop needs.
Title: Re: 16 Bit Myth
Post by: fotometria gr on December 29, 2011, 04:31:02 pm
I don't know Mr Peterson but I can agree with the above.


+1 Theodoros. www.fotometria.gr
Title: Re: 16 Bit Myth
Post by: hjulenissen on December 29, 2011, 04:56:34 pm
A photon is either present or not. At least according to my rusty physical knowledge.

A hypothetical "photon counting" sensor that had high enough spatial resolution that each sensel in practice either saw 1 or 0 photons (and practically never more than 1) would need no more than a 1-bit ADC to capture every nuance of tonality and DR there is to capture.

When there is noise present (either naturally or artificially), the barrier between spatial resolution and "level resolution" gets blurry. Newspapers and inkjets can deliver more or less high-quality gradations by varying the density of dots that are either "colored" or "noncolored". For this to work, the spatial resolution must be high enough that level information can be "encoded" into high-frequency noise that is either invisible or less annoying.

The wikipedia article on dithering is quite enlightening:
http://en.wikipedia.org/wiki/Dithering
Title: Re: 16 Bit Myth and the DR of chemical (film) pixels
Post by: BJL on December 29, 2011, 05:10:45 pm
A hypothetical "photon counting" sensor that had high enough spatial resolution that each sensel in practice either saw 1 or 0 photons (and practically never more than 1) would need no more than a 1-bit ADC to capture every nuance of tonality and DR there is to capture.

The wikipedia article on dithering is quite enlightening:
http://en.wikipedia.org/wiki/Dithering
Indeed: as I am fond of saying, film used 1-bit pixels, since each silver halide-clump either reacted to light or not, and yet (sometimes with the help of blurring through dye-cloud diffusion), film achieves more DR than "1-bit" suggests. The moral being that with any practically relevant meaning of the words, the dynamic range of a printed image is not at all limited to the "engineering" DR of the individual pixels.

Thanks for the reference on dithering: it gets discussed far, far less around here than resolution or dynamic range, but deserves some attention. Maybe Michael could find someone to write an essay on dithering for this site? Maybe he could invite one or two of our more technically competent forum participants to contribute?
Title: Re: 16 Bit Myth
Post by: LKaven on December 29, 2011, 05:14:37 pm
Reminds me of Fossum's proposal for a sub-diffraction layer "gigapixel" sensor with "jots" -- one photon per sensel.  Since he discusses similar issues, I'm linking to the paper:

http://ericfossum.com/Publications/Papers/Gigapixel%20Digital%20Film%20Sensor%20Proposal.pdf
Title: Re: 16 Bit Myth
Post by: ErikKaffehr on December 29, 2011, 05:16:08 pm
Hi,

You are absolutely right! The correct interpretation (in my humble opinion) would be that DR (defined as FWC/readout noise) would exceed 14 bits on a full frame 24 MP Exmoor sensor using todays technology. That may necessitate a fifteenth bit.

Best regards
Erik


There are no fractions of bits (or "hairs" of bits) in any A/D converter. An analog input signal is either truncated (rounded down) or extended (rounded up) to an integer value.

If given the choice of a sensor & A/D converter system which converts its signal into an 8-bit data set or a 14-bit data set extrapolated to 16-bits, I'll take the 14-bit system. No myth there.
Title: Re: 16 Bit Myth
Post by: Doug Peterson on December 29, 2011, 05:21:18 pm
- The statement is true (the A/D convertor is 16 bit).
- The implied advantage is true (that the file has more tonal smoothness and holds up to post processing better).

The specifics of the entire imaging chain and the usefulness of comparing using a 16 bit vs. 14 bit convertor in any given camera are quite complex and include proprietary and very in-depth knowledge (e.g. are all 14 bit A/D convertors identical in impact on image quality when placed into an imaging chain, are there any additional effects that need to be understood when adding in proprietary dark-frame subtraction technology, do all A/D convertors produce identical amounts of heat and are all available in the same physical sizes/shapes, are competing "14" bit systems using all of their "14" bits fruitfully).

We frequently have very in-depth technical conversations (as full and frank as we have knowledge/experience to offer) with the customers who are very technically minded and interested in such things.

Usually though we simply send raw files or have the client test (in the real world) the camera they are considering so they can make their own conclusions.

Bottom line: if someone buys a digital back because they have been lead by the graphic to expect better tonality and file robustness they have not been misled. So I think for marketing purposes it's a fair distillation of the product features.

However, this thread has convinced me the use of the word "true" was not the right way to go. I've updated the graphic (http://www.captureintegration.com/) accordingly.

Doug Peterson (e-mail Me) (doug@captureintegration.com)
__________________

Head of Technical Services, Capture Integration
Phase One Partner of the Year
Leaf, Leica, Cambo, Arca Swiss, Canon, Apple, Profoto, Broncolor, Eizo & More

National: 877.217.9870  |  Cell: 740.707.2183
Newsletter (http://"http://www.captureintegration.com/our-company/newsletters/") | RSS Feed (http://"http://www.captureintegration.com/2008/08/11/rss-feeds/")
Buy Capture One 6 at 10% off (http://"http://www.captureintegration.com/phase-one/buy-capture-one/")


P.S. The end of the year is extraordinarily busy for us (tax law in the US means a lot of photographers make equipment purchases at the end of the year) so I will likely not be on LL the next few days. Please don't interpret this as avoidance - I think we (Capture Integration) have shown repeatedly we want to be a transparent and honest dealer.
Title: Re: 16 Bit Myth
Post by: hjulenissen on December 29, 2011, 05:37:39 pm
The specifics of the entire imaging chain and the usefulness of comparing using a 16 bit vs. 14 bit convertor in any given camera are quite complex and include proprietary and very in-depth knowledge
Did anyone try to set the 2 lsb to zero, or replace them by random noise prior to raw development? If it has no visible differences from the untouched file when both are developed using identical raw development, that would be an indication that bits#15-16 are of little importance.

Of course, one could carry out measurements using suitable targets for those two cases, or view the difference-file (to see if there are recognizable image features in the difference).

I'd say you can make a pretty good case for or against 16-bit files playing a significant role in any given camera using such methods. Is it worth it, or will the easily fooled be fooled, while the sceptics will remain ... sceptic? I dont know.

-h
Title: Re: 16 Bit Myth
Post by: BJL on December 29, 2011, 05:38:42 pm
- The implied advantage is true (that the file has more tonal smoothness and holds up to post processing better).
More tonal smoothness than using the same sensor (with at most 12-13 stops of DR in the case of all DMF sensors I know of) but with a 14-bit ADC, as is the case in some DMF backs?

Or more than with some 35mm format DSLR's with 14-bit ADC output but reportedly greater per pixel DR, when the images are compared at equal PPI (say with crops to equal pixel count)?

I refer to equal PPI and equal pixel count crops to factor out the legitimate advantage of having more pixels, and so getting equal print size with higher PPI: that is a legitimate IQ advantage, but is nothing to do with recording individual pixel output signals that have at most 13 stop DR using 16 rather 14 bits; it is instead related to the benefits of dithering.
Title: Re: 16 Bit Myth
Post by: LKaven on December 29, 2011, 06:33:56 pm
- The statement is true (the A/D convertor is 16 bit).
- The implied advantage is true (that the file has more tonal smoothness and holds up to post processing better).

What implied advantage?  There is no implication from a 16 bit A/D converter to more tonal smoothness.  There are many other factors that are responsible for that quality.  

Quote
The specifics of the entire imaging chain and the usefulness of comparing using a 16 bit vs. 14 bit convertor in any given camera are quite complex and include proprietary and very in-depth knowledge (e.g. are all 14 bit A/D convertors identical in impact on image quality when placed into an imaging chain, are there any additional effects that need to be understood when adding in proprietary dark-frame subtraction technology, do all A/D convertors produce identical amounts of heat and are all available in the same physical sizes/shapes, are competing "14" bit systems using all of their "14" bits fruitfully).

With a read error of 2e for the Exmor sensors, not to mention a higher QE, I'd say they are.  Otherwise, I'd expect to see your MF sensors perform nicely at ISO 6400, without Sensor+.  

Quote
We frequently have very in-depth technical conversations (as full and frank as we have knowledge/experience to offer) with the customers who are very technically minded and interested in such things.

Usually though we simply send raw files or have the client test (in the real world) the camera they are considering so they can make their own conclusions.

There are lots of good reasons to buy your products, true reasons.

Quote
Bottom line: if someone buys a digital back because they have been lead by the graphic to expect better tonality and file robustness they have not been misled. So I think for marketing purposes it's a fair distillation of the product features.

I've been led to expect only one thing: that there are 16 bits of beneficial image information to be gained by buying your camera.  Well, is there or isn't there?  

Reductio ad absurdum argument: Why not have 24 bit A/D converters and advertise that?  If people are led by that -- falsely -- to expect better tonality and file robustness, they have surely not been misled?  Either there is a benefit or there isn't, and if you are implying that there is one when there isn't, you are culpable.  

Quote
However, this thread has convinced me the use of the word "true" was not the right way to go. I've updated the graphic (http://www.captureintegration.com/) accordingly.

True as opposed to what?  
Title: Re: 16 Bit Myth
Post by: Guillermo Luijk on December 29, 2011, 07:53:26 pm
Did anyone try to set the 2 lsb to zero, or replace them by random noise prior to raw development? If it has no visible differences from the untouched file when both are developed using identical raw development, that would be an indication that bits#15-16 are of little importance.

I have done that with Canon 40D and Pentax K5 RAW files, comparing 12 vs 14 bits in these cases. For the Canon the 2 extra bits were irrelevant, they didn't add anything to image quality because noise is well above a 12-bit interval even at ISO100. For the Pentax K5 the 2 extra bits were indeed useful, avoiding posterization that began to show up when performing 12-bit development.

Since no digital back has lower noise in the deep shadows (read noise) than the K5 at a pixel level, I dare to conclude those extra 2 bits from 16-bit digital backs are just marketing, as were the 14 bits in the Canon 40D. But it's still very cool to have a 16-bit machine!!!

If anyone is interested I can upload a version of DCRAW with a bits decimation option to do personal tests.

Regards
Title: Re: 16 Bit Myth
Post by: hjulenissen on December 29, 2011, 07:58:56 pm
Since no digital back has lower noise in the deep shadows (read noise) than the K5 at a pixel level, I dare to conclude those extra 2 bits from 16-bit digital backs are just marketing, as were the 14 bits in the Canon 40D.
Did you also try different ISO values to see if in-camera processing of raw was able to benefit from more bits than the sensor appears to make use of in itself?

iso3200 might be a straight "bitshift left" of iso1600, but is it certain that no other iso values are obtained by multiplying the data with values < 1?

-h
Title: Re: 16 Bit Myth
Post by: Guillermo Luijk on December 29, 2011, 08:04:16 pm
Did you also try different ISO values to see if in-camera processing of raw was able to benefit from more bits than the sensor appears to make use of in itself?

Nope, anyway I don't think there is in-camera RAW processing in recent cameras, at least I never found evidences of it (only the well know Sonys, and Sony quickly allowed the user to disable this NR via firmware).

Title: Re: 16 Bit Myth
Post by: deejjjaaaa on December 30, 2011, 01:37:17 am

Reductio ad absurdum argument: Why not have 24 bit A/D converters and advertise that?

Pentax K10D (Sony 10mp CCD) once was ad'd as having 22-bit A/D  :)
Title: Re: 16 Bit Myth
Post by: hjulenissen on December 30, 2011, 03:42:05 am
Nope, anyway I don't think there is in-camera RAW processing in recent cameras, at least I never found evidences of it (only the well know Sonys, and Sony quickly allowed the user to disable this NR via firmware).
Surely, the ISO values of >1600 tend to be obtained by digital multiplication, at least in Canon cameras?

My point was that perhaps also 1/3 stop intermediate ISO values are obtained by multiplying a limited number of analog gain values by 4/3, 2/3 etc?

-h
Title: Re: 16 Bit Myth
Post by: Bryan Conner on December 30, 2011, 03:45:26 am
Pentax K10D (Sony 10mp CCD) once was ad'd as having 22-bit A/D  :)

You are correct!  So, does this mean that the modern MF digital backs, that are advertised to be 16 bit, are not capable of producing files of the same (or better) quality as the Pentax K10D from 2006?  What happens in the future when, and if, a fully 16 bit back is introduced?  Are the advertisements going to state "now, we finally really have a full, complete 16 bit image pipeline!".  That will devalue the impression that was put forth in the current advertisements.

I think that it would be smarter to simply state that the image quality speaks for itself.  Make sample raw files downloadable and invite potential customers to evaluate them.  Also, allow customers to use a demo back at the store and take the files home with them. There is no use to play word games/mind games with customers.  Especially with the high quality of Phase One Digital backs.  It reminds me too much of the snake oil/sleazy used car salesman tactics.
Title: Re: 16 Bit Myth
Post by: Doug Peterson on December 30, 2011, 07:47:05 am
I think that it would be smarter to simply state that the image quality speaks for itself.  Make sample raw files downloadable and invite potential customers to evaluate them.  Also, allow customers to use a demo back at the store and take the files home with them. There is no use to play word games/mind games with customers.  Especially with the high quality of Phase One Digital backs.  It reminds me too much of the snake oil/sleazy used car salesman tactics.

Can I assume that you've never dealt with us (Capture Integration) as a customer before?

Doug Peterson (e-mail Me) (doug@captureintegration.com)
__________________

Head of Technical Services, Capture Integration
Phase One Partner of the Year
Leaf, Leica, Cambo, Arca Swiss, Canon, Apple, Profoto, Broncolor, Eizo & More

National: 877.217.9870  |  Cell: 740.707.2183
Newsletter (http://"http://www.captureintegration.com/our-company/newsletters/") | RSS Feed (http://"http://www.captureintegration.com/2008/08/11/rss-feeds/")
Buy Capture One 6 at 10% off (http://"http://www.captureintegration.com/phase-one/buy-capture-one/")
Title: Re: 16 Bit Myth
Post by: Doug Peterson on December 30, 2011, 07:50:47 am
For clarity this is the advertisement in question:

(http://www.captureintegration.com/wp-content/uploads/CI75_01-LeafAptus223.jpg) (http://www.captureintegration.com/)
Title: Re: 16 Bit Myth
Post by: Doug Peterson on December 30, 2011, 07:58:02 am
Again, I'm open for suggestions.

The problem to be solved is explaining that the 22mp files from this camera have better tonal smoothness and better ability to hold up to post processing than the 5DII that 90%+ of our potential clients assume is the same because it also has 22ish megapixels. That must be done in about 15 characters and in a way that 90% of people viewing the ad will understand instantly and without further (immediate) explanation (they can of course always ask for more details when they contact us).

Brainstorming here:
"Large Sensor"
"Flexible Raw File"
"Great Tones"
"Smooth Tones"
"Beautiful Files"

None of those strike me as a great substitution. But if ya'll feel strongly one of them does a better job solving the marketing problem outlined above let me know.
Title: Re: 16 Bit Myth
Post by: hjulenissen on December 30, 2011, 08:02:56 am
None of those strike me as a great substitution. But if ya'll feel strongly one of them does a better job solving the marketing problem outlined above let me know.

"N stops of Dynamic range*)"?
*)According to DXO "display" measurements
Title: Re: 16 Bit Myth
Post by: Doug Peterson on December 30, 2011, 08:09:15 am
"N stops of Dynamic range*)"?
*)According to DXO "display" measurements

A great suggestion but:
- I don't agree that the DXO measurements fully explain the useable range captured (two raw files can have a similar measure but in one the shadows are clumpy, tonally choppy, and inaccurate in color while the other is gaussian/film-like, tonally smooth, and accurate in color - the difference between quantitative engineering measurement and qualitative photographic aesthetics)
- DXO measurements aren't available for all cameras (especially legacy systems like this one)

"High Dynamic Range" maybe, except that photographers now largely associate that phrase with ugly HDR abominations on Flickr.
Title: Re: 16 Bit Myth (P45 vs Pentax K5)
Post by: ErikKaffehr on December 30, 2011, 08:39:18 am
Hi,

Marc McCalmont, one of the frequent posters on this forum, posted raw image from his Pentax K5 and his P45. My understanding is that he felt that the Pentax K5 had better image quality.

I checked those image using Lightroom and this is what I found:
http://www.luminous-landscape.com/forum/index.php?topic=50895.msg421886#msg421886

Best regards
Erik


A great suggestion but:
- I don't agree that the DXO measurements fully explain the useable range captured (two raw files can have a similar measure but in one the shadows are clumpy, tonally choppy, and inaccurate in color while the other is gaussian/film-like, tonally smooth, and accurate in color - the difference between quantitative engineering measurement and qualitative photographic aesthetics)
- DXO measurements aren't available for all cameras (especially legacy systems like this one)

"High Dynamic Range" maybe, except that photographers now largely associate that phrase with ugly HDR abominations on Flickr.
Title: Re: 16 Bit Myth (P45 vs Pentax K5)
Post by: BernardLanguillier on December 30, 2011, 08:59:29 am
Marc McCalmont, one of the frequent posters on this forum, posted raw image from his K5 and his P45. My understanding is that he felt that the K5 had better image quality.

At least it seems pretty clear that the 5DII, although it is a very successful camera widely used by pros, is now pretty far being the best DSLRs available.

So measuring the gap between backs and DSLRs by using the 5DII as a reference seems a bit unfair.

Cheers,
Bernard
Title: Re: 16 Bit Myth
Post by: hjulenissen on December 30, 2011, 09:06:14 am
A great suggestion but:
- I don't agree that the DXO measurements fully explain the useable range captured (two raw files can have a similar measure but in one the shadows are clumpy, tonally choppy, and inaccurate in color while the other is gaussian/film-like, tonally smooth, and accurate in color - the difference between quantitative engineering measurement and qualitative photographic aesthetics)
We cannot always quantify that which we feel is true. In those cases, instead of resorting to "amazing true (tm) 16-bit quality", I would prefer statements such as "used by professionals world-wide", or "in a side-by-side comparision on luminous-landscape.com, 42 out of 51 voters preferred the X camera over Y".

-h
Title: Re: 16 Bit Myth (P45 vs Pentax K5)
Post by: ErikKaffehr on December 30, 2011, 09:07:39 am
Hi Bernard,

I don't get your point K5 is Pentax K5. I update the posting to be more clear on the issue.

Anyway, Marc posted  comparable raw images from his P45 (Phase One) and K5 (Pentax). Highlight detail was similar but shadow detail was much better on the Pentax.

Best regards
Erik


At least it seems pretty clear that the 5DII, although it is a very successful camera widely used by pros, is now very far being the best DSLRs available.

So measuring the gap between backs and DSLRs by using the 5DII as a reference seems a bit unfair.

Cheers,
Bernard

Title: Re: 16 Bit Myth
Post by: Bryan Conner on December 30, 2011, 09:33:57 am
Can I assume that you've never dealt with us (Capture Integration) as a customer before?

Doug Peterson (e-mail Me) (doug@captureintegration.com)
__________________

Head of Technical Services, Capture Integration
Phase One Partner of the Year
Leaf, Leica, Cambo, Arca Swiss, Canon, Apple, Profoto, Broncolor, Eizo & More

National: 877.217.9870  |  Cell: 740.707.2183
Newsletter (http://"http://www.captureintegration.com/our-company/newsletters/") | RSS Feed (http://"http://www.captureintegration.com/2008/08/11/rss-feeds/")
Buy Capture One 6 at 10% off (http://"http://www.captureintegration.com/phase-one/buy-capture-one/")


You are correct.  I have never dealt with you before.  I  intended no slight on you or your service.  My point is that I do not think that with a level of quality that both you and your products exhibit, that you need to spin the truth at all.  I apologize if I was not clear with my intention.
Title: Re: 16 Bit Myth
Post by: BernardLanguillier on December 30, 2011, 09:53:23 am
bits of cold... after a bit of editing.

(http://farm8.staticflickr.com/7156/6604151313_2b12641584_o.jpg)
Title: Re: 16 Bit Myth
Post by: ErikKaffehr on December 30, 2011, 10:05:44 am
Hi,

Nice for you to have some winter in Japan, we don't have any here is Sweden.

You have sold your D3X? What are you using now?!

Best regards
Erik

bits of cold...

(http://farm8.staticflickr.com/7003/6600520373_d5f4e39b0c_o.jpg)

Title: Re: 16 Bit Myth
Post by: BernardLanguillier on December 30, 2011, 10:07:48 am
Hi,

Nice for you to have some winter in Japan, we don't have any here is Sweden.

You have sold your D3X? What are you using now?!

An IQ180.

Kidding, a D7000. Very nice little camera I have to say. Works wonders when stitching.

Cheers,
Bernard
Title: Re: 16 Bit Myth
Post by: deejjjaaaa on December 30, 2011, 10:30:50 am
You are correct!  So, does this mean that the modern MF digital backs, that are advertised to be 16 bit, are not capable of producing files of the same (or better) quality as the Pentax K10D from 2006?
Pentax had a decency to stop.
Title: Re: 16 Bit Myth
Post by: theguywitha645d on December 30, 2011, 11:04:16 am
Again, I'm open for suggestions.

The problem to be solved is explaining that the 22mp files from this camera have better tonal smoothness and better ability to hold up to post processing than the 5DII that 90%+ of our potential clients assume is the same because it also has 22ish megapixels. That must be done in about 15 characters and in a way that 90% of people viewing the ad will understand instantly and without further (immediate) explanation (they can of course always ask for more details when they contact us).

Brainstorming here:
"Large Sensor"
"Flexible Raw File"
"Great Tones"
"Smooth Tones"
"Beautiful Files"

None of those strike me as a great substitution. But if ya'll feel strongly one of them does a better job solving the marketing problem outlined above let me know.

How about:

Professional IQ
Amazing IQ
Commercial IQ
Medium-format IQ
5D Killer
Can't believe it ain't 16-bit

I would go for "large sensor" or "2X the 35mm area" or whatever the area difference is or simply the sensor dimensions. One phrase on an ad is not going to sell this back. I would imagine the customer who is coming to you is more interested in the sensor size anyway--or why go to MFD? Believe it or not, but I will not buy this back on a simple ad. I think the word that is going to get the customer to call you is "affordable."

Title: better buzz-phrase for the DMF advantage
Post by: BJL on December 30, 2011, 11:17:57 am
Again, I'm open for suggestions.

The problem to be solved is explaining that the 22mp files from this camera have better tonal smoothness and better ability to hold up to post processing ... That must be done in about 15 characters and in a way that 90% of people viewing the ad will understand instantly and without further (immediate) explanation

Brainstorming here:
"Large Sensor"
...
That first suggestion is probably the main root cause, but how about

More pixels for higher PPI printing and smoother tonal dithering

given that I see no evidence of a per pixel advantage for DMF backs over the state-of-the-art from Sony and such.

However, this slogan assumes that people paying four figure prices for digital cameras know a bit about dithering and such, which I think they should, but ...

You could also talk about the inherent larger format lens performance advantages, in lp/ph resolution and such, but that seems out of fashion!

P. S. I added "smoother tonal" to my slogan, since that seems to be a selling point of interest to you.
Title: Re: 16 Bit Myth
Post by: Guillermo Luijk on December 30, 2011, 11:31:09 am
Surely, the ISO values of >1600 tend to be obtained by digital multiplication, at least in Canon cameras?

My point was that perhaps also 1/3 stop intermediate ISO values are obtained by multiplying a limited number of analog gain values by 4/3, 2/3 etc?

In older Canons ISO3200 and above was obtained from ISO1600 multiplied before encoding RAW data. In some newer ones that happens from ISO6400. In any case ISOs above ISO1600, no matter if they are analogue or digital, don't add anything to the RAW shooter because SNR doesn't improve anymore.

By processing I meant applying noise reduction on the RAW data. The digital ISOs (and this includes +1/3 and +2/3 values) are of no interest to the RAW shooter.

Title: Re: 16 Bit Myth
Post by: HCHeyerdahl on December 30, 2011, 11:44:25 am
Again, I'm open for suggestions.

The problem to be solved is explaining that the 22mp files from this camera have better tonal smoothness and better ability to hold up to post processing than the 5DII that 90%+ of our potential clients assume is the same because it also has 22ish megapixels. That must be done in about 15 characters and in a way that 90% of people viewing the ad will understand instantly and without further (immediate) explanation (they can of course always ask for more details when they contact us).

Brainstorming here:
"Large Sensor"
"Flexible Raw File"
"Great Tones"
"Smooth Tones"
"Beautiful Files"

None of those strike me as a great substitution. But if ya'll feel strongly one of them does a better job solving the marketing problem outlined above let me know.


As a person considering the jump to MF, my take is that your marketing ought to reflect reality. Thing is when you quote 16 bit as something that makes MF stand out from FF DSLRs,  and threads on several forums find this questionable, I as a potential customer begin to wonder if the whole thing is snake oil. I have spent quite some time trying to sort this out and I am not finished yet... ;)

Now, from what I have managed to pick up from numerous threads it may appear that it is the very size of the sensor itself AND coupled with stricter tollerances in every component that gives MF its alleged superior image quality. If this is actually the case, then this is VERY interesting for a potential customer since this will not be available in a future FF DSLR. Of course DSLRs will continue to evolve, but they will not grow bigger sensors. I also find it unlikely that they will increase their quality tollerances. Hence, at a given purchase date, the benefits from a MF investment will probably hold itself against future FF DSLRs for more than one or two generations (btw a great  wife argument!).

I really don´t have a ready 15 char slogan for you but something along the line of "Built to MF quality tollerances" , "Designed for MF quality" would at least to me suggest that the quality of MF has someting to do with the quality of the entire system.  Sure, lots of people will contest this or similar assertions, but at least we will not have threads from people with superior technical knowledge in image processing, etc more or less PROVING your (16 bit) advantage is wrong.

Hay, OK !  Maybe I still haven´t got it - just my two (amaturish) cents trying to be helpful  ;D.

Christopher
Title: Re: 16 Bit Myth
Post by: Slobodan Blagojevic on December 30, 2011, 01:26:51 pm
... the inherent larger format lens performance advantages, in lp/ph resolution and such, but that seems out of fashion!...

Back in the film days (i.e., the last time I was paying attention to this), larger format lenses were, if anything, inferior to 35mm format lenses in terms of lp/mm resolution. However, given the need for less magnification, they could get away with it and still end up with superior results in print. Is the situation different nowadays with digital?
Title: Re: 16 Bit Myth
Post by: fredjeang on December 30, 2011, 01:41:10 pm
It seems to me that a something important is often missing in those discussions that involve tech.

Yes, a MF lens is normally inferior but with digital the need for less "vague" standards is important, for MF as for 35 or m4/3 or even the newest Nikon standard.

So teorically, a lens that has been designed for film, with much more "room" than digital allows, should performs so so on a digital camera.

But reality shows that it's far from being a golden truth.

The missing part for me, is that I rarely hear talking about light. Light knowledge, understanding, is all. It's key. Everybody is concern about sharpness and resolution, so all the factors that are the less relevant when it comes to imagery.

Most of the time you see people that are following lenses and sensor testings-specs. But each lens is a word in itself and they would exell under certain light conditions. This understanding is too often missed. I have a 30 euros russian pancake lens call Industar. It's a plastic lens. It can be extremely bad, or, if you learn how to use it and in wich conditions it shines, it can be very good.

And yeah, it's not a glamour lens, it won't impress the gallery. But I know at least one international photographers that uses it for B & W outputs.

Remember when Graham did a testing with Rollei vintage lens on a IQ back? Did it lacked sharpness and resolution?

I strongly recommend to have a look on Cooter's and mark Tucker websites.

I spent all yesterday afternoon with Mark tucker's blog that I hadn't visited for some time and it was a pure delight. He shoots with everything, and he bloody knows what he is doing. More importantly, he catches you.
Mark Tucker couldn't care less about image chirurgical precision, in fact it's like he escapes the trap on purpose and he obviously knows very well about MF LF and digital retouching. (I'm not talking for him but that's the sensation he transmits) I saw some movies he did and yeah, content is king.
Then i visited some Coot movies too, I was blowned again by the car race, high isos, grainy, not ultra sharp, blowned highlights on purpose (where is the DR so many people are concern with?) but what a great peice of motion! It gave me some enthousiasm and boost to persevere into motion.
When tech parameters enter into consideration, is that the image missed the point.


Shoot with whatever my friends, in any bits, byts bites.

Title: lp/mm vs lp/ph
Post by: BJL on December 30, 2011, 02:00:01 pm
Back in the film days (i.e., the last time I was paying attention to this), larger format lenses were, if anything, inferior to 35mm format lenses in terms of lp/mm resolution. However, given the need for less magnification, they could get away with it and still end up with superior results in print. Is the situation different nowadays with digital?
Indeed: that is why I mentioned lp/ph (line pairs per picture height) to compensate for format size differences. Also, it seems that in recent years, there have been some stunningly sharp new lenses designed for DMF, at least from the large format lens makers (Rodenstock et al) getting into the digital back market. Does anyone have some up-to-date data on something like lp/ph at MTF50 for various 35mm and MF lenses (assuming 48x36mm in MF, for example, so a gain for DMF by a factor of 1.5 relative to lp/mm comparisons.)
Title: Re: 16 Bit Myth
Post by: ErikKaffehr on December 30, 2011, 04:18:21 pm
Hi,

It depends. What it depends on is the MTF of the lens and the sensor at the frequency we are interested in. The frequency is given by viewing distance, the contrast sensitivity of the eye and the magnification. Once the frequency is known we multiply MTF for lens and sensor for that frequency.

This is what is called SQF, essentially. Well SQF is actually the integral over the contrast sensitivity function of the eye, but...

With digital things have shifted a bit. We have medium and large format lenses specially built for digital. Some of those lenses (Schneider,  Rodenstock HR digital lenses and lenses for the Leica S2) may be better than the best 135 lenses.

My feeling right now is that the best lenses outperform the sensors. Once we see aliasing or moiré that is a sure sign of that.

Best regards
Erik

Back in the film days (i.e., the last time I was paying attention to this), larger format lenses were, if anything, inferior to 35mm format lenses in terms of lp/mm resolution. However, given the need for less magnification, they could get away with it and still end up with superior results in print. Is the situation different nowadays with digital?
Title: Re: better buzz-phrase for the DMF advantage
Post by: ixania2 on December 30, 2011, 05:11:14 pm
If you would advertise your own products this way, you would go bancrupt soon, believe me.

That first suggestion is probably the main root cause, but how about

More pixels for higher PPI printing and smoother tonal dithering

given that I see no evidence of a per pixel advantage for DMF backs over the state-of-the-art from Sony and such.

However, this slogan assumes that people paying four figure prices for digital cameras know a bit about dithering and such, which I think they should, but ...

You could also talk about the inherent larger format lens performance advantages, in lp/ph resolution and such, but that seems out of fashion!

P. S. I added "smoother tonal" to my slogan, since that seems to be a selling point of interest to you.
Title: Re: 16 Bit Myth
Post by: telyt on December 30, 2011, 09:08:57 pm
Again, I'm open for suggestions.

If you say much more than "Lots of people like 'em" someone's going to get his knickers twisted into knots.  Or maybe they will anyway.
Title: Re: 16 Bit Myth
Post by: BrendanStewart on December 31, 2011, 03:56:01 am
http://en.wikipedia.org/wiki/Quantization_(signal_processing)

Can't translate the url, paste this: en.wikipedia.org/wiki/Quantization_(signal_processing)

I suggest this as interesting reading as well. The images are particularly useful in showing why signal resolution is important.
Title: Re: 16 Bit Myth
Post by: Fine_Art on December 31, 2011, 03:56:53 am
What implied advantage?  There is no implication from a 16 bit A/D converter to more tonal smoothness.  There are many other factors that are responsible for that quality.  

With a read error of 2e for the Exmor sensors, not to mention a higher QE, I'd say they are.  Otherwise, I'd expect to see your MF sensors perform nicely at ISO 6400, without Sensor+.  

There are lots of good reasons to buy your products, true reasons.

I've been led to expect only one thing: that there are 16 bits of beneficial image information to be gained by buying your camera.  Well, is there or isn't there?  

Reductio ad absurdum argument: Why not have 24 bit A/D converters and advertise that?  If people are led by that -- falsely -- to expect better tonality and file robustness, they have surely not been misled?  Either there is a benefit or there isn't, and if you are implying that there is one when there isn't, you are culpable.  

True as opposed to what?  

The Pentax K10D had a 22 bit DA converter. It was dumped in the newer cameras for 14 bit versions. All reviews say the newer cameras produce better pictures despite the less expensive DA converters.

Another issue is there is no current technology to view 16 bits. If you look at the websites of LCD manufacturers you will see they are mostly 10 bit devices with some 12 (There may be 14 since the last time I looked) for very expensive hospital devices. Everything over the 10 bits is dithering of adjacent pixels. Yes, your OS claims you have 32 bit color. Prints have even less dynamic range.

Despite that I would still prefer a camera that could record a full 16 bits for future screen technology. Some shots cannot be redone. A full 16 bit pipe from sensor to file is a good thing. My camera does multishot HDR very well, the files look good. Unfortunately it only outputs this to 8bit jpg. Doh! Now why cant they combine multishot into a full 16bit RAW?

Title: Re: 16 Bit Myth
Post by: ErikKaffehr on December 31, 2011, 04:02:49 am
Hi,

You are mixing up things a bit:

1) 32 bit color mean 3x8 (RGB) pu 8bit Alpha channel

2) DR essentially defines what information is contained in the image. We always transform the image before viweing it.

Best regards
Erik


Another issue is there is no current technology to view 16 bits. If you look at the websites of LCD manufacturers you will see they are mostly 10 bit devices with some 12 (There may be 14 since the last time I looked) for very expensive hospital devices. Everything over the 10 bits is dithering of adjacent pixels. Yes, your OS claims you have 32 bit color. Prints have even less dynamic range.


Title: Re: 16 Bit Myth
Post by: Fine_Art on December 31, 2011, 04:10:32 am
An IQ180.

Kidding, a D7000. Very nice little camera I have to say. Works wonders when stitching.

Cheers,
Bernard


It's always bugged me that Nikon gets more out of a given Sony sensor than Sony. Of course the price is also much higher. The alternate 14bit DA converter pathway does give Nikon a real advantage.
Title: Re: 16 Bit Myth
Post by: Fine_Art on December 31, 2011, 04:28:14 am
Hi,

You are mixing up things a bit:

1) 32 bit color mean 3x8 (RGB) pu 8bit Alpha channel

2) DR essentially defines what information is contained in the image. We always transform the image before viweing it.

Best regards
Erik


Isnt 32 bit color more of a myth that 16bit? ;D
Title: Re: 16 Bit Myth
Post by: ErikKaffehr on December 31, 2011, 04:52:05 am
32 bits color on computers means 8bit RGB + 8 bit Alpha, 3*8 = 24 + 8 = 32.

Isnt 32 bit color more of a myth that 16bit? ;D
Title: Re: 16 Bit Myth
Post by: hjulenissen on December 31, 2011, 05:36:06 am
In older Canons ISO3200 and above was obtained from ISO1600 multiplied before encoding RAW data. In some newer ones that happens from ISO6400. In any case ISOs above ISO1600, no matter if they are analogue or digital, don't add anything to the RAW shooter because SNR doesn't improve anymore.

By processing I meant applying noise reduction on the RAW data. The digital ISOs (and this includes +1/3 and +2/3 values) are of no interest to the RAW shooter.


If that digital iso is eg a mult by 0.5, you would need 13 bits to store a 12 bit signal without loss. That iselevant to all raw shooters using such an iso setting. My guess is that many do.

Only offered as a theory as to why canon use more bits than they seem to need

-H
Title: Re: 16 Bit Myth
Post by: Guillermo Luijk on December 31, 2011, 06:04:50 am
That iselevant to all raw shooters using such an iso setting. My guess is that many do.

They shouldn't. There is a post by Gabor Schorr where he explained all ISOs in Canon cameras. Anyway I am not sure if ISOs 160, 320, 640, 1250, 2500 and 5000 are post processed before encoding the RAW data (which would lead to information loss), or are just metatagged in order to inform the RAW developer to appy additional -1/3EV baseline exposure correction.

Regards
Title: Re: 16 Bit Myth
Post by: hjulenissen on December 31, 2011, 06:18:25 am
They shouldn't.
You probably are right. But for understanding why Canon do what they do, I think it is more relevant to understand what their customers actually do, than what they should have done.

Canon did hint by letting the 1/3 stop ISO settings be hidden by default, and by limiting the auto-ISO to not exceed 1600 or 3200.

-h
Title: Re: 16 Bit Myth
Post by: hjulenissen on December 31, 2011, 06:20:33 am
Another issue is there is no current technology to view 16 bits. If you look at the websites of LCD manufacturers you will see they are mostly 10 bit devices with some 12 (There may be 14 since the last time I looked) for very expensive hospital devices. Everything over the 10 bits is dithering of adjacent pixels. Yes, your OS claims you have 32 bit color. Prints have even less dynamic range.

Despite that I would still prefer a camera that could record a full 16 bits for future screen technology. Some shots cannot be redone. A full 16 bit pipe from sensor to file is a good thing. My camera does multishot HDR very well, the files look good. Unfortunately it only outputs this to 8bit jpg. Doh! Now why cant they combine multishot into a full 16bit RAW?
As Erik said. Displays have a non-linear "gamma" response. Cameras have a linear response. You need more "camera bits" than "display bits"

-h
Title: Re: 16 Bit Myth
Post by: Guillermo Luijk on December 31, 2011, 08:20:02 am
You probably are right. But for understanding why Canon do what they do, I think it is more relevant to understand what their customers actually do

Canon cares of what their customers do: most of them shoot JPEG and that is why even DSLR cameras are JPEG-oriented. And RAW shooters still have good enough tools to obtain beautiful pictures, even if they are not the best possible.

Only advanced users and geeks like us are interested in such things as RAW histograms, know about ISO implementation,... and so forth.

Regards
Title: Re: 16 Bit Myth
Post by: Steve Hendrix on December 31, 2011, 10:14:14 am
On the topic of bit depth, and in particular 16 bits, I don't have much to add technically, except that my sense is that if a product is a native 16 bit product, then at the least, that spec is required to deliver the quality of image that the product is capable of. Whether 14 bits or 16 bits, etc, - by itself - is superior, necessary, noticeable, is not something that IMO particularly matters or is even easy to specifically identify. I will say that I have not mentioned 16 bit depth as a feature in quite some time, though I have discussed 16 bit output on many occasions, most often in the context of a minimal Capture One to Photoshop or Lightroom workflow for those who have an affinity for Adobe products (raw converting in Capture One, then outputting a robust 16 bit tiff into either Adobe program).

In any event, it is not, nor should it be, a dominant part of any discussion that we would have with a client. There are far too many elements that are of much greater importance that determine whether a Leaf Aptus 22, for example, is an appropriate fit for a client of ours.

From that standpoint, I would have to say I agree that the term "True 16 Bit Depth" is of questionable merit or value when limited to stating just 3 features on the banner for our $4,990 Leaf Aptus 22 special. But, not an easy spot to come up with something that is designed to grab, given the limitations. And that banner is just that, a grabber. From its intent of provoking a phone call, email inquiry, mentioning 16 bit depth may indeed provide an end to that means, but let's say one does make a call (as many have). I can say not a single call on that product has asked about True 16 Bit Depth, nor has there been a discussion on it. However, if there was, the relevance of True 16 Bit Depth would gradate into the actual tonality of the image quality itself, rather than a simple "Well look fella, 16 bits means better tonality". The fact is that the original Leaf Aptus 22 did indeed have a very nice transition in tonal range when used appropriately, even compared to other 22 megapixel digital backs. But it is not a straight path, as Doug said, and how and why is a combination of many elements. The 16 bit part of the equation is highly likely a necessary part of that equation, but not solely responsible for the smoothness of those transitions.

With more time and thought, I agree, we could probably find a more relevant stated feature that would be a reason to purchase that product. That said, it is difficult to find "16 Bit Depth" trumpeted anywhere on our site, and even on Leaf's site. So, I think to be fair, it's pretty clear we do not as a company make broad use of the term to sell product or mislead clients. And compared to just about any other website that sell digital backs (manufacturer or dealer) I think we have by comparison an abundance of real world information vs manufacturer-driven marketing data.

Now, one more thing, getting back to the use of the word "True". If anything, I feel like that was the one word that had the most legitimate reason for being there. Many, many products advertise themselves as "16 bit", but in almost all cases, they are referring to up-sampling the output. So, True 16 Bit, is relevant from the standpoint of 16 bit on the input side. Further, I will also say that I am not clear on whether the Dalsa 22MP sensor is a "true native 16 bit sensor". It has always been claimed and/or assumed, while it was clear that the Kodak Sensors used in digital backs (Sinar, Imacon/Hasselblad, Phase One (pre-P65+, post H10), have always been 14 bit native, even though data sheets for all of these products trumpet 16 bit. So, stating True 16 Bit, has also been a differentiator between Kodak and Dalsa sensors. However, at some point - for me - even the assumption that Dalsa sensors were native 16 bit came into question, and to this day remains unclear. As a result, the importance of the 16 bit spec is rarely even mentioned, much less discussed. The question of how the real world tonality and handling of shadow and highlight gradations differ between one digital back and another is a more relevant and much more frequently discussed topic that may be related to bit depth, but for the user, much more pertinent than a discussion of 14 bits or 16 bits etc.

We spend an enormous amount of time going over what we put on our website, I have discussed this with Doug, and it is a bit late now, but on the next 3 word banner we have, one of the stated features absolutely, has to be "Fabulous".


Steve Hendrix

Title: Re: 16 Bit Myth
Post by: ErikKaffehr on December 31, 2011, 10:27:24 am
Hi,

That makes us two...

I don't really think it's the AD converter that plays that role, but could be that the Bionz processor is just 12 bits wide. On the Exmor chips the AD converters are actually on the chip, so the physical converters are probably/possibly the same.

Best regards
Erik


It's always bugged me that Nikon gets more out of a given Sony sensor than Sony. Of course the price is also much higher. The alternate 14bit DA converter pathway does give Nikon a real advantage.
Title: Re: 16 Bit Myth
Post by: Ken Doo on December 31, 2011, 10:53:05 am
....We spend an enormous amount of time going over what we put on our website, I have discussed this with Doug, and it is a bit late now, but on the next 3 word banner we have, one of the stated features absolutely, has to be "Fabulous".

Steve Hendrix


How about: "Steve recommended.  Doug Approved."

Works for me.   ;D    lol  ken
Title: Re: 16 Bit Myth
Post by: LKaven on December 31, 2011, 10:58:46 am
Steve, the product data sheet for the Dalsa FTF4052C sensor used in the Aptus 22 reads:

"The FTF 4052C is a full frame CCD colour image sensor designed for professional digital photography applications with very low dark current and a linear dynamic range of over 12 true bits at room temperature."

I think this, by stipulation, is the only sense of "true" in terms of "true bits" that you are looking at.
Title: Re: 16 Bit Myth
Post by: PierreVandevenne on December 31, 2011, 11:45:12 am
while it was clear that the Kodak Sensors used in digital backs (Sinar, Imacon/Hasselblad, Phase One (pre-P65+, post H10), have always been 14 bit native, even though data sheets for all of these products trumpet 16 bit.

KAF 31600
FWC: 60000 e (per Kodak data sheet)
Read Moise: 16 e (per Kodak data sheet)
-> 3750 levels -> under 12 bits of DR
Sanity check
DR = 20 x (Log (60000/16) = 71.5 dB

Kodak data sheet: 70.5 dB

Good match allowing for rounding errors.

If one wants a sensor with 14 bits of DR, one should look at a sensor with a linear DR of 84 dB (or a sensor using partial read, multiple exposure summing per pixel described in the other thread).

And yes, it doesn hurt to have an A/D converter that outperforms the sensor a bit. But if one goes from sensor to a/d converter to data path, it's quite easy to tell which number will be used by marketeers :-).
Title: Re: 16 Bit Myth
Post by: Steve Hendrix on December 31, 2011, 12:03:08 pm
Steve, the product data sheet for the Dalsa FTF4052C sensor used in the Aptus 22 reads:

"The FTF 4052C is a full frame CCD colour image sensor designed for professional digital photography applications with very low dark current and a linear dynamic range of over 12 true bits at room temperature."

I think this, by stipulation, is the only sense of "true" in terms of "true bits" that you are looking at.


Quite possibly. However, back to the original cruxt of stating 16 bits as a specification, if "true" bit depth was stated for every product, it might also lead to confusion, as the stated bit depth from marketing data sheets would likely always differ. So, stating the FTF 4052C as a true 12 bit sensor would confuse, as many other products might genuinely natively be 8 bit, 11 bit, etc. Inacurrate as it may be, 8 bit, 12 bit, 14 bit, 16 bit, etc, are an established reference point. And given the relevance, or lack of relevance if you will, of bit depth as a singular statistic for making a purchasing decision, I don't believe reinventing the wheel when it comes to establishing "true" bit depth for every imaging device on the market is worth the time. And considering I cannot recall the last time I even had a discussion on the bit depth statistic with a client, I would say interested purchasers tend to agree.
Title: Re: 16 Bit Myth
Post by: LKaven on December 31, 2011, 12:34:48 pm

Quite possibly. However, back to the original cruxt of stating 16 bits as a specification, if "true" bit depth was stated for every product, it might also lead to confusion, as the stated bit depth from marketing data sheets would likely always differ. So, stating the FTF 4052C as a true 12 bit sensor would confuse, as many other products might genuinely natively be 8 bit, 11 bit, etc. Inacurrate as it may be, 8 bit, 12 bit, 14 bit, 16 bit, etc, are an established reference point. And given the relevance, or lack of relevance if you will, of bit depth as a singular statistic for making a purchasing decision, I don't believe reinventing the wheel when it comes to establishing "true" bit depth for every imaging device on the market is worth the time. And considering I cannot recall the last time I even had a discussion on the bit depth statistic with a client, I would say interested purchasers tend to agree.

Even as someone who studied semantics for many years, I still had to pace the floor for a solid 5 minutes to unpack this circuitous defense.

This all started due to an Aptus back that you advertised as having "16 true bits."  This left us scratching our heads about the intended meaning of the word "true."  By implicature, it appeared you were trying to put forth the impression that this back turned out 16 bits of image data, as opposed to 12 or 13 bits of image data and 3-4 bits of noise.  This of course was a false impression.  You wisely decided to correct it, while somehow still defending it.

The manufacturer's technical data sheet supplies a meaning of "true" that is hard to counter.  It claims just over 12 "true bits."  This is not a marketing sheet by the way, but a technical reference with pin-outs and voltages. 

You cite confusion on the part of the prospective customer over varying ratings of bit depth.  You also say that the customer rarely discusses it.  But in point of fact, you try to forestall such discussion by advertising "16 true bits" on your bullet list (as do some of the MF manufacturers in their own marketing literature, e.g., the Pentax 645D).  So by this time, the customer has accepted it unquestioningly as evidenced by the number of times the claim is parroted on these forums.  I would not expect the discussion to come up again unless the customer somehow gets the correct impression that this is not a truthful claim.  You've introduced the confusion you see, a confusion that is not exposed at the point of sale, but later on, in forum threads like this one.  And if the bit specifications were truly inconsequential (both to buyers and in practice), you wouldn't use them in marketing literature.  I don't know what "wheel" you are talking about reinventing here.  And I don't know what kind of "reference point" is established regarding bit depth except those that are intended to induce vague, positive impressions about final image quality in a prospective buyer.

By your lights, all DSLRs might be called 16 bit since they can be made to produce 16 bit files.  In fact, most DSLRs have more "true" bits, considered as measurable dynamic range, than your Aptus.  What your Aptus has, and it does have something special, is something else altogether. 
Title: Re: 16 Bit Myth
Post by: ErikKaffehr on December 31, 2011, 12:52:00 pm
Hi,

I would say that telling that a certain CCD based MF digital back is a 16 bit device when it's more like a 12 bit device is a bit of false marketing, especially when the fake 16 bits are claimed to be an advantage over the real 14 bits sensor may actually use. Pentax K5 is close to 14 bits, it seems.

The best way to demonstrate the advantage of MFDBs over other cameras are decent and technically correct raw-files shot under controlled conditions. I have seen very few examples of that. At least such files involving both MFDBs and state of the art DSLRs. I have seen a couple of such tests:

- Lloyd Chambers has tested the Leica S2, Pentax 645D, Hasselblad H3D and some others.

- Miles Hecker has compared Pentax 645D to Canon 5DII and Nikon D3X.

- Imaging Resource has raw files from the Penatx 645D for download

In my view those tests demonstrated the advantages of MFD. Photography is about images, so why use fake bits for marketing instead of real raw images?

Best regards
Erik

The point is to explain to someone who isn't a scientist why most 22mp backs are better than most 22mp dSLRs. Why pixels are not all created equally. Assume you have about 10 characters worth of space to explain that and the person is not the 5% of users who get deeply involved in the science. "True 16 bit" is about as good as I can come up with as most customers understand the difference between 8 and 16 bits elsewhere in photography. I also considered "Great tonal smoothness" or "holds up to strong styling in post processing well" but it didn't have the same ring and took up much more room. I'm open for suggestions. We're not trying to be dishonest or disingenuous.

The real world advantage is real (e.g. how the file looks after you add a lot of contrast and local dodge/burn or a significant change in WB) but the explanation could take up many pages regarding the entire imaging chain (including the quality, not just bit depth, of the A/D convertor and dark frame technology) and the emphasis throughout the chain on image quality over convenience, features, speed, or cost.

Anyone who has dealt with us (Capture Integration) knows our emphasis is on the real world and on results and real world testing not specs or marketing spiels - but the reality of marketing is you have only a few seconds of attention span and a few characters to use it.

I'm serious when I say I'm open for suggestions.

Doug Peterson (e-mail Me) (doug@captureintegration.com)
__________________

Head of Technical Services, Capture Integration
Phase One Partner of the Year
Leaf, Leica, Cambo, Arca Swiss, Canon, Apple, Profoto, Broncolor, Eizo & More

National: 877.217.9870  |  Cell: 740.707.2183
Newsletter (http://"http://www.captureintegration.com/our-company/newsletters/") | RSS Feed (http://"http://www.captureintegration.com/2008/08/11/rss-feeds/")
Buy Capture One 6 at 10% off (http://"http://www.captureintegration.com/phase-one/buy-capture-one/")

Title: Re: 16 Bit Myth
Post by: Steve Hendrix on December 31, 2011, 01:00:37 pm
Even as someone who studied semantics for many years, I still had to pace the floor for a solid 5 minutes to unpack this circuitous defense.

This all started due to an Aptus back that you advertised as having "16 true bits."  This left us scratching our heads about the intended meaning of the word "true."  By implicature, it appeared you were trying to put forth the impression that this back turned out 16 bits of image data, as opposed to 12 or 13 bits of image data and 3-4 bits of noise.  This of course was a false impression.  You wisely decided to correct it, while somehow still defending it.

The manufacturer's technical data sheet supplies a meaning of "true" that is hard to counter.  It claims just over 12 "true bits."  This is not a marketing sheet by the way, but a technical reference with pin-outs and voltages. 

You cite confusion on the part of the prospective customer over varying ratings of bit depth.  You also say that the customer rarely discusses it.  But in point of fact, you try to forestall such discussion by advertising "16 true bits" on your bullet list (as do some of the MF manufacturers in their own marketing literature, e.g., the Pentax 645D).  So by this time, the customer has accepted it unquestioningly as evidenced by the number of times the claim is parroted on these forums.  I would not expect the discussion to come up again unless the customer somehow gets the correct impression that this is not a truthful claim.  You've introduced the confusion you see, a confusion that is not exposed at the point of sale, but later on, in forum threads like this one.  And if the bit specifications were truly inconsequential (both to buyers and in practice), you wouldn't use them in marketing literature.  I don't know what "wheel" you are talking about reinventing here.  And I don't know what kind of "reference point" is established regarding bit depth except those that are intended to induce vague, positive impressions about final image quality in a prospective buyer.

By your lights, all DSLRs might be called 16 bit since they can be made to produce 16 bit files.  In fact, most DSLRs have more "true" bits, considered as measurable dynamic range, than your Aptus.  What your Aptus has, and it does have something special, is something else altogether. 



My point was that there may be technical reference sheets for many products that state the actual bit depth as less than the stated bit depth on the manufacturers end product data sheet. That is all that I was saying.

A lot is being made of our use of the term true 16 bit (as stated by Leaf themselves) on a banner for the Leaf Aptus 22. I can barely find a mention of 14 or 16 bit depth anywhere else on our website, so I think it is being a bit overblown so far as our use of bit depth is concerned as it relates to selling a product. I agree, "true 16 bit" is vague, possibly technically inaccurate, and not one of the top 3 features I would mention as reasons to consider an Aptus 22. However, we've already noted that, and it is not indicative of Capture Integration being the industry culprit for this.


Steve Hendrix
Title: Re: 16 Bit Myth
Post by: LKaven on December 31, 2011, 01:32:31 pm
[...]
I agree, "true 16 bit" is vague, possibly technically inaccurate, and not one of the top 3 features I would mention as reasons to consider an Aptus 22. However, we've already noted that, and it is not indicative of Capture Integration being the industry culprit for this.

True enough, Steve.  If Leaf and/or Pentax had their in-house reps writing to this forum, we'd be taking them to task in a bigger way.  [In fact, they should be put on notice for these kinds of claims.  This kind of myth, about a product that costs as much as an Audi in many cases, should be squelched.]  Since you and Doug undertook a defense of the practice, you probably caught more than your fair share.
Title: Re: 16 Bit Myth
Post by: Steve Hendrix on December 31, 2011, 01:44:14 pm
True enough, Steve.  If Leaf and/or Pentax had their in-house reps writing to this forum, we'd be taking them to task in a bigger way.  [In fact, they should be put on notice for these kinds of claims.  This kind of myth, about a product that costs as much as an Audi in many cases, should be squelched.]  Since you and Doug undertook a defense of the practice, you probably caught more than your fair share.


And to be clear, we're not advocating a defense of it. I've already stated that in hindsight, it wasn't worth mentioning as a top 3 feature, regardless of the actual specification (for the Aptus 22 product). And I can't find anywhere else on our website where we really even mention bit depth other than when re-positing manufacturer stated specifications. Because of the public vagueness of actual bit depth on a per sensor basis, we have never really spent much energy espousing it. This just happened to find its way into a little space as a buzzword and I agree it is not why someone buys an Aptus 22. The physical sensor size, pixel dimensions, tonality, color reproduction and that Leaf Special Sauce are far more pertinent.


Steve Hendrix
Title: Re: 16 Bit Myth
Post by: Bryan Conner on December 31, 2011, 02:08:00 pm
Why does a person purchase a camera?  To take pictures...unless the person is a collector, or it is purchased as a gift.  But, in the end, most camera purchases are because someone wants to take a picture.  If I were looking to buy a camera and it was advertised as being ANYTHING 16 bit, my first thought would be that it produces 16 bit files.  If it did not produce 16 bit files, then what is the purpose of telling me anything about 16 bit?  This is the same as Audi mentioning the phrase 250 km/h (155 mph) in an advertisement for a basic A2 that will only reach 173 km/h (107 mph).  The Audi A8 will indeed reach 250 km/h and maybe it shares a few electrical connectors or such with the A2.  These parts are capable of reaching 250km/h when they are in the A8, but not the A2.

I do not think that this whole issue is just about one dealer, or one manufacturer.  A lot of companies (not Phase One or related dealers etc) get so caught up in the race to get the customers...some are starting to sound like this: http://www.youtube.com/watch?v=pkYNBwCEeH4  

Title: Re: 16 Bit Myth
Post by: rainer_v on December 31, 2011, 02:21:02 pm

Quite possibly. However, back to the original cruxt of stating 16 bits as a specification, if "true" bit depth was stated for every product, it might also lead to confusion, as the stated bit depth from marketing data sheets would likely always differ. So, stating the FTF 4052C as a true 12 bit sensor would confuse, as many other products might genuinely natively be 8 bit, 11 bit, etc. Inacurrate as it may be, 8 bit, 12 bit, 14 bit, 16 bit, etc, are an established reference point. And given the relevance, or lack of relevance if you will, of bit depth as a singular statistic for making a purchasing decision, I don't believe reinventing the wheel when it comes to establishing "true" bit depth for every imaging device on the market is worth the time. And considering I cannot recall the last time I even had a discussion on the bit depth statistic with a client, I would say interested purchasers tend to agree.

steve, the only reason why this 16bit myth has so much relevance is that noone of the manufactors and sellers never has admitted in the various discussions about (e.g. in forums as this one) that his mf backs arent 16bit. as happens now again. rounds and rounds are made without a clear statement, even the dalsa data spec couldnt be more clear as it is..... and even if this was now posted since years. how you can say that you dont see clear if the a22 ( and others ) are 12, 14 or 16bit? its very clear and its easy to see, isnt it? 
next week the 16bit statement will come up again and the same discussion will go on and on, creating this myth which you say is the reason that you have to say its 16bit, because it explains so nice the quality difference between mf and smaller formats.  r.
Title: Re: 16 Bit Myth
Post by: Radu Arama on December 31, 2011, 02:43:51 pm
I am not that sure that you will ever find something official from Pentax citing "16 bit pipeline". They quote 14 bit A/D convertor and 14 bit RAW files.

http://www.pentax.jp/english/products/645d/feature_1.html

Happy New Year!
Radu

Even as someone who studied semantics for many years, I still had to pace the floor for a solid 5 minutes to unpack this circuitous defense.

This all started due to an Aptus back that you advertised as having "16 true bits."  This left us scratching our heads about the intended meaning of the word "true."  By implicature, it appeared you were trying to put forth the impression that this back turned out 16 bits of image data, as opposed to 12 or 13 bits of image data and 3-4 bits of noise.  This of course was a false impression.  You wisely decided to correct it, while somehow still defending it.

The manufacturer's technical data sheet supplies a meaning of "true" that is hard to counter.  It claims just over 12 "true bits."  This is not a marketing sheet by the way, but a technical reference with pin-outs and voltages. 

You cite confusion on the part of the prospective customer over varying ratings of bit depth.  You also say that the customer rarely discusses it.  But in point of fact, you try to forestall such discussion by advertising "16 true bits" on your bullet list (as do some of the MF manufacturers in their own marketing literature, e.g., the Pentax 645D).  So by this time, the customer has accepted it unquestioningly as evidenced by the number of times the claim is parroted on these forums.  I would not expect the discussion to come up again unless the customer somehow gets the correct impression that this is not a truthful claim.  You've introduced the confusion you see, a confusion that is not exposed at the point of sale, but later on, in forum threads like this one.  And if the bit specifications were truly inconsequential (both to buyers and in practice), you wouldn't use them in marketing literature.  I don't know what "wheel" you are talking about reinventing here.  And I don't know what kind of "reference point" is established regarding bit depth except those that are intended to induce vague, positive impressions about final image quality in a prospective buyer.

By your lights, all DSLRs might be called 16 bit since they can be made to produce 16 bit files.  In fact, most DSLRs have more "true" bits, considered as measurable dynamic range, than your Aptus.  What your Aptus has, and it does have something special, is something else altogether. 
Title: Re: 16 Bit Myth
Post by: LKaven on December 31, 2011, 02:49:33 pm
I am not that sure that you will ever find something official from Pentax citing "16 bit pipeline". They quote 14 bit A/D convertor and 14 bit RAW files.

http://www.pentax.jp/english/products/645d/feature_1.html
I stand corrected.  Thanks for the link.
Title: Re: 16 Bit Myth
Post by: digitaldog on December 31, 2011, 03:37:20 pm
The problem I have with the semantics: “True 16-bits” (besides the fact it may be untrue which would be inexcusable), is it implies others, competitors have untrue or false 16-bit capture. Talk about two wrongs not making a right!
Title: Re: 16 Bit Myth
Post by: Chris_Brown on December 31, 2011, 03:49:47 pm
. . . back to the original cruxt of stating 16 bits as a specification . . .
All that needs to be shown is solid information about the A/D conversion device/algorithm in the product. In audio products for example, the publishing of sample rates and A/D conversion bit depth is the norm.
Title: Re: 16 Bit Myth
Post by: LKaven on December 31, 2011, 09:30:46 pm
All that needs to be shown is solid information about the A/D conversion device/algorithm in the product. In audio products for example, the publishing of sample rates and A/D conversion bit depth is the norm.

Oddly enough, according to Dan Lavry, one of the audio industry's eminent A-D designers, there are lots of A-D/D-A converters made with a 24-bit specification, but none of them can actually do more than 21 bits. 
Title: Re: 16 Bit Myth
Post by: hjulenissen on January 01, 2012, 07:58:55 am
Oddly enough, according to Dan Lavry, one of the audio industry's eminent A-D designers, there are lots of A-D/D-A converters made with a 24-bit specification, but none of them can actually do more than 21 bits. 
This is true.

The audio industry (at least the serious ones) tend to use a term called "effective number of bits". I guess this concept could be used in photography as well, but I would think that established conventions for measuring DR etc directly makes more sense?
http://en.wikipedia.org/wiki/Effective_number_of_bits
Title: Re: 16 Bit Myth
Post by: TH_Alpa on January 01, 2012, 11:03:02 am
Just as an information, and for the history and the record, since many wrong information is written here:

It was Sinar who introduced the first MF back with 22 MPx, the Sinarback 54.

At that time, when advertised (launched at Photokina 2002) it was clearly stated in the SINAR brochure, as well as in any relevant published data or technical information, that it was a sensor with a bit depth of 14 bit. I can find the brochure for those who don't believe it. These 14 bit were advertised as well thereafter, when the competition got the 22 MPx sensor from Kodak, in July 2003 (there was a 1 year exclusivity for Sinar).

As a reference for my claim, one can look at Michael's review here, where Sinar clearly stated 14 bit, not like others:

Medium Format Digital Backs — Q3, 2003 (http://www.luminous-landscape.com/reviews/mf-backs.shtml)

Please note also the ISO discrepancy between the different manufacturers (advertised as ISO 25 for Sinar)

Just to put the record straight.

Best regards
Thierry
Title: Re: 16 Bit Myth
Post by: Doug Peterson on January 01, 2012, 12:14:31 pm
Dear forum: the graphic (http://www.captureintegration.com/) graphic containing the five characters found by some to be misleading has been changed. For reference here is the product sheet from Leaf for the Aptus 22 (http://www.captureintegration.com/wp-content/uploads/hi_afaptus22-17_en.pdf).

I will be taking a month or so away from the forum. The overall negative and accusatory (or at least lacking the benefit of the doubt) tone exhibited by some on this thread is absolutely exhausting. If this conversation were at a party I would have left the party a long time ago.

We work very hard to provide honest, meaningful, real-world advice on digital backs. We also strive to provide transparency and open discussion (as indicated in my first reply to this thread); we are in fact real people! I think in this case we were wrong; largely because we (Capture Integration) have always made it part of the company ethos not to rely on manufacturer specs/claims but to provide advice based on actual usage/testing/comparisons. We are not shills selling snake oil - frankly that implication was deeply insulting.

Doug Peterson (e-mail Me) (doug@captureintegration.com)
__________________

Head of Technical Services, Capture Integration
Phase One Partner of the Year
Leaf, Leica, Cambo, Arca Swiss, Canon, Apple, Profoto, Broncolor, Eizo & More

National: 877.217.9870  |  Cell: 740.707.2183
Newsletter (http://"http://www.captureintegration.com/our-company/newsletters/") | RSS Feed (http://"http://www.captureintegration.com/2008/08/11/rss-feeds/")
Buy Capture One 6 at 10% off (http://"http://www.captureintegration.com/phase-one/buy-capture-one/")
Title: Re: 16 Bit Myth
Post by: michael on January 01, 2012, 12:58:35 pm
And that's what we get when some people act like children and boors. A valuable resource is lost, if only for a while.

I will not let this Forum become like DP Review. From now on anyone engaging in insults and personal attacks will be summarily banned. No warnings. I'm fed up.

Michael
Title: Re: 16 Bit Myth
Post by: Hulyss on January 01, 2012, 01:35:00 pm
I agree with you, Michael. And Happy new year ;)
Title: Re: 16 Bit Myth
Post by: HCHeyerdahl on January 01, 2012, 01:54:51 pm
...
But we are not shills selling snake oil - frankly that implication was deeply insulting.

Doug Peterson (e-mail Me) (doug@captureintegration.com)
__________________

Head of Technical Services, Capture Integration
Phase One Partner of the Year
Leaf, Leica, Cambo, Arca Swiss, Canon, Apple, Profoto, Broncolor, Eizo & More

National: 877.217.9870  |  Cell: 740.707.2183
Newsletter (http://"http://www.captureintegration.com/our-company/newsletters/") | RSS Feed (http://"http://www.captureintegration.com/2008/08/11/rss-feeds/")
Buy Capture One 6 at 10% off (http://"http://www.captureintegration.com/phase-one/buy-capture-one/")


Dear Doug,
I am assuming you are refering to my reply where i used that phrase. I am really sorry to read that you feel deeply insulted. That was certainly not my intention.

I was actually just trying to be helpful, but since english is not my native toung, maybe I don´t have sufficient understanding of all the nuances  that phrase conveys.  Please, let me try again:

I was trying to give an answer to this:
The point is to explain to someone who isn't a scientist why most 22mp backs are better than most 22mp dSLRs. Why pixels are not all created equally. Assume you have about 10 characters worth of space to explain that and the person is not the 5% of users who get deeply involved in the science. "True 16 bit" is about as good as I can come up with as most customers understand the difference between 8 and 16 bits elsewhere in photography. I also considered "Great tonal smoothness" or "holds up to strong styling in post processing well" but it didn't have the same ring and took up much more room. I'm open for suggestions. We're not trying to be dishonest or disingenuous.

I'm serious when I say I'm open for suggestions.


For reference, this is  what I wrote:



As a person considering the jump to MF, my take is that your marketing ought to reflect reality. Thing is when you quote 16 bit as something that makes MF stand out from FF DSLRs,  and threads on several forums find this questionable, I as a potential customer begin to wonder if the whole thing is snake oil. I have spent quite some time trying to sort this out and I am not finished yet... ;)

Now, from what I have managed to pick up from numerous threads it may appear that it is the very size of the sensor itself AND coupled with stricter tollerances in every component that gives MF its alleged superior image quality. If this is actually the case, then this is VERY interesting for a potential customer since this will not be available in a future FF DSLR. Of course DSLRs will continue to evolve, but they will not grow bigger sensors. I also find it unlikely that they will increase their quality tollerances. Hence, at a given purchase date, the benefits from a MF investment will probably hold itself against future FF DSLRs for more than one or two generations (btw a great  wife argument!).

I really don´t have a ready 15 char slogan for you but something along the line of "Built to MF quality tollerances" , "Designed for MF quality" would at least to me suggest that the quality of MF has someting to do with the quality of the entire system.  Sure, lots of people will contest this or similar assertions, but at least we will not have threads from people with superior technical knowledge in image processing, etc more or less PROVING your (16 bit) advantage is wrong.

Hay, OK !  Maybe I still haven´t got it - just my two (amaturish) cents trying to be helpful  ;D.

Christopher


I am a seasoned amature using high end Nikons for a long time and I am considering to make the jump to some MF brand.  I am currently doing my research to determine what advantages MF can give,  and if I need it for my photography.

From several dealers I get the argument that the 16 bits is what gives MF superior file quality. Well, here on LuLA and other forums that assertion is disputed.  That, at least to me, is very confusing and I have spent quite some time trying to figure out what is really the case. Now since this turns out not to be exactly right, I naturally become even more inclined to carefully research other claims about MF too.  Hence, I suggest using some other phrase which conveys accurate information. It would make life a lot easier for me as a researching customer.

I hope you can understand that my argument is NOT about you personally or any other dealer for that matter.  

From your first post I took it that you use the 16 bit argument believing it will convince customers.  I am trying to tell you that at least for for me it does not work.

Sincerely,

Christopher

PS
I see that while writing both Michael and Hulyss have taken offence too. Well, I can just apologize once more. I am truly sorry.
DS
Title: Re: 16 Bit Myth
Post by: ErikKaffehr on January 01, 2012, 02:39:28 pm
Hi,

Thanks for posting an interesting link!

Just to put things in perspective:

1) 16 bits corresponds to a DR of 16 steps
2) 16 bits correspond to density 4.8

My film scanner is purported to be 16 bit and have a maximum density of 4.8, pbut it peters out around 3.5 according to independent tests.

Best regards
Erik

Just as an information, and for the history and the record, since many wrong information is written here:

It was Sinar who introduced the first MF back with 22 MPx, the Sinarback 54.

At that time, when advertised (launched at Photokina 2002) it was clearly stated in the SINAR brochure, as well as in any relevant published data or technical information, that it was a sensor with a bit depth of 14 bit. I can find the brochure for those who don't believe it. These 14 bit were advertised as well thereafter, when the competition got the 22 MPx sensor from Kodak, in July 2003 (there was a 1 year exclusivity for Sinar).

As a reference for my claim, one can look at Michael's review here, where Sinar clearly stated 14 bit, not like others:

Medium Format Digital Backs — Q3, 2003 (http://www.luminous-landscape.com/reviews/mf-backs.shtml)

Please note also the ISO discrepancy between the different manufacturers (advertised as ISO 25 for Sinar)

Just to put the record straight.

Best regards
Thierry
Title: Re: 16 Bit Myth
Post by: ErikKaffehr on January 01, 2012, 02:50:44 pm
Michael,

Thank you, we all need to calm down.

Best regards
Erik

And that's what we get when some people act like children and boors. A valuable resource is lost, if only for a while.

I will not let this Forum become like DP Review. From now on anyone engaging in insults and personal attacks will be summarily banned. No warnings. I'm fed up.

Michael

Title: Re: 16 Bit Myth
Post by: amsp on January 01, 2012, 03:41:36 pm
Six pages of this bs, seriously? Don't you have anything better to do, you know like go out and take some pictures maybe?
Title: Re: 16 Bit Myth
Post by: ErikKaffehr on January 01, 2012, 03:46:59 pm
Well, yes, of course! On the other hand, if you want to spend like 30 kUSD you may as well spend it for an existing reason. If you are not interested in this BS, why do you read and why do you post?

Best regards
Erik


Six pages of this bs, seriously? Don't you have anything better to do, you know like go out and take some pictures maybe?
Title: Re: 16 Bit Myth
Post by: bjanes on January 01, 2012, 04:03:44 pm
From several dealers I get the argument that the 16 bits is what gives MF superior file quality. Well, here on LuLA and other forums that assertion is disputed.  That, at least to me, is very confusing and I have spent quite some time trying to figure out what is really the case. Now since this turns out not to be exactly right, I naturally become even more inclined to carefully research other claims about MF too.  Hence, I suggest using some other phrase which conveys accurate information. It would make life a lot easier for me as a researching customer.

Rather than reading advertising claims, it is more advisable to understand a bit of basic science and mathematics. A good place to start is the post by Emil Martinec: Noise, Dynamic Range and Bit Depth (http://theory.uchicago.edu/~ejm/pix/20d/tests/noise/noise-p3.html). Here is an important excerpt that just follows figure 18:

"Quantizing the signal from the sensor in steps much finer than the level of the noise is thus superfluous and wasteful; quantizing the noise in steps much coarser than the level of the noise risks posterization. As long as the noise exceeds the quantization step, the difference between the coarser and finer quantization is imperceptible. As long as noise continues to exceed the quantization step in post-processing, it doesn't matter how one edits the image after the fact, since any squeezing/stretching of the levels also does the same to the noise, which will always be larger than the level spacing no matter how it is squeezed or stretched. On the other hand, quantizing the signal in steps coarser than the noise can lead to posterization. Ideally, the noise should slightly exceed the quantization step, in order that roundoff errors introduced by quantization are negligible, and that no bits are wasted in digitizing the noise."

DXO has a good section on Tonal Range (http://www.dxomark.com/index.php/About/In-depth-measurements/Measurements/Noise):

"Tonal range is the effective number of gray levels the system can produce. This measure has to take noise into account (indeed, a very thin gray-level quantization is irrelevant if the quantization step is much smaller than noise). The standard deviation of noise can be viewed as the smallest difference between two distinguishable gray levels."


For those who do not like to do integral calculus, consider the following example. Many MFDBs have a full well capacity of 50-60 thousand electrons and the readout is done with a 16 bit ADC, which can theoretically resolve 2^16 or 65536 levels. For simplicity of calculation, assume the full well is 65536 electrons, so each ADC number represents one electron.

A mid gray tone would have a DN (data number) of 0.18*65536 or about 11,800 representing 11,800 electrons. The standard deviation of the shot noise (which places an absolute limit on noise) would be sqrt (11,800) or about 109 electrons representing 109 ADC levels but only one distinguishable gray level. The process could be extended over the range of the sensor to obtain the total number of distinguishable levels. For the Phase One IQ 180, the tonal range is 8.52 bits screen and 10.19 bits print, corresponding to 367 and 1168 levels respectively, a far cry from 65536.

One thing that the IQ 180 does offer that no dSLR can is 81M high quality pixels. If you make big prints, that is a decided advantage, but if you limit yourself to 8x10 inch prints, the difference likely will not be significant from a good dSLR such as the D3x. The proponents of MFDBs do themselves no favor when they quote meaningless statistics such as a true bit depth of 16 or a dynamic range exceeding that of a top dSLR by 6 stops. The proponents also claim that the difference is noticeable at a glance even for 8x10 inch prints, but a double blind study by Michael comparing a good P&S to a MFDB did not bear that out.

Regards,

Bill
Title: Re: 16 Bit Myth
Post by: amsp on January 01, 2012, 04:05:52 pm
On the other hand, if you want to spend like 30 kUSD you may as well spend it for an existing reason.

I agree completely, stop arguing about bits, electronics parts and whatnot and go try out the system by taking some actual photographs with it before shilling out 30K. Cause I sure hope people are paying that kind of money because they see a real advantage in the results and not because it says 12, 14 or 16 "bits" on a piece of paper.

Title: Re: 16 Bit Myth
Post by: TH_Alpa on January 01, 2012, 04:20:35 pm
That was my experience when demonstrating backs and cameras, that potential customers usually don't blindly believe brochures, but wish to see with their eyes and/or test it themselves.

I think everybody sensed a bit would do that. And by doing real life tests it will definitively show the superiority (wherever this superiority comes from) of a 22 MPx digital MF back, still today, nearly a decade after its introduction. And my guess here is that one does not buy such a capture medium to output only 8x10" prints.

Thierry

... Cause I sure hope people are paying that kind of money because they see a real advantage in the results and not because it says 12, 14 or 16 "bits" on a piece of paper.
Title: Re: 16 Bit Myth
Post by: ErikKaffehr on January 01, 2012, 05:09:47 pm
Hi,

I wish more examples would be posted on the issue, I have seen very few actual images available as raw files. Miles Hecker had a few when he tested Pentax 645D, Imaging Review has some P645D, I found three raw images from P65+ available at Phase One (good demo, by the way) and two images from the Leica S2 from Diglloyd. That's about it ;-(

Tim Parkin has posted a very good review which is useful but it was more about film and now raw images.

Best regards
Erik


I think everybody sensed a bit would do that. And by doing real life tests it will definitively show the superiority (wherever this superiority comes from) of a 22 MPx digital MF back, still today, nearly a decade after its introduction. And my guess here is that one does not buy such a capture medium to output only 8x10" prints.

Thierry

Title: Re: 16 Bit Myth
Post by: Steve Hendrix on January 01, 2012, 06:38:52 pm
steve, the only reason why this 16bit myth has so much relevance is that noone of the manufactors and sellers never has admitted in the various discussions about (e.g. in forums as this one) that his mf backs arent 16bit. as happens now again. rounds and rounds are made without a clear statement, even the dalsa data spec couldnt be more clear as it is..... and even if this was now posted since years. how you can say that you dont see clear if the a22 ( and others ) are 12, 14 or 16bit? its very clear and its easy to see, isnt it? 
next week the 16bit statement will come up again and the same discussion will go on and on, creating this myth which you say is the reason that you have to say its 16bit, because it explains so nice the quality difference between mf and smaller formats.  r.



Happy New Year everyone. Now that my head has cleared a bit....  I have to admit to a brain cramp in my posting about the bit depth of the sensor itself, as that is not what I meant when I brought up the question of 14 bit vs 16 bit, Kodak, Dalsa, etc. I was discussing the processing and ADC conversions themselves, which in the case of the Leaf product (as Doug has also stated) are 16 bit (this is true). I haven't spoken with Doug, so I'm not sure how the term "true" was intended. But again, in the case of ADC, I believe there may be imaging products that have 14 bit ADC's, but the data sheets themselves state 16 bits. I'm not sure of this, though, and there's a very good reason for that. Because frankly, when it comes to determining the right product for a user, it is far from an important statistic to me. There are indeed certain enhancements that the 16 bit processing and 16 bit ADC creates for the Leaf product that positively affect the outcome of the image quality. No one can definitely prove that it doesn't, because your only argument would be scientific or theoretically-based, you are not part of the development team for Leaf Imaging. The inclusion of the term "True 16 Bit" certainly could be treated as an accurate statement, but only if one were to understand that the reference is to the processing chain and ADC. From that standpoint, we're more than happy to omit that, given the limited 10 character space for the term.

The ramifications of what Leaf does with this architecture and how it affects their image quality (and in particular compared to a competitor's image quality) would make for a great stand alone article. It is relevant. It is not necessarily appropriate as the emblazoned #1 reason to choose a Leaf product. But then again, that is quite a subjective choice, is it not? Does large sensor size matter to everyone? No. If someone desires more depth of field, or sharper results from the edge of their frame, perhaps not. Etc, etc.

So, at the end of the day (and this far too long thread) with all of the concern over 16 bit vs 14 bit, etc, what matters most is how someone intends to use the product and what the final image looks like. And that is what is discussed with a client. We, Capture Integration, are not guilty of pervading the 16 bit anything. We grabbed a statistic and posted it as a differentiating feature (which it may very well be, in terms of what it allows the final image quality to become). I would like a show of hands - how many here have purchased from us and had discussions which were reduced to "Leaf has 16 bits, so it is superior, and that is why you should buy it"?

What I know is that when someone calls us for information on the potential purchase of a medium format digital back, the resulting initial discussion lasts an average of 40 - 80 minutes, sometimes longer. And in that time, what 16 bits as a statistic does is rarely discussed, much less even mentioned. Friday night I left the office at 8:30 pm, our last business day of the year, because during the course of the day, in addition to everything else that was happening, I spent several hours with a client picking up a system, going over some final quick start tips, and among others, took 3 phone calls that lasted 40 - 60 minutes each. One call was from a client who had purchased elsewhere and was wondering about the quality of some of the lenses he had been sold that didn't seem to hold up. With nothing to gain, other than the hope he might choose to do business with us in the future, we spent nearly an hour discussing real world performance of various lenses. No technical statistical data came up once. And that is how we do business. We're sorry that in advertising a product, the actual spec from the manufacturer was used and treated as a significant feature (regardless of whether you believe it is and can be easily measure it or not).

I just find it astonishing that because we stated a statistic as a feature (in a questionable or vague context, granted) turns into an attack on the integrity of how we do business as a company. While I understand that perhaps some of you have had experiences with dealers who relied on simple technical specifications from data sheets to sell you a product, to broad-brush our company with the same stroke is insulting. We'll take part of the blame for this very rare occasion of a questionable attention grabbing banner on a pretty damn good deal for a Leaf Aptus 22. But there is an abundance of evidence available that points to how we do business as a company, some may just not realize (and we certainly strive to improve), while some have just taken the opportunity to randomly trash.


For Thierry -

Yes, Thierry, my friend, as a former employee of SinarBron, I can attest to the tradition of Sinar being very literal in their documentation. If the Sinarback 54H, being a Kodak sensor, is stated as 14 bit on the data sheet, is this referring to the native bit depth of the sensor, or the ADC? What about the Sinarback 54LV? This product used the same sensor as the Leaf Aptus 22, yet on the Sinar data sheet it states 16 bit per channel.

http://www.sinar.ch/en/products/digital-backs/146-sinarback-emotion-54-lv



Steve Hendrix

Title: Re: 16 Bit Myth
Post by: digitaldog on January 01, 2012, 07:27:44 pm
Question. Unless we had two identical camera systems, down to the lens and setup, but one was 14 bit, the other 16-bit, how could we know the extra 2 bits were at all useful? Plus it seems there is no agreement what a 14-bit or 16-bit product really is (where and how those bits get created or processed).

It is obvious we can have two different camera systems and examine the files and decide which we believe is superior. On print, screen, using some raw analysis, whatever. There seems to be dozens, maybe hundreds of different aspects of a products design and implementation that attribute to the quality.

It seems the marketing departments of many companies would like to suggest (a kind term) that their product produces superior quality based on the number bits. I can’t comprehend how they can prove their case.

Quote
I just find it astonishing that because we stated a statistic as a feature (in a questionable or vague context, granted) turns into an attack on the integrity of how we do business as a company.

Attacks are not warranted. But neither is promoting a feature that may not be a feature, especially when it is vague and questionable. Now if any company can setup a truly scientific test that proves two more bits per color alone produce superior image quality that one can actually see, that would be interesting, educational and useful. I suspect it would be an expensive test to produce.
Title: Re: 16 Bit Myth
Post by: Camdavidson on January 01, 2012, 08:09:15 pm
Capture Integration are the good guys.  I have purchased kit from them and asked far too many questions of Steve, Dave and Doug than the value of the business they have gotten from me.

I've known them for years and have purchased a Phase back, an Eizo Monitor and a few other goodies.  There have many conversations with Doug about Capture One and how to integrate Media Pro into my workflow.  (I've used C1 for years)

They have ALWAYS been straight shooters about software and gear.  Doug has increased the knowledge of C1 through the blog posts and his noise reduction techniques are excellent.  What does not go reported is how Capture Integration has pleaded the case for photographers who use C1.  They have gotten Phase to listen to what shooters need and to incorporate a few of these desires into the software.  I know of no other dealer who cares as much about their clients and their success as Capture Integration.

They know their stuff.

Cameron Davidson
 http://www.camerondavidson.com (http://www.camerondavidson.com)
Title: Re: 16 Bit Myth
Post by: uaiomex on January 01, 2012, 08:13:33 pm
I've had before the 16-bit answer more than once. But I was always under the impression that MFDB had better quality mainly because of the bigger sensor. The other reasons in decreasing importance are: Superior components, circuitry and wiring. 16 bit, now on trail and the CCD. (just my belief).

Later on, when I finally got my first FF dslr, a 5D, I immediately saw the superior image quality over the aps cameras of that time. More or less, everything was the same with the exception of  the bigger sensor (more than twice) and just a few more mp's. This to me was enough evidence that the sensor size was the most important thing regarding IQ when comparing similar technologies (which is usually the case). Digital is analog to Analog in too many ways.

The 5D2 arrived with 14-bit. Colors are better and I suspect the bit rate has something to do here. Overall the MkII is bettter than the MkI but the jump in quality was not as much despite the big increase in mp's.

What I'm trying to say is that DMFB manufacturers should keep their promotional efforts on sensor size. For IQ, aproach and workflow, the bigger sensor size is their bigger asset.

Just my 2 centavos
Eduardo

Title: Re: 16 Bits from ADC, only analog charges or voltages from CCD sensors
Post by: BJL on January 01, 2012, 08:23:42 pm
For Thierry -

... If the Sinarback 54H, being a Kodak sensor, is stated as 14 bit on the data sheet, is this referring to the native bit depth of the sensor, or the ADC? What about the Sinarback 54LV? This product used the same sensor as the Leaf Aptus 22, yet on the Sinar data sheet it states 16 bit per channel.

http://www.sinar.ch/en/products/digital-backs/146-sinarback-emotion-54-lv
I am not Thierry, but I think I can partly answer this, having read the data sheets.
Kodak's data sheets never mention bit depth of output, for the good reason that Kodak CCD sensors output an analog signal (a charge). Kodak instead talks in terms like DR of 12 stops or 72dB or 4000:1.
(IIRC, Dalsa sensors do an extra step, charge-to-voltage conversion, and so specify output in voltage instead of charges, but still analog, and still using specs like DR of 4000:1 or 72dB.)

That Sinar page seems to describe 16-bits as the "digitalization" of the back, not of the sensor.
Title: Re: 16 Bits from ADC, only analog charges or voltages from CCD sensors
Post by: Steve Hendrix on January 01, 2012, 09:36:02 pm
I am not Thierry, but I think I can partly answer this, having read the data sheets.
Kodak's data sheets never mention bit depth of output, for the good reason that Kodak CCD sensors output an analog signal (a charge). Kodak instead talks in terms like DR of 12 stops or 72dB or 4000:1.
(IIRC, Dalsa sensors do an extra step, charge-to-voltage conversion, and so specify output in voltage instead of charges, but still analog, and still using specs like DR of 4000:1 or 72dB.)

That Sinar page seems to describe 16-bits as the "digitalization" of the back, not of the sensor.


Thanks BJL - yes, that was my assumption as well.


Steve Hendrix
Title: Re: 16 Bit Myth
Post by: Steve Hendrix on January 01, 2012, 09:48:04 pm
Question. Unless we had two identical camera systems, down to the lens and setup, but one was 14 bit, the other 16-bit, how could we know the extra 2 bits were at all useful? Plus it seems there is no agreement what a 14-bit or 16-bit product really is (where and how those bits get created or processed).

It is obvious we can have two different camera systems and examine the files and decide which we believe is superior. On print, screen, using some raw analysis, whatever. There seems to be dozens, maybe hundreds of different aspects of a products design and implementation that attribute to the quality.

It seems the marketing departments of many companies would like to suggest (a kind term) that their product produces superior quality based on the number bits. I can’t comprehend how they can prove their case.

Attacks are not warranted. But neither is promoting a feature that may not be a feature, especially when it is vague and questionable. Now if any company can setup a truly scientific test that proves two more bits per color alone produce superior image quality that one can actually see, that would be interesting, educational and useful. I suspect it would be an expensive test to produce.


Vague from the standpoint of how bit depth is generally interpreted and what Leaf's use of the term specifically means. The 16 bit process and ADC of the Leaf Aptus 22 does play an integral role in the resulting quality under a specific variety of situations. From that standpoint, the fact that Leaf states this as 16 bit, means that it is warranted and for the purpose of  the quality they deliver. It doesn't have to be bragged about necessarily, which is where the grey area enters as to whether it is a feature or not. The key is whether the end result of the application of Leaf's 16 bit process and ADC provides an advantage over competitive products. Leaf feels that it does. If it does, there's no question it qualifies as a feature. If it does not, then it is simply the means to their end result. The fact it is not spelled out specifically how the result benefits from this, is what brings into question stating it as a feature.


Steve Hendrix
Title: Re: 16 Bit Myth
Post by: digitaldog on January 01, 2012, 10:19:53 pm
The 16 bit process and ADC of the Leaf Aptus 22 does play an integral role in the resulting quality under a specific variety of situations.

That’s been proven how, specifically and only due to the higher bit depth? The identical process at 14 or 12-bit has been demonstrated to be inferior?
Title: Re: 16 Bit Myth
Post by: Steve Hendrix on January 01, 2012, 10:27:29 pm
That’s been proven how, specifically and only due to the higher bit depth? The identical process at 14 or 12-bit has been demonstrated to be inferior?


There's nothing to prove. Leaf uses this process toward their end result. It doesn't have to be comparative. That's the point I'm making. For Leaf to produce the quality that they do - regardless of where it stands comparatively - this is part of their process. It contributes to the end result, and particularly under certain conditions, but is not solely responsible for it, the same as any other manufacturer.


Steve Hendrix
Title: Re: 16 Bit Myth
Post by: digitaldog on January 01, 2012, 10:36:04 pm
There's nothing to prove.

Of course there is, if marketing or others are going to boast about 16-bits being somehow superior to 12-bit or 14-bit captures. From the pages of discussions here, it is still unclear what a ‘true’ 16-bit capture implies. The quality may very well be there, I don’t question that. I question that it is the higher bit depth alone that produces this quality. Which is why some (myself included) question the need to market a higher bit depth, and then use the term “true” in the same context.

Now if you had two captures that were identical expect for bit depth and you could demonstrate the higher bit depth produced superior quality, you’d have few if any naysayers.
Title: Re: 16 Bit Myth
Post by: Guillermo Luijk on January 01, 2012, 10:38:28 pm
Question. Unless we had two identical camera systems, down to the lens and setup, but one was 14 bit, the other 16-bit, how could we know the extra 2 bits were at all useful?
This is simple, developing the same RAW file in its original 16bits, and developing it again with bit decimation prior to demosaicing: 15bits, 14bits, 13bits,... until we can see some damage on image quality. If IQ is damaged from 15bits, then those 16bits were useful, otherwise they were marketing bits.

Unfortunately this thread already reached 7 pages with 0 MFDB RAW files posted. But to be honest I think we don't need to do the test; noise measurements suggest those 16 bits are unnecesary in today's MFDB's so they would fall in the unuseful category.

An example, this is a particularly noisy camera (Olympus EP-1) tested at a particularly noisy ISO setting (ISO1600):

Original RAW file (12 bits):
(http://www.guillermoluijk.com/article/rawbits/ep1_1.jpg)

Severely decimated RAW file (8 bits):
(http://www.guillermoluijk.com/article/rawbits/ep1_2.jpg)

The second image even looks better! it's just an illusion due to some noisy pixels clipping to black, but the important conclusion is that the 12-bit is by no means better than the 8-bit decimated version, so the extra 4 bits were useless in this case.

Regards
Title: Re: 16 Bit Myth
Post by: Steve Hendrix on January 01, 2012, 11:10:10 pm
Of course there is, if marketing or others are going to boast about 16-bits being somehow superior to 12-bit or 14-bit captures. From the pages of discussions here, it is still unclear what a ‘true’ 16-bit capture implies. The quality may very well be there, I don’t question that. I question that it is the higher bit depth alone that produces this quality. Which is why some (myself included) question the need to market a higher bit depth, and then use the term “true” in the same context.

Now if you had two captures that were identical expect for bit depth and you could demonstrate the higher bit depth produced superior quality, you’d have few if any naysayers.


I'll say again, other than our inadvertent prioritizing of 16 bit as a top tier feature, I don't really see a lot of boasting going on anywhere. Perhaps there has been in the past, I don't know. Like I've said, I have never really concerned myself as much with how as with what. Check Leaf's website, there's barely a mention of 16 bit anything. Maybe in the past manufacturers have made a big deal out of it, I don't know. But is this really the case today?

I think part of what is at play here, is that 16 bit is being evaluated as a singular quality specification, which in Leaf's case, sort of is and sort of is not. It is more a part of the process which solves several problems for them in the digitization process and enables some enhancement, likely specific to the way that they produce an image with the sensors that they work with. If we all understand and accept that, then really, there is no issue here. That is unless, someone is trumpeting 16 bit as in and of itself, a verifiable feature of advantage, and that is not the case, at least it should not be, unless there is a qualitative analysis of how the 16 bit process for Leaf provides the competitive advantage and in what situations.

Absent that, it simply becomes a situation similar to how a floor is swept. I use a specific broom and have a specific technique for when I sweep the floor, and my wife has a different technique. She may use a different broom and call it 14 bit broom, I may use a 16 bit broom, and a different technique (for instance, I turn on 2 lights in a room and she only turns on 1), and this allows for a better result for certain things in certain situations. The end result from each may be a completely clean floor, but that is where the real difficulty lies, in determining who has the cleaner floor. From that standpoint, unless you believe I produce a cleaner floor, it doesn't matter that my broom is 16 bit, it is only the means to my end. What matters is the end.

I will also say that the process of how Leaf produces a digital file, and the role that their 16 bit process plays in it, is I believe, much more complex than most understand, judging from the discussion here.


Steve Hendrix
Title: Re: 16 Bit Myth
Post by: TH_Alpa on January 02, 2012, 05:12:25 am
Hi Steve,

Please note that my intervention was solely meant to give an historical information, but not at all meant to criticize or attack any manufacturer or dealer.

In this respect, it wasn't meant either to say that Sinar gave right and true/honest (or wrong) information, purposely or not, concerning products after the SB 54. I was speaking in this instance about the fact that Sinar did publish the SB 54 with 22 MPx as a 14 bit digital back, some other manufacturers too, some others with 16 bit.
I believe, from what I remember from back to 2002, that the 14 bit were meant for the sensor, not the A/D converter, but I could be wrong. I simply remember that the SB 54 does produces 16 bit files (for obvious reasons), and that I was wondering why others did mention 16 bit. My understanding at that time was that some manufacturers were speaking sensor while others where mentioning the A/D converter.
That is not to accuse anyone, nor to say that it was marketing reasons.

Interesting enough, and again for this 22 MPx back, some manufacturers did advertise the nominal ISO at 25, like Sinar, others at 50. Here also I can remember that when I was doing ISO tests with a Sinarback and another back given as ISO 50, both set to there given nominal ISO, the results in terms of density of the image were practically identical, when using the same f-stop. Again, this is pure information, not meant to criticize anyone. Beside this, there wasn't any ISO standard for digital backs, and there probably still isn't any standard.

Now all this being said, it is certainly not the dealers' mistake or responsibility for the publications content. They have the information from the manufacturers and rely on this one. Therefore I don't understand why CI is being criticized so strongly in this thread.

Best regards
Thierry

For Thierry -

Yes, Thierry, my friend, as a former employee of SinarBron, I can attest to the tradition of Sinar being very literal in their documentation. If the Sinarback 54H, being a Kodak sensor, is stated as 14 bit on the data sheet, is this referring to the native bit depth of the sensor, or the ADC? What about the Sinarback 54LV? This product used the same sensor as the Leaf Aptus 22, yet on the Sinar data sheet it states 16 bit per channel.
Steve Hendrix
Title: Re: 16 Bit Myth
Post by: Radu Arama on January 02, 2012, 06:39:35 am
Hi Erik and Happy New Year to all!

In the Focus Numerique test here: http://www.focus-numerique.com/test-1160/moyen-format-pentax-645d-bruit-electronique-12.html in the first third of the page just under the photos they wrote "Vous pouvez également télécharger les fichiers Raw (100-200-400-800-1600 ISO)" and those are links to DNG files used to show the noise throughout the ISO range.

Hope it helps,
Radu

Hi,

I wish more examples would be posted on the issue, I have seen very few actual images available as raw files. Miles Hecker had a few when he tested Pentax 645D, Imaging Review has some P645D, I found three raw images from P65+ available at Phase One (good demo, by the way) and two images from the Leica S2 from Diglloyd. That's about it ;-(

Tim Parkin has posted a very good review which is useful but it was more about film and now raw images.

Best regards
Erik


Title: Re: 16 Bit Myth
Post by: mediumcool on January 02, 2012, 08:58:36 am
I couldn’t be bothered reading all of the posts in this thread because, frankly, there seems to be a lot of angst being intermittently expressed; lots of barrows being shoved around in a not-always-polite way.

My perspective (and knowledge) is this: I have a “35” digital SLR that claims to use 12 bits per colour channel; I also have an Aptus back that claims to use 16 bits per channel.

I know, from experience, that I can pull enormous amounts of detail and colour from severely under-exposed 16-bit files, but the headroom of the “35” is much much less (highlights blow if I don’t expose it as if it were Kodachrome). Applies to C1 and Photoshop/Camera Raw, and any other RAW converter I have tried. My Panasonic GF1 seems worse again. Why? Dunno.

Is all the carry-on about the difference between 14-bit and 16-bit colour? Focusing on this alone does not help account for differing internal workflows, A-D converters, hardware and software. BTW, I shoot only RAW, even for happy snaps.

I suppose what I am trying say is “get out there and take/make the photographs you are capable of creating, with the equipment that both suits you and is affordable” rather than being obsessed by bit depth. I see obsessive gear behaviour elsewhere, where folks think only one camera brand [pick one!] can do a good job for them. When I was an adolescent, I wanted a Nikon; nothing else was good enough! How shallow. I am now nearing 60, and simply want to use the time I have left to make interesting and perhaps useful photographs, whether for monetary profit or for “pleasure profit”.

Phew!

Todd Rundgren and Darryl Hall (http://www.youtube.com/watch?v=1pLTD23l468) with a thought for the day, and New Year. 0.02.

Title: Re: 16 Bit Myth
Post by: cunim on January 02, 2012, 11:17:38 am
Sitting on my shelf I have an old 16-bit camera.  This thing is monochrome, has 30 um pixels (1K x 1K), reads out at 1kHz, and is cryogenically cooled by a gas refrigeration unit.  Cost was about $100K back in the day and yes, it delivered 16 bit precision with luminescent targets, careful calibration, and adequate consideration of system MTF.  Great for star fields or gene expression studies.  Useless for photography.  Point is, high precision imaging and photographic imaging are completely different beasts.  That's why this discussion keeps chasing its own tail.  There is nothing wrong with specifying high precision parts for a photographic imager, but I have never seen a photographic acquisition paradigm that delivers even 14 bits.
Title: Re: 16 Bit Myth
Post by: asf on January 02, 2012, 12:50:13 pm
After reading the pages following my original posting I see the futility of my query and apologize for pointing my finger at CI through their ad and subsequently by implication painting them and other dealers all with the same brush.

I had noticed this supposed 16 bit superiority claim come up quite often recently, almost always in a snobbish and/or misinformed manner, and was wondering why this one "myth" of MFDB continues. Then I saw that ad and wondered if perhaps if dealers were still possibly promoting this oversimplified and questionable viewpoint, mostly because I was surprised to see it included in writing from a dealer as highly esteemed as CI.

When I bought into MFDB and dropped that first $50k+ in one afternoon I didn't do it because of 16 bit anything. That didn't even come into it. But I do remember it being mentioned, and being told the 16 bit MFDB back captured a lot more info than 14 bit FF 35 cameras. Even some digital techs repeated that info to me.
After a while I learned more about how these things work and I started to notice when people would insist the extra 2 bits made a big difference in the amount of info captured and the quality of the file, usually while defending/promoting their MFDB choice. Somehow this idea that there is some "16 bit" quality in certain capture devices continues.

No, it doesn't matter in the end. My dealer didn't lie to me, most likely he repeated the sales info he knew. He's not a scientist or a technician, he's a salesman and a good guy. I'm still a customer. When I bought my first digital system I didn't know as much as I do now, my dealer knew more than me and didn't steer me wrong. Now I (think) I know more about some of this than he does and I see that he's a salesman and knows what he needs to know to sell the stuff.

Should dealers "know better" than to use sales talk like this? Should buyers "know better" than to believe all sales talk?

"16 bit" can mean many things. The various simplifications of these complexities leads to much of the confusion. Sales talk is simplification for the most part, esp in a small banner ad.

We're all lucky to have good dealers around like CI, people who stand behind what they sell and you can call when you have problems.
Title: Re: 16 Bit Myth
Post by: Stefan.Steib on January 02, 2012, 02:06:27 pm
I Think it should be stated that there are actually cameras available which do have an advantage by 16 bit processing:
these are e.g. made by Andor and are cooled down to -100 degr. celsius and do have 1 electron sensitivity, pixel sizes of 8-24 μm, max resolution of 1024x1024 and sensors mounted in vacuum. see here  http://www.andor.com/scientific_cameras/ixon_emccd_camera/897_single-photon/
at this stage there is enough information to use high bit - True 14 bit @ 10, 5, 3 & 1 MHz readout rate (16 bit @ 1 MHz) -see this excerpt from the brochure page 47

"....5. It is clear that the actual sensor Dynamic Range only exceeds 14-bits at 1 MHz, through either EM or a conventional amplifier. Therefore, it is at 1 MHz that we require an option to match this higher dynamic range output with a scientific grade, noise free 16- bit A/D digitization. The iXon3 is uniquely designed to do just that, making use of a real scientific grade A/D that is optimized for 1 MHz readout.
Note: There is a direct relationship between readout noise and maximum dynamic range at a given readout speed. Lower readout noise affords higher dynamic range....."

Hope this helps a bit, I think this proofs that there is usage for it. On the other hand it shows that for actual commercial photography chips it has close to none relevance.

Greetings from Munich
Stefan
Title: Re: 16 Bit Myth
Post by: bcooter on January 02, 2012, 03:17:16 pm
This type of conversation has gone on for about 8 years and honestly, it's all old school thought.

Maybe in the days when we were transitioning from film to digital did clients ask questions about file size, or photographers were looking for that holy grail of a digital camera that would perform like film, but today . . . in today's world I don't think any of this is relevant.

Consequently, 8 years ago there were  a whole world of digital expert camera dealers that offered specialized service that was needed because everyone was so new to the process.  Same with digital techs and consultants.

Today it's different.  Sure you need a good dealer and Dave Gallagher's Capture Integration is a good company, though today there are a lot of good dealers that will stand behind their products.  Samy's in LA, Digital Transitions in New York and if I really looked around there are probably another half dozen or so in the U.S. that know their stuff.  I know because I've bought from most of them with good results. 

The difference with CI is they have more of an online presence than most dealers and I think some people appreciate it. I assume it must enhance CI's sales  or they wouldn't expend the effort, though as we've seen sometimes it goes a little sideways.

That's just the way the world works.  Not everyone is always going to like everyone.

To some that matters, but most professionals could care less about forums, bit depth, file size or how to tether a camera to a computer because the digital process is now standard fare and we all know how to do it.

You can take a Nikon, Canon, Hasselblad . . . almost any professional still camera and if the content is compelling any client at virtually any level will eagerly accept the files if the work is done professionally.

If your a professional you know your stuff and don't need to make a dozen phone calls to figure out the firewire cord is broken.

My suggestion is buy what you want, enjoy what you have and don't worry about the camera, because the only really great camera is the one you like.

IMO

BC

Title: Re: 16 Bit Myth
Post by: deejjjaaaa on January 02, 2012, 04:23:34 pm
I know, from experience, that I can pull enormous amounts of detail and colour from severely under-exposed 16-bit files,

probably the infamous "6 stops of DR advantage" (c) Mark Dubovoy

 ;D
Title: Re: 16 Bit Myth
Post by: fredjeang on January 02, 2012, 05:43:21 pm
Myths or not, 8, 10, 14, 16 or 50000 Bits at -200 on the surface of Titan at 1 Mhz or at 0.00003,

It seems that some don't enjoy very much photography. What are you looking for ? I'd like to know.

Like if our hobby-profession had to be monocam.

Enjoy the diversity of owning any kind of equipment, regardless of numbers, graphics, Bits, resolutions etc...simply enjoy photographing with the vastest range of different gear you can afford.

That's how I see the thing.

Look at Michael Reichmann. He touches everything, from MF to minicams like the Nex. But more importantly, he enjoys and produce images with whatever.


I'm very much into micro technology now, small cams. I beleive that they are the future. But I own as much cameras, systems and lenses my budget can afford at the moment. I'd get a Phantom, a Leica S, a Nex 7...etc...whatever.


Enjoy life. It goes as fast as the speed of light. When you'll get older, do you want to remember of you the long technical discussions you had to try to prove something? or the day you've been banned from the forum after insults? or won't you prefer remember when you where enjoying and having fun with your cameras and producing imagery you were proud of?

http://www.youtube.com/watch?v=OpIZ_S38A_0
Title: Re: 16 Bit Myth
Post by: Rob C on January 03, 2012, 01:32:58 pm
Fred-

Thanks for the link. I have seen the series before, in the dim and distant past, but I had also forgotten just how innovative Avedon really was. I seem to keep believing that it all changed in the late 50s, but of course, it was well before that.

I also think it's very simple for somebody in his position (then!) to make easy remarks about not letting anything go, etc. If I had as much hair I might even feel tempted to make such statements myself... and believe them. Unfortunately, not many of us manage to keep going at that pace and at that level - if we even get near to it in the first place. It's often thought that a photographer gets ten good years - maybe fifteen if lucky. Nothing to do with him, everythig to do with his clients and the new, cheaper, exciting offers that come into the office every day. How do you fight off never ending competition without, one day, falling down with exhaustion or burnout?

Watching, I feel this urge to lose fifty years and start all over again... yeah, right.

But on the other things you mention – yes, it should be about doing not dreaming.

A single camera would be nice, and for the keen amateur it’s a reality, or at least a possibility. The pro doesn’t have the luxury unless he’s very specialized; even now, retired, I can’t be happy with anything much that I have, not because it isn’t good – I think it is – but age and simplicity demand that I go light, and that’s the glitch: what’s the right light? And without breaking the bank? One thing I guarantee: the answer is not a cellphone!

It could be an M6, but it isn’t going to be for the simple reason of cost. Yes, I could get one, but why when It won’t ever get the chance to pay for itself? Was a time when bank interest was real, and you could budget your life; now, you watch the numbers get lower on every statement because damn all comes in – it all goes on bloody bonus packages to other people.

Ciao -

Rob C
Title: Re: 16 Bit Myth
Post by: amsp on January 03, 2012, 03:45:05 pm
A couple of years ago I went and saw a huge Avedon exhibit and they were showing that documentary in the museum cinema. I have to say that Avedon's prints in real life are nothing short of breathtaking. I also snagged a beautiful limited-edition book that was made specifically for that that exhibition that I treasure dearly. All this tech-babbling is about as far from actual photography as you can get.

Title: Re: 16 Bit Myth
Post by: hjulenissen on January 03, 2012, 04:36:38 pm
All this tech-babbling is about as far from actual photography as you can get.
Forum section:
Equipment & Techniques
Thread title:
16 Bit Myth

Did you click here expecting to discuss composition? How do you suggest that the thread topic can be discussed without any "tech babble"?
It seems that some don't enjoy very much photography. What are you looking for ? I'd like to know.
I think it is quite possible to enjoy photography AND try to understand how the tools are working. For some it might even improve their photography if they are able to see through marketing and "guru babble"*) and use their tools efficiently.

-h
*)Claims that are supposed to be accepted at face value due to the experience/status/equipment/... of the source
Title: Re: 16 Bit Myth
Post by: Guillermo Luijk on January 03, 2012, 06:00:47 pm
I'm reading and reading this promissed land, hopping I'd get a bonus, a gain, an information that could enhance my skills and techniques, an hidden revelation about high-end equipment that we didn't knew all before. I'm trying to find it, but I can't.

It seems as if the entire world's objective should be to satisfy your needs. Are you planning to give it something in return rather than complaints?
Title: Re: 16 Bit Myth
Post by: BernardLanguillier on January 03, 2012, 06:31:38 pm
So we have the triade. The top 3 of periodic topics in wich we all learn and progress a lot.
In importance order for the photographer.
- MF vs DSLR
- the DR topics
- and the Bits and resolution ones in wich this one belongs.

I totally agree that skills are the most important thing and also that there is no reason why we all should use the same camera. The question triggering all these discussions is "what camera should it be?".

There are a series of factors underlying this discussion that can make it very frustrating for some people and trigger some comments/discussions that don't appear to belong to the realm of adult conversation.

- We have been thoroughly educated to be consumers, which means to prioritize our current wants on top of our possible future needs. In other words, we are manipulated into giving up our own free will to manage our wealth, pushed to give it away to aquire something that is likely not to deliver the real world benefits we once imagined,

- Internet has given the possibility to a huge number of people to express themselves without feeling the responsibility that authors used to feel before publishing something that would be read by a large audience. Many people around also don't use their real name for a variety of reasons. Another of the side effects of this is to expose to more eyes the old reality that some people are so much richer than others that whatever they buy doesnt make a ripple in their bank account, which means that they may not care about the performance of their equipment as much as someone for whom it is going to be the only camera they buy in the next 5 years.

- Underlying all this, we have the digital revolution that perhaps impacted photography more than any other segment. Besides huge benefits, it has resulted in a huge inflation of the price of equipment, probably unheard of in any other segment oh human activity I am aware of. It has also resulted in an acceleration of the performance of cameras such that most DSLRs are in fact good enough in terms of image quality to handle many jobs that used to be the domain of MF film only.

In this context, what is true, what is a myth? What do we need, what do we want? Can a real man shoot landscape without a MFDB?  ;D

This can make it very confusing for some new entrants to decide how much they need to invest in a camera to get capability he/she needs. In my view, this also explains the somewhat violent reactions of some people around here who are reacting not to just 16 vs 14 bits, but to the whole package of things underlying this discussion.

Cheers,
Bernard
Title: Re: 16 Bit Myth
Post by: fredjeang on January 03, 2012, 06:51:13 pm
I totally agree that skills are the most important thing and also that there is no reason why we all should use the same camera. The question triggering all these discussions is "what camera should it be?".

There are a series of factors underlying this discussion that can make it very frustrating for some people and trigger some comments/discussions that don't appear to belong to the realm of adult conversation.

- We have been thoroughly educated to be consumers, which means to prioritize our current wants on top of our possible future needs. In other words, we are manipulated into giving up our own free will to manage our wealth, pushed to give it away to aquire something that is likely not to deliver the real world benefits we once imagined,

- Internet has given the possibility to a huge number of people to express themselves without feeling the responsibility that authors used to feel before publishing something that would be read by a large audience. Many people around also don't use their real name for a variety of reasons. Another of the side effects of this is to expose to more eyes the old reality that some people are so much richer than others that whatever they buy doesnt make a ripple in their bank account, which means that they may not care about the performance of their equipment as much as someone for whom it is going to be the only camera they buy in the next 5 years.

- Underlying all this, we have the digital revolution that perhaps impacted photography more than any other segment. Besides huge benefits, it has resulted in a huge inflation of the price of equipment, probably unheard of in any other segment oh human activity I am aware of. It has also resulted in an acceleration of the performance of cameras such that most DSLRs are in fact good enough in terms of image quality to handle many jobs that used to be the domain of MF film only.

In this context, what is true, what is a myth? What do we need, what do we want? Can a real man shoot landscape without a MFDB?  ;D

This can make it very confusing for some new entrants to decide how much they need to invest in a camera to get capability he/she needs. In my view, this also explains the somewhat violent reactions of some people around here who are reacting not to just 16 vs 14 bits, but to the whole package of things underlying this discussion.

Cheers,
Bernard


You know what really excites me recently?  The Red Epic and This Nikon V1. (footage of the V1 is far from being ridiculous. It's a high performance camera).

I might actually get the V1 before I got the Red 1...

Your points are perfectly valid. (erased my previous post because it was a bit provocative and this thread doesn't need that, imo)

Cheers.
Title: Re: 16 Bit Myth
Post by: BernardLanguillier on January 04, 2012, 02:25:44 am
You know what really excites me recently?  The Red Epic and This Nikon V1. (footage of the V1 is far from being ridiculous. It's a high performance camera).

I might actually get the V1 before I got the Red 1...

I have been using a J1 for some time now, but need to explore more its capabilities as a video camera. It works very well for me as a compact camera.

As far as the Red goes... that is far above budget for me. I am also not sure that 4K makes sense at this point of time for non pro applications, but the files sure look amazing.

Cheers,
Bernard