Pages: 1 2 3 [4] 5 6   Go Down

Author Topic: Sony Kicks Butt  (Read 423425 times)

Ray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 10365
Re: Sony Kicks Butt
« Reply #60 on: June 15, 2015, 08:37:11 pm »

So, I'm reading this to say that the application of the inverse tone curve during raw generation pushes 14 bit sensor values down to 11 bit stored raw values, which are then inflated back to 14 bit values during capture by applying the tone curve.

It looks like the tone curve is not a continuous function, but a set of discrete multipliers.  It favors highlight compression.

I'm wondering, could this be done on the sensor?  It would certainly make moving frames off the sensor faster, and it can be done in a single pass on the fly.  That would certainly make it harder to change.

[Sometimes I think Sony loves compression.  Remember the minidisc?  They must have a compression laboratory.]



I feel I'm not qualified to comment on the theoretical significance of these computer processes. I'm not a computer scientist. I'm more concerned with any consequential and noticeable limitations on image quality that might become apparent during the processing of my RAW images in Photoshop.

I always process my images, sometimes cropping extensively to make an image more appealing. For all I know, this concern about the convoluted compression techniques used by Sony might be a storm in a tea cup, except for very specialised photographers such as Astrophotographers. When I compare the DR of the A7R with that of the Nikon D800 at DXOMark, I see very little difference at base ISO and most other ISOs. However, at ISO 200, the A7R is half a stop worse, which is at the threshold of significance. For all I know, that result might be just an anomaly or quality-control issue.

I see this issue as one of expectations. The Samsung NX1 generated a lot of interest because it had the first cropped-format BSI sensor. There was an expectation that DR would be stellar. I was contemplating ordering that camera for myself, but decided to wait until the DXO results were out.

Now that the DXO results are out, I'm somewhat puzzled by the lack of a stellar DR. Despite having a BSI sensor, the DR of the NX1 (comparing normalised images) is 0.72 EV lower than the much older Nikon D7000, and a whopping 1.44 EV lower than the 'only slightly more recent' D7200, which doesn't boast a BSI sensor.

Logged

TohN

  • Newbie
  • *
  • Offline Offline
  • Posts: 1
Re: Sony Kicks Butt
« Reply #61 on: June 16, 2015, 07:42:33 am »

Two questions/thoughts about the dynamic range given the discussion:

1) This is probably me not understanding but: if the RAW file of most manufacturers is indeed 14 bits, then how are some sensors being reported as having > 14 DR in DXOMark?  I can't see how that much data can be even measured.

2)

Quote
"There is nothing unclear about the bit depth of the Sony RAWs. The encoding algorithm used for the Sony a7/a7R RAWs takes the original 14 bit values and maps them to 11 bit space (a bit more on this below) and then the 11 bit values are further compressed into 8 bits per pixel by delta-encoding them in fixed-length 16 pixel blocks as follows: 11-bit minimum value for the block, 11-bit maximum value for the block, 4-bit index of the minimum value in the block, 4-bit index of the maximum value, 14 7-bit deltas from the minimum value for the other 14 pixels. This uses 11+11+4+4+(14x7) = 128 bits per 16 pixels or 8 bits per pixel.

So this reads to me like there's a weird compression algo that limits precision/contrast between certain pairs of pixels but not others.  For example, if there's one 16 pixel block that's at 2^14 bright, and a neighboring one that's 2^0 dark, this would be accurately rendered.  If the DXOMark DR tests were done by taking a big sheet of black paper next to a big sheet of white paper, we'd see 14 bits of DR (or more?  I don't understand if the 11 bits of value + 4 bits of index end up giving 15 bits...).  However, *within* a 16 pixel block, it looks like certain pairs of pixels can only be 7 bits apart.  So the micro contrast/precision might be limited.  I'm having trouble understanding the particular encoding of the "7-bit deltas" as I can't tell if the most or least significant bit values are being recorded.  

On the other hand, maybe in most situations this might not matter: a 16 pixel block is presumably 4x4.  Each unit of 4 pixels is appx 20 microns (please check me) which comes out to about 50 line pairs/mm.  Within each grouping, it'd be 50+ lp/mm.  For most lenses, the MTF contrast ratio is below 50% here, so the lens might have eaten a couple of bits anyway?  Please do check my math!

Addendum:
Digiloyd's star trails example is a perfect example of local high contrast, and that apparently shows large compression artifacts.
« Last Edit: June 16, 2015, 08:34:06 am by TohN »
Logged

jwstl

  • Full Member
  • ***
  • Offline Offline
  • Posts: 149
Re: Sony Kicks Butt
« Reply #62 on: June 16, 2015, 11:24:59 am »

The Samsung NX1 generated a lot of interest because it had the first cropped-format BSI sensor. There was an expectation that DR would be stellar. I was contemplating ordering that camera for myself, but decided to wait until the DXO results were out.

Now that the DXO results are out, I'm somewhat puzzled by the lack of a stellar DR. Despite having a BSI sensor, the DR of the NX1 (comparing normalised images) is 0.72 EV lower than the much older Nikon D7000, and a whopping 1.44 EV lower than the 'only slightly more recent' D7200, which doesn't boast a BSI sensor.



The DXO numbers also show the NX1 outperforms all of the D7xxx series in low light performance (slightly in some cases). So, which is supposed to benefit most from BSI: DR or Noise?
Logged

michaelmph

  • Newbie
  • *
  • Offline Offline
  • Posts: 3
Re: Sony Kicks Butt
« Reply #63 on: June 16, 2015, 11:57:14 am »

It certainly looks very tempting though I have been slightly put off by comments on other sites about Sony's poor customer care/ relations. Also, what do you make of Sony's compressed RAW files? Is this a deal breaker? Having been heavily invested in Canon L glass over recent years, the fact that the Sony can use the best of those lenses with an adapter is very impressive. Very pleased that they've sorted out the noisy/ vibrating shutter.
Logged

LKaven

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1060
Re: Sony Kicks Butt
« Reply #64 on: June 16, 2015, 03:44:34 pm »

Two questions/thoughts about the dynamic range given the discussion:

1) This is probably me not understanding but: if the RAW file of most manufacturers is indeed 14 bits, then how are some sensors being reported as having > 14 DR in DXOMark?  I can't see how that much data can be even measured.

2)[...]

So this reads to me like there's a weird compression algo that limits precision/contrast between certain pairs of pixels but not others.  For example, if there's one 16 pixel block that's at 2^14 bright, and a neighboring one that's 2^0 dark, this would be accurately rendered.  If the DXOMark DR tests were done by taking a big sheet of black paper next to a big sheet of white paper, we'd see 14 bits of DR (or more?  I don't understand if the 11 bits of value + 4 bits of index end up giving 15 bits...).  However, *within* a 16 pixel block, it looks like certain pairs of pixels can only be 7 bits apart.  So the micro contrast/precision might be limited.  I'm having trouble understanding the particular encoding of the "7-bit deltas" as I can't tell if the most or least significant bit values are being recorded.  

On the other hand, maybe in most situations this might not matter: a 16 pixel block is presumably 4x4.  Each unit of 4 pixels is appx 20 microns (please check me) which comes out to about 50 line pairs/mm.  Within each grouping, it'd be 50+ lp/mm.  For most lenses, the MTF contrast ratio is below 50% here, so the lens might have eaten a couple of bits anyway?  Please do check my math!


1) In "print" mode DR, the full image is reduced to an 8x10 output size, which requires downsampling.  Downsampling consolidates information from many pixels into one pixel, increasing the effective bit precision of the resulting pixel.  In that way, one is able to exceed the DR of the native A-D converter.

2) I think that the 16-pixel blocks may be consecutive pixels along the X-axis in this case.  I suspect that this is a compression scheme intended to be done on the sensor, which would explain why this was not just a simple feature request for Sony engineers to do in firmware. 

Remember that 14-bit samples coming out of A-D are squished to 11 bits through a simple division, and then re-inflated back to 14 bits on the capture end -- with some losses.  There are more losses in the highlights, because the squish function squishes more in the highlight tier than in the shadow tier (non-linearly).  The 11 bit sample represents 14 bits of DR.  It's the values in between that have been squished out. 

Ray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 10365
Re: Sony Kicks Butt
« Reply #65 on: June 16, 2015, 11:15:26 pm »

The DXO numbers also show the NX1 outperforms all of the D7xxx series in low light performance (slightly in some cases). So, which is supposed to benefit most from BSI: DR or Noise?

That's not my interpretation of the graph. Below is the DXOMark graph comparing the DR of the NX1, the D7000 and the most recent Nikon upgrade, the D7200.

The NX1 appears to have a slight, but probably noticeable DR advantage at ISO 12,800, compared with the much older D7000. That advantage, according to the test results, is 0.45 EV, which would be worth having at base ISO, or even ISO 800. But ISO 12,800?? I would never bother with such degraded images, so for me that's no advantage.

Also, the D7200 whips the NX1, regarding DR, at all ISOs up to and including 12,800.  At ISO 6400, the highest ISO I would ever consider using, the D7200 has over a full stop better DR than the NX1. Now that's what I call significant.  ;)

By the way, the DR specification also relates to noise, but noise in the shadows expressed in terms of Exposure Value. The other DXO noise test, SNR 18%, refers to the noise levels in the mid-tones. The performance at 18% grey is approximately the same for all 3 cameras, with the D7200 having an insignificant edge at base ISO.

To get a worthwhile improvement in SNR at 18% it seems one has to increase sensor size to full-frame.



Logged

foxhole510

  • Newbie
  • *
  • Offline Offline
  • Posts: 10
Re: Sony Kicks Butt
« Reply #66 on: June 17, 2015, 06:52:42 pm »

 Well written and spot on. BTW as of today 6/17 B&H is taking orders. Why do I get the feeling everyone knows already. Steve
Logged

bokehcambodia

  • Jr. Member
  • **
  • Offline Offline
  • Posts: 61
    • bokehcambodia
Sony RAW a firmware fix
« Reply #67 on: June 18, 2015, 03:43:59 am »

RAW is coming...

KM: Sony RAW is compressed, not uncompressed. But if we're getting a lot of requests for it, we should make such a kind of no-compression raw. Of course we recognize that. But I cannot give you a guarantee when we're going to fix or not fix.

DE: Right. When you're going to address that, yeah.

KM: Sure, sure. And so we recognize the customer's requirement, and actually we are working on it.

DE: So it's something that you're aware of. I'm sure that the image processing pipeline is optimized for the way that it is now, but it seems to me that, while it might involve some trading off some performance, that it could just be a firmware change. Could it? Would you be able to provide uncompressed raw as a firmware update, or would it require new hardware?

KM: Right, yes. So... not hardware.

http://www.imaging-resource.com/news/2015/06/16/sony-qa-the-must-have-sensor-tech-of-the-future

Ethan

  • Newbie
  • *
  • Offline Offline
  • Posts: 2
Re: Sony Kicks Butt
« Reply #68 on: June 23, 2015, 11:08:30 pm »

It might be worth noting that Dynamic Range is based on two factors: electron well capacity of the photosite, and Signal to Noise Ratio as determined by heat retention and the cleanliness of electronics in comparison to the well capacity.

BSI doesn't make for a larger well, just more light gathering which expands the SNR.

I'd like to see a few things in the future, one of which is a lithography process that allows for larger well capacity. The primary issue then becomes slower heat disipation and its accompanying noise on the sensor.

Anyway, I like the nerdum you guys are throwing into the discussion. I feel at home here.
Logged

dreed

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 1715
Re: Sony Kicks Butt
« Reply #69 on: June 24, 2015, 03:14:46 am »

Logged

BrianVS

  • Full Member
  • ***
  • Offline Offline
  • Posts: 164
Re: Sony Kicks Butt
« Reply #70 on: June 26, 2015, 07:24:01 am »

http://www.rawdigger.com/howtouse/sony-craw-arw2-posterization-detection

The above link gives a good overview of the Sony compression scheme. Basically convert 14-bits to 11-bits using a curve, probably done via look-up-table then compute row-wise differences and store as 7-bit offsets. It basically compresses each pixel to 8-bits, behaves line an 11-bit curve most of the time unless there is a lot of contrast within 32-pixel strips. Then it can go very bad. The average noise will be lower, peak-to-peak noise will increase dramatically based on image content. Pushing shadows in post-processing will cause the peak-to-peak noise to stand-out.

Too bad. Looks like a nice sensor. Wasted trying to get a 2:1 compression scheme. Idiots.
« Last Edit: June 26, 2015, 07:26:54 am by BrianVS »
Logged

stevesanacore

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 267
Re: Sony Kicks Butt
« Reply #71 on: June 26, 2015, 07:39:53 pm »

Michael,
 Everything but usb3. usb2???? So tethering with a 42 mp file really isn't an option unless one frame per 3 minutes in acceptable. Once again, so close yet so far.

Not sure if anyone else has responded to this but I've been shooting with the A7R on jobs for months, and while tethered it downloads raw files  quicker than my D800E with USB3. USB2 is a non-issue. In fact the connection of the micro USB connector is more secure than any USB2 or USB3 connector I've ever used. Always running C1Pro on my Macbook Pro using tether-tools USB extension with a standard USB cable into the camera.
Logged
We don't know what we don't know.

stevesanacore

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 267
Re: Sony Kicks Butt
« Reply #72 on: June 26, 2015, 07:47:49 pm »

http://www.rawdigger.com/howtouse/sony-craw-arw2-posterization-detection

The above link gives a good overview of the Sony compression scheme. Basically convert 14-bits to 11-bits using a curve, probably done via look-up-table then compute row-wise differences and store as 7-bit offsets. It basically compresses each pixel to 8-bits, behaves line an 11-bit curve most of the time unless there is a lot of contrast within 32-pixel strips. Then it can go very bad. The average noise will be lower, peak-to-peak noise will increase dramatically based on image content. Pushing shadows in post-processing will cause the peak-to-peak noise to stand-out.

Too bad. Looks like a nice sensor. Wasted trying to get a 2:1 compression scheme. Idiots.

What motive would Sony have to cripple the files in this manner? They are leaders in the video world so they certainly know what they are doing. Wouldn't this be an easy fix with a firmware upgrade? They must realize that it's an issue at some point which professionals would be concerned with.

Logged
We don't know what we don't know.

HansKoot

  • Newbie
  • *
  • Offline Offline
  • Posts: 20
    • HKPHOTO
Re: Sony Kicks Butt
« Reply #73 on: June 27, 2015, 07:23:15 am »

As I read, I see that some of the participants here reject the camera because theoretically its not possible to get the max from the raw files. Also issues like no gps (a battery drainer in my opinion, that already falls short) and USB 2.0 (kind of a miss indeed, but how much) feel to me a bit blown up. On dpreview i only did find one example of the limitation in the Raw files.. well...I was not seriously impressed. I did not find more, but maybe there are, I am open to see them.
Then, Sony seems to work seriously on it, and I believe they will indeed build a firmware update.

Personally I prefer to look what this camera will bring me. And I feel thats a lot, in fact more then I saw in the last years from the established brands (no need to write down the list again). I shot six years with the 5D2 with great satisfaction and last May I finally sold it. I bought a D800 that i knew I would only use temporarily. The Sony appeared at the right time for me, I am intending to buy it and i think it will do the job another 5-6 years for me again, probably till the next 'game changer'. I will see the test results before ordering one. I am not in a rush.

 :) Having said this I really enjoy the discussion here (and learned from it), and am very much convinced this will help to push the borders for the manufacturers (here sony)  once more. We all profit from that.

But, to finish, notify me please when the perfect camera has arrived, till that moment I will probably just wander around shooting with this or another "Idiots" camera.  :D
Logged
"Its better to create something that others criticize than to create nothing and criticize others" (Ricky Gervais)

BrianVS

  • Full Member
  • ***
  • Offline Offline
  • Posts: 164
Re: Sony Kicks Butt
« Reply #74 on: June 27, 2015, 10:43:00 am »

What motive would Sony have to cripple the files in this manner? They are leaders in the video world so they certainly know what they are doing. Wouldn't this be an easy fix with a firmware upgrade? They must realize that it's an issue at some point which professionals would be concerned with.



It may not be just firmware required to fix this- Looking at the 11-bit curve in the rawdigger link, the A/D converter might be just 11-bits with a non-linear scale. The a/d numbers might represent a more coarse quantization as they values get higher. Several CMOS sensors use this type of algorithm. The 7-bit delta function is most likely a firmware rev. I am not sure about the A/D as technical specifications are not available for the sensor.

In any event, this should be considered as an 8-bit compression scheme as it is. Firmware might get you to 11-bits or 14-bits, hard to say.

Sony was good at making TV sets and analog video cameras. With this camera- they do not understand Digital representation of imagery.

I'm mad at the idiot that decided to cripple a great best-in-class sensor with this really bad compression algorithm. All they needed to do was store full-value pixels followed by a delta for the next pixel stored as a byte with a "reserved escape". Use -127:127 for the delta, and 'FF'x to signal that a full raw value follows. You get almost 2:1 compression with some rare cases of storing full values. You do not lose data. At least store what the A/D converter reads out, instead of playing musical chairs.

It could be that Nikon or Canon has a "mole" in Sony that is making these stupid decisions. So it might not be an idiot making these decisions, but someone getting paid by the competition.

This is the Ranting Forum, right? Are Conspiracy Theories allowed?
« Last Edit: June 27, 2015, 11:12:11 am by BrianVS »
Logged

Ray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 10365
Re: Sony Kicks Butt
« Reply #75 on: June 27, 2015, 02:36:05 pm »

It may not be just firmware required to fix this-


I've wondered about this. If the issue is just a matter of a firmware upgrade, surely Sony would have taken the opportunity when announcing the marvellous new features of the A7RII, to boast about a new lossless 14 bit RAW mode which they could have claimed would take full advantage of the improved performance and light-gathering capability of its first full-frame BSI sensor.

Surely this could have been a big advertising opportunity for them, even if in reality the improvement in image quality for most users of the camera, who probably shoot jpeg anyway, would not be noticeable.
Logged

Telecaster

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3686
Re: Sony Kicks Butt
« Reply #76 on: June 27, 2015, 03:17:17 pm »

Let's face it: the vast majority of A7r2 buyers won't give a frak about any of this stuff. I'm a geek and I get it from a technical standpoint…and I still don't give a frak. If I don't see issues with any of my photos—and with the A7r I don't—there's nothing of real-world significance to get worked up about. In fact there's plenty of drama in the outdoors, due to Michigan's ultra-crap "summer" weather, I'd rather focus on. So off I go!

-Dave-
Logged

Iluvmycam

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 533
Re: Sony Kicks Butt
« Reply #77 on: June 27, 2015, 03:30:30 pm »

Yup. Sony missed a few boats...

– no uncompressed raw

– no GPS

– no USB3 tethering

But, my point is not that they made the perfect camera, but hat they hit all the major bases this time round.

Michael

None of these are of use to me. If it had a shutter speed dial I may have bought one.
Logged

Manoli

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2296
Re: Sony Kicks Butt
« Reply #78 on: June 27, 2015, 06:09:37 pm »

It may not be ... might be just ... might represent ... is most likely ... I am not sure about the  ... should be considered as .... Firmware might get you ... hard to say.

Sounds like a great basis on which to make an informed decision.

It could be that Nikon or Canon has a "mole" in Sony that is making these stupid decisions. So it might not be an idiot making these decisions, but someone getting paid by the competition.

Classic.

Logged

BrianVS

  • Full Member
  • ***
  • Offline Offline
  • Posts: 164
Re: Sony Kicks Butt
« Reply #79 on: June 27, 2015, 07:02:35 pm »

Since Sony does not publish details of their camera: what is strange about the 11-bit compression function as shown by Rawdigger's website is that it is not a smooth curve, but is line-segments with a few steps. If this was all firmware, the smoothest curve would be a 22-bit to 11-bit square root function. Take the original 14-bit Raw value, bit shift left 8 places, then take the square-root. Use this in a Look-up-Table in the firmware. Fast, easy, nice point-spread function. This is very different from what Rawdigger claims that Sony is using. The Sony curve: looks like set points in a non-linear A/D converter. That usually means the hardware is limited.

What I liked about Leica when Kodak did the sensors- the technical data sheets for the sensors were published. No need to guess at what the camera was doing.

http://www.onsemi.com/pub_link/Collateral/KAF-18500-D.PDF

These days Leica will not even print what manufacturer makes their sensors.

Technical Data Sheets are required for answers, most manufacturers don't want to tell you how it works and hope customers do not notice the shortcomings.

But this is the Rant section of the forum. Sony Kick Butt? Mostly the customers that believe the camera produces a 14-bit image. Most users would be better off to stick with JPEG, the camera probably uses the uncompressed 11-bit curve to produce the Jpeg before destroying it with the 7-bit delta function. So- Sony is using an 8-bit scheme that might get a little better if they dump the delta function, might get as good as 11-bits and the Nikon Lossy-Compressed NEF algorithm.

Sony could easily inform us of how they do compression in their camera, then we could all make an informed decision. Until then, only the idiot/mole at Sony knows why they cripple the output from the first BSI full-frame sensor.
« Last Edit: June 27, 2015, 07:20:54 pm by BrianVS »
Logged
Pages: 1 2 3 [4] 5 6   Go Up