Well, I'm not nearly as technically oriented as some of you guys, and I don't pretend to precisely understand these compression processes and all the circumstances where image flaws might become apparent during post processing, as a result of the lossy compression used.
However, what seems odd to me is that Sony should market an upgrade to the A7R which boasts the first BSI full-frame sensor, and yet still use the same old, lossy compression algorithms that the A7R uses.
Isn't the entire purpose of the BSI design to improve image quality, particularly dynamic range? Since the DR of 36mp full-frame sensors, as in the A7R, D800 and D810, is already close to 14 stops at the pixel level, wouldn't a BSI sensor push that limit even closer to a full 14 stops?
Out of curiosity, I just checked the DXOmark DR results for the A7R, D800E and so on. Whilst the D810 has a worthwhile advantage of around 2/3rds of a stop at its base ISO of 64, which is a lower ISO than the A7R's base ISO, the D800 and D800E have a fairly similar DR to the A7R, at their base ISOs of 100. The A7R is only a relatively insignificant 1/4th of a stop worse.
I wonder if Sony has lost a marketing opportunity here. In addition to announcing so many innovative features in the A7R2, the BSI design being one of the major features, surely Sony could have added to their marketing spiel, something along the following lines:
"In view of our breakthrough technology in being the first company to introduce a full-frame 35mm BSI sensor, we have considered it advantageous to our precious customers who have the ultimate, image-quality discernment, to also introduce the option for a full 14 bit uncompressed, RAW image pipeline, so that the full potential of our BSI design can be appreciated." (Or should I have written B ull S h I t design.)