Great, we're making progress from the whole scientific world is wrong, to only 2 persons.
Look, your understanding of luminance and Bayer sensors is all wrong and is in direct contradiction to the rest of the world, scientific or otherwise. You can choose to live in this ignorance, or you can choose to expand your knowledge of this area.
So you are claiming that by increasing the sampling interval for each color to every-other-sensel, instead of each sensel position, there is no effect on resolution? So we're back at the whole industry is wrong now? Seems some evidence is due, finally.
By all means, enlighten us.
I never said any such thing. I am claiming nothing more than the fact that sampling interval affects resolution and that the sampling resolution of luminance in a Bayer sensor is much less than that of the sampling resolution of a monochrome sensor of the same pixel size. You are the one claiming that a Bayer sensor has "almost the same Luminance resolution (only a few percent loss) compared to a monochrome capture". This is a direct quote from your previous post here.
And, if you want to be enlightened, try reading the original patent from Eastman Kodak (U.S. patent 3971065) in the words of Bayer himself. It would seem that Eastman Kodak has represented a respectable portion of the photographic industry. Don't you agree that maybe the U.S. Patent Office, Eastman Kodak and the rest of the photographic industry got it right, and that perhaps you, Bart van der Wolf, got it wrong?
Luminance is captured at 100% of the sensel positions, bands of color are captured in line with the CFA arrangement. Of course only that part of luminance that penetrates the CFA is captured and contributes to image forming, that's why more exposure is needed than monochrome capture without filters.
All bogus claims by you. Luminance is NOT
captured at 100% of the sensel positions as you say. To prove you wrong, I cite the words of Bayer as found in the U.S. patent that I referenced above:
under SUMMARY OF INVENTION, 2nd paragraph, lines 28-34,
"By arranging the luminance elements of the color image sensing array to occur at every other array position
, a dominance of luminance elements is achieved in a pattern which has …"
And, Bayer goes on to make explicit his claims about luminance and its relation to the green region in column 6, beginning with line 21,
"What is claimed is:
1. A color imaging device comprising an array of light-sensitive elements, which array includes at least (1) a first type of element sensitive to a spectral region corresponding to luminance, (2) a second type of element sensitive to one spectral region corresponding to chrominance, and (3) a third type of element sensitive to a different spectral region corresponding to chrominance, the three types of elements occurring in repeating patterns which are such that over at least a major portion of said array luminance-type elements occur at every other element position
along both of two orthogonal directions of said array.
2. A device in accordance with claim 1 where in said luminance-type elements are sensitive in the green region of the spectrum, and the two types of chrominance elements are sensitive in the red and blue regions of the spectrum, respectively ... "
The earlier remarks/claims have to do with your claims about demosaicing, just like in the beginning of your reaction, "the interpolation process used to estimate the missing image pixels". I'll throw in some empirical evidence about that, namely that luminance resolution is only impacted by a few percent by the demosaicing process:
Nothing fancy, it's just a simple page I threw together almost 7 years ago, to proof some nonsense statements wrong. Who could have thought it would still be needed what seems like eons later. Oh well.
Again, nothing but bogus claims by you trying to be represented as something scientific; however, it is not even close to being such. The existing scientific literature alone on the topic of demosaicing since the Bayer sensor was introduced in 1976 amounts to hundreds of papers in many different languages. Just for concreteness, I will cite one here for you:
"New Edge-Directed Interpolation", by Xin Li and Michael T. Orchard, IEEE Transactions on Image Processing, Vol. 10, No. 10, October 2001."
In this paper (and many many others), you will find explicit details about the variation of image acuity resulting from different methods of estimating the missing image elements of a Bayer sensor. In the Concluding Remarks of the above paper, the authors support exactly what I have been saying here and state the following,
" We have studied two important applications of our new interpolation algorithm: resolution enhancement of grayscale images and demosaicking of color CCD samples. In both applications, new edge-directed interpolation demonstrates significant
improvements over linear interpolation on visual quality of the interpolated images."