Interesting! Since I made that comment, whereby I was prepared to trade off the the lower noise capability of the panchromatic pixels for a higher pixel count on the assumption that smaller, more densly packed pixels would generate both more read noise and more shot noise per unit area of sensor, John Sheehy has weighed in with the observation that in general this is not true and that the reverse applies. Smaller, more densely packed pixels, as in P&S cameras such as the FZ50, tend to generate less noise per unit area of sensor.
On this basis, we could have a 40mp B&W APS-C DSLR, which would not only produce much higher resolution than any current Bayer type DSLR, and even better resolution than the latest MFDBs, but would also have the advantage of significantly less noise to boot, on same size prints.
Perhaps you could show us 100% (or even 200%) small crops, at maximum jpeg quality, that are easily downloadable .
I've just moved house so I'm not in a position to be able to do that immediately (everything is still in boxes), but hopefully I should be able to put something online within a week or so.
As best I can tell, the Megavision's chip is basically the same as the one in the 16 megapixel Hasselblad back (the one on the anniversary edition 500 series), just without the Bayer matrix and the AA filter. Megavision do a colour version of the E4 too, which uses exactly that chip. It's exactly the same 9 micron pitch, 37mm square 4k x 4k format in both cases. I suspect the improved noise performance is for exactly the obvious reasons -- if I put a deep red filter in front of the lens, I get a loss in sensitivity, so a Bayer matrix is basically doing exactly that (for R, G and on chip. I quite like the Kodak idea, but I suspect that it will probably only work well for colour images or for panchromatic B&W conversions -- doing a 'red filter' conversion after-the-fact, for example, would probably look a bit weird, kind of like the usual Bayer 'not quite right' look at 1:1 but worse.
Actually, at 1:1, it's pretty tricky to get a capture that looks really sharp at the pixel level -- it can be done, but if you do *anything* wrong you can really see it. Forget hand-holding, or even using anything other than a really solid tripod. Not *at all* forgiving, more like using large format really, but worth it when it's right. One interesting difference is that the Megavision doesn't do any sharpening (obviously there's no interpolation either), so it's not really a like-for-like comparison with a typical colour RAW conversion. The images do sharpen very nicely, though, mostly I suspect because of the complete absence of interpolation artifacts.
Wearing a different hat (my PhD is in extreme environment electronics, so I've had to study the way things like CCDs behave), you get some interesting effects when you scale a semiconductor process. Generally, bigger transistors mean less noise, for the same reason that bigger transistors are more radiation hard (hence the use of lots of 386 processors on the space station). Nevertheless, as processes have scaled down, other optimisations have been made that have improved both power consumption and noise characteristics -- it's not true to say that reducing the size of a transistor improves noise (it does the opposite, without question), improving a semiconductor process (in various ways) can let you scale the features without affecting performance. If you scale them a bit less than you really need, this will give an improvement in performance, though it's not actually because of the scaling per se. Photo receptors on both CCD and CMOS sensors are honking great huge things in comparison with contemporary digital electronics -- making finer pitch sensors isn't really a problem at all, but the smaller the sensor, the fewer photons (and consequentially the fewer electrons), so the job of managing noise becomes a lot harder. I suspect that we're seeing some kind of Moore's law working on sensors, but with a slower growth curve due to the difficulty of managing the noise problem and also the much slower rate of advancement in lens sharpness. Actually, gut feeling is that sensors will hit a wall based on lens sharpness rather than anything else -- arguably this has happened already, but I think there is still some hope for some more improvement.