Hey Ray!
How does your discussion change if you take the anti-aliasing filter into account? Rhetorical, I know, because there's no information on it. So here goes a bit of speculation...
I'd assume it performs an (analog) function similar to Gaussian blur, so the photons from each of your grains is spread through several sensor pixels and, among other things, thereby yields color infomation which would not be the case if only one sensor element had been in the path of the light ray (no relation ). The AA filter also does what its name suggests by this same "spreading" function. Therefore, I'd go as far as suggesting the actual pixel size per se is not a very important factor, although we have shown that the spacing (pitch) does indeed affect captured detail with the 10D getting a bit more per sq mm of sensor. Pixel size, sensor design and fabrication are clearly contributors to the resulting sensitivity/signal to noise etc, so in that sense pixel size is important, for different reasons. If all the sensed photons at a pixel have arrived after having been blurred by at least a diameter or two (I'd guess, for the AA function to be worthwhile) of the pixel pitch, it likely won't matter much whether the pixels are full-sized or half-sized, as long as they sample the photons well and have good S/N.
What do you think?
Andy