I think the term "resampled" is wrong...my understanding of the Epson black box is that the printer's error-diffusion dithering algorithm is more like a fine sieve...it takes whatever image data it is given to it and based on the resolution settings of the driver, drops the image data against the dithering to determine when and where to spirt a droplet of ink.
I've never heard an assumption like that before, but there's a first time for everything. We both don't have a solid proof (like an official statement by one of the major printer manufacturers) to support either position, but I think there is more circumstantial evidence to support simple interpolation. Not only is interpolation a logical simplification to speed up the dithering process, there are also moiré effects
created that are completely consistent with (poor quality) print driver interpolation/resampling.
The printer resolution feedback from the printer driver itself when interrogated, is IMHO also a clear indicator that the fixed PPI settings are the basis for further dithering, and that interpolation is happening.
As a historical tidbit I found an old contribution by Mike Chaney
(developer of Qimage) on an other website, where he also explains in simple terms what happens and why it can help to interpolate to 720/600 PPI prior to sending the data to the printer driver.
I also seem to remember a post somewhere from a relatively credible source that hinted at improvements in some printer drivers using bicubic interpolation instead of bilinear, but I'd have to search and see if I can still find that post.
There is also a faq entry on Eric Chan's website
specifically mentioning the resampling issue in connection to the Epson driver settings.
Finally, it's important to remember that in order to avoid accidental downsampling
artifacts by lower quality printer driver resampling, setting the printer driver to accept the highest possible input resolution (720/360 or 600/300 for the brand and paper choice) is the safe route to use. Substandard downsampling is a huge source for artifacts. That's why a program like Qimage adopted the use of anti-aliasing prefiltering for downsampling. The current LR version 3 also seems to have improved in that department. It's better to make sure what quality goes into the driver, than leaving it up to that black-box to find a quick and dirty solution.
Of course not all images are that critical that one will always run into resampling trouble, and not all images have enough detail to absolutely need the highest possible PPI setting. Nevertheless, I've seen no clear evidence that using 720/600 PPI input files has a detrimental effect on image quality, but 360/300 can have that effect.
But in any event, I'm pretty sure the Epson driver does not resample up or down prior to generating the dither.
May I suggest you check with Eric Chan what evidence he found that resampling is what is happening? I know he has exchanged views with Mike Chaney, but presumably also with others (not to be named individuals at Epson?). And perhaps (if he happens to read this) Eric can react here if he feels he can add something to clarify the situation.