Actually the diagonals are oversampled in comparison to horizontal and vertical. If you look at the FFT, you will see that there is more room in the corners for higher frequencies.
In the sample I provided earlier (post 5), there is no room in the FT beyond 64 x Sqrt(2) distance from DC. Here are the image, its top right corner zoomed in, and the FFT (the yellow rectangle in the next image is 64x64 pixels):
Of course there is more room in the FT, but we also know what happens with the image when we push detail to the extreme (the image turns gray at given frequencies).
The largest image I tried was your rings1 image at 1000x1000 pixels, which took about 6.5 seconds to do a 2x interpolation (on my not-state-of-the-art pc). A limitation of the interp2d code is that it only accepts integer ratios. So to do the above series at 0.1 intervals, I had to over-interpolate by a factor of 10 (for example, interpolated by 14x for the 1.4 example), and then at the end decimate by nearest neighbor by a factor of 10. This complicates things and makes the memory requirements impractical for this kind of processing, unless you break it down and process it in blocks.
Yes, or just stick to 2x. Hmm 6.5 seconds for a single 1Mpixel is not too bad, but it will become much slower at larger sizes and with 3 channels.
As I mentioned once before, for practical purposes a good bicubic interpolater might be good enough. In the maxmax example, PS bicubic does nearly as well as the sinc filter. But then again, it depends on what kind of other processing you will be doing later in the work flow. Reconstruction errors look really bad when you sharpen them.
Yes, but it wouldn't do that good with your original image (as you showed in post no. 7), which kicked of this thread and the interest in good reconstruction filters.
Here is a "classic" paper by Mitchell on reconstruction: Rconstruction Filters in Computer Graphics
A classic indeed, and the Mitchell/Netravali filter is the default for upsampling of regular images in ImageMagick.