I think some MFDB users in this thread still haven't grasped the relationship between sensor size and resolution.
If you are stitching together two or three 1Ds3 images in order to get an image as detailed as a single shot from a P45 (for example), then you should use the same focal length of lens at the same f stop for both shots.
If you were to use the actual same lens on both cameras (using an adapter for the 35mm), there's no reason why the resulting images would be different in any respect, outside of differences in RAW converters and other peculiarities of design such as the differences between the CCD and the CMOS and AA filter issues.
The fundamentals with regard to resolution, DoF and even dynamic range are the same in the sense that a P45 is basically two 1Ds3 sensors joined together.
If the resulting image comes from two sensors stitched together, or two images from the separate sensors stitched together, the result is basically the same, provided the same lens is used
Whilst it's true that a P45 has 16 bit processing, considering all the puzzlement about the expected improvement of 14 bit over 12 bit in recent 35mm DSLRs, it's doubtful that the 16 bit of the P45 compared with the 14 bit processing of the 1Ds3 would be noticeable.
All of the above is based on the assumption that the objective in stitching is to get the same file size and aspect ratio as the single shot from the P45. If you change the aspect ratio of the stitched 1Ds3 image, as in a panorama, then clearly the 1Ds3 resulting stitch will be superior to the single P45 shot in every respect since you would have to use a wider angle lens with the P45 and crop to the aspect ratio of the 1Ds3 stitch.