Hi,
Lens profile will not handle it as the problem not a problem of the lens but of perspective. It is not possible render surfaces correctly in a rectilinear projection.
Hi Erik,
You and Bill are correct that the issue has to do with anamorphotic perspective distortion, it stems from projection of a 3-dimensional subject on a flat plane (our sensor) at an angle. However, given that flat plane projection,
the real issue is the viewing position of that projection. The image will look undistorted when viewed from the correct (proportionally scaled)
center of projection(!). It's just like lettering on roads that looks undistorted when approaching it, but very stretched when we're looking from too close or from the side.
To correct the situation geometrically is two-fold, and I've added an overview screen-capture sample from my Pano application as attachment.
- First, there is a slight rotation and non-level optical axis shooting issue.
- Second the vertical shift needs to be compensated for to restore correct heights.
Then the resulting image (see second attachment) should be watched from the correct projection viewpoint which is now at the horizon line near the bottom edge of the image, and from a distance of (and here is the real problem) of 3.3 mm! For anyone myopic enough to pull that off with one eye, the perspective will look perfectly normal, just as it was in real life. When the image is watched from further away, then it will look distorted, stretched, be cause the wrong viewing point is used.
The only geometrically correct solution is to proportionally enlarge the image enough to allow viewing from a more comfortable distance. So for viewing it at a normal reading distance (say approx. 12 inch), it should be magnified to 10x its size on display, or some 30x when printed at 300 or 360 PPI. The original, without the subsequent pano projection correction for the shift should be viewed from the upper edge, which is not common because it might require a ladder with decent output sizes on a wall, while the pano corrected version should be viewed from the level of the natural horizon in the image.
To prevent having to jump through these hoops, one can fudge a bit and apply some warp distortion.
As for the rectilinear projection math involved, a Pano stitcher can be set up quite easily to make sure that the image is squared correctly, by adding a vew vertical line control points. When a shift was used, then that should also be manually input as a vertical offset. A 12mm shift on a sensor that is 24mm high, will require a vertical shift of the image center by 12/24 = 50% of the number of vertical pixels. That will put the apparent pano horizon near the top edge of the image, and after the pano stitcher did it's corrections the horizon is now at its natural image position again.
Since in this case the original sensor image was 36x24mm, and the image was projected by the lens from 24mm distance (the focal length at infinity focus), the output viewing distance scales proportionally. To view from 10x 24mm distance, the sensor image needs to be enlarged 10x (=360x240mm), to view it from say 1 metre distance, it should be magnified 1000/24mm = 41.7x (1500x1000mm output size). These numbers are approximate, one should formally use the exit pupil distance instead of the focal lenght, but it's close enough to get an idea of the implications.
Any (anticipated) deviation from the 'correct' viewing position will seemingly introduce a distortion which can be compensated for by introducing a warp distortion.
Viewing from the 'wrong' position/distance can also be used creatively to 'enhance' the effect of e.g. a wide-angle effect by viewing it from 'too far away', or a compressed telelens effect by viewing it from 'too close by'.
Cheers,
Bart