Been trying to work this out in my head for a few weeks, unfortunately the search terms are so vague I can only think to ask it here:
There is a rock ten feet in front of you. There is a mountain one mile in front of you. You take a picture with both in the frame, handheld, and you yaw the camera while doing so.
Will the rock or the mountain be more blurry (or will the detail in each be blurred by the same amount)?
I'm not talking about apparent loss of detail but rather pixel by pixel/grain shift in each.
What spurred this thought is that when you're shooting a gun, inaccuracy caused by the slightest amount of yaw or pitch in the rifle is amplified by your distance from the target. However, when shooting a camera, the sensor or film is the target, and the light can perhaps be thought of as the bullets.
This is further complicated by the fact that a bullet striking a target is (for all intents and purposes during a competition) an instantaneous event, whereas the time in which an exposure takes place is highly relevant (ironically, given that light is many times faster than any projectile). Perhaps camera yaw can be thought of as a rifle target moving very quickly, causing the bullet to not only punch a hole but also tear sideways a bit through it?
Or, if distance is irrelevant to yaw blur, as a piece of paper with a rod through it extending to infinity at every point. Yaw will cause the paper to tear, but the distance of the object has no effect.
Please help.