I am scanning some 35mm Kodacolor negs using an Epson V750. Is 1200 DPI a good scanner setting to use? I am trying to balance detail extraction with scanning speed. ANy other tips would be welcome - I am totally new to this.
Hi Peter,
I've been scanning film for a long time, probably close to some 20 years, but less since my digital cameras files exceeded 16MP (roughly(!) matches 4000PPI). There is still a lot of misinformation going round about how scanners and film interact, but it is essentially quite simple once you understand the factors that play a role. Let's concentrate on resolution for now, since that's you main question.
First we need to establish what resolution the film actually caught. If detail isn't resolved on film, you won't resolve it in a scan. This immediately becomes a question that's hard to answer, because it depends on the interaction of the film (slow thin emulsion, low or high contrast), the camera optics (focus/diffraction/aberrations), and shooting conditions (tripod/handheld).
However, we can
empirically demonstrate that it is possible to extract more detail from film with scanning densities of upto 6000-8000 PPI, presuming that such level of detail was captured on film to begin with. That does also help to reduce
grain-aliasing, but that's a somewhat different subject.
Now, the difference between, say, 4000 PPI and higher sampling densities is in the range of diminishing returns but it still helps. However,
this also presumes a scanner is being used that is capable of delivering high MTF at those resolutions, and this is where things change quite a bit with the Epson V7xx and V8xx scanners when compared to dedicated filmscanners or especially drumscanners.
One major obstacle is getting the film in the plane of best focus, which includes height of film in the film holder,
and film warp or even movement during the scan due to heat build-up. Then there is the quality of the scanner optics (lens/prism/mirrors whatever is included), and the relatively poor protection against veiling glare (also because a large area of the film is exposed while only a small strip is being scanned).
Another obstacle is that it's hard to predict how the MTF of the film image and the MTF of the scanner (assuming perfect focus and a clean lens) will combine into a system MTF.
One possible way of knowing the practical upper limit on scanning PPI with a particular scanner is by determining the limiting resolution of the scanner itself, and then assume a perfect film image (which it isn't). There is an
ISO 16067-2 procedure that describes the best way of doing that, and it involves scanning a slanted edge which then allows to produce an MTF curve which should tell us where we can draw the line on practical resolution limits.
The 1951 USAF resolution test chart which was mentioned, is designed to be used with analog recording systems (i.e. film), and is not that well suited for discrete sampling systems such as line scanners and digital camera sensors. It's main issues are that the contrast is unreasonably high (lower contrast will already be unresolved when higher contrast can still be resolved) and, more importantly, the bar patterns may either align with the sensel pitch of not. That makes it sensitive to positioning which can result in up to a factor of 2 resolution difference between tests.
Another and much more practical method, if one can still produce a film image with the equipment and film/processing, is by scanning a (sinusoidal) grating at various rotations as captured on the actual film. My
free radial grating (Star) test target is very well suited for such tests. It allows to visually compare performance, and it allows to quite accurately quantify the limiting system resolution of capture+scanning operations, all the way up to the Nyquist frequency of the scanner/digitizer.
Here is an example of a test I did on a much older target design with a dedicated film scanner. Those old webpages were not maintained/expanded when a newer model scanner was introduced, since I received fewer questions that needed to be specifically answered for that first generation 5400 scanner. However, it still shows that the scan resolution effectively increased with scanning/sampling density on those dedicated film scanners for home use. BTW, an effective
system resolution of 76.1 lp/mm on that SE5400 resembles a direct digital camera capture with an approx. 6.4 micron sensel pitch.
Concluding (I'm jumping ahead to it a bit), we can assume that there is a benefit to scan at at least 2400 PPI on the Epson scanners, with diminishing returns for resolution all the way to 6400 PPI. It will help to reduce grain-aliasing a bit at the higher scan densities, but the limited scan resolution (and relatively diffuse lighting) already have a 'positive' effect on the visibility of noise.
Personally, I would scan at full resolution capability, and down-sample for storage. The amount of down-sampling will determine the level of loss we can tolerate for the intended use.
Cheers,
Bart