HI,
I have about 500,000 images shot originally as JPEG (high production volume originally plus expensive storage meant it was the most sensible option for a while) plus about 250,000 shot as RAW with Canon 5D, 5DII, 7D, Panasonic GF1 and Hasselblad H3D31-II. The majority of new shoots are done on the Hasselblad.
I've already had to migrate this collection across several different storage solutions, and also made the transition from PC to Mac for all my work in the last couple of years. Here's my current workflow, a lot of valuable suggestions for which came from the DAM book by Peter Krogh.
I shoot sets of photos. Each set tells a specific story, is pretty much self-contained and contains 50-150 images on average. I now treat a photoset as the basic unit of organisation for my work, having learned that alternatives like filing by model or by date are not as useful, since sets can get processed a long way out of chronological order.
One directory per photoset, prefixed with a code which tells me the originator of the set (as I buy unprocessed photosets from some other photographers), then a unique sequential set number within that originator, plus the model's name and a descriptive suffix. For example:
RE0001_Ariel_InTheHighAlps
RE0002_SammieB_WhiteSatinDress
AL0001_Delta_Cowgirl
VID0001_KatyCee_NewsgirlInNorway (for videos- one directory per video storyline)
I first of all copy photos off the cards via FireWire800 reader, either with the Finder (for Canon/Panasonic shots) or using Phocus to make FFF files (for the Hasselblad). I put these into a temporary area called "Virgin Imports" so I can look at them, but as soon as possible at the end of the shoot I organise them into the photoset directory. From then on, all the contents of that directory will travel together so all the
I do this file organisation immediately after import.
At the earliest opportunity, I will keyword the photos as well. I use Phocus for the Hasselblad files.
All photoset directories then get copied to a working directory on my main RAID, from whence they are backed up to a 16 TB DroboPro, which is in turn backed up to a rotating offsite backup of two more disk sets for a second DroboPro. The working disk is also backed up to an 8 TB Drobo via TimeMachine, so stuff I'm actively working on or which has yet to be processed gets hourly backups.
At this point, I import into Aperture.
I have one massive library with the old JPEG-based shoots. At over half a million shots, this is actually rather unwieldy, but it is done now.
I have a second library containing more recent processed sets.
Then I have my third, working library. I keep all the more recent stuff in here, periodically exporting it and re-importing to the second (processed) library. I just have a "Processed" folder in the working library which I drag projects to when they are done, and periodically export that then clear it out. When I do this I mirror it on the physical work drive, copying everything to its final resting place on the main 16 TB DroboPro as a processed photoset.
I make one project for each photoset, named according to its unique directory name above. I do everything by referenced masters, because I've already had to go through several different cataloguing and image processing programs and I want to keep the primary physical organisation intact, since I've learn by bitter experience what the best way to manage these files is.
This does lead to a profusion of projects, but keeping the working aperture libabry cleared out of processed photosets every few months keeps it managable.
When it comes time to process a photoset, I'll do it in Aperture most of the time. The exception is if the colours are tricky or I need DAC for the Hasselblad files- in that case I'll make ProphotoRGB 16 bit TIFFs in Phocus and import them into Aperture alongside the FFF files, and use autostack to group them together. I'd love to be able to keep these intermediate TIFFs, but the data volume is impossible right now (180 MB per file, several thousand shots per month = impossible). Anyway, for their brief existence, they live in the TIFFS directory inside the photoset's own directory.
With this structure of lots of small projects, one per photoset, I do not need any further subdivision within Aperture. I mark all acceptable shots as one star, two stars for an exceptional shot, but never mark anything higher than that in the working library: any decision as to which shots get elevated to higher star status is only done much later, at my leisure. I do have to denote a "main" and four "preview" images for each photoset, which I do in the "intructions" IPTC panel because that's what my website software is set up to read and it deals with that automatically.
When I am done, I export the versions to full-sized JPEG in the "full" subdirectory of the photoset's directory, and production sized JPEG in the "processed" subdirectory. Then I export the project as a new library to that directory as well, to ensure that all the metadata and processing decisions are stored along with the images. Ideally I'd prefer to export per-file XMP sidecar files, automatically, the way that Lightroom does it, but at least this way the data is still around and I don't have to fiddle around exporting masters, then deleting all the duplicates in order to just get hold of the XMP files. Annoyingly, Phocus uses a DIFFERENT format XMP sidecar file. Teeth grind.
I then upload sets to my website, and most of the importing chores are now done automatically by the website scripts reading the metadata.
When sets are done, they get moved from the working directory on my working RAID to the processed directory, and backed up from there to my main DroboPro (and thence to the offsite backups). This means that I keep a pristine copy of the photoset AND the processed copy; this may be unnecessary and I'll revisit once the DroboPro fills up.
Phew. Sorry, you did ask. Read the DAM book for why a lot of this is a good idea!
Cheers, Hywel.