Any comments on the testing cited by Breathing Color? They mention the Fine Art Trade Guild and Blue Wool fade testing. I could probably guess your response, but would love to hear your thoughts on their validity.
To the credit of Breathing Color, the company actually puts a lot of effort into discussing image permanence issues on its website. Unfortunatey, it's hard to find the "meat" in the archival quality certificate BC awards to its own products. Here are two excerpts from the BC Archival Quality Certificate:
1) "Breathing Color is committed to performing extensive independent testing by third-party accredited laboratories in order to ensure that our fine art products meet and exceed the standards set forth by Fine Art Trade
Guild (FATG) for pH and lightfastness".
2) "THIS CERTIFICATE SERVES TO ENSURE THAT [this product] Meets the Archival Standards set
forth by Fine Art Trade Guild for pH and lightfastness".
What I wasn't able to find on the website was what third party ran the tests, copies of the actual tests, or what printers and inks were used to conduct the light fastness portion of the Fine Art Trade Guild's Blue wool derived light fade test. The lightfastness part of the FATG specification by its very design is a "system test". You can't just test the paper. You must test the real artwork, which for a fine art inkjet print means the printer screening pattern, the ink, the paper or canvas, any post treatments or coatings, and most importantly the artist's actual image. The FATG light fade method is therefore a sacrificial test on one real print in test. Hence, this test is mainly applicable to a limited edition run of prints all made at the same time. At best, a manufacturer could only copy but not precisely replicate an FATG lightfade test by choosing a generic test target image, but it would still have to be a total system analysis not a test of the media by itself.
Not too belabor the issues, but the FATG light fade test is really quite interesting and worth a little more discussion.
As I understand it, the Guild has specified a visual test which artists can perform on their own artwork (or it can be run in an accelerated light fade chamber). It is based on comparison of the artwork fading to ISO blue wool #6 color patch fading when both are exposed to the same incident light source. The ISO Blue wool dyes (there are eight of them) have been studied to death and have a long history of use in the museum and archives community as well as the textile industry where they originated. They also have batch to batch variability issues and many other quirks, but I don't mean to disparage them. They have served their intended purpose well over the years.
The Guild's test method is roughly this: One print and one BW#6 color patch are stored in the dark while another identical print plus BW#6 patch is exposed to light (presumably a sunlight window test). The viewer must observe a just noticeable change in the BW#6 color patch before seeing any noticeable change in the artwork when comparing the light exposed samples to the dark stored samples. If the artwork shows signs of fading or discoloration prior to fade being observed in the BW6 patch, the print fails the FATG test specification. On its fundamental merits, this approach is really quite good because it sets a rigorous standard for allowable change at the test endpoint that I believe most fine art printmakers would find desireable, ie., little or no noticeable light-induced fading of the artwork at a specified exposure dose. Another positive aspect of using an actual image is that the test addresses the important issue of image-specific fading whereby different combinations of colors and tones can contribute to easier or harder-to-notice signs of fading. However, a disadvantage is obviously the sacrificial nature of the test such that the method appears to be useful only for limited edition print sets where one print can be sacrificed. All that said, the Guild's basic testing concept actually had a great deal of influence on me as I was working out the instrumented colorimetric approach I now use to specify the AaI&A conservation display ratings (CDRs). I simply take the test standardization and the measurement of "little or no visual change" a few steps further by using spectrophotometric instrumentation to detect that change and by bolting down the spectral illuminant and the temperature and RH during the test cycle in order to create a fully standardized test protocol. Moreover, the AaI&A CDR scores give comparative figures of merit for product (total system) performance whereas the guild test is a pass/fail rating only.
In a search through the art conservation literature, I found the critical exposure dose information we need to know in order to understand the light fastness of the Blue wool #6 patch and consequently the Guild's pass/fail performance expectation. The reported exposure dose is 100 Megalux-hrs for just noticeable fade of the BW6. Also, it is important to note that this 100 megalux hour exposure was achieved with a UV excluded light source (consistent with what museum lighting specialists would do when illuminating valuable artwork on display). Additionally, the UK Guild test has generally been interpreted to mean 100 years on display before any just noticeable light-induced fading occurs. That means 100 years is being equated to a 100 Megalux hour exposure dose. Hence, the FATG's 100 year interpretation for BW6 lightfastness establishes a one-to-one Mluxhrs-to-years correspondence which means an assumption of approximately 225 lux for 12 hours per day average illumination levels. This is about half the illumination level cited by WIR in its predictions (450lux per 12hr day) but nearly twice what Kodak cites for its predictions (120 Lux per 12 hr day). You can probably understand why I finally concluded that predicting years on display for light-induced fading becomes a fool's game. Letting end-users extrapolate rated Megalux hour exposure doses to their own specific lighting conditions may be a little harder for the end-user than dumbing down the rating to "years on display" but it's not that hard. Moreover, it will result in a more honest assessment of lightfastness performance. Prints in Real world display conditions can encounter "normal" illumination levels that range well over 1000:1. Significance of a global "average" print illumination value doesn't mean a whole lot compared to having more specific knowledge of a print's true display environment, and as you can see from the 4:1 discrepancy between WIR and Kodak predictions, defining that "average" illumination condition isn't an easy factor to quantify and agree upon, either. Megalux hour ratings resolve that dispute.
Finally, to summarize my own light fade method in use at AaI&A with respect to the FATG BW6 comparative performance expectation, it appears that if AaI&A test samples reach 100 Megalux hours in test before triggering the CDR limits (note that I do include some UVA energy content in my light fade testing), then 100+ Megalux hour CDR scores should qualify print materials to also pass the UK Guild light fastness test. I've got some Breathing Color canvas and paper samples printed with Epson K3 and K3VM ink sets in test now, so we will find out in about a year if they reach the 100 Mlux-hr mark without triggering CDR limits. Its take a while to rack up 100 MLux-hrs in my less aggressively accelerated testing units. If the samples do achieve 100+ Megalux-hour ratings, the results are for these tested systems only. Other printers and inks would inevitably have different CDR scores.
I'm curious to know what you thought I might say, but I hope my remarks offer a reasoned reply to your question.