I was hoping to get some informed comments about why manufacturers are doing this and, specifically, if it can be ascribed to Color Appearance Models, which suggest increased saturation and a brightening of the tone curve will produce a printer image that appears, after a time to adapt, to be closest to what was on the computer monitor.
In this instance, my informed comment is coming from past experience. Sometimes a product can be touted as “better, new, improved” on the trade show floor by trickery. Years ago certain machines were sold as “sharper than the competition” only because the default sharpening setting was always ON at it’s setting of zero and to turn it off one would need to make the setting of negative 3. Another trick was to mislead a prospect by using “lpi” as a specification of resolution when really the specification was “lines per image” and not the usual “lines per inch.” Too many tricks to count.
So, if a printer engineer / manufacture can create some way to impress a buyer, investor, accountant, a superior; that could explain a lot and it may not actually be better, new, or improved!
My contacts at Kodak, Fuji, and Epson have long retired and would not necessarily have any more technical information than we already have at this point. However, I’m walking distance to the Canon Experience Center showroom in Costa Mesa, CA., so if anyone can provide the name of contact, I would be happy to take the time to see them and ask questions. I know Canon has at least more than a couple of very savvy reps at trade shows whom I’ve met and could have insight , I just don’t have any of their names.