If this is always true, there’s no fix (which seems hard to believe), then game over, never use such a device. What I suspect is, you are correct, like many devices, there’s drift or need for continual calibration and a lot of labs simply dismiss as much of this work as possible. Again, a process using something like Maxwell, with email notifiers that tell the lab the minute this happens on a target at a fixed max deltaE programed, would greatly reduce issues and redos. But turning off such a machine (or shutting down a press to fix a blanket) takes time away from production. Those that want the highest quality output in a consistent fashion will do the necessary work to ensure proper calibrating. Most others will simply pass the output onto the client, hoping they don’t notice.
A lot of truth in this. Calibration requires you to stop throughput, and print out a target which takes around 5 minutes, then read in the results. One issue is advanced photographers want the consistency of an inkjet, which just isn't going to happen no matter what. Even a variation of a tenth of a degree can produce a change that might be visible when compared side by side.
One real problem is the replenishment rate and machine utilization. The rate is based on an expected average density of the images being printed. There are circumstances where that can change dramatically even in a short time ... perhaps a large job that are all very dark or very light images. Suddenly the developer is a little off. If you keep going it will average itself out, but some prints may exhibit some differences. We'll print an order of several hundred Christmas cards all on a high key background, almost always the last card will be slightly darker than the first because the machine has been over replenishing based on the print densities. Some genius somewhere could probably design a program that controls the replenishment rate based on all the factors including the density of the files being submitted, but currently it's a "start here and tweak". Under utilized machines require more than heavily utilized machines.
These variations are normally well within the toleration point of most of the labs customers, but can easily become issues if prints from different machines are compared, or reprinted at different times. The differences are usually not large, but substantially more so than with current inkjet processes. We sent in files to both WHCC and mPix to get the test prints back, both are using identical equipment, but both produced different results. The results were consistent for each ... one was all slightly cool the other slightly warm. I spent some time at Millers/mPix main plant in Kansas. To solve this they always send an entire job to the same printer. If you order a 30x40 and 100 8x10's, they'll send it all to the 30" durst, rather than the more efficient model of the big print to the durst and the smaller ones to the 11" Noritsu's. We actually installed a 24" Noritsu for much the same reason, to make sure our large portraits matched the smaller ones. The problem we have (I say we since I still consult with my previous company) is that there are 185 Noritsu's spread around the country and we can't really get consistent match between all of them and the central plant. But it works well enough for our customer base ( which are all end consumers, not photographers) nearly all of the time. I guess maybe we fit the description in your last sentence.
Nothing new here, this is the same issue that has been around with silver halide printing forever. It's actually far better than it was ... I remember the nightmares of a bulb blowing in a big Lucht Package printer that had several 600 negative rolls tested up and ready to go ... what we have now is substantially better than that.