Pages: 1 2 [3] 4 5   Go Down

Author Topic: The terms "linearization" vs "calibration"  (Read 28949 times)

Mark D Segal

  • Contributor
  • Sr. Member
  • *
  • Offline Offline
  • Posts: 12512
    • http://www.markdsegal.com
Re: The terms "linearization" vs "calibration"
« Reply #40 on: May 06, 2016, 09:18:51 am »

..............

As for spectrometers, they are usually adjustable (by updating either voltages, or lookup tables in firmware), so that the specific output units are within a given tolerance/precision range and in units that can be converted to other units.

Additional issues that affect accuracy, are electronic drift (long term due to aging of electronic components, but also short term due to temperature changes which hopefully is compensated for) and the changes in the lightsource. Multiple short spot measurements with a device that has a halogen bulb as lightsource will e.g. introduce drift due to Tungsten filament deposits on the inside of the quartz bulb because it doesn't become hot enough. Prolonged measurements that heat up the bulb enough will then allow the deposit to redeposit onto the filament, so in a way mechanically resetting the bulb's influence to a known more stable state.

It's an interesting topic.

Cheers,
Bart

Hi Bart, yes it is. I've been wondering why my old, but previously very reliable, X-Rite DTP-20 spectrophotometer suddenly stopped passing the calibration test. As XRite pulled all repair support from this device roughly a couple of years ago (being only 9 years after its introduction), I did what the victims of XRite's forced obsolescence strategies are expected to do and bought an i1Pro2, now hoping this one lasts a good long time. I was using the DTP-20 periodically for making new profiles as the need arose, but also doing lots of spot measurements, so I'm wondering now based on what you report here whether the cause of the DTP-20 demise could have been an imbalance between prolonged and instantaneous readings. Of course when it died it was about ten years old, so aging could have been at play, but the usage over that lifespan was not industrial-style heavy - quite periodic in fact.
Logged
Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....."

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: The terms "linearization" vs "calibration"
« Reply #41 on: May 06, 2016, 10:10:07 am »

The DigitalDog thinks of calibration as applied to digital photography where the output of the device being calibrated can be adjusted electronically. However, in the scientific community there are devices where the output of the device can not be adjusted in the calibration process, and calibration merely establishes the relationship between the measured values and the values as realised by standards as Bart suggests. It's a reference table between input, and output in specific units.
So you're stating: When a device cannot be calibrated, calibration is not possible. I'll add: When a device cannot be calibrated, it can be profiled (establishes the relationship between the measured values and the values as realised by standards reference). If and when you have specific standards for the reference, by all means, specify that.
Oh, and a device that can't be calibrated who's measured values do not produce the values realized by standards?
« Last Edit: May 06, 2016, 10:14:42 am by digitaldog »
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: The terms "linearization" vs "calibration"
« Reply #42 on: May 06, 2016, 11:00:22 am »

Hi Bart, yes it is. I've been wondering why my old, but previously very reliable, X-Rite DTP-20 spectrophotometer suddenly stopped passing the calibration test. As XRite pulled all repair support from this device roughly a couple of years ago (being only 9 years after its introduction), I did what the victims of XRite's forced obsolescence strategies are expected to do and bought an i1Pro2, now hoping this one lasts a good long time.

Yes, built-in obsolescence sucks, big time. In your case, the intermittent use could hardly have worn out the electronic components by being significantly heated multple times. There might be components that need disconnecting and reconnecting to clean the contacts. The only culprits I can think of is filters that could over time bleach or dichroic materials that get dirty from the atmosphere. I do not think there are that many components that age/corrode beside printplate solder or poorly coated copper tracks, especially when not used.

Halogen bulb deposits might also play a role to push the response outside the expected calibration range, but that could be improved by letting the bulb continuously burn for a while. Maybe the bulb can be replaced (if your unit uses that) to see if that triggers something.

One can only hope they didn't include a timer, which has to be reset during maintenance service while available. Chipped inkjet cartridges show that it would not be unthinkable, although I think it's unlikely because it is a more recent practice for many devices. Traditional Tungsten lightbulbs did have that obsolescence built in (cheaper filaments and the bulb blackened on the inside anyway), cars get less durable because they lasted too long, and a perfectly functional phone can be tossed away when the battery can't be replaced. Progress comes at a price (for the environment and our wallets).

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

Mark D Segal

  • Contributor
  • Sr. Member
  • *
  • Offline Offline
  • Posts: 12512
    • http://www.markdsegal.com
Re: The terms "linearization" vs "calibration"
« Reply #43 on: May 06, 2016, 11:05:11 am »

Indeed, and one can often legitimately ask how much progress, because it isn't always self-evident, apart from the sales figures of the relevant corporations.
Logged
Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....."

bjanes

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 3387
Re: The terms "linearization" vs "calibration"
« Reply #44 on: May 06, 2016, 08:48:24 pm »

So you're stating: When a device cannot be calibrated, calibration is not possible. I'll add: When a device cannot be calibrated, it can be profiled (establishes the relationship between the measured values and the values as realised by standards reference). If and when you have specific standards for the reference, by all means, specify that.
Oh, and a device that can't be calibrated who's measured values do not produce the values realized by standards?

No, that is not what I said, but that's how you interpreted it. You distinguish between calibration and profiling, but that is not how the standards organizations define calibration. The example of the thermometer is considered calibration by NIST, since the process establishes a relationship between the temperature readings and the standard, and this relationship is expressed in a lookup table which is read out manually.

When I calibrate my NEC monitor with Spectraview, the results are recorded as a profile. The luminance is likely actually adjusted to the specified cd/m^2, but I think the tone curve and color values are encoded in a lookup table which loaded into the electronics of the monitor at startup. This is analogous to case of the thermometer, except that the lookup is done by the computer rather than manually.

Regards,

Bill
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: The terms "linearization" vs "calibration"
« Reply #45 on: May 06, 2016, 08:56:52 pm »

No, that is not what I said, but that's how you interpreted it.
OK, sorry....
Quote
You distinguish between calibration and profiling, but that is not how the standards organizations define calibration.
OK fine, so considering this is the color management forum on a photo site, can you define exactly what standards you're referring to? I have asked more than once  ;D .
Quote
The example of the thermometer is considered calibration by NIST, since the process establishes a relationship between the temperature readings and the standard, and this relationship is expressed in a lookup table which is read out manually.
Understood. So if the thermometer doesn't follow it's reference, what temp really exists, now what?
Quote
When I calibrate my NEC monitor with Spectraview, the results are recorded as a profile.
That much I understand but the part you're not specifying is the part about calibration! You set the target for the calibration by flipping a coin, the default, what someone recommends on-line OR to produce a desired behavior from the device as I suggested earlier? I would hope and suspect the later.
Quote
This is analogous to case of the thermometer, except that the lookup is done by the computer rather than manually.
I don't understand your analogy, sorry. The thermometer is supposed to report some metric based on facts (this water is 112 degrees, when it really is 112 degrees or there abouts). The display you calibrate is or isn't aimed at some desired behavior? Like, matching a print for example.
« Last Edit: May 06, 2016, 08:59:57 pm by digitaldog »
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: The terms "linearization" vs "calibration"
« Reply #46 on: May 07, 2016, 09:21:18 am »

OK fine, so considering this is the color management forum on a photo site, can you define exactly what standards you're referring to?

Take your pick... , mostly from the first 8 references.

This might also come in handy as a basis for discussion, to differentiate between calibration and profiling.

And this.

Cheers,
Bart
« Last Edit: May 07, 2016, 09:28:31 am by BartvanderWolf »
Logged
== If you do what you did, you'll get what you got. ==

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: The terms "linearization" vs "calibration"
« Reply #47 on: May 07, 2016, 10:32:14 am »

Take your pick... , mostly from the first 8 references.
OK, which one requires a user to calibrate towards a goal, other than specifying a reference?


Let's take ISO 3664:2009 which specifies a standard for a display luminance from 75 cd/m2 to 100 cd/m2 with a monitor CCT of 6500K. IF I use that for calibration because as you suggest, it's a standard, the result is a mismatch of my display and print viewed next to the display. That serves no benefit as a target for calibration.


Calibration to CCT 5150 at 150cd/m2 using a specific piece of software (SpectaView) using a specific colorimeter (i1Display) next to a specific illuminant (GTI SOFV-1e) with a specific paper (Epson Luster) produces a match because that's the calibration that produces a match for me. Again; calibration in this context, in this forum for this audience is simple: putting a device into a desired condition to achieve a desired result! The standard calibration aim point of ISO 3664:2009 doesn't do that, not by a long shot. It's useless in that respect. It might be useful as a calibration aim point for other users so all their individual displays match (or more than likely, mismatch) from their goal. That's not a useful 'standard' for calibration in that context!
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2197
Re: The terms "linearization" vs "calibration"
« Reply #48 on: May 07, 2016, 11:50:09 am »

Let's take ISO 3664:2009 which specifies a standard for a display luminance from 75 cd/m2 to 100 cd/m2 with a monitor CCT of 6500K. IF I use that for calibration because as you suggest, it's a standard, the result is a mismatch of my display and print viewed next to the display. That serves no benefit as a target for calibration.

To be fair, ISO 3664:2009 specifically disclaims use of this standard for comparing images displayed on monitors to hard copy light booths. I presume you are aware of this. Others may not be.

The techniques of matching monitor displays to hard copy are beyond the scope of ISO 3664:2009.
Logged

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: The terms "linearization" vs "calibration"
« Reply #49 on: May 07, 2016, 12:02:25 pm »

My point continues to be, that standards absolutely do not match my display to my hard copy prints in my light booth. YMMV and that's WHY products aimed for the calibration of displays offer many such options. The more, the better (as an example, NEC SpectraView). One size doesn't fit all when it comes to display calibration. Or are people here suggesting that the process where we select targets for this process shouldn’t be called calibration?
Quote
The techniques of matching monitor displays to hard copy are beyond the scope of ISO 3664:2009.
But it's a 'standard' the standards proponents will suggest.  :P
« Last Edit: May 07, 2016, 12:12:26 pm by digitaldog »
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: The terms "linearization" vs "calibration"
« Reply #50 on: May 07, 2016, 12:09:54 pm »


http://www.gtilite.com/gti-faqs/
ISO 3664 species that in the imaging industry – graphic arts, photography, and graphic design – 5000K lighting, a.k.a. D5000 or D50 as it is commonly referred to, is the best light source to use. It has equal amounts of Red, Green, and Blue light energy, so it will not accentuate or subdue any colors. In imaging applications, many colors are viewed at the same time (e.g. a photograph), so all colors need to be represented evenly. This is why an equal energy light source is so important to this application.

D5000 should be the primary source, but a secondary source can also be used. In packaging applications, using the source used in the store would be useful. This is typically a high efficiency fluorescent lamp.

Old color evaluation booths also offered 7500K lighting. This light source was only used to help the pressman see the yellow ink on the press sheet. This was dropped from the standard (ISO 3664) over ten years ago. D50 is the only ISO standardized source now used in the imaging industry.

http://info.gtilite.com/what-is-iso-3664/#.Vy4St2P7BT4:


A summary of the graphic arts and photography standards.

ISO 3664 is the international color viewing standard for the graphic technology and photography industries. In order to improve the quality and consistency of light sources and color viewing, ISO 3664 outlines the minimum criteria necessary for all color viewing systems to meet. This set of specifications enables lighting engineers and manufacturers to design, test, and certify their color viewing systems to the industry standards, and challenges them to enhance the performance of their products. ISO 3664:2009 is the newest version of the international color viewing standard, revised to reflect tighter quality control guidelines in order to reduce miscommunications and other errors in color reproduction.

Color temperature – Per the ISO 3664:2009 standard, the color temperature of a light source should be relative to a phase of natural daylight with a correlated color temperature of about 5000K.


And a CCT value of about 5000K doesn't produce a match for me.
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2197
Re: The terms "linearization" vs "calibration"
« Reply #51 on: May 07, 2016, 12:15:25 pm »

My point continues to be, that standards absolutely do not match my display to my hard copy prints in my light booth. YMMV and that's WHY products aimed for the calibration of displays offer many such options. The more, the better (as an example, NEC SpectraView). One size doesn't fit all when it comes to display calibration.

Of course. I was just pointing out that people can't expect a standard to be useful outside of it's specific domain and that ISO standard specifically points this out in regard to display matching of prints in light booths.

ISO:
Quote
However, it is important to note that adherence to these specifications does not ensure that the monitor will match the hardcopy without provision of a defined colour transformation to the displayed image or use of proper colour management. This aspect of matching is outside the scope of this International Standard.
« Last Edit: May 07, 2016, 12:18:51 pm by Doug Gray »
Logged

Mark D Segal

  • Contributor
  • Sr. Member
  • *
  • Offline Offline
  • Posts: 12512
    • http://www.markdsegal.com
Re: The terms "linearization" vs "calibration"
« Reply #52 on: May 07, 2016, 12:20:31 pm »

I think what all this boils down to is that one should "calibrate" in context to achieve specific objectives. If your objective is to replicate the ISO standard for whatever reason, you do that. If your objective is to equilibrate what comes out of your printer with what you see on your display, you may need to change the calibration parameters that the monitor and printer software provide for doing this. By calibration we mean that we set those several parameters to fixed values that are known to serve the purpose at hand, and whatever other colour management we need we do through profiling, which is predicated in part on the calibration parameters. As for linearization of printers, I remain of the view that Harald Johnson's explanation of it is the most straightforward and plausible, while contrast curves should be handled through image editing and profiling.
Logged
Mark D Segal (formerly MarkDS)
Author: "Scanning Workflows with SilverFast 8....."

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: The terms "linearization" vs "calibration"
« Reply #53 on: May 07, 2016, 12:21:37 pm »

It is my impression that the non-standard use of the word 'Standard', a methodology with recommendations for statistically solid procedures, causes some confusion. A 'Standard' is not a reference value.

In a similar vein, Calibration is not the same as Profiling. And Linearization is not the same as Calibration (although it may be part of a Calibration process to improve interpolation accuracy).

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

digitaldog

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 20630
  • Andrew Rodney
    • http://www.digitaldog.net/
Re: The terms "linearization" vs "calibration"
« Reply #54 on: May 07, 2016, 12:25:11 pm »

It is my impression that the non-standard use of the word 'Standard', a methodology with recommendations for statistically solid procedures, causes some confusion. A 'Standard' is not a reference value.
In a similar vein, Calibration is not the same as Profiling. And Linearization is not the same as Calibration (although it may be part of a Calibration process to improve interpolation accuracy).
A standard doesn't, never defines a reference, a set of values to target? And who here said linearization is the same as calibration (cause it's different)?
Logged
http://www.digitaldog.net/
Author "Color Management for Photographers".

Bart_van_der_Wolf

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 8913
Re: The terms "linearization" vs "calibration"
« Reply #55 on: May 07, 2016, 07:43:40 pm »

A standard doesn't, never defines a reference, a set of values to target?

Hi Andrew,

I wouldn't know about 'never', but the common pattern of 'a Standard' is a description of procedures to follow, and metrics to be calculated, in order to allow independently conducted tests to produce the same (within a margin) results, IOW produce repeatable and consistent observations. I do recommend getting hold of some of them, despite the cost, because they usually (after a long process of becoming an actual Standard) make a lot of sense and often explain the considerations that went into them before achieving a state of general consensus.

There may be some recommended settings for the tests to be conducted, but they are usually explained, e.g. to produce statistically more robust, or perceptually more relevant, results.

Quote
And who here said linearization is the same as calibration (cause it's different)?

Just stipulating the difference, given the OP's question, trying to get back on topic instead of hiding behind smoke screens.

Cheers,
Bart
Logged
== If you do what you did, you'll get what you got. ==

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2197
Re: The terms "linearization" vs "calibration"
« Reply #56 on: May 07, 2016, 09:30:36 pm »

Getting back to the question of calibration v linearization. My personal use of the terms is thus:

Calibration: Things that change inking. In particular setting the highest density points of CYMK.
Linearization: Establishing the density of each ink at various points which will be entered in the RIP.

These are both done for a specific media.

After doing these steps profiling is done and should provide fairly smooth transitions over the 3D LUT space yielding high quality profiles.

Many others appear to include linearization as a part of calibration. I won't argue with that. It's the process that matters.
Logged

Stephen Ray

  • Full Member
  • ***
  • Offline Offline
  • Posts: 217
Re: The terms "linearization" vs "calibration"
« Reply #57 on: May 07, 2016, 10:18:17 pm »

Getting back to the question of calibration v linearization. My personal use of the terms is thus:

Calibration: Things that change inking. In particular setting the highest density points of CYMK.
Linearization: Establishing the density of each ink at various points which will be entered in the RIP.


Doug,

Regarding Linearization and "establishing the density of each ink," are you determining the ink values to create a smooth gradation of the said ink channels or are you determining the values to arrive at a gray balance, thus ultimately creating a "linear curve" which would be a contradiction in terms? (I've found "linear curve" mentioned in reference material, that's why I even bring it up.)

I'm in the process of compiling my research and will share my findings ASAP.
Logged

Doug Gray

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 2197
Re: The terms "linearization" vs "calibration"
« Reply #58 on: May 07, 2016, 11:06:40 pm »

Doug,

Regarding Linearization and "establishing the density of each ink," are you determining the ink values to create a smooth gradation of the said ink channels or are you determining the values to arrive at a gray balance, thus ultimately creating a "linear curve" which would be a contradiction in terms? (I've found "linear curve" mentioned in reference material, that's why I even bring it up.)

I'm in the process of compiling my research and will share my findings ASAP.

IMO "linearization" is an inaccurate term of art. Spectrally, reflectance at a specific wavelength or CIE XYZ terms are linear. But most uses of the term are after a mapping that corresponds more closely to perception. Profiles use ICCLAB which is neither linear nor logarithmic (as in density). In fact, even taking the simple case of neutrals only (K, LK, LLK)  the last thing one would want to do to with linearization of printers is to create an actual "linear" mapping of the profile outputs. More important is that whatever process is used prior to profiling create a smooth response to the L*a*b* inputs. Lumpiness is hard, or impossible to fix in profiling as the profile size expands rapidly the more you segment it to accommodate gradient changes.

No idea what a "linear curve" even means but most likely means the author meant some sort of smoothness or limitations on gradient change rates. (caps on the second derivatives).

Also, there are different goals at work sometimes. Ink use reduction, and large gamut, amongst others.
Logged

Peter_DL

  • Sr. Member
  • ****
  • Offline Offline
  • Posts: 544
Re: The terms "linearization" vs "calibration"
« Reply #59 on: May 08, 2016, 06:57:16 am »

IMO "linearization" is an inaccurate term of art.
...
No idea what a "linear curve" even means ...

Doug,

With a monitor's output in cd/m2, Linearization is achieved not by Calibration to whatever gamma but by the application of exactly the inverse function in color management. The calibrated gamma is counter-balanced by the 1/gamma-encoding upon conversion to the monitor profile.

The net result is a linear relationship (linear curve) between x: the RGB numbers of a grayscale in a linear gamma space, and y: the output luminance in cd/m2.

In a first order the calibrated gamma, whether 1.8, 2.2 or e.g. the L* trc, is simply irrelevant. Only in a second order there can be "bit precision effects", or let’s call it "smoothness". For example, a regular 2.2 gamma with its steep take off in the deep shadows is not a good idea once 8 bit come into play.


Now let’s think about printer again.
IMHO, the above described "net linearity" should finally be valid as well, now with y = the Reflectance along the printed grayscale.

Again there can be second order effects which may make it desirable to calibrate the printed grayscale not only right to this numerical linearity, but to a brighter state with a more perceptual distribution of tones, however, it is finally captured in the profile and therefore should get counter-balanced and eliminated in the course of color management – in a way that the net linearity is met again, and that there is no net addition of brightness.


Peter
--
Logged
Pages: 1 2 [3] 4 5   Go Up