sure, clipping is when your capture/display device does not display the full range of luminosity/color present in reality. but isn't that a uniquely human judgement? after all, how does the software know that the white of a cloud is clipped and it really wasn't that white? we know because the cloud in reality has many tonal variations and when we see a cloud that looks like a white paper cutout, we say the cloud had its whites clipped. but how does software know to put blinkies in the cloud? or for that matter, how does a camera know to do that in the lcd or its electronic viewfinder? if the camera can 'show' that it's not capturing certain tones, then how is it detecting those tones?
is the algorithm simply looking for consecutive pixels of the exact same tone and assigning the clipping indicator to it? (after all there is no homogeneity in the real world, right?) i guess i am asking how does software define clipping?
Perhaps better understanding can be gained by considering scene capture by the sensor and conversion to a color image as separate subjects? And the use of the word 'clipping' itself can be questionable, IMHO.
The sensor is sometimes said to have a linear gain characteristic (curve) with respect to incident illuminance but we all know that it does not - it has an 'S' shaped curve with a fairly linear portion in the middle. For example, one of my cameras has a sensor well capacity of 77,000 electrons but is stated to have acceptable linearity in the range of 40,000 electrons or so. Thus we see that there is a 'headroom' of some 37,000 electrons but which has a decreasing gain (electrons/lux-sec) as the sensor approaches saturation. For such a camera, any level between 40 and 77 thousand electrons could be chosen for a 'clipping' signal but clipping
per se does not occur. Even 77,000 is only an average value for saturation, varying as it does according to the laws of probability and tolerances of sensor manufacture.
Onwards to consideration of conversion of the sensor signals to a color image. Taking an example of flower shots with their highly saturated colors, often shot in bright conditions, most sensors will successfully capture most of the reflected color gamut. However, one finds that many of the captured colors are outside of the gamut covered by the RGB or CMYK color spaces used for image output - monitors, printers, OLED's, etc. Unfortunately for highly saturated images, the process of conversion uses color compression (perceptual) or just plain color clipping (colorimetric) which this time is real clipping of a digital nature. So an on-board JPEG histogram, or a blinkie function, will show clipping when the sensor signals themselves are not clipped. Thus the camera is doing it's internal conversion to JPEG (sRGB or aRGB) and is showing when the
conversion is clipping,
not when the sensor is saturated. Indeed, this is the basis of 'highlight recovery'.
Even if a RAW image is shot and the file converted into a wide-gamut working space such as Kodak RIMM/ROMM (ProPhoto), or good old Adobe "Melissa", the time comes when the image must be converted into a smaller color gamut which can often be clipping time. For yellow flowers, I find that blues are often reduced to zero and reds increased to 255 when converting from either RAW or ProPhoto to sRGB and thus are truly clipped and not easily retrievable, if at all.