I have a few graphs and some maths (see attachments) to see if I have a good grasp of these characteristic curves.
At the top of page 61, Hunt talks about how light from the screen which is reflected by the room back to the screen (viewing flare), can alter the appearance of the image. In particular, a darker image will throw less light off the walls:
hence the viewing flare will decrease somewhat… and the effective gamma will rise at high densities.
Well, I don't always believe your man Hunt, and since he previously hadn't explained why the gamma rises at high densities if there is less viewing flare, I wanted to prove it myself (attachments). Leaving the maths to one side for the moment, I assume that if there is less viewing flare, then darker parts of the image which would have previously been washed out, will now show visible detail, and in some way the gamma is thereby increased. On the other hand, because the surround is now darker, that should mean a slight decrease in effective gamma (the darker the surround, the more gamma has to be increased to compensate). I assume the former effect overwhelms the latter.
In the attachments, I have drawn a graph similar to Fig 6.12, then tried to work out what would be the effect of adding viewing flare of 5% and 2%. If my maths is correct, the effect is just as Hunt says it would be: in going from 5% to 2% (a reduction in viewing flare) the on-screen gamma increases.
I'll never doubt your man again.