The scope of the test was indeed very narrow: just to find out if the theory about how a digital sensor works (basically a linear photon counter), holds true when changes in camera exposure are compensated by software exposure. But the consequences are important because they allow to completely dissociate camera exposure from final image exposure without any issue or side effects regarding colour rendition, something that didn't happen with film.
And the simplest and more precise point at which software should perform exposure correction to 100% mimic camera exposure is at the very beginning, when RAW data are still linear and colour is nothing else but a RAW {RGB} ratio.
If a given camera exposure T=0,5 s produces RAW numbers:
R=400
G=700
B=300
Doubling exposure to T=1 s will produce (I'm ignoring Canon black point offset for simplicity):
R=800
G=1400
B=600
Potential captured colour is the same because RAW {RGB} ratios are the same. Now if the user sets the RAW developer slider to -1,0EV, the software just needs to divide by 2 all those RAW numbers getting virtually the same RGB data as we would have obtained with T=0,5 s and the slider set to 0,0EV:
R=800/2=400
G=1400/2=700
B=600/2=300
In the real world other non colour-related effects take place: the T=1 s shot will have improved SNR over the T=0,5 s and in exchange could clip up to 1 extra stop of highlights information. But regarding colour (RAW {RGB} ratios in non-clipped areas), both RAW files are potentially clones while exposure correction is carried out at the right place. Why commercial software usually don't do it exactly in this way, allowing (slight) colour shifts when touching the exposure slider, falls out of my understanding. This test proves that a genuine Exposure slider is possible.
Regards