Gamma exists partly due to the native response of CRT displays, and partly because it maps noise/distortion in a perceptually uniform manner.
There is absolutely no need of gamma per se
applied to digital images. The correct mapping of the RGB values so that they display right can be done in software, in the output device (monitor plus video card), or both working together.
Gamma in digital images is only necessary to avoid posterization in the shadows if they are encoded with integer values (for instance 16-bit TIFF files and specially 8-bit JPEG files). But an image editor working with floating point numbers doesn't need gamma at all; images can be linear and deep shadows would be represented as richly as the highlights. It's a matter of encoding efficiency.
These two images display correctly thanks to the software (Photoshop is interpreting the 2.2 gamma of the image on the left, and the 1.0 gamma of the image on the right), which is sending to the output video devices the appropiate values for correct rendering:
However the RGB values are totally different, just look at the histograms.
Gamma exists because of the original output devices (basicall CRT's) non linear behaviour, and has no relation to the way human vision works. To correctly mimic real life, any imaging system must be linear end to end. If gamma exists at some point, is to compensate an inverse non-linearity somewhere else into the system, and this has no relation to the way we perceive light.
Today, in digital imaging, gamma is only needed because integer formats are still the most widely used.