Been thinking about this one for a while and I think given the technology it is doable. The result would be reduction in blown highlights, in camera HDR and opens up the posibility of an interesting photo technique.
The problem with digital sensors is their low exposure latitude. In a contrasty situation you can chose to expose for highs, mids or low in a single exposure, or you can take multiple exposures and combine them and there are many programs that will do this for you.
It occured to me though that a digital photo is nothing but a bunch of numbers representing various things about the file. If you could add another number to this file you could eliminate problems with blown highlights and effectivle attain in-camera HDR. The number represents the time taken for a pixel to reach 100% saturation. In theory you could overexpose an image (e.g. full sunlight, f1.8, 30 second exposure) and be able to recover detail.
Obviously it wouldn't be as simple as all that...there would be formulas to take into account the way the sensor reacts to light and thus be able to recover information about the image.
Obviously this number needs to be stored somewhere, somehow. A 32 bit number can record single nano seconds for about 2 and a bit seconds, a 16 bit number can store 1 every 15,200 or so nano seconds up to 1 second. The former would tripple the amount of information to be stored and the later would double it (after the camera unscales the 12bit ccd data to 16 bit). How fine the granuality needs to be depends on how much detail on average is lost due to over exposure.
Given that storage is cheap these days, what's another 10mb of data to store?
Thoughts?...been done before?...or am I just talking out my rear end..
Cheers,
Bart
edit: didn't realise this was my first ever post (thought I'd posted here before)...hmm...will have to introduce myself somewhere.