I've thought for a long time how nice it would be to have an option that based exposure on length of time per site to fill to a set saturation point. For those occasions where shutter speeds really aren't critical this would be sweet.
Current shutters are still partially mechanical (only the front "curtain" of canons is electronic (simulated), so accuracy is limited.
I was thinking this myself, but then consider this.
If you've chosen 1/1000 for the time a shutter is open, what is the acceptable error in exposure length? 1% 10%? I've got to believe that it is less than 1%. A 10% error margin would mean 1/60 was anywhere from 1/54 to 1/66 - unacceptable. At 1%, 1/100 is from 1/99 to 1/101. So whilst I agree, there is likely some error in the precision, it's also got to be very small or else it would be a very big problem. So thus, I put that out of my head as a concern.
Currently cameras offer shutter speeds equal to 1/3 stop increments, so there are two other choices in between each of the speeds you mention, which seems to be enough granularity for about any shooting condition except high contrast subjects where an HDR mode based on the the this idea would really be sweet.
To think about this differently, if 1/50 does not give me a histogram that is far enough to the right, I've got to allow in 20% more light and shoot at 1/40. 20% is relatively huge. What if 1/40 clips your red and blue channels but 1/50 is still not close to the maximum? What if the best exposure would be 1/48?
At the very least, every digital camera should allow both
1/3 and 1/2 stop selection as an available choice, so that I get 1/50, 1/45 and 1/40. This is currently not the case. But even when it is possible, there is a 10% drop (or 11% increase) between 1/50 and 1/45.
Or to think about it differently, the accuracy with which a camera using 1/3 stops to meter a scene is really rather small - with a fixed aperture and 1/3 stops in use, the camera has an error margin of 10%. Anything that is properly exposed with 1/46 to 1/55 will be exposed at 1/50. If the camera can use with 1/3 and 1/2 stop shutter speeds, the accuracy improves to 5%. Is that good enough?
If your camera can provide you with a highly accurate 1/1000 of a second exposure with the shutter, why can't it provide you with an exposure of the scene with just as much accuracy?
Or to put this another way, a medium format digital back from Leaf or Phase One that costs $40,000 and is only able to meter a scene with an accuracy of 5% (assuming it can use both 1/3 and 1/2 stops conventional stops.) Is that acceptable?
I don't think it's possible with current sensors, based on how they charge each pixel then read the charges, but here's hoping some bright person somewhere is thinking outside the box and developing a sensor that might be able to do this.
The sensor collects charge and there's some circuitry somewhere that drains each pixel to read a value through a DAC. Somewhere there is a "timer" that expires, triggering that to happen. That is now going to be a byte or two stored in the cameras memory that are loaded somewhere as a count down for something to expire. The only part of the circuit that is digital is the memory in which those numbers are stored. Capacitors that are used to hold charge are all analogue. It's not the sensor that is the problem but the circuitry around it.
Consider that the sensor can be read in 1/2000th of a second or 2 seconds. The sensor is just a bucket. To think of it differently, imagine trying to fill a bucket with water from water falling over a waterfall. Is it the bucket itself that determines whether or not it can be filled or is it the decision about how long to leave it under the water?