I asked similar questions on photo.net recently. I am not a digicam owner but this issue strikes me as important and under-discussed. I may just have looked in the wrong place.
I have occasionally read about dead pixels on digicam sensors in the context of how to fix them via computer image editing. But generally speaking there is not much written about the topic nor have I noticed it much in the context of digital workflow.
I wondered about a few things. Some people express the opinion that having a few dead pixels is normal and acceptable and that fixing them in an image editor is part of the game. Is this true? I'm asking that in the engineering sense, i.e. is there no way to produce dead pixel free sensors at a cost that makes digicams affordable (relatively)? If I buy a digicam with a dead pixel is it reasonable to ask for an exchange or am I chasing the holy grail?
If it is a commonplace phenomenon, then I wondered why it's not mentioned much in discussions of workflow. Do the algorithms in the sensors' firmware mask the pixels with interpolated data so that the user mostly does not know about it? At what point in the processing does this happen: A/D conversion on chip, conversion of RAW data, etc?
I find it hard to believe that people think it's normal to spend time fixing a known set of bad pixels on EVERY frame. Seems like time badly spent.
Does the problem get worse over time? This may become important as more and more people buy 2nd hand digicams, like I am apt to.
How many dead pixels are deemed acceptable, 5, 25, 1000, 10,000? It would upset me if I paid for a 4 meg sensor but later found out that only 3.5 meg of them worked and that the rest consisted of interpolated data.