And it gets trickier, since those tests are almost always done with a black and white graph. If you switch to color, the nasty Bayer filter rears its head. Red and blue on a typical sensor are much lower resolution than the green channel.(typical distribution is 25%/25%/50% R/B/G) As a result, it takes 7 or 8 of these sub-pixels to create a single full color point on the film.
The camera has to then use its internal software to interpolate and anti-alias what it's seeing(or in the case of RAW, the software has to do this) Point and shoot cameras of course tend to have very poor methods of doing this and often won't shoot in RAW format. True, you can see some fine details around edges and such, but they tend to have off-color areas, moires, halos, and whatnot, because while the resolution is technically there, it's only partial color resolution.
So shooting color, you effectively have about 0.6-0.65x the listed resolution in actual full-color locations/true pixels in each dimension. And, yes, their marketing departments lie - huge surprise there... 12MP is actually closer to 4.3-5.0 physical full color locations. Which is why the Sigma at 4.6MP looks about identical to a 10-12MP CMOS sensor. Yes, I know Sigma lists theirs as 14MP, but it's a true 4.6MP.
Try shooting a resolution chart printed in red, blue, and green. I'm amazed that more places that review cameras don't do this.
http://www.ddisoftware.com/sd14-5d/A review that points this out. The problem isn't the lens or the camera, but that we need a new generation of sensors that are free of these problems. I've nearly taken Foveon out of consideration, though, since they seem to be unable to get their act together and make a deal with Nikon or some other major player.
A true 8MP camera would be an ideal replacement for 35mm film. Too bad Foveon and Sigma seem to be stuck in first gear in getting it to market.
EDIT: Why I said 8MP?
A typical D-Lab scans 35mm film at 3000*2000 fixed resolution. This is roughly 2200 DPI. 400DPI Dye-sub as a result. A more reasonable limit would be 2400DPI, though, which would net a 3400*2265 scanned area. Almost all dedicated personal scanners default to this resolution as well, or very close to it. This is roughly equal to about 420lines per inch on Dye-sub, which is where most experts agree that you don't gain anything - at least for color. Most D-labs kludge it at 400 lines per inch and call it a day.
So 2400DPI scanned is a good compromise and what we should aim for to consider 35mm film "dead"/rendered moot.
Anyways, 3400*2265 is ~7.7MP of actual true resolution. That's 7.7 with a Foveon type sensor or about 5200*3500 for a Bayer pattern.(16MP at a 0.65 ratio) It's not a lot different in terms of linear resolution - 5200 vs 3400(roughly like 300 DPI Dye sub vs 400DPI), but of course, the multiplier once you add in both dimensions makes it grow to silly numbers.
That said, many DSLRs do give you 16+MP now and can give you film like results after they have been processed and tweaked. I said 8MP because it's not common to find sensors in exactly 35mm film aspect ratios these days, so you usually have to go to something like 3500*2400.
MF, btw, by the same standards, would require 30MP actual resolution, or closer to 60 with a digital back to replace 120 film. The sensors are larger, and the software is generally better, making it closer to a 0.7 ratio instead of 0.65 or so. I'd rate consumer level point and shoots at 0.6 or worse - which is why they look washed out and dull - there's just too little color data/saturation. They just came out with 40MP digital backs, so 60MP or so and replacing 120 film entirely isn't that far off.
*note* - this isn't subjective measurements. Most pros consider 120 film dead with 40MP, or close enough to where they don't care. But from a technical standpoint, 60MP is about where you'd have to go to actually make it moot. I give the industry 2-3 years to get to this point.