WTF are you smoking??? Let me break the process down into simple steps, using only small words.
Let me try to break through some of the mumbo jumbo and apparent confusion in your mind.
24 frames per second doesn't work as 24fps. It's not acceptable quality.
If footage is recorded at 24 frames per second, it's never played back at 24 frames per second,
as far as I can tell. Correct me if I'm wrong. It's played back at a minimum of 48 frames per second, and on the most modern LCD displays at 120 frames per second.
A similar situation applies to 30p, except that 30 fps is at least to some degree better than 24p, and doubling that 30p frame rate to 60fps is smoother (or has less blur motion) than 48 fps.
My initial post in this thread was an attempt to explain to someone who was asking "Why, why, why 30p?" as though 24p was in some way inherently better.
I can find no reason why 24p should be inherently better than 30p, or in the final analysis even as good.
Also, I can find nothing in your posts on this topic that even alludes to any advantage of 24p.
I'll just add a further note of clarfication because we are not entirely at cross purposes, although it might seem that way. I understand your point that flicker is largely a property of CRTs and Plasma displays, and that, whilst a refresh rate of 24fps on a CRT would be horrendous, on an LCD display it would not necessarily be too bad because the transition to a different frame every 24th of a second does not involve a momentary reversion to black.
The only point I have ever been trying to get across in this discussion, a point which seems to have escaped you, is that 24 fps is not enough for totally smooth and realistic motion. 30fps may also not be enough, but it's better than 24 fps and also lends itself more easily to display on a 60Hz system.