Can someone explain the original reason for interlacing? My guess is that historically, the choice (in the USA) was between 60fps interlaced and 30fps progressive, and the lower 30fps had worse visual problems than the faster but interlaced option.
But since 50fps (as with PAL and it 50fps interlaced) seems fast enough, it would seem that once "p50" is feasible, interlacing should be abandoned. So unless the experts correct me, the path forward for HD broadcasting seems to be 720p50 [PAL ] and 720p60 [NTSC], and then onto 1080p50 and so on.
Aren't most US HD broadcasters using 720p rather than 1080i?