* Btw I don't get how you say that frame rate is doubled for the de-interlacing method you mentioned. Shouldn't it be halved from the signal rate, to wait for the next alternate interlaced frame? For eg if 1080i is at 60fps it can be de-interlaced to form 1080p at 30fps.
Btw, here is a somewhat technical article that explains various algorithms used in upscaling/downscaling 1080p and 1080i signal on 720p display: Rescaling 1080i to 720p
So, some information can be lost in scaling 1080i to 720p, so it is possible that 1080i signal may be displayed better on 1080p display. But again, it depends on the input signal rate in frames per second as well as the algorithm used by the TV to de-interlace.
Here is another nice article: High Definition Blog
Last edited: