I'm capturing VHS using a VC500 USB, AG1980, amarecTV with GV lossless as interlaced NTSC. when I play it back in VLC player with deinterlace, it looks fine. If I turn deinterlace off, it of course looks obviously interlaced. I then drop it into DVD Architect (part of
Vegas pro) and burn a 480i DVD, 9.8 bit rate, interlaced. So from capture to burn, there is no deinterlacing or data loss till the final step. I thought that meant I would get a better picture because of minimal rendering, but when I play that DVD back on my XBox, the image looks like this-
1214181438_HDR.jpg
Very bad interlacing. This was just taken by my phone on a 1080p flat screen, so not the best reproduction. The TV says the signal coming in is 480i, which I thought was the DVD standard, and DVD players did the deinterlacing for progressive displays? If I play it back on my ES15 (connected to my computer), it looks fine. Is this a function of the Xbox or TV? I haven't tried playing with the ES15 plugged into the TV yet.. it's going to take a lot of rearranging for that to happen, but I suppose that is the next step, or is there something obvious I am missing? How do you verify that your DVDs will play back well on many different TVs?