Some of this depends on the quality of your LCD, and its age. Cheaper and old LCD units, especially ones with slow latency, had visual issues (and they're not necessarily related to interlace, either). In some cases, a brand new 24" computer LCD may look truer than a low-grade 40" television LCD.
Always keep your output the same as the source, in terms of interlacing (except in a few rare circumstances). The television hardware will always provide a better progressive image than anything you could do on your computer (again, except in a few rare circumstances).
"Rare circumstances" is when you're converting video to a streaming format for computer-only use. In those situations, you'd crop the image, resize it to a certain spec, properly deinterlace, etc.
"Properly deinterlace" is not using the tick-box that deinterlaces in software, but more advanced means (a discussion for another topic, if needed).
If you were to deinterlace in software, you usually would be unable to fix blurring motion -- that's a byproduct of slow 24-25-30fps frame rates. The new HD standard still sucks to many of us, because an increase in resolution isn't nearly as helpful as an increase in frames would be.
In many cases, deinterlacing in software trades one error for another. You may minimally improve the motion, but then you'll have a lot of linear errors to deal with, especially since this appears to be tape-originated footage (and not film-originated or telecined footage). In many cases, motion actually gets worse, because you'll end up removing full or partial frame data from pans, so it will become jerky instead of blurring -- or worse -- both jerky AND blurry!
It's best to leave it as interlaced, when television is the destination for viewing.
|