digitalFAQ.com Forum

digitalFAQ.com Forum (https://www.digitalfaq.com/forum/)
-   Capture, Record, Transfer (https://www.digitalfaq.com/forum/video-capture/)
-   -   Interlaced vs. non-interlaced artifacts? (https://www.digitalfaq.com/forum/video-capture/1399-interlaced-vs-interlaced.html)

via Email or PM 04-15-2009 03:23 PM

Interlaced vs. non-interlaced artifacts?
 
I previewed the dvd's when I was done and was suprized with how much interlaced artifacts I was seeing on my LCD-tv (when it got big enough). I guess the 24" monitor I'm using wasn't large enough for me to have paid any attention to them, because now when I look closely I can easily see them on the computer screen aswell.
Do you always keep the final product the same as the source? (interlaces material -> interlaces product) Since CRT's are hard-to-find-objects these days and progressive scanning tv-sets are everywhere, perhaps one shouldn't supply the final product interlaced any more? What's your view of this?

Note that the artifacts are only visible in scenes with fast movement. But since the material is a whole lot of folk-dance shows which a lot of fast moving people, it appears rather often. I would keep it interlaced, since the tv-set should de-interlace on-the-fly, right? but..


This question was asked via email. Site Staff no longer answer tech questions via email, so that others may read and benefit from our expertise. Please continue the conversation here. Either login or join as a Free Member, and we can continue troubleshooting your video, photo or web related issue. Thanks for understanding our tech Q&A policies.


lordsmurf 04-16-2009 05:37 PM

Some of this depends on the quality of your LCD, and its age. Cheaper and old LCD units, especially ones with slow latency, had visual issues (and they're not necessarily related to interlace, either). In some cases, a brand new 24" computer LCD may look truer than a low-grade 40" television LCD.

Always keep your output the same as the source, in terms of interlacing (except in a few rare circumstances). The television hardware will always provide a better progressive image than anything you could do on your computer (again, except in a few rare circumstances).

"Rare circumstances" is when you're converting video to a streaming format for computer-only use. In those situations, you'd crop the image, resize it to a certain spec, properly deinterlace, etc.

"Properly deinterlace" is not using the tick-box that deinterlaces in software, but more advanced means (a discussion for another topic, if needed).

If you were to deinterlace in software, you usually would be unable to fix blurring motion -- that's a byproduct of slow 24-25-30fps frame rates. The new HD standard still sucks to many of us, because an increase in resolution isn't nearly as helpful as an increase in frames would be.

In many cases, deinterlacing in software trades one error for another. You may minimally improve the motion, but then you'll have a lot of linear errors to deal with, especially since this appears to be tape-originated footage (and not film-originated or telecined footage). In many cases, motion actually gets worse, because you'll end up removing full or partial frame data from pans, so it will become jerky instead of blurring -- or worse -- both jerky AND blurry!

It's best to leave it as interlaced, when television is the destination for viewing.


All times are GMT -5. The time now is 03:09 AM

Site design, images and content © 2002-2022 The Digital FAQ, www.digitalFAQ.com
Forum Software by vBulletin · Copyright © 2022 Jelsoft Enterprises Ltd.