Quote:
Originally Posted by mtetreault
Thank you for your prompt answer. You're right, I indeed meant the Diamond Multimedia VC500. I must admit that I don't understand why people prefer the VC500 over the Insanity Pro 4K. I did a quick capture with both card using an Hi8 camera and results seems to be almost identical. Note that I am not that experimented. So my eyes might be missing a few things here and there.
|
Devices like the VC500 are optimized for analog source. Products like the Insanity Pro 4K is designed for pristine digital sources and is unlikely to handle very well the wild voltage variances found in analog source, resulting in signal level problems (crushed blacks and blown-out highlights are typical for digital-optimized devices and can't be corrected after capture).
Quote:
Originally Posted by mtetreault
You are also right, my vhs is not a JVC HR-S912U but a JVC HR-S5912U.
I looked at the spec sheet and it doesn't seem to have a built-in tbc. To be honest, I somewhat though the line-tbc could be done on the software side with a virtualdub filter. I'll consider buying a new VHS deck.
|
There is no such thing as a line tbc software filter. Programmers have tried designing them but none succeeded. Scanline errors are a runtime phenomena and can be fixed only during capture. Many people use a legacy DVD recorder as a line tbc pass-thru device, such as a Panasonic DMR-ES10 or ES15.
Below is an image from a video made without a line tbc. The scanline errors display as warped and notched verticals that change and "wiggle" during playback. Notice the distorted gray pipe near the left edge, and the warped window frame at the right.:
A copy of that video was encoded with the frame rate reduced to 8fps to show how the scanline errors "wiggle" on verticals and in the house siding panels during play:
http://www.digitalfaq.com/forum/atta...x_crop_8fpsmp4.
Quote:
Originally Posted by mtetreault
I don't understand how upscaling is bad. I don't mean to argue, I'm simply trying to understand. If I capture the VHS stream as it is without upscaling. Roughly, I'll have a video with a 333×480 pixel resolution. Now, when I play back the video on my 4K TV. I assume the TV will play the video in full screen and not in a small window. So I assume the TV will upscale the video source anyway? So either way, it`ll get upscaled?
|
Upscaling isn't bad in itself. It depends on how it's done. Your graphics card, media players, and tv can do a better job of upscaling than you can with software. Besides, if you upscale your videos to 1080p they'll be upscaled again in 4K unless you upscale to 4K yourself, in which case a 1080p display will end up downscaling, and all that re-scaliong takes a toll on quality. On top of that, if you intend to post-process and clean up analog captures in HD frame sizes your working files will be huge, and for no particular advantage.
A 4k display will not play 4:3 video full screen unless you direct your t v to distort it, otherwise it will play properly with black side pillars at the correct aspect ratio unless the original source is 16:9 to begin with. So far, you've mentioned some ways for distorting and degrading your videos. There are reasons why standards and proven methods still exist, even with professional tape restoration labs. Analog tape captured to lossless media using lossless compressors such as
huffYUV or Lagarith with YUY2 color at 720x480 NTSC or 720x576 PAL is still the way it's done these days, even by pro labs. The only way around that is to spend well into 7 figures and get the same industrial gear and years of pro training that Hollywood uses. Yet I've seen plenty of evidence where their results don't look as good as many projects posted by advanced hobbyists in tech forums! I own some retail DVD's "mastered' from old tapes by "pro" labs that look pretty awful. Makes you wonder why they're paid for such low level work.