I have a cheapy USB 2.0 Sabrent Video Capture Device laying around and wondered if I could use it with my JVC M-DH40000U to capture some old 80's home movies recorded on a high end JVC camcorder with the over the shoulder recording unit.
Better at what exactly? The actual video quality is improved with the ATI 600?
I did notice the case is almost identical to the easycrap lol
I don't mind the price, the ATI seems to be going for around $25 on eBay, but I don't have much time to work on transferring after this week is up with job stuff so I'd like to get started soon.
This always comes up, and I've researched it thoroughly. I tend to view things from a technical standpoint, and in that case I've seen differences in capture window, decombing, reaction to non-standard signals, even chroma placement. The default color settings vary, and the hardware filtering varies too. But basically, comparing similiar settings, you'd have to look hard on test patterns to see the difference, with only capture window being obvious.
People will argue about other aspects like being able to capture uncompressed, compatibility with virtualdub, the software that came with it, and I'm not sure what else. From the user point of view, it's mostly the default hardware settings that can be bad. If you look in the registry you can usually optimize a few things.
I really have nothing against cheap electronics, I think they get a bad name. It's cheap because labour costs are low in the country it's exported from, but the design and chips themselves contain most of the quality. I've just looked at a reference design myself last night, and it's dead simple. There's two resistors at the input. The rest is about giving the chip good power and avoiding interference from the digital signals. It seems unlikely to screw it up, though I'm not very experienced at electronics. If someone can argue why anything cheap has noticeably worse quality, at calibrated settings, let me know.
On the contrary, why did ATI get such a good reputation? It mostly seems to revolve around it's 3d comb filter, which was one of the best. I've never had combing issues from standard VHS from any of my 4 (old) capture devices. Combing is usually a problem with sharp sources, from TV to laserdisc, some S-VHS and DVD (both through composite connector).
All modern chips that I browsed in an electronics catalogue had features like 10-bit ADC and 4x oversampling, which are important to quality. Close to perfection, really.
Thanks jmac, that makes a lot of sense, when I'm not using any special built in features, included software tools, or hardware conversion, the end result of an uncompressed AVI should (in theory) be identical.
Yep, and by all means get a second opinion. Another practical answer is, quality is in the eye of the beholder. If you think another product looks better to you, go for it. But I think the processing done afterwards makes the most visible difference, as well as capturing properly, to include the blacks and whites equally (by adjusting brightness and contrast), and to use a TBC if possible, and avoiding over-sharpening.
The video compression used by included software makes a difference to people's impression too.
the end result of an uncompressed AVI should (in theory) be identical.
While true, theory and practice rarely coincide in the realm of video. The hardware in use is a major determining factor on quality. For example, it's easy to cook the colors, cause audio sync issues, or give video a "digitized" look that can be avoided. Software is an issue, too, but not as much as the hardware.
The Sabrent is pretty much guaranteed to be a rebadge of the cheap/crappy EzCap (or knockoff EasyCap) device.
We wasted money in either 2010 or 2011 on these, for testing. The results were pitiful, assuming the device would even install. You can use it and get results, yes. But you can do better. The ATI 600 is undeniably better. Even some other various no-name (and obscure) cheap hardware is better.
When it comes to video hardware, it's still true that you get what you pay for.
- Did my advice help you? Then become a Premium Member and support this site. - Please Like Us on Facebook | Follow Us on Twitter
- Need a good web host? Ask me for help! Get the shared, VPS, semi-dedicated, cloud, or reseller you need.
Probably. But it's hidden in a monster pile of unsorted research that's being sorted this year. I remember the white level, black level (IRE), and overall contrast/brightness values were off. The video was cooked, and lost a lot of shadow and highlight detail, instead replaced by muddy goo or paleness. I don't think it tinted the video -- that's usually an issue found in DVD recorders, and not capture cards.
- Did my advice help you? Then become a Premium Member and support this site. - Please Like Us on Facebook | Follow Us on Twitter
- Need a good web host? Ask me for help! Get the shared, VPS, semi-dedicated, cloud, or reseller you need.
Another common issue with some video capture devices (like the TV Wonder 650 and 700 USB) is aggressive automatic gain control that can't be bypassed. Driver issues certainly top the list though. Video capture drivers (and the APIs they use) have ALWAYS been buggy on Windows. How many times has this happened when in VirtualDub's capture module?
Avery Lee (author of VirtualDub) has plenty of video capture driver rants in his blog as proof.