Why choose vintage ATI card over USB dongles?
I see here that the old ATI All in Wonder cards are really popular. I'm still reading a lot of posts but I'm wondering why so many are choosing legacy hardware over some of the more common USB adapters. I realize there are a lot of crappy Chinese ones but it seems there are plenty of higher quality ones, I'd imagine. Or even some of the newer PCIe card instead of the old AGP. Is it just preference or do the older ones actually produce better results?
|
Quality.
On a scale of good, better, best, AIW is best. None of the USB options are best. Just good and better. And as mentioned, worse. But AGP/PCI also had worse. Just nobody talks about those anymore, most BT-based cards. The PCIe AIW was crippled compared to the AGP/PCI. Drivers removed abilities. PCIe's ATI MMC 9.1.x removed important features (example: dropped frames counter). |
Quote:
Additionally, I recently got rid of a bunch of hardware. I still may have an old motherboard with AGP but has anyone tried using one of the AGPs card with an AGP to PCIe adapter? |
It's not just about compression. It's about correct video values: IRE, color, luma, etc.
No, there are not any modified ATI drivers for video. For graphics, yes, but those have nothing to do with the Theatre chip video. The PCIe cards had problems due to bad PCIe drivers. For MPEG, ATI MMC is required. And that's one reason for the AIW. It does 15/20mbps max, which can be archival, sub-broadcast, BD spec. The MPEG is hybrid software/hardware encoding. The other is correct values. Rarely is an AIW out of spec, and require you to fiddle with proc amp controls internally per system. Calibration is an art, and also requires calibrated monitor to do it. As an example, the VC500 tends to run dark, and the ATI 600 clones tend to run a bit bright. You can guess the values, both are good cards. But "best" is the AIW, rarely is it incorrect. OBS is streaming software, not capture Premiere is an NLE, not capture. Both have many issues, specifically in regards to dropped frames. Because, again, not capture software. There's nothing "old" or "vintage" about ATI. It's legacy hardware, still viable for modern use. Yes, software that comes with cards is usually junk. ATI MMC was an exception to the rule. There are no AGP>PCIe adapters. The tech is not compatible whatsoever. |
I started with ATI USB 600, then switched to PCIe AIW (x600 pro to be exact).
The biggest difference I see involves adjusting your video levels during capture to avoid clipping shadow or highlight. Much has been said in this forum before. Basically, the Virtualdub Preview mode histogram is used to ensure your video is not below 16 or above 235 to avoid crushing shadow or highlight. See the Virtualdub Capture Guide thread in this forum for more on that. The difference is the USB has no proc amp control (Brightness, Contrast, Sharpness, Color). So if the output of your video into the USB capture device is too high or low, you have to get a proc amp to insert in the video path. Otherwise, you are going to capture things with a lot of blown out shadow or highlight. It is quite likely, in my experience, you will need some kind of proc amp control somewhere in the video chain. It is much cheaper to have an AIW catpure card (AGP or PCIe), which DOES have an internal proc amp control to allow adjustment of brightness and contrast, so you don't clip. This is more subjective, but I also personally thought the quality of the captures was better with the PCIe x600 I got, compared to the ATI USB 600. I think there was less noise, and therefore a bit more efficient capture bitrate (using HuffYUV). Just my personal experience, anyway. As has also been noted on this forum, if you get a PCIe AIW that is recommended, you have to get two additional cables to connect to the card. Non-PCIe just has the usual purple or domino breakout box/cable. Those are cheap and easy to find, in my experience. For PCIe, you also need an additional connector that you plug between the card input connector and the aforementioned purple or domino breakout box/cable. To complicate even more with those PCIe, it depends which model you get, because there are two variants of this cable. There are gory details about this in this thread that show pictures of what they look like. http://www.digitalfaq.com/forum/vide...creen-aiw.html |
Quote:
While I've never used one, there are in fact AGP to PCIe adapters. Or at least there were, once upon a time. I'm highly dubious of their effectiveness though, which is why I ask. Assuming no barriers to entry, which of the AIW cards would you recommend? Is this the list I should be looking at?: http://www.digitalfaq.com/forum/vide...html#post13441 |
NLE capturing well is arguable. It causes dropped frames more than not. And it doesn't preserve the captured file without further re-encode processing. NLEs don't have "stream copy"/lossless type export features. Any NLE capture is double encoded on output.
I don't understand the love of OBS. It's a fad as far as I'm concerned, in terms of analog video capture. It has a much larger resource overhead that VirtualDub, and it shows, causes problems. Yes, not much different than NLE capturing, in resource terms. OBS is a streaming tool, use it as designed. Premiere/NLEs are for editing, use as designed. Problems happen with video when afterthought features are used, such as SD capturing with BM cards. AGP to PCIe would have to be some sort of scammy Chinese device. There may be some weird situation where a PCIe card was actually designed as AGP, adapter to PCIe, and cold be adapter back. I remember stuff like that going back to the early 90s, weird backwards-compatible (backwards, period!) type hardware. But those are oddballs, not general-use adapters. It's like Firewire>USB, which doesn't exist as such, but a few unique devices can adapt specific data transfers by connecting to both. The AGP 7500, 8500, 9000, 9200 are generally easiest to install and acquire. No special adapters, no unique quirks. I have the 9600 cards, special adapter, quirks. Also the PCI 7500, rare, quirks, which I use with far more recent hardware than AGP, yet still supports XP. Also AIW USB. XP required for AIW> |
Quote:
I looked at the AGP PCIe adapter and I think there was only one company that made it and, as expected, it had it's limitations. I only remembered that one actually existed so I wasn't sure. Plus I doubt there are any, even if it did work. Looks like I'm going to have to hit the electronics recycler because only a few months ago I cleaned out a bunch of my old computer parts I was certain I'd never need, including anything with AGP. I could have sworn I had an AIW but if I did I got rid of it then too. Looking at the images of cards, none of them look familiar so maybe I didn't. I know I had a Hauppauge at one point though. I think I traded that a while back. One last question. Do these cards usually connect to equipment via coax? I imaged they would be S-video but it looks like they are all coax. Pictures on eBay suck so maybe there are S-video ports I'm not seeing. Thanks all for the advice. |
Once upon a time, NTSC AIW had analog coax for analog cable/air. Worthless now.
PAL had whatever that aerial connection is. AIW are all s-video and composite via the purple/domino break-out box (aka dongle). Only use s-video. |
Quote:
|
Quote:
|
Pretty much its the hugely popular 9600 model that requires the Weison custom breakout cable.
The earlier 9000, 9200 and later 9800 didn't have that requirement.. but there is a reason the 9600 still sells even without the breakout cable. |
Quote:
|
Any quirks with the 9800? I can probably get my hands on one of those.
|
There are a few adapters up on ebay, a bit hard to find though.
Then there's also the Ati Radeon VIVO (I think they're basically like a tunerless AIW) cards, the 9200, annd 9600 (which I have) use a 10-pin DIN plug breakout cable different from the purple dongle. The 9000 VIVO have S-Video and Composite ports on the card itself. |
The cards are still valuable to people who have the "missing cable" or have cards that have developed problems over the years due to capacitors going bad.
Just like motherboards have big electrolytic caps that eventually age out and die.. usually in their death throes they start causing interference. A few "professional" users will "re-cap" the aiw video card.. but until recently they were plentiful enough to just find another going for a low price on the auction sites. What cost the most money, was getting your "first" breakout cable connector. Then hanging on to it. As you noticed the cards can be had for a lot less money than the cable connector. The cable connector can be bid up pretty quick when properly marketed online think x10 times the cost of a card. There are also a couple different "versions" of the 9600, the Euro version and the North America version, you can tell by the RF connectors, and F-style or the smooth Push coax jack. These have different tuners and different RF de-modulators, so their potential interference patterns (if any) can be different. The 9000 through the 9800 don't have a lot of downsides, they vary mostly on the GPU side.. which doesn't matter for the video capture. But the larger the number usually the more power hungry the GPU just sitting idle and the louder the fan, and larger the power supply. By the time the X series came along (think the 10,000 series like OS X, it was just marketing) the power requirements for the PCIe cards got so big you had to attach a secondary power cable direct to the power supply and they all had Huge hair dryer jet engine sounding fans on them. They threw out a lot of heat. The X series (PCIe) cards run Hot and require a well ventilated PC case.. they were built to be gamer cards First.. video capture cards second.. so you have a lot of extra baggage to service just to do video capture. Motherboards and XP also inherited PCIe as an "Add-on" bus so it was still experimental and buggy.. Bus mastering conflicted.. ect.. That's generally a lot of reasons why its best to stay in the 9000 series. The 9800 is not "bad" by any stretch of the imagination, and a lot easier cable wise.. it probably "peaked" ATI's design experience.. but it was already diverging from video capture "First" and was becoming more a Gamer's card, then a Video capture card. The 9000 is also not bad, but it was early days, it still supported DirectX7 and DirectX8 even though designed with the ideas about DirectX9. It wasn't until the 9600 that everything came together all at once, DirectX9 was generally available, the card was optimized for video capture and only the perceived need for [dual video ports] for dual monitors caused them to use that special cable. In spite of choosing to use that funky breakout cable.. people still sought it out and still do. Buying a 9600 at a cheap price and hanging on to it, until you find a cable.. or thinking down the line you may sell it to a collector or professional for more than you pay for it (might) be a good strategy. Its not certain.. but its a good card. But don't worry about it if you can only get a 9000 or a 9800.. personally I think I'd pick the 9800.. it was released pretty close to the 9600 in years anyway.. and they (all) have the all important Theater 200 capture chip. |
Quote:
I couldn't find any! I kept trying to find the right search term but couldn't. Thanks. I'll consider getting one, now that I know I can get a cable. |
Quote:
|
Pros generally mean "GPU needs mode power" .. aka they run "hot" for no reason useful for video capture.
Running "hot" is not good for the longevity of the electrolytic capacitors on the video card, it heats up, they heat up and they live shorter lives and when they start failing you get more noise in your capture until they stop working. The low power GPUs or the passively cooled (heatsink, but no Fans) are better for video capture exactly counter intuitive to what a Gamer would want the cards for. And since they run cooler, they live longer. Heat can kill silicon too.. but anything with big capacitors is more sensitive to running hot. I've found an old fashioned Irwin Case for an ATX motherboard simply isn't cool enough.. the thermals of the CPU and GPU get quickly out of hand if left to the puny (one) case fan in those old boxes. You really need something more modern that will handle the spillover from the video capture card.. its not just a video capture card, it a gamers, gpu, tuner, rf demodulator and ram chips.. a whole slew of things you don't need for video capture. But you have to take care to keep everything cool.. including your motherboards CPU and RAM. I've also tried just augmenting with a larger fan, or more fans.. no dice.. the problem is the throughput of air through the vents.. the old cases were just sealed up too tight.. you can't realistically get enough air through the holes in the front and the back flowing in the pattern you need to keep everything cool. I broke down and bought a Corsair Graphite case with huge fans and lots of cooling vents.. though I've thought about buying a cheap test stand and running everything "naked" on the bench with desk fans aimed at it. I will say Uncompressed capture generally runs "cooler" than when using hardware Compression while recording.. it only stands to reason, and its true. Even if offloaded partially to the GPU, the CPU still puts out more heat because its working harder.. and the ram chips since they are moving all those bits multiple times. Its another reason to not try Compressing while making video capture.. get the video digitized "first" and then edit, repair before taking the final plunge and Compressing the original into something for long term storage or distribution. Its a long work flow.. but perfection requires a lot of effort. |
Quote:
Quote:
I'll probably get the 9600 then. The proprietary cable is a bit of a pain, but as long as I have one it's not a big deal I guess. |
Site design, images and content © 2002-2024 The Digital FAQ, www.digitalFAQ.com
Forum Software by vBulletin · Copyright © 2024 Jelsoft Enterprises Ltd.