The VCR puts you a step ahead of most consumer players and will give better results. An external frame-level tbc is often used as well; while not
absolutely necessary with a good player it does eliminate
frame-level signal problems and is essential for damaged, poorly stored, or otherwise deteriorating tapes.
The ATI capture device most often recommended here is the ATI 600 USB or PCI. Lossless capture is achieved using
VirtuaLDub as the capture interface, while the capture drivers themselves come with the capture device. The ATI can also capture directly to MPEG2 via its MMC software package. Another capture device that can capture to lossless media is the Diamond VCD500 (also using
VirtualDub to capture). Uh, did I get that Diamond model number correct? Hard to remember, but it's definitely a '500'.
Again, the phrase "highest quality possible" shows up again. But most users who've never captured to lossless media don't know what the phrase entails. VHS, VHS-C, SVHS, Hi8 are capped in lossless form to decompressed AVI using Lagarith or
huffyuv losssles compressors into a YUY2 colorspace (because YUY2 is a close equivalent to the way luma and color information are stored on analog tapes). Typically such a capture consumes drive space at about 25GB per hour. So frankly I don't think that method is exactly what you imagine it to be. DV runs at slightly lower file size but is
not lossless; DV loses some 20% of the overall original data, and for NTSC it loses 50% of the chroma resolution. DV also has some compression artifacts that don't usually show up in VHS-to-lossless capture.
Practically speaking, you would likely prefer a much smaller archive encoded directly to MPEG2 at very high bitrates, which would max at about 3GB per hour, more or less depending on bitrate. It is possible later on to clean up the usual analog tape defects found in those captures. But realize that more compression loss is inevitable with re-encoding, and you'd have to do the fancy intermediate processing as lossless media anyway, and you have to know what you're doing.
The use of an HTPC for television viewing is one of the oddest fads going nowadays IMO, as PC graphic cards aren't all that suitable for TV display. People assume that PC monitors and TV sets display the same way. But of course they don't, and the difference is far more than just progressive vs interlaced-to-progressive. PC monitors and TV sets use a difference colorspace and have different luma and gamma curves, among other factors. The same mismatch occurs the other way around, i.e., watching DVD and BluRay movies on a PC when in fact those media are designed for TV display. Of course most people don't know the difference anyway, for several reasons, and usually neither their PC monitors nor their TV sets are properly calibrated for any medium, regardless of source.
The advantage to lossless is, first, that it's lossless. Second, any future modifications won't damage the new results thru lossy re-compression. Third, lossless media can be encoded to any format or codec imaginable -- which means that lossy encoding from lossless to the usual codecs is only
one lossy stage, not multiple stages of loss and degradation. The master archive remains as-is. The disadvantage is that, like DV-AVI, set top and external media players don't recognize those lossless codecs any more than they can recognize lossy DV. Most PC media players, however, can interpret and play the lossless file if the codec is installed on the machine.
As I said earlier, most people aren't so picky anyway. But it's often the case that the more you know about video and graphics, the more closely you watch and the more involved you get, and the more involved you get the more the careless processing gets in your way. It can be dangerous territory.