Hey all.
My normal workflow is a Samsung sv-5000w VCR -> StarTech SVID2USB232 ->
VirtualDub x64 (on a Windows 11 laptop). Not ideal, but what I have for now until I can afford a better setup. I'm archiving only family footage.
Recently I found a Sony CCD-TRV11 camcorder and just got a replacement power supply for it. When I viewed some 8mm tapes through the display, I noticed a few differences compared to the captures I was getting.
1. The hue was a lot stronger on the display. Everything looks more vibrant.
2. The quality was a lot clearer than what I was getting off of my VHS tapes.
This made me wonder if I was having issues with my VCR or my capture card.
I tried the following two setups.
1. Camera -> VCR -> Capture Card ->
VirtualDub
2. Camera -> Capture Card -> VirtualDub.
There's an extremely noticeable brightness flicker(?) where the color isn't stable during some videos. This occurs in both setups above. Because of this, I think I'm having issues either with the cheap capture card or my VirtualDub settings are off. The only other thought I have is maybe my component cables need to be insulated more?
Attached a small clip from one of the recent captures from the camera showing the issue.
EDIT: I just tried using OBS for comparison and the issue isn't present there, this makes me think it's related to VirtualDub.