Made an interesting observation when playing around with an ADVC-110 (with Firewire to USB-C adapters) captured with stock QuickTime on an M1 MacBook Air.
If you choose "maximum" quality as the record setting, the stream gets captured in progressive Apple ProRes 422 at 704x480 and a 5 minute clip takes up about 1.5GB which is right around what typical lossless captures take with AIW cards. It's not lossless here since the source is interlaced (VHS), so it is certainly doing some processing to the original at least.
I am fairly certain that it is doing a (for lack of a better term) "virtual dub" of the DV stream's 4:1:1 interlaced signal to a 4:2:2 progressive using its own deinterlacing algorithm at the time of capture. I will say that the deinterlacing it does seems superior to what Hybrid QTGMC has done in at least one of my test clips (diagonal lines/repeating patterns do not have nearly as many artifacts).
I highly doubt that Quicktime is able to get a hold of the signal before the usual DV 4:1:1 sampling and compression occurs within the ADVC-110, but I'd like to be able to prove that since it's already doing some things that suggest it could have access to an uncompressed stream (captures at a different resolution than the ADVC-110 usually outputs)
I saw this which describes how to test subsampling using a pattern and an HD source (computer HDMI output to an LCD TV in this case) -
https://www.rtings.com/tv/learn/chroma-subsampling
Any methods out there of how I could test standard definition subsampling losses over S-video or composite? I think I just need to be pointed to a specific pattern and a way to push it out from computer into a digital to analog device that is known to have good color representation. Could also be possible to encode a progressive DVD with a fixed pixel-accurate test pattern and play it through a DVD player which would get it back to analog.
Not trying to reinvent the wheel if there's already a good method for visualizing subsampling losses in a chain.
Thanks!