My VCR is not a S-VHS but why I'm getting a nice picture using S-Video without any noise or interferences in the TV but not in the PC. That would discard a bad cable (TV image is OK).
I'm using a SCART to S-Video cable (like
this) to connect the VCR to the PC but when I'm testing the VCR with the TV I need to add a converter (like
this) from S-Video to SCART (no S-Video input in my TV). So here's my theory:
According to
Wikipedia, the S-Video Y pin is shared with the Composite pin while the S-Video C uses a different pin and is not shared with the Composite (picture OK). So my guess is that the TV is unaware that the input is S-Video and is running in, let's say, Composite
mode. The PC is pretty aware that the input is (
should be) S-Video and is trying to use the two signals but S-Video C pin is carrying garbage (is a regular VHS not a S-VHS).
Does it make sense?