If “digital” cable comes into my house in coaxial cable why isn’t used though out the entire home theater? I mean really why do I have to have s-video, rca, optical, DVI, component etc… I mean really. ANd is there a better, best cord? If we leave the number and science out of it… does it REALLY matter?
Thanks.
BTW, I just bought a 50" plasma… which made me get digital cable for the HD channels. About 75% of the channels look like butt! About 20% channels are fair and the rest look awesome. My neighbor says Comcast in our area is the reason. This is just too much. NOW I must get satellite!
Because your TV can’t decipher the digital signal. Your cable box deciphers the signal and converts it into something your TV can understand, which will most likely be component, s-vid, or composite (what you called rca), in decreasing order of signal quality. I haven’t seen any cable or satellite boxes that have dvi outputs yet (actually I think when these are used to carry tv signals they get called hdmi), but I’m sure they’re coming. Optical is never used for video signals so far as I know, only for carrying digitally encoded surround audio.
Incidentally, all of the above video signals are carried on (one or more) coaxial cables. S-vid is two (really skinny) coax (encased in a single round outer jacket), component is 3, composite is one.
Does it really matter? Well, it depends. Depending on a variety of factors, component inputs may or may not be noticeably better than s-vid, and s-vid may or may not be noticeably better than composite. Only way to know for sure in any given setup is to try it out and see. The better standards won’t ever hurt the picture, though, so you’ve nothing to lose aside from the cost of the cords, which won’t be much unless you pay for that ridiculously overpriced Monster stuff the a/v shop thinks you need because their markup on it is obscene.
The principle reason 75% of your channels look like crap on your big screen is that they’re not HD signals, and so they have to be interpolated to be put on your high res display. It’s much the same as setting your 1280x1024 lcd computer monitor to 1024x768. Looks like crap. Now, if you have a really good low def signal, it’s possible to get a reasonable picture (that’d be your 20%), but a lot of tv signals are processed in such a way that blowing them up onto a high def screen just magnifies a bunch of compression artifacts and the like. The awesome 5% are the channels that are actually in hi def. It really does look great, but there’s not much of it out there yet. I have no idea if you’ll actually get better pictures out of low def satellite channels. I wouldn’t expect that you would, to be honest.
My work sometimes includes installing home theatre stuff. From what I’ve seen, I wouldn’t spend the money on those big screens cuz there’s so little HD programming available, and the low def stuff usually looks worse than it would on a cheap tube. Not that I could afford one anyways.
All of the S-video cables I’ve dissected over the years have comprised two unshielded twisted pairs, one pair of the luminance signal and its ground and the other for the chrominance signal plus ground. I don’t recall seeing one which was two bundled coax conductors, though I can’t immediately think of a reason, other than potential impedance matching issues (coax is normally between 50 and 75 ohms while UTP is around 100 ohms) why there couldn’t be.
My mistake. I never cut into s-vid cords as I lack the means of terminating them, and I just assumed that the two signals were on a pair of coax lines. If I remember things aright, the impedence of coax depends on the ratio of the diameter of the center conductor to the diameter of the shield, so there’d be no particular reason you couldn’t have 100 ohm coax. Which I suppose is neither here nor there if they’re using twisted pairs.