I’m a firm believer in the “all cables will do the same job” theory and share manys opinions that Monster Cable and it’s likes are snake oil.
However, I do know enough to realize that a 16 gauge speaker wire is an improvement over a 24 gauge speaker wire. And I belive that when it comes to co-axial cable there is a difference between some in ohm measurements.
How about HDMI cables running 2 meters? Can I buy the one from cablesforless.com for $10 and expect it to be the same as the $30-$60 upgrades?
Or are their various significant gauges/measurements for HDMI cables?
The data being transfered is digital. You either get it or you don’t. Unless durability and length are concerns (and two meters shouldn’t concern you) then no, there will be no difference in signal quality.
Since this is IMHO not GQ, I hope you’ll accept an anecdotal answer. We’ve just upgraded our, well, everything, as it happens: TV, satellite box and service and home theatre system. We had real problems getting the satellite box to talk to the A/V receiver/amp - essentially the signal was intermittent and dropped off very very frequently. We tried connecting the box directly to the TV with the HDMI cable we had and didn’t have the same problems, so by process of elimination decided there was a fault with the A/V receiver. We returned it to the vendor (as an aside, this was Richer Sounds and for UK dopers I cannot recommend them highly enough - they were competitive, helpful and not patronising ).
Problem was, we got the second A/V receiver home, hooked it all up (and my, that’s fun - took about an hour each time to hook up and test all the speakers) and had the exact same problem. Cue hours of research on every audio-visual forum I could find, and a lot of them recommended looking at the cables we were using. We upgraded the HDMI cable in question from a £30 to a £70 version. Result: no more drop out, perfect picture, no further problems with the signal.
The short version: I don’t know for sure that the quality of the cable fixed our problem, but I’m more convinced than I was that cable quality is important. Although, I’m still not convinced it’s as important as the audiophiles would have you believe.
this sounds like a simple case of a faulty wire. I’ve had expensive faulty wires too, though, in theory at least you would expect the more expensive ones to be more rigorously tested.
Holy crap you payed 30 POUNDS (That’s $60) and then payed 70 pounds ($140!!) for another HDMI wire? Was it covered in diamonds cut out of rock by starving african children?
I guarantee that you’re problem would have also gone away if you had returned the damaged $60 wire and gotten a new $10 one.
That’s just not true, digital signals (the actual waveform that reaches the end of the wire) can degrade just like analog signals. In fact, high-speed digital signals can look pretty ugly on the oscilloscope screen.
Digital signals are less suceptible to degredation because: (1) if the degradation is not large enough to cause the bit to flip (1 becomes 0 or vice versa), then the actual digital data is intact, and (2) Even if the actual data is corrupted, digital data streams usually have error-detection or correction mechanisms.
I’m not sure what you just said contradicted me at all.
Yes, the signal is subject to degredation, but just as you mention, because of the nature of the data and the systems involved, for most practical purposes either you get the intended data or you don’t.
Either the video will look as it is meant to, or there will be no video (the video will cut off or stutter. It’s not like an analog connection where noise/distortion/loss of quality will be introduced into the final output.
Unless you are dealing with long distances or a wire that needs to operate under some possible stress, or is in a location that makes it hard to service (a wall) I stand by my recommendation (and even in those special cases, you can probably find an alternative that’s just as good, but not so pricey as some fo the brand names in retail stores).
I thought it was a misleading when taken as a general statement about digital signals. Digital data is susceptible to interference and signal degredation, and cable quality does make a difference. Anyone with digital satellite TV knows there can be varying degrees of data degredation - it’s not an on/off thing.
Although I do agree that for a commercially manufactured cable used for its intended purpose, even the cheapest example is likely to work fine without any data errors, and there will be no benefit to buying a high-end example.
not covered in diamonds sadly, although I like the image
Unfortunately the first one wasn’t damaged in any way, and works perfectly well in its current situation. We just kind of ran out of patience and fight with the whole set up and although spending yet more money on the whole thing did make me cross, for whatever reason the recommendations we got did fix the problem. Or, I suppose, to be accurate: acting on the recommendations we got coincided with the problem going away!
Well, let’s talk briefly about coaxial cable. The “ohm measurements” about which Hampshire is talking refer to the nominal impedance of the cable. The impedance of a coaxial cable is determined by the ratio between the diameter of the centre conductor and the diameter of outer conductor (which doubles as the shield).
There are 2 major flavours of coax - 75 ohm is most common, this is your RG-6, RG-59, or RG-11 distributing analog video signals at a location near you, and then there is 50 ohm, RG-8, RG-58, which is usually found connecting some types of antennae to receiving units. This is commonly used, just for example, in wireless microphone systems. Differences in impedance usually matter, and substituting RG-58 for RG-59 will generally cause you problems.
Within each family of coax, the different types vary by gauge of the centre conductor. Just for example, RG-6 is an 18 gauge centre conductor, while RG-59 is a 20 gauge centre conductor with a correspondingly smaller diameter coaxial shield. They each have a nominal 75 ohm impedance, but the RG-59 will of course exhibit a higher resistance (which is also measured in ohms) due to its thinner conductors. Because of this, while signals on RG-6 and RG-59 will degrade at roughly the same rate assuming equivalent shielding, the signal on RG-59 will attenuate at not quite twice the rate of a signal on RG-6. Essentially, using RG-59 is like using a bit less than double the length of RG-6. I’m guessing this was the difference you thought Hampshire was talking about, and I’d agree that it’s less important than shielding. The impedance difference is not, however.
What complicates matters is that much commonly available RG-59 comes with a 95% braid shield, while RG-6 is usually foil + 60% braid, so people associate the RG-59 with inferior shielding.
I’m not sure about the technical nature of the argument about whether digital will/can degrade, but we have digital cable. And from time to time, there will be “noise” in the sense of random colored strips or pixelations — things like that. They come and go, they’re rare, but they happen. I figure its a case where a block of signal gets corrupted for a moment and then passes away.
There’s one simple fact that matters. The signal that you are getting from your cable provider or satellite dish is going to be pretty well degraded as far as digital signals go. Any small variations between a cheap cable and a insanely expensive one is going to be negligible as compared to the signal issues that are likely to be introduced before the data gets to your doorstep. The system is only as strong as it’s weakest link, and that HDMI cable and power supply are not going to be the weakest part.