HDMI cable certification and signal quality

I can go to Amazon and buy a one meter Monster cable HDMI 1.3a compliant for $57.53. I can go to MyCableMart and buy a 1.8 meter HDMI 1.3a compliant cable for $6.82.

I assume there’s a difference, but what is it? I don’t want to waste even the $13.90 with shipping the cheap cable would cost if it won’t be good enough, but I also don’t want to waste the $60.00 the name brand cable will cost if it’s not going to make a difference in picture or sound quality.

If the certification means anything shouldn’t all 1.3a cables be the same? If not, what good is the certification? Are there dopers out there with A/V experience who can weigh in here?

“If not, what good is the certification?”
It’s good for extracting more money from you.

Unless you are going to push the limits of the spec. on distance, there is no reason to go with the more expensive cable (IMHO).

Digital signals are either strong enough for the reciever to pick up, or not. If not you’ll have lots of pixelation and audio pops. For 1.8m distance there will be absolutely no difference.

As for Monster - I’ve compared Monster products vs cheap brands for all sorts of AV cables, and apart from some long (10m) analoug audio cables, I’ve never noticed any differences.

Cables has become a easy way of extracting money from people with more monies than common sense IMO.

The difference is that the Monster cable is packaged with pretty colors and nice high-tech looking connectors on each end.

In terms of performance, there’s zero difference as long as both cables have solid connections and are built of decent quality. And they probably are.

Monster Cable skirts a fine line between being overpriced and being an outright scam. Buy the cheap stuff and pat yourself on the back for being a more intelligent consumer than the average Best Buy denizen.

Because if it costs more, it has to be good. It’s certainly done well for Alienware!

Buy the cheaper one. The biggest difference between Monster and any other cable isn’t in the build, it’s in the marketing department.

Well that includes redefining common sense. According to M-W, “common sense” is

and in most cases where a person buys a $60 HDMI cable, the sound and prudent judgment cannot be done correctly based on simple perception of the situation or facts. Just because it’s digital doesn’t mean it’s any more fault tolerant, and a crappy cable can certainly make it work poorly or not work at all. However, there’s absolutely no reason to believe that a $60 cable is not crappy or that a $6 is. Current consume HDMI is a 165Mhz signal with 3 data lines (+/-) and a clock line (+/-), it’s PLLed to 10 bits per clock for an effective signal rate of 1650Mhz. I would place the common sense guideline is that it shouldn’t be significantly more expensive than CAT5e patch cables.

Actually it does. Digital signals are immune to minor RF interference leaking into the cable. Moderately poor shielding of an HDMI cable won’t make a lick of difference, because the TV or whatever is receiving the signal is looking for a bunch of 1s and 0s, and when it gets a bunch of 1.0572s and 0.1316s it’s going to ignore the fractional values, so to speak. If the cable is sufficiently crappy you can start to get problems, but you’re going to have to have pretty substantial interference before that happens. With analog, on the other hand, any interference whatsoever gets summed into the signal, resulting in static or hum.

The biggest difference you’ll see between low and low quality HDMI cables is how long they can be and still work.

Monster is a scam of insane proportions. There was a thread here a while ago I posted to where I found that they were selling a bunch of different 1/4" mic cables for super high $$$. Some were labelled as specially for “Rock”, though you could get the exact same cord & ends, but marked for “Jazz”, etc.

The extra $$$ goes to keeping their marketing department high.

All right, the cheapest certified cable it is. Thanks for the advice.

I never stated that digital signals are not immune to minor RF interference leaking into the cable. What I’m talking about are major RF interference, crosstalk, reflections , attenuation and other factors affecting high frequency digital links. A cable can be crappy in many more ways than being poorly shielded – it can have incorrectly matched impedance, unacceptably high capacitance, unbalanced electrical properties between conductors within the cable, etc. All of these things are affected by conductor spacing, choice of insulation and insulation thickness, conductor material and stranding, twisting, shielding and manufacturing defects. We’re talking 1’s turning into 0’s and 0’s turning into 1’s, not minor variations in signal.

Absolutely. But it is simply false to say there isn’t any more fault tolerance in digital signals than in analog. Minor faults that fall short of turning zeros into ones are tolerated perfectly well.

That’s also arguable, depending on your definition of fault and tolerance. To me fault tolerance means a property of the system. Analog is more fault tolerant for some things, digital is for others. Error correction and detection can also be added to digital but is much more difficult to add to analog.

The fault tolerance between the two is certainly different, but I wouldn’t say one has any inherent advantage over the other. For example, interference that would introduce a single bit error into a digital signal might not even be perceivable in an analog signal, but entirely destroy the message in a poorly designed digital system. In some cases where there is error detection but no error correction, the message has to be resent. If there is no ability in the system to resend (fairly often), but still error detection is done, the message can be discarded.

For example, a streaming MP3 audio application might calculate the frame CRC and discard the frame if the CRC does not match (it would be dumb, but I’ve seen this). This represents a loss of 1152 samples, or almost 40ms of audio silence that can be introduced by a single bit error. An equivalent spike might not even be audible in an analog stream.

Digital cables either work or they don’t. Period. The Nyquist-Shannon sampling theorem (Nyquist–Shannon sampling theorem - Wikipedia) I believe says something to the degree that a digital signal can be reproduced perfectly as long as it still gets through.

Cables is cables; get the cheap ones unless you’re routing them over long distances (say, greater than 10 feet).

Link to more info on HDMI than you’ll ever want to know.

That is just plainly false. Digital cables either work or they do not for any particular bit in the transmission, that’s true, but you do not even get to know whether that is the case unless you take special measures. Integrity of the data can certainly be compromised. The entire field of error detection and correction wouldn’t exist if this wasn’t true. There’s nothing magic about digital signals that make them immune to signal degradation due to a bad cable. Anybody who’s ever had a bad USB or CAT5 cable can attest to that – it might still work, but intermittently, poorly or slowly due to constant resending. HDMI has no error detection or correction. Every incorrect bit in the HDMI stream (which is mostly uncompressed video data) is going to show up on the screen. Whether it’s going to visibly degrade video quality depends on how many errors there are and how often they occur.

Again, I’m not recommending people pay $60 for cables, but not because there’s no such thing as a bad digital cable, but rather because
a) $60 is a ridiculous high amount
b) Any cable that is HDMI compliant should work because to get the HDMI compliance logo the cable (at least one cable of the same design/length) is tested.

As for the Nyquist-Shannon sampling theorem, it is generally applied to analog-to-digital conversion rather than digital signals. In fact, in most digital systems the sampling frequency of a digital signal is equal to the bandwidth because it is driven by the same clock, so you get very little use out of the theorem.