Differences in types of HDMI cables

No, I’m not asking about Monster vs. Monoprice.

When buying HDMI cables, I noticed there were different types - HDMI, HDMI + ethernet, and 3D. What’s the difference between these, and which would I use for:
Xbox 360 to HDTV
Laptop to HDTV?

From the horses mouth.

For what you want, the standard HDMI will probably be fine, although the spec does call for high speed at these resolutions. You don’t need ethernet. HDMI 3D is not an HDMI anointed name, although it might indicate HDMI high speed. Indeed you might note that the HDMI consortium is quite clear about what it considers to be proper labelling. HDMI version number is not allowed as a cable type designator either.

The specs are minimums. For instance I have a $20 30 foot long cable that works perfectly at 1080p. But you can have problems with losses. In general HDMI cables sold at most retailers are marked up to an insane level, and are where many of them make their money (along with overpriced extended warrantees.)

From what I hear, there are two kinds of HDMI cables. Ones that work, and ones that don’t. The rest of all that is claimed is complete bollocks.

This is not to say it once was true, or will one day be true, but for now they are all the same, and the only thing you need to worry about is enough physical length that it reaches between your components.

Well, if your cable is going to be setting next to a power source, you might need an insulated cable.

Other than that, you’re spot on.

I believe I paid .97¢ for mine on amazon.com. Works great.

Wow! less than a penny! I’ll bet the shipping was a killer.

Watch out - there are several types of HDMI cables. I bought a bunch of cheap ones from Amazon, and some work, some don’t. Usually it appears to be a problem with the HDCP handshake- cheap cables don’t seem to pass it right or something.

My situation was (many devices)->HDMI switch-> projector

When any of the devices was connected directly to the projector, no problem, with even the cheapest cables.

When connecting through the splitter, only the “good” cables (or is it HDMI 2?) worked. With cheapo cables, some devices would cause the dreaded HDMI blue rectangle covering the image, with the “You aren’t allowed to watch this you evil pirate!” message.

From what I can tell there are two types of HDMI cables. The first is the standard HDMI cable. This cost about $10 online and works on almost anything.

The second is the ones that Best Buy and high end tech places will try to sell you. This is the “Monster” type HDMI cable. It is exactly like the standard HDMI cable except that it comes in a really nice package. It doesn’t do anything other than cost a $100 or more. It’s best attribute is it’s ability to separate you from your money.

Anecdotal of course, but see my post above. All HDMI cables are not created equal. Of course, bits are bits, and the video and audio aren’t going to be “better quality” with a $100 cable than they would with a 97¢ cable. But, having tried many different cables from Amazon and other sources, I can confirm that there is a “crapiness” threshold, where if you go below it, you start having issues with dropped signals and failed HDCP handshakes.

Better cables also have better shielding, so if you run them next to power cables (as often happens in home theater setups) there’s less chance of them receiving or transmitting EM interference.

Since your test direct from device to projector was successful with every cable - but only failed when going thru the splitter or switch - I doubt it had anything to do with the cables and dealt more with the switch itself.

From the point of view of the HDMI consortium there are two domestic grades of cable. Standard and High Speed. There can be a real physical difference in the construction of these, and this could in principle matter. The point above with the interspersed HDMI switch exactly indicates the problem.

The difference is almost totally covered by the loss in the cable. (Other aspects that matter are crosstalk and noise immunity.) HDMI switching rates are very very high. The signal loss down even a short length of cable can matter. However, how this manifests itself is mostly a matter of the HDMI receiver design, and the signal drive level from the transmitter. If the sender has a higher drive level, the system will be more tolerant of losses in the cable, similarly the receiver can also be more tolerant - both of signal loss, but also of common mode noise, and signal jitter. A high speed HDMI cable is guaranteed to be able to work at the higher data rates (for 1080p and up) over its length. A standard cable is only guaranteed to work up to the data rates seen with 1080i and lower. These are guarantees, not absolutes. If you have better engineered components you may find that standard rate cables work fine at any rate.

Cables you buy can be branded in all sorts of ways. The longer the cable the more loss. So no matter what it needs a higher spec cable than a shorter length. However there are not actually many manufactures of the actual cable, and the price to you of the actual cable is 95% profit down the channel. So you may well discover that many standard speed cables are actually made from exactly the same cable as high speed - it is easier. Then it is just a matter of getting some money out of you.

The silly prices asked for HDMI cables (Monster and the like) are just a way of getting money out of people. The standard provides for two grades of cable, and this then becomes for types when Ethernet is added as an option. That is it. If you can get away with standard grade, great (I do) but it is possible that you will discover you are close to the edge, at which point it dies, and you need a high speed cable.

The requirements for HDMI cables are not a great deal different to Ethernet, and the difference between CAT5 and CAT6 is not that far from the different between standard and high speed HDMI. Given how cheap CAT6 cable is you can see how much fat there is in many HDMI cable prices.

But if that were so, why would the same switch work fine with the good quality cables? The whole things works flawlessly when I use the better cables, and the switch even correctly detects changes of source.

What I was trying to say was that the switch introduced noise or a different variable that caused the issue - and that the ‘better’ cables hid from the process.

The point being is that the switch, just by virtue of being there, helped to introduce noise along the signal path that would not have been there otherwise - the better cables (did they have ferrite cores on them?) helped to overcome that issue.

HDMI switches are not simple passive devices. They actually receive and retransmit the signal. Internally they convert the signal from the analogue domain of signalling down a wire, back to internal logic levels, and then retransmit. The receivers are designed to cope with the typical signal loss seen in a length of cable and to compensate. So any HDMI switch also acts as a repeater. However they don’t necessarily reclock the data, and this means that timing jitter can accumulate. A poorly implemented switch could easily add more problems of its own. (I had a cheap switch that had fairly major fault out of the box. It destroyed the HDMI output of my Blu Ray player - which was unrepairable - and thus cost me not a small amount of money.)

You have an end to end signalling problem. Signal losses in HDMI cables are huge - because the data rates are so high. Better quality cable will reduce the loss, and can also improve other figures of merit. Better quality transmitters and receivers will also help. When the lies come in is to suggest that once you have a stable end to end connection that there is anything more a higher quality cable can do. There isn’t.

Is this the reason my XBox won’t play DVDs when I’ve got it connected to HD? - I get the HDCP compliant error message…someone told me it was the TV. Is it the cable?

Ah thanks for the replies. Belkin’s (the brand I bought) website says they have 4 types of cables: normal and high speed, with and without ethernet - just like the HDMI website says. Nice to know they didn’t invent new types of cable. So I guess high speed without ethernet is good enough for everything, and normal is good enough for my TV (720p)?

I would pick up high speed for everything -

I have no idea what the Belkin’s price is - but I buy all mine thru Monoprice and I have yet to hit a bad cable - $3 a piece for 6 footers.

By elimination, I narrowed it down to the Comcast/Moto cable box. the other sources were fine; it was just the cable box that would refuse to switch source properly, until I used the “good” cables, then it worked as well.

Googling around, looks like many people have had issues with cable boxes not doing the HDCP Masonic handshake correctly, and the model I have was specifically mentioned in quite a few posts.

Since we seem to have lots of hdmi knowledgeable people here, can someone tell me how to hook up my laptop to my TV?

Do I have to buy some kind of adapter or is there a cable with hdmi on one end and usb on the other? I tried to Google it and got confused.

I bought this and this yesterday and hope it works.

What brand & model laptop, and what outputs does it offer?

Dell 1440.

Um…how to find out?