Wait, HDMI is how fast?

So a friend of mine asked me what the maximum bandwidth of an HDMI link is, and since I didn’t know I decided to look it up. My mistake, I guess, since it seems this is close to making me question everything that is holy. You see, all places that I’ve checked, like this official-looking document [Warning:PDF] make claims like this:

and

and then in a table below this tidbit

Ok, fine, so nothing about that is too strange yet, but then we get to the technical details:

[Note: I’m only including this as proof that it’s actually binary digital and not some sort of a hybrid system]

and

Is this another industry specific word distortion of signal frequency or data rate? I mean this is a marketing document. Unfortunately Google and the Wikipedia entry for HDMI confirm all of this data.

I will probably wind up looking really stupid once somebody explains to me what this means, but our HDMI guy is out of the country and this is a good place to ask.

TMDS uses 2 pins per channel and 10 serial bits to represent each 8 bits of data. According to everything I’ve read HDMI has 3 TMDS channels sharing the same clock.

For simplicity let us consider 1080p 8-bit 60Hz. 1080p is 1920 x 1080 pixels, and assuming this is YUV422 coded data you need 4,147,200 bytes per frame. You need 60 frames a second since this is progressive. That’s about 1.85Gbps just for the YUV data. Audio, etc. adds something on top of that too, but probably not too much. According to the chart in the PDF, however 1080p 8-bit 60Hz requires 4.46Gbps. That’s one heck of an audio overhead. All right, let’s not panic, maybe they’re using YUV444, but that’s still only 2.78Gbps. It’s puzzling to me as to how they got their 4.46Gbps figure, but that’s ok, I can live with that.

What I can’t live with comfortably is that it’s written as “4.46Gbps (148.5MHz)”. These mentions of MHz are throughout both the Wikipedia article and the PDF, and all around Usenet and the Web. I sure hope they don’t mean Megahertz, because using a 3-bit TMDS serial interface like HDMI you need a signal frequency of about 2GHz to achieve 4.46Gbps throughput. However, 4.46Gbps can be achieved using a 148.5MHz protocol if you just happen to be using 32 bit transmission, by either using 32 channels or some voodoo (like several discrete voltages instead of just two, or some sort of modulation). Now, I’d rather believe that HDMI runs at 2GHz and the internet is just full of idiots than that it runs at 148.5MHz and they’re using voodoo to squeeze 32-bits per clock on 3 data lines. However, more than either of those I’d rather believe I am missing something simple and this is all just my misunderstanding.

What’s going on? What is the actual maximum signal clock frequency of HDMI and what’s the maximum bandwidth?

Bump.

I hope you don’t mind, but I took the liberty of posting your question at avsforum.com. Here is a link to the thread. I thought your question was interesting and there might be people there who would be able to answer you. So far, there doesn’t seem to be a lot of takers at SDMB. If you object to my posting your message over there, I will withdraw it.

If I’ve understood your question completely, the crucial thing that you’re missing is that the TMDS clock runs at one-tenth of the data rate of each of the three TMDS data channels. The product page for the E4887A High-Resolution HDMI TMDS Signal Generator:

As you point out in your OP, each byte of video data (however it’s encoded in color space) is sent as 10 serial bits, so the one-tenth clock means that it’s really a byte clock rather than a bit clock; the HDMI receiver will use a phase-locked loop (PLL) to regenerate the full-speed clock on-chip.

Thus, since the bit-rate is ten times the transmitted HDMI clock, and there are three differential signal pairs, the transmitted bit rate (in Mbps) will be thirty times the clock rate (in MHz). The Wikipedia section on HDMI versions gives the current maximum (version 1.3) bit-rate as 10.2Gbps, which means that each data channel is 3.4Gbps, and is phase-locked to a 340MHz transmitted clock. (Note that since each byte is represented by 10 serial bits, the maximum real data throughput is 1.02Gbytes/sec.)

The above holds true no matter what signal is being transmitted. Video and audio are multiplexed.

[On preview, I see Drum God’s post. I haven’t looked at his linked thread before submitting this.]

[Bolding mine.]

Just to clarify, they are “using voodoo” (if by “voodoo” you mean “transmitting a divided-by-ten clock” :wink: ), but it’s 30 bits per clock on 3 data lines (i.e. 10 x3), not 32.

High-speed signal transmission often uses transmitted clocks that are submultiples of the data bit rate per channel. Not only does the lower clock speed result in reduced EMI emissions, but in a protocol like HDMI, it provides a more useful sync for the data channels than a full-speed clock would. [Example: if you had a scenario where the data lines were running flat-out at 3.4Gbps each, with a 3.4GHz clock sent down the cable on a differential pair, the receiver would have a hard time knowing when each 10-bit serial chunk started. However, by sending a one-tenth frequency synchronized to the first bit of the ten-bit chunk, the receiver has more information than it would have with a full-speed bit clock. Throw a x10 PLL into the receiver, and it’s simple to regenerate the full-speed bit clock.]

Interesting discussion. I’ll only add that to get set up for certifying an HDMI source device to the 1.3a spec, you will need to pony up over $100K for equpiment, assuming you have none. The guys from Tek will come in and show you how to run everything, though! :slight_smile:

Excellent! Thank you everybody, this makes perfect sense.

Just FYI, but the post at AVS Forum has gotten 42 views, but no replies. Maybe it’s because of the weekend.