So a friend of mine asked me what the maximum bandwidth of an HDMI link is, and since I didn’t know I decided to look it up. My mistake, I guess, since it seems this is close to making me question everything that is holy. You see, all places that I’ve checked, like this official-looking document [Warning:PDF] make claims like this:
and
and then in a table below this tidbit
Ok, fine, so nothing about that is too strange yet, but then we get to the technical details:
[Note: I’m only including this as proof that it’s actually binary digital and not some sort of a hybrid system]
and
Is this another industry specific word distortion of signal frequency or data rate? I mean this is a marketing document. Unfortunately Google and the Wikipedia entry for HDMI confirm all of this data.
I will probably wind up looking really stupid once somebody explains to me what this means, but our HDMI guy is out of the country and this is a good place to ask.
TMDS uses 2 pins per channel and 10 serial bits to represent each 8 bits of data. According to everything I’ve read HDMI has 3 TMDS channels sharing the same clock.
For simplicity let us consider 1080p 8-bit 60Hz. 1080p is 1920 x 1080 pixels, and assuming this is YUV422 coded data you need 4,147,200 bytes per frame. You need 60 frames a second since this is progressive. That’s about 1.85Gbps just for the YUV data. Audio, etc. adds something on top of that too, but probably not too much. According to the chart in the PDF, however 1080p 8-bit 60Hz requires 4.46Gbps. That’s one heck of an audio overhead. All right, let’s not panic, maybe they’re using YUV444, but that’s still only 2.78Gbps. It’s puzzling to me as to how they got their 4.46Gbps figure, but that’s ok, I can live with that.
What I can’t live with comfortably is that it’s written as “4.46Gbps (148.5MHz)”. These mentions of MHz are throughout both the Wikipedia article and the PDF, and all around Usenet and the Web. I sure hope they don’t mean Megahertz, because using a 3-bit TMDS serial interface like HDMI you need a signal frequency of about 2GHz to achieve 4.46Gbps throughput. However, 4.46Gbps can be achieved using a 148.5MHz protocol if you just happen to be using 32 bit transmission, by either using 32 channels or some voodoo (like several discrete voltages instead of just two, or some sort of modulation). Now, I’d rather believe that HDMI runs at 2GHz and the internet is just full of idiots than that it runs at 148.5MHz and they’re using voodoo to squeeze 32-bits per clock on 3 data lines. However, more than either of those I’d rather believe I am missing something simple and this is all just my misunderstanding.
What’s going on? What is the actual maximum signal clock frequency of HDMI and what’s the maximum bandwidth?