HDTV Question: HDMI/DVI cable vs. Component

I have an HDTV (Mitsubishi rear projection CRT) and an HD cable box (Scientific Atlanta 8300HD). At present they are connected with a component cable set, and the picture quality is excellent.

But the cable box has an HDMI output and the TV has a DVI input. HDMI>DVI adapter cables are available for about $25-30.

The GQ question is: does the HDMI>DVI route offer a picture that is theoretically superior to the component video cable? If so, in superior in what ways, and how would the theoretical difference be visible.

The IMHO question is, is the difference (if any) so visible to make it worth the $30 cable? Now, obviously, I’m not a pauper, so it’s not the end of the world if I end up paying $30 in vain. But my suspicion is that either there’s no difference at all, or that if there is, it’s very subtle.

So if you’ve done a side-by-side test of HDMI and component, please let me know what you found. Thanks.

This one has been done to death on the AvsForum site. Component vs. HDMI is one of those “holy wars” like Linux vs. Windows that never seems to get resolved.

To read more opinions than you ever wanted, search www.avsforum.com.

Here’s an example:

http://www.avsforum.com/avs-vb/showthread.php?t=650231

I have a Mitsubishi 52" DLP HDTV. I have a Motorola HD cable box(comcast). I used component out forever. I work for a low voltage wiring company(we do alot of home theater, among other things), and got a DVI-HDMI cable from a supplier to try it out. There was no difference whatsoever. I think HDMI might be better for very high quality output devices(upcoming HDDVD,BlueRay,etc). I can also see a reason for it if you need to send digital audio to two devices (use HDMI to carry one signal, and the regular digital audio output to the other), but DVI doesnt output audio, so that’s no benefit here.

My understanding is that HDMI is a superset of DVI: DVI carries only video, while HDMI carries, audio, video, and control signals.

DVI can carry both analogue and digital video. The analogue video is RGB component video; this is why DVI to VGA adapter cables can work. If the DVI cable is carrying analogue video, the VGA adapter simply presents the analogue video to a VGA connector and does not pass the digital signals.

Wikipedia on DVI signals and connectors:

Wikipedia on HDMI signals and connectors:

As mentioned, the HD video signal may be encrypted by HDCP. If the destination cannot decrypt the HD signal, it may be able to display a lesser-quality unencrypted version. Wikipedia on HDCP:

Almost, except that the analog RGB VGA format is not the same thing as the component video inputs/outputs on recent TVs and DVD players. Component video is “YPbPr” most commonly, and maps onto a different colour space than RGB does. I’m not sure what happens if you feed a VGA signal into a component input, or vice versa, but they’re not the same thing.

Drat. I guess that goes to show me that you can’t assume anything these days. I’ll shut up mow. :slight_smile:

Do what I did - buy a cable (making sure you can return it), hook up both the component and DVI from the cable box to the TV. You should be able to switch between the 2 inputs on the TV, or even put them into split screen, which is how I did it. I didn’t notice a difference in quality, though the colors were rendered slightly differently between the TV and cable box decoded signal. I wound up keeping the cable box on HDMI since I needed the 2 component inputs on the TV for the DVD player and Xbox.

I think it’s a holy war because there is no clearcut answer. Somewhere the digital signal has to go through a DAC. If the DAC in the TV is significantly better than the one in the cable box, then the HDMI signal to the TV will look better. If the cable box DAC is better, the component signal will look better.

Thanks for all the replies. After posting this, I realized that because my A-V receiver only switches component video (not HDMI or DVI), I would have to cut it out of the loop to use the HDMI output of the cable box. Not very practical.

Nevertheless, I found a cable for $20 on the Internets (my local Radio Shack wanted $50 for a 6-foot cable, and didn’t have the 10-foot length I needed), and I’ll try it out just to see if it makes any discernable difference.

Thanks again.

Actually, with DLP TVs, the signal never enters the analog domain (until it reaches your retina, or perhaps your brain), if the source is digital.

You’re switching analog HD signals through your receiver? I wouldn’t worry about which cabling is going to give you a better signal then, 'cause you’re already probably not getting as good as you could from component.

What makes you say that, and what kinds of problems do you think putting the component signal through the switcher could cause?

And what alternative would you suggest, when I have two component sources and one standard video source (cable and DVD, plus VCR)? I could switch the video inputs at the TV and the audio inputs at the receiver, but that would be a pretty big pain in the butt. How do you handle it?

Well, I’d say that once light is being projected with varying intensities, it’s now analog data, but I get your point. But yes, since the DLP set has to convert the analog signal of the component cables back to digital before displaying it, keeping the signal digital as long as possible is probably the better option.

Any extra breaks/connections between the source and the display will introduce some degradation in signal quality. A poorly designed switch can introduce a lot of signal loss.

I understand that that’s true in theory, but we’re talking about a mid-range A-V receiver designed to switch component video. I haven’t done an A-B comparison of signals straight from the cable box compared to running through the receiver (which would require buying a second very long component cable), but I doubt there’s serious visible degradation. But I’ll look for indications if someone will describe them precisely.

Analog signals are always subject to degradation, and even the highest end components cannot completely eliminate that. I have a Pioneer Elite receiver, and I still bypass it for the video signals. The newer Pioneer receivers, however, will switch HDMI, which I would definitely use if I had it. The whole rationale behind high end cables is related to signal losses, although of course marketers have gone far beyond what is rational or reasonable with their claims and costs. Years ago, I discovered that if I ran my computer monitor at 1600x1280 through my KVM switch, I got ghosting of the desktop. This despite the manufacturer’s bandwidth claims that the box should have been sufficiently powerful.

Followup report: I got the HDMI-DVI cable, hooked it up, and compared it to the signal through the component cable. The HDMI-DVI signal went straight from the cable box to the TV, and the component signal passed though my JVC AV receiver.

As I expected, there was no perceptible difference. Resolution did not seem to be better, and dynamic range and color appeared to be pretty much identical, at least to my eyes. I didn’t have test signals or instruments to measure any differences. But then, I don’t watch test signals or enjoy TV shows via instruments.

Last word: with my signal source and display, HDMI/DVI is no better than component.

Thanks again to everyone for your advice.

FWIW, I had a smiliar problem when I bought my Sony LCD. I was pretty disappointed with S-video and a regular coax connection. Months later I gave in a got a good HDMI cable (I was sure it wouldn’t make a difference) and now the TV is awesome. When I go to the HD channels, I’m very happy–the difference is almost startling and everything I had hoped for when I bought the TV last year.

I think that perhaps when the new 1080P stuff starts getting out there, if you have a 1080P TV, there may be a discernible difference then. (BluRay, HDDVD, PS3, etc).

Light is NOT being projected with varying intensities from a DLP television. Grayscale is achieved by discrete pulsewidth modulation and is entirely digital.

Not surprising when you consider that S-video and (RF modulated) coax cannot carry an HD signal. HD must use component, DVI, or HDMI.