My HD-TiVo has one HDMI output, and one component-video output. Is the image quality on component video lower than that of HDMI? Or are they both equal quality?
Theoretically they’re the same, I believe. HDMI has other benefits, though, like being able to carry audio signal on the same cable as video, and being able to communicate with your TV to tell it what aspect ratio is best for the program being played.
My experience with my Sanyo Z3 projector (HD-ready 720p) is that the picture quality from my upscaling DVD player (Philips DVP-5990) is better through the HDMI than the component. So it was worth the cost of the HDMI cable for me. I assume that someone with more knowledge will come along and be able to explain generally if that’s the case, or if it’s just my equipment combination.
Ok, so that probably didn’t really help you…
The other convenience of HDMI is that it’s a smaller cable than component (even without factoring in the RCA audio cables you need as well). Spend the $4 on HDMI.
The limiting factor for a lot of people is the amount of HDMI inputs you have on your TV.
It’s probable that measured, objective video quality is less on component cables than on HDMI cables, because component cables carry analogue signals instead of digital signals. I never see a subjective difference, though, and am dubious by reports of people who claim that they can, except in certain circumstances (really bad cables, really long runs, interference). Heck, there are people who claim they can see quality differences amongst brands of HDMI cables. :rolleyes:
HDMI also allows HDCP, which may or may not be something that you’re for or against.
The factor for us is this wireless HDTV transmitter, which has a single HDMI input (and output on the receiver). Transmitter is to send signal to bedroom TV. TiVo in living room currently connects to living-room TV via TiVo’s single HDMI output. TiVo’s single HDMI output would need to go to the transmitter, which leaves the TiVo’s component-video output for the living room TV.
If component video’s signal quality is significantly degraded (as compared to what we’ve been seeing on HDMI), I suppose I can buy an HDMI splitter.
A splitter will work fine, but I don’t think you’ll notice a difference between HDMI and component. If you do, it definitely won’t be significant.
And that transmitter is friggin’ sweet.
I’ve a splitter with a XBOX and PS3 going into it. Play games, watch blue rays etc. all work and look perfect. Never had any problems whatsoever.
They switch when you turn something on. Nice piece of simple kit that has never let me down.
That’s a switch (2 in, 1 out), not a splitter (1 in, 2 out).
When I got Directv they told me that component can only handle 720p. For 1080p you need HDMI - but I’d read this many other times as well.
You can notice the difference btween component and HDMI but only if you have a very sharp eye or they are on different screens right next to each other. 1080p is definitely sharper but you don’t really notice the “degradation” with 720p unless you are looking for it.
Oh, the reason that tivo has only one HDMI port is probably because satellite and cable I think max out at 720p. Over the air ATSC signals I think can broadcast at 1080p but I’m not sure how many actually do. The only common source of 1080p is bluray and bluray rips.
Actually this is what I just installed at home (but haven’t had a chance to play with yet): I’ll see your splitter and raise you a switch.
HDMI will also allow something called Audio Channel Return but only on the newest equipment.
You can use it for example to send audio to a connected A/V Receiver. So no more optical cable.
Our Cable provides most of the HD channels at 1080i (but not 1080p). The TiVo passes these on to the TV unaltered through its HDMI output.
As it happens, the bedroom TV is only cable of 720 (not sure whether i or p).
HDMI splitters? Most of the new TVs I’ve seen have an inconceivably high number of outputs. Helped a buddy set up an entertainment center that has 2 components, 2 RCAs and 6 HDMI ports, divided between side and back. I can’t possibly imagine needing more than that…
I have one. I bought my set a year ago.
If you go with HDMI, please, don’t be taken in by the marketing nonsense accompanying those ridiculously-overpriced fancy cables - gold connectors, special insulation, whatever. It’s all bullshit.
It is logically impossible for them to offer better picture quality than a cheap HDMI cable fished out of a bargain bin.
Because HDMI transmits images and audio digitally, slight degradation in signal strength will have no effect on the picture or sound quality, as long as the data gets to your TV intact, and in that case, it simply won’t work at all, or will abruptly cut in and out.
As with any digital transmission, it either works or it doesn’t. There’s no middle ground.
How big a TV is it, if you don’t mind me asking? As a video game junkie, the number of ports (and their location) would be the first thing I would look at if I was considering investing in a new TV.
That is unclear. Component can handle 720P and 1080i. 1080P requires HDMI.
The static resolution of 1080i and 1080P is exactly the same - 1080. But the temporal resolution of 1080P is twice that of 1080i. What I mean is that with 1080i, 540 lines are drawn in one sixtieth of a second - first the even lines, then the odd lines. With 1080P, all 1080 lines are drawn in one sixtieth of a second. But the difference is mostly academic, as virtually all film is shot at twenty-four frames a second. A TV set with good deinterlacing (a circuit that takes both the 540 line “fields” and converts them into one 1080 “frame”) will produce an equally good picture with film sourced material. Virtually all HD television is either 1080i or 720P - which does a better job on sports because it displays 60 720 line frames per second. The temporal resolution of 720P is greater than 1080i, there is more picture information in one second of 720P video than in one second of 1080i video. That is why ESPN went with 720P.
I have seen true 1080P television, and it looked amazing. But you would need a really huge set and a very discerning eye to tell the difference between that and well-deinterlaced 1080i.
It’s been a year since I shopped for HDMI cables for our big TV, but ISTR that there were two data rate standards (don’t remember the names of the standards). If your cable was rated for the higher rate, then it was basically guaranteed to handle 1080P, regardless of price, so yes, I went with the cheapest stuff I could find.
Interestingly, if an HDMI cable was rated for the lower rate, it would definitely handle that lower rate, and would probably handle the higher rate; the difference in most cases was that the manufacturer didn’t want to pay the cost for certifying the cables at the higher data rate, even though they usually would have passed.