Really? I was informed that the only source of 1080 right now is blu-ray, but that it was headed that way. I know Comcast is only 720.
Really? I was informed that the only source of 1080 right now is blu-ray, but that it was headed that way. I know Comcast is only 720.
No, that is incorrect. Every cable and satellite system I have ever encountered passes the resolution of the original broadcaster. Some channels are 480i, some are 720P (ESPN and ABC) and the rest of the HD channels are 1080i. Some pay-per-view programs on DirecTV are supposedly 1080P.
See my post above. There is nothing only about 720P. For live broadcast it is superior to 1080i.
Update: BluRay is the most readily available source of 1080P.
How big a TV is it, if you don’t mind me asking? As a video game junkie, the number of ports (and their location) would be the first thing I would look at if I was considering investing in a new TV.
47", 1080p LCD. One is all I need. I have an Onkyo 605 receiver that has 2 HDMIs in (PS3 only at the moment), and one out. My cable goes via component, with an optical audio out to the receiver when I want to turn on surround sound.
If you think about it, 1080i is really 540p since you are only getting half of the display lines per frame. If you don’t have a lot of sub one second action scenes and you have good deinterlacing, you probably can’t see much of a difference between 1080i and p but for action scenes, if you have the eye for it, there should be a significant difference.
If you think about it, 1080i is really 540p since you are only getting half of the display lines per frame. If you don’t have a lot of sub one second action scenes and you have good deinterlacing, you probably can’t see much of a difference between 1080i and p but for action scenes, if you have the eye for it, there should be a significant difference.
See my link from Dr. Alvy Ray Smith.
The reason we have both 720P and 1080i (as well as the newer 1080P) is that the computer guys were pushing 720P and the TV manufacturers were pushing 1080i. Progressive signals compress better than interlaced ones, coupled with the fact that all flat panel displays are progressive.
But the TV makers, who were all making CRT direct view sets or CRT based rear screens at the time, could push their existing CRT technology to (barely) display 540 interlaced lines in one 60th of a second. That, and compressors of the time couldn’t cram a 1080P signal into the 6 megahertz bandwidth of a broadcast channel.
The FCC decided to support both.
If you are not familiar with the technology 1080i sounds like it is better than 720P, but the number of different pixels one sees in a second of 720P HD video is greater than the number of different pixels one sees in a second of 1080i video.
Really though, unless you know exactly what to look for, you won’t see a difference. If you like, I can visit your home and show you exactly what to look for and ruin TV for you forever. And I’ll explain about kerning as well.
Comments on a bunch of other replies…
If you are connecting a digital source (for example, a Blu-ray player) to a digital display, use a digital (HDMI or DVI) cable. You may or may not notice any difference, but the digital cable has only a single connector compared to component’s 3. And if you’re using the display for sound, too - as opposed to a separate stereo - the audio travels on the HDMI cable as well, saving another 2 connectors.
I have a 65" commercial Panasonic plasma in my bedroom. It has a bazillion inputs (which are actually on plug-in cards, so the panel can be configured to a particular customer’s needs). One of the plasma’s HDMI inputs goes to a PC which outputs 1920x1080 (in other words, 1080P). There is a very noticeable difference between analog and HDMI, mostly because of 1-pixel-wide fonts. The display is a Panasonic TH-65PF10UK (I think they’re up to PF12UK these days).
If you have an analog source (for example, VCR or Laserdisc), connect it using the best analog method both devices support. In the case of Laserdisc, that’s probably S-Video - depending on which device has the better comb filter. Of course, Laserdisc players are a PITA to hook up - particularly the combo LD/CD/DVD units. Not all Laserdiscs have digital audio, so you need both the digital and analog audio connected. DVDs will play via the component connectors, but Laserdiscs will only give you black-and-white on the component connectors, so you need S-Video or composite video.
DirecTV’s On Demand (or Cinema, whatever they’re calling it this week) offers some 1080P programming. There’s a 1080P symbol in the guide listing for these.
Due to content provider restrictions (which could be the subject of a GD topic), you may find some combinations of player and content that won’t let you have HD (> 480I) video on analog outputs, and/or require HDCP for digital outputs. Switches, splitters, and the like add a lot of extra complexity to getting HDCP working properly.
There are devices out there that can’t/won’t output both analog and digital at the same time, either for technical or other reasons.
gaffa, a few questions:
-
Are you more likely to see compression artifacts on a 720p set? (Regardless of source - let’s say you watch an equal amount of 720p/1080i/1080p sources from perfect signals).
-
Given the same sources as above, how do they fare in terms of picture quality on either a 720p or 1080p set?
If you like, I can visit your home and show you exactly what to look for and ruin TV for you forever. And I’ll explain about kerning as well.
You mean ruin tv for me more than LOST already has? Hmmm. Methinks this is not possible. 
gaffa, a few questions:
- Are you more likely to see compression artifacts on a 720p set? (Regardless of source - let’s say you watch an equal amount of 720p/1080i/1080p sources from perfect signals).
All things being equal, with the same amount of bandwidth being devoted to both signals, you will see fewer compression artifacts on a 720P source. And of course the 1080i source will also have interlacing artifacts.
- Given the same sources as above, how do they fare in terms of picture quality on either a 720p or 1080p set?
It depends. Are we talking about film originated material or video? A 720P signal should scale very well to a 1080P display. But film material, due to the much lower frame rate of 24 fps may well deinterlace perfectly to a 1080P display.
If you go with HDMI, please, don’t be taken in by the marketing nonsense accompanying those ridiculously-overpriced fancy cables - gold connectors, special insulation, whatever. It’s all bullshit.
It is logically impossible for them to offer better picture quality than a cheap HDMI cable fished out of a bargain bin.
Because HDMI transmits images and audio digitally, slight degradation in signal strength will have no effect on the picture or sound quality, as long as the data gets to your TV intact, and in that case, it simply won’t work at all, or will abruptly cut in and out.
As with any digital transmission, it either works or it doesn’t. There’s no middle ground.
Not necessarily true. BIG DISCLAIMER: this isn’t justification to run out and buy expensive, Monster cables!
Our modern concepts of digital communications as they apply to computers also involve two-way communications. This permits handshaking, data check-summing, and the ability to re-transmit bad packets. This doesn’t apply over HDMI. Other than things like return audio channel and some limited control signals, the audio/video data is a one-way street. If something gets corrupted, then you just have to accept it, and it will display snow or artifacts on your screen. The severity of the signal loss will affect how much of this you see on screen.
It’s more correct to say this: “The cheapest acceptable cable is good enough. Once you achieve a perfect signal, then no increase of cable quality can increase that perfect signal.”
Certainly most people connect their Blu-Ray player to the television that’s right next to it, say, a six foot cable. A Monoprice cable is not likely to let you down, and I could practically guarantee that a Monster cable can’t improve in those circumstances. But then you have people that have long, long runs of cable, where the very physics of high-frequency signals really, truly can cause problems (warning: still not vindicating Monster Cable here!) like I describe above. In this case, again, “The cheapest acceptable cable is good enough.” In these cases, you may legitimately need a higher quality cable.
I use a combination of Monoprice and Blue Jeans cables. I absolutely love the Blue Jeans’ because I have a lot of experience with the Belden bonded pairs in industrial high-speed data applications. It’s not so much a “quality” benefit per se as an installation benefit (again, long runs!).
As an HDTV owner since 2000, all I can say is that HDTV sure is fun for people to figure out - and VERY confusing for most.
IMHO, the only advantage to HDMI is with Blu-Ray or upconverting DVD players. Generally they will not provide an HD signal over component.
For Cable, Satellite or broadcast tuners (a rare beast these days) - Component is a good as HDMI and a lot less hassle in general (HDMI can have compatibility problems between cable/satellite box and TV).