Correct me if I am wrong, as I know it, High Definition television sets are able to read high definition signal. The high definition signal carries more data than the normal signal used in standard definition broadcasting. Consecutively, the high definition signal carries increased resolution than the standard definition signal.
My question is:
So what is the resolution of the data contained in a high definition signal from a cable tv source? I realized that some HD TVs are capable of displaying 720P resolution and some others 1080I resolution. Does that mean that there are more than one type of HD signal, each corresponding to a different resolution?
How about DVD movies data, what is the usual resolution of movies recorded in DVD format? The reason I am asking this is that, if the resolution of DVD movies are lower than 720P or 1080I, then there is no advantage in playing DVD in HD TVs since the resolution of the source is way too low, am I right in this?
I am also planning to use HD TV as a monitor for my computer using the S-Video output from my graphics card. Will I notice any difference in this situation if I use a normal TV versus an HD TV?
Out of the common cables that are available in the market (S-Video, VGA, Component, HDMI, the standard yellow, red, white cables), which cables can carry high definition signal?
It depends on the channel. Discovery HD Theatre is one of the very few that actually uses a 1080i signal. And even though the show are boring as shit, sometimes I’ll watch it for hours because the picture has me dazzled. It’s like looking out my window and watching all the Gorillas sit in the grass and eat or whatever else happens to be on. The rest mostly use 720p, I believe.
DVD movies look great on the HDTV. But there are HD DVDs. So I dont think normal DVDs are 720p. They are definitley NOT 1080i.
This has to be set up properly. I’ve noticed that just because the computer has an S-Video out, doesn’t mean it will display correctly. You have to set the refresh rates and the freqs properly to get it to work correctly. A normal tv set up properly will look way better than just plugging it into an HDTV and hoping. There are aftermarket devices that you can plug your S-Video into and then connect that to the TV and everything looks awesome.
4)I think all the ones you listed will carry the signal. HDMI is the best. Components are the next one down.
A standard DVD is not HD quality. Two forms of high def DVDs (HD-DVD and BluRay IIRC) are hitting the market now.
Component and the yellow/red/white cables are the same thing. You may be thinking of composite. Composite is your standard coax cable. That’s the lowest quality. In order they go composite, s-video, component, and HDMI. Component will support HD, s-video and composite will not. HDMI is basically DVI (on a PC) plus sound.
The yellow/red/white cables are composite video(the yellow cable) and audio (the red and white cables). The standard coax cable is used for an RF (radio frequency) signal, which must be demodulated into separate audio and video components before it is usable.
Component video also uses 3 cables, but they are usually red, green and blue, and all three are used for the video signal (no audio). This is also commonly referred to as YPbPr, and produces a very high quality video signal
You will notice a large quality difference between using a TV and HDTV for a computer monitor. I thought I could save some money by using a TV for a monitor, but it turns out that unless all you want to do is watch movies, and maaaybe play games, it’ll be more of a headache (literally) than it’s worth, because the low resolution will cause text to be very blurry. As far as an HDTV, one capable of a 1080i resolution should give you the same resolution as a 1080x720 resolution on a monitor, which, if you are looking at anything above a 16 or 17 inch, is probably going to look terrible.
1080p is not broadcast by any means, cable or over the air. It’s too much data to compress within the alloted bandwidth, using the codec and modulations currently in ATSC.
IIRC, 1080 isn’t among the 18 resolutions authorized by the FCC for HDTV (ATSC) in the US.
Some hardware may be able to handle it, and there’s nothing that prevents someone from encoding a DVD in 1080p, but it’s not a standard.
Isn’t any set that can project 1080i also capable of 1080p? I thought the limit was in the broadcast or data storage system, not the TV hardware itself.
I have a 720p system and I love it. And I second the comment about Discover HD. Amazing!
As I understand it, no. At least that’s what I get from reading the magazines in preparation for buying an HDTV set.
Said magazines also had an article saying that at normal viewing distances, the human eye was not capable of distinguishing 1080i or 720p from 1080p. Since there are no commercial sources for 1080p programming and only a few sets even capable of a 1080p input, there’s no point at all in waiting for it to come along or spending the money on a set that has the capability if you can get something you want and can use now.
And all good sets (except maybe the cheapest), again from what I read, will automatically downscale a 1080p signal into a resolution that the tv is capable of. So there’s no worry about not being able to use future signals off of HD disks.
Standard DVD is 480i, but 480p can be resurrected with little fuss. If you buy anamorphic widescreen DVDs you will find they are far superior when viewed on an HDTV, This is primarily due to the aspect ratio and the fact that your DVD player will not discard 1/3 of the resolution when otputting to a widescreen TV.
Yes, HDTVs make great computer monitors. Regular TVs don’t.
Only component (RGB cables, not RWY) and HDMI/DVI can carry HD signals.
Not asked, but came up in a subsequent post: I would definitely buy 1080p today (my TV is 720p) for two reasons:
HD-DVD (or Blu-Ray) will output 1080p and will be widely available soon.
1080p is an easy step up from 1080i for the TV electronics. You will get a better picture from 1080i sources on a 1080p TV than a 720p TV. Most over-the-air or cable sources are 1080i. Only ABC and ESPN are 720p.
Are you sure? My cable brochure says that all of the Network HD channels are 720p or 480p. The benefit to them is they can still broadcast a 720p and a 480i together on the same signal or something. I dont have the brochure with me. It listed very few stations as using 1080i. Discover HD was one of them, I thought ESPN was one too. And WorldNet HD or something.
One thing about the cables channels like HBO and such… they mostly play movies that were never filmed in HD, right? So basically all we get from those channels is a widescreen, clear (but SD) movie. Right? Not even really HD. But other times they show some stuff filmed in HD. It’s usually channel specific stuff. Like “HBO Presents…” or “Behind the Scenes” or “The Making of…” stuff.
Even if you buy a 2160p++ Ultra Plasma screen and you use all Monster Cables and Powerstrips, and you have a Direct fiberoptic link to all the networks…
Comedy Central will still be fuzzy and the picture will look like ass.
So if you watch a shit load of comedy central, you will spend much time debating if your HD package was a total waste of money!!!
K. Yeah I was misremembering. This is what my brochure has to say:
480i- (Square Screen Only) Digital version of current television signals
480p- (Square of Widescreen) Also known as “enhanced definition” - has the same detail as today’s television signal but looks sharper. This is similar to progressive scan DVD players.
720p- (Widescreen Only) The HDTV format used by ABC, ESPN and the WB. This format provides an image just about as good as 1080i, while allowing other 480p signals to be broadcast at the same time from the same station.
1080i- (Widescreen Only) The most detailed image available from broadcast TV. The HDTV format used by NBC, CBS, Discovery, HBO, Starz, and most cable networks.
Hmmm… so I guess the bigger question is “what resolution was it actually FILMED in?”
Thanks for the answers. I have a few more followup questions:
With regard to using HD TV as a computer monitor, flex727 said that only component and HDMI/DVI (and I presume not S-Video) cables can carry HD signals. Does this mean that when using S-Video cables, I will not notice any difference between SD and HD TVs?
My graphics card also has DVI output (it is the white colored one, right?). Does that mean I am better off using this DVI output instead of the S-Video if I want to connect it to the HD TV? The problem is, my HD TV does not have a DVI jack (it only has component and HDMI inputs). Will a simple converter convert the DVI output to become HDMI?
If your graphics card is made by ATI and supports the conversion feature, there is a specific DVI-to-component dongle you can get for about $25 (this is a non-ATI manufactured version that will save you five bucks). Other than that, converting to component is a pain.
Converting from DVI to HDMI is a lot easier – almost everyone makes adapters that can do what you want done. However, HDMI is the format of choice for a lot of the content providers, because it will eventually include an automatic resolution downgrade in HDMI-compatible TVs if your player doesn’t know the secret handshake. Today’s DVD players, HD-DVD, Blu-Ray, consoles, and PC DVD Drives do not have the “content protection” bit enabled, so all of your older equipment will display inferior resolutions on new HDMI-compliant TVs (to prevent “perfect” digital theft of the source data). I’m avoiding the format altogether wherever possible.