High Definition TV

Wrong. As stated, film has a higher resolution than HD. Some of HBOHD’s content is mastered in HD from film. Looks FANTASTIC. Other content is upconverted to HD from a standard mastering of the film. While technically it is an HD signal (they are sending you all the lines of resolution required for your HDTV) it doesn’t look anyway near is good as true HD. It does however look better than watching SD signals on your HDTV. Its like comparing regular (interlaced) DVD pictures with progressive scan DVD pictures. There still is a noticable difference.

And I agree that there is no reason to wait on getting a 1080p TV. There is a noticable difference in the smoothness and absolute realism when compared to 1080i or 720p sets. No… cable does not broadcast 1080p, but the sets upconvert cable’s 1080i to 1080p. And just as there is a distinct difference in progressive vs. non-progressive DVD signals, there is also a noticable difference in progressive vs. non-progressive 1080 HD signals. And with Blu-ray and HDDVD due for release in the main-stream U.S. in the next few months, why wait? Your new TV is likely to last 15 years. You really gonna replace it with a 1080p set later?

Outputing from a computer to a TV using S-video looks like crap. No where near HD for video sources, and your everyday computer use will drive you up the wall. Text will be fuzzy, and nothing will look as sharp and clear as it should and you’ll need the economy-size bottle of ibuprofen to use your computer for more than 20 minutes. Use the DVI output if you have it.

Also wanted to point out that ESPN HD blows. They created the channel saying that it would be 100 percent HD content, and 90 percent of it is upconverted HD. Doesn’t look quite as good. They noticed that, so now they broadcast the grey bars on the side of their signal to shrink the frame so it doesn’t look as soft. You know the bars you get on the side when your playing with wide-screen modes? The bastards broadcast them, so you have them even in full-screen mode. I’d much rather watch Discovery HD Theatre. It’s like real life (only better!) :stuck_out_tongue:

Re 35mm movie film resolution vs HDTV, the answer is complicated. The perceived resolution is often not better for 35mm film. It’s true a 35mm still slide (say Kodachrome 25) can be extremely high resolution – possibly 10 megapixels. However 35mm movie film stock is much lower resolution, and what counts is the actual perceived resolution when you watch it. This has been studied and the final resolution can be 750 lines, roughly equal to 720p or 1080i:

Re an HD display that natively displays 1080p vs 720p, if you can afford it a 1080 native display in theory could produce better images from 1080i source material. However – several popular sources are in 720p: ABC, ESPN, and Fox. A 1080 native display probably won’t look any better than a 720 native display for that material.

I think all 1080i providers aren’t actually transmitting full-bandwidth 1080x1920 at 30 fps, but a cut-rate version using 1080 x 1440. This is due to limitations in current HD camera equipment and inability of the distribution chain to handle compression at full bandwidth. By contrast providers of 720p (1280x720 at 60 fps) can handle the full bandwidth of that format, since the data rate is lower.

The bottom line is for current equipment, over-the-air, satellite and cable HD providers are only transmitting 1080i frames using 1440 x 1080 resolution for 1.55 megapixels per image. At 30 images per second, that’s a data rate of 46.6 megapixels/sec.

By contrast, 720p images are 1280 x 720 for 921k pixels per image. At 60 images per second, that’s a data rate of 55.3 megapixels per second.

So current 1080i still images have more pixels, but the overall data rate is lower. Thus when considering movement (panning, zooming, object movement) the effective delivered resolution may be less for 1080i as currently implemented.

The full 1080i implementation is 1920 x 1080, for 2.07 megapixels per image, at a data rate of , 62.2 megapixels per second. However it’s unclear when content providers will be able to transmit this – possibly not for a long time.

If you’re buying an HD display and can afford the price difference of a native 1080 display vs a 720 native display, why not get it. However the overall image quality may not be much different, especially for ABC, ESPN and Fox material.

Whenever true HD quality discs and players are widely available, and if those are in 1080 format, you might see a more significant improvement with a native 1080 display. That will probably happen long before the various HD broadcasters are using the full 1080i bandwidth and resolution.

Unless you meant bars at the top & bottom, instead of the sides, you need to either change the output mode of the DVD player to widescreen TV or change the picture mode of your TV to Wide.

Absolutely right. Doing this with S-Video sucks a ton. I did get my computer hooked up to my Sony HD via an HD-capable video card outputing DVI to a DVI input on my TV. Sony does not recommend this on my model, and I was probably taking a big risk.
I had to do a lot of tweaking of display resolutions and such using Power Strip, and it took a long time to get it close to right, but I don’t believe that it will ever be what I would like it to be.
Consider how far away from the display you would sit when watching normal television, and far away you sit when looking at a computer monitor. Quite a difference.

Okay, I totally misread Shai’tan’s post. Sorry about that - ignore my response.

Not completely wrong.

The issue of taking a filmed episode of, say, Sex & The City and transferring it to HD opposed to SD has as much to do with how it was originally shot. If you frame for common headroom and protect 16x9, then you can indeed transfer straight to a scanned HD image and broadcast a nice show.

If, OTOH, you shot in classic 4x3 and did not frame for 16x9, then you have delivered framing that is completely unusable for HD aspect ratio. Therefore, the image could be scanned in as HD but would have to be reframed and blown up some ( potentially degrading the quality of the frame ).

This would depend upon what the series was and when it was originally shot and how it was framed when it was shot. If the camera originals exist, and the negative can be freshly transferred straight up to HD then yeah baby, lovely stuff prevails.

Cartooniverse, professional camera operator on both film and HD and SD video cameras.

Why is the coax cable such a shitty HDTV choice for sticking into the tv, when that’s the cable coming out of the wall?

The coax cable coming from the wall contains all of the channels you subscribe to, either cable or satellite, which are decoded (or demodulated) into video and audio on their respective channels.

In order to get the signal on coax to your TV, the video and audio need to be re-combined and remodulated (usually on channel 3 or 4), then your TV set again demodulates the signal and separates it out into video and audio.

Each step in this process adds noise and signal degradation to the process. If you can demodulate the signal once and deal with the highest quality of video and audio you can obtain all the way through the signal path your picture and sound will be much better.

Bear_Nenno upon re-reading your question, let me revise my answer just a little. If you are running the coax cable straight from the wall into your cable-ready HDTV set, then this is the best way to go.

If your coax cable goes to a cable box or dish receiver, then you are better off to run the highest quality video and audio available from the box to your TV.

I got two woids for ya there, Bear_Nenno.

Fiber

Optic.

:smiley:

Aside from the inevitable ( for now ) analog switchers, the dream, the misty ghosty beautiful gauzy dream is to go from the Head End at the cable company to your monitor, 100 % glass.

One dares to dream.