HDTV definition question

Why does the picture quality vary so much among various “digital” channels? I find the best picture quality is on the NYC CBS affiliate, which is fully HD, but some of the so-called digital versions of other channels (both networks and cable) deliver a picture quality between that of non-digital local broadcast and the great CBS station. These channels (generally located at #76-110) are mostly noticably better than their lower-number equivalents, but it’s not as much of a difference as I’d like.

BTW, my set has 1366 x 768 resolution, for whatever that’s worth.

Or you can save even more by using any old audio L+R + composite video cable, ~$6 at Radio Shack, and just label the plugs Red, Green, Blue. Same cable. Yes, I’ve done this when I came up short a cable in my cavernous video lair / secret fortress; works great.

Are you referring to cable/Satellite, or OTA/ATSC? I’m not sure what specific channels you’re referring to, but I find that many of the “second tier” SD (and probably HD) stations (TBS, Lifetime yadayada) have crap pictures, I think because they choke down the MPEG stream bitrate to starvation levels, and precondition the signal to compress well by muddying down the edges and saturation. They’re (and I’m not sure if “they” are the channel originators, or the head end on the cable/satellite provider) basically saving bucks by stuffing more channels into a given pipe on the data stream. OTOH, if you’re referring to OTA channels, it may just be that the broadcaster is inept in analog, and when converted to digital they still look crappy. GIGO.

Comcast cable, carried through actual copper wires (as opposed to fiber optic or other). I should’ve specified; thanks for your response.

I suppose we can only hope that as the changeover occurs in six months that more networks and cable providers will get serious about digital picture quality, if only for competitive purposes.

For whatever it’s worth:

Monitors (and projectors) have what is called a “native resolution”. This is the resolution that can be displayed with no manipulation of the source picture – assuming the source matches that resolution, the screen will display a one-to-one accurate image.

Virtually all monitors are also designed to display “non-native” resolution sources as well. My 1080i TV, can also display 480i, 480p, 720i (maybe 720p as well, I don’t remember). And since I bought it several years ago when 1080p didn’t exist, I have no idea what it would do if fed that signal.

So, all this being said, all these TVs use different schemes to upscale or downscale images that are different than the screen’s native resolution. There is no universally accepted standard for doing this. It may be that two TVs with native resolutions of 1080i might look virtually identical with a 1080i source, but considerably different with 720p sources.

The only way you can truly make the best choice is to work with a vendor who is willing to feed various resolutions into the TVs you’re considering, and make your own subjective judgement about whether the image quality is better or worse in various situations.

Hee, hee. Fat chance with CostCo. Price has its price, after all.

I should correct my post on one factor. The ‘i’ and ‘p’ are not specifically related to resolution, per se, though they may have an impact on apparent picture quality. An ‘i’ picture shows only half the total image per frame – every other line. The next frame shows shows the aalternate lines. The frame refresh rate is so high that your brain puts the frames together and shows you whole images. The ‘p’ shows all the lines in every frame.

Honestly, I can’t tell the difference on my TV. OTOH, my TV has something called a “scan doubler” for lower resolution sources that substantially improves the picture in other ways, and perhaps that’s why I can’t see a difference.

As to Costco, well… if you’re willing to screw a high-end dealer out of some time, you could go to one and do better comparison shopping there, and buy at Costco.

One other thing that was mentioned above but should be stressed again – buy a TV with as many HD video inputs as you can. I have a cable box, a broadcast receiver box, a video camera, and a DVD player. I also have an Dolby AV receiver with 5 video inputs, but only one of those is HD. My TV has two HD inputs, of which one is taken up by the AV receiver. So I’m trying to decide now whether to upgrade my AV receiver to one with more HD inputs, or to bypass the receiver’s video altogether and use some kind of HD video switcher like this.

These “half” pictures are called “fields”. If you want to think of these fields as frames, be aware that the field rate is 2x the nominal frame rate – e.g., 1080i/30 is 60 fields/sec, but the nominal frame rate is 30 fps.

If you have surround audio speakers, the ideal is to switch everything through an AV receiver, so the surround will be decoded, or the audio less-than-surround will get routed properly (e.g. stereo -> 2.1 or 3.1, as you prefer). At the very least, you get rid of yet one more remote. If you’re just using the television’s audio, then using an HDMI switcher and/or many HDMI inputs on the television is fine, and of course many HDMI inputs is always better than fewer. Having said all that, my LR TV setup is in the same shape as Boyo Jim’s. When I start getting more HD devices, I’ll need to toss my HTR and get something with a lot of HD switching muscle, which may be fairly expensive in the medium term. I’ll need to do some research and shopping.

Yes, I do have a surround system, but I also have a programmable remote, so it’s not a big deal to route the sound differently from the video. I can just reprogram it to do more commands to any or all of my components with a one-button touch.

So I’, back in CostCo and they moved the Vizio plasma onto the floor. It does run warm; there’s a fair amount of warm air rising out of the top.

However this week they had a Westinghouse 42" LCD for $700, twenty bucks cheaper. VK-42F240S I don’t know how accurate the site is. It does not mention any HDMI ports and there are two; I saw 'em. It is an unabashed 1080p