I’m having a conversation with my BIL about HD broadcasts and certain plasma TVs. He’s getting me to check out models of plasma TVs, and I’m noticing something among the smaller flat panel screens: Under 50", they seem to have a base resolution of 1024x768 – a 4:3 aspect ratio, despite the unit being widescreen. An argument ensues on whether or not this constitutes real HD resolution.
I say it doesn’t. True HD broadcasts are 1280x720. A set that displays only 1024x768 simply cannot display a true HD signal without scaling and interpolation – it would have to drop vertical lines in order to squeeze 1280 pixels worth of information into a 1024 pixel screen. Television sets of 50" or above seem to be capable of true HD (usually with a resolution of 1366x768) so these would be true HD sets.
He says “it depends on the broadcast” and other such nonsense. Says it’s still true HD and true widescreen.
I wouldn’t say it’s true HD if it is incapable of displaying a 1280x720 image, no.
However, in the European digital TV standard (and I would be surpised if the American standard didn’t offer the same flexibility) broadcasters can include aspect ratio information in order to stretch the picture horizontally. That’s how they do SD widescreen over here - it’s transmitted at the same resolution as 4:3, but with a signal telling the set-top box to stretch it horizontally to 16:9. Or, they’ll only use a horizontal resolution of 512 for 4:3, similarly requiring the set-top box to scale it to the required 720. It allows the broadcaster to reduce the bitrate required to send the picture, meaning they can get away with less bandwidth, or squeeze more channels into the same bandwidth.
So, assuming that the broadcast standard does allow this, you might find broadcasters cutting costs and transmitting “HD” TV using less than 1280 pixels horizontally. If they use, say, 1024x720, to be scaled to 16:9, it will look no sharper on a true HD set than on one of the smaller sets you describe.
This is something which puzzled me since we bought ours. We were looking at plasmas and LCDs in the 40 inch range, and all the plasmas were 1024x768, compared with 1366x768 for the LCDs. Side by side, the LCDs had noticably sharper pictures. The oddest thing was, several months later, some company came out with what they said was the highest resolution plasma screen (at the time). “About time!” I thought, but no: it was 1024 x 1024. Anyone know why this is?
I really came here to rant/hijack, but now I feel obligated to attempt an answer. Ummm, from Home Theatre Magazine:
Odd. My 50" plasma TV is 1366x768. Claims to be “9th generation” (whatever that is) and I got it last year, so it’s not exactly latest and greatest. The 58" TV in this particular line is also 1366x768, but with larger pixels.
Their latest and greatest panels are 1920 x 1080, so they will display 1080i or 1080p natively, with no resampling or interpolation.
But that’s what throws me. 50" and larger plasmas have the correct resolution and aspect ratio to properly support a full widescreen 720p or 1080i image. (1080p as well with even larger, higher resolution sets) When you drop below 50" though, you start getting this 1024x768 nonsense, which is inherently a 4:3 aspect ratio, even though the screen itself is 16:9. No matter how many ways you slice it, an HD signal is going to have to be downsampled and interpolated in order to make it look normal on a television like that. I just can’t quite work out how this can be effective compared to a true HDTV display with the proper HD resolution.
I have a 42" plasma with 1024x768 native resolution. I agree with you, this isn’t true HD resolution, you don’t get a 1:1 pixel mapping from an HD source. This causes the quality of the picture to suffer slightly. However, if you get a plasma which has a good and reputable internal scaler chipset, it may hardly be noticable.
One thing about plasmas with 1024x768 resolution - the widescreen stretch causes the pixels to get elongated horizontally, which sucks for computer images and htpc.
I wouldn’t get too fixated on counting pixels. The modern trend is for the different parts of the video chain to be resolution independent. The signal may have been captured by a camera at 1080p/24, edited at 1080p/60, broadcast at 720p, and displayed at 1080i.
Well, yeah, that’s to be expected – but the actual, native resolution of the television does factor in there.
Consider the humble computer monitor. An old one if you like, it doesn’t really matter, but one that can do 1024x768. It is, in every concievable way, a 4:3 aspect ratio. You can view widescreen images or video on it, but it’s going to be letterboxed because it’s a 16:9 aspect ratio, so, black bars of empty content for you. Sure, you could scale it up to fit the whole monitor, but then the picture will be stretched vertically and everyone’s going to look like Gumby.
Now stretch that monitor. For the sake of argument, it’s made of rubber, tube and all. Now you’ve got a widescreen, 16:9 aspect ratio. Problem is, you’ve still only got 1024 pixels across and 768 pixels down. Except now the video is stretched horizontally more than it was stretched vertically, and everyone looks like Oliver Hardy.
Now the question is, how do you take a resolution that is inherently 4:3 and stretch it out to 16:9 without distorting the image?
The pixels are wider on a widescreen 1024x768 plasma.
Again, a standard 4:3 image that is not scaled by the internal scaler (i.e. a native RGB/DVI signal at 1024x768) will be distorted due to the pixels not being square. This makes a widescreen 1024x768 plasma not such a good choice as a computer monitor or home theater PC.
480, 720, and 1080 signals are scaled to the native resolution of the panel. Thus, no distortion of the image (unless you’re watching non-widescreen content in ‘stretch’ mode, of course).
Frankly, i’d avoid the mess altogether and go with a plasma with a proper 16:9 resolution.
Exactly: Stretched, which is the problem whose solution I can’t work out.
So essentially what’s happening is sort of a Panavision effect: The image is squeezed into a 4:3 aspect, but when displayed on a 16:9 television with a 4:3 resolution, the image gets expanded back out again due to the widened pixels so it looks normal, correct? In essence, though higher resolution than a normal telvision signal, it really isn’t true widescreen.
The resolution isn’t inherently 4:3. Pixels are not inherently square. It’s easier to think of it in terms of analog video. You have N scan lines, sampled at a specified rate, displayed at a specified aspect ratio, progressive or interlace.
Well, you can only buy what they make. In my case, where we have our TV, we don’t have room for a 50 incher, only about a 42 inch set. As the OP said, below about 50 inches, they don’t make plasma TVs with the proper resolution.
The HD picture on my Samsung 42" 1024x768 plasma TV is great, although I became well aware of the non 16:9 problem when I tried displaying my computer output to it. It took a lot of fooling around with a utility called Powerstrip to design a custom resolution that looked right, but it still wasn’t perfect.