I recently was given a CRT Sony TV (KV36HS20) that has a component input for 1080i. It doesn’t handle 720 anything, although it will do both 480i & p. I’m considering upgrading my DirecTV to hi-def, but about 90% of the viewing we do is on the Big Four networks and I read that two of them only offer 720p programming. Would we be able to view those channels in Hi Def? Would they be upconverted to 1080i or downconverted to 480p? Would it be worth it?
Most likely, whatever signal input device you use (cable STB, satellite STB, or off the shelf ATSC tuner) would have a setting allowing you to pick what resolution it would output to the TV over the component connectors, and the box would do the conversion for you. And yes, it would convert 720p to1080i. Both my Comcast HD DVR and Verizon HD DVR have done so - in fact, both of them forced you to pick 720p or 1080i, and would convert the HD signals to that resolution.
The ‘Big 4’ signals that come out of my cable box are 1080i. I don’t actually know what the originating signal is, so it is possible the cable company upconverts them before sending the feed out through their system.
Ummm… I take that back. My local broadcast channels come in 480i on one cable channel, and on a second they are offered in 1080i. So, I would check with Direct TV – you might have to upgrade your channel package to get the 4 on-air networks in HD.
Actually, looking through the manual, your TV is not an HDTV. It is a standard TV with component input and a line doubler. You will be able to process HD signals, but you will not see actual HDTV at all.
Ya, CRT=480i. I’m fairly certain all CRT’s are incapable of anything else. So, your answer is anything above 480 will be downconverted by your TV and everything will be in standard definition whether the original signal is 720 or 1080. Not worth upgrading your signals just yet!
This is completely incorrect. There are many CRT HDTVs. I own one. More to the point, every computer monitor made between 1994 and 2003 or so was effectively an HD CRT.
Edit: The OP’s, however, is not an HD CRT. But that has nothing to do with the fact that it’s a CRT. It’s just a CRT that doesn’t do HD.
Nope - false as Flymaster has already pointed out. My Sony 40XBR800 (CRT) displays 1080i.
As old as it is now, it’s still a magnificent display.
I had read the manual and the only mention of HD it makes is when it describes the component inputs as “HD/DVD IN”. Unsure whether that meant it’s a HD TV or not, I borrowed a PS3, set it to 1080i, and watched POTC: At Worlds End (1080i native on BR). I also set it to 480p and 480i for comparison, and the 1080i was better overall, but not amazingly better like what I’d expect from sample displays I see in stores.
So I guess I don’t understand. In this case, the PS3 is sending out a HD 1080i signal, the TV is stripping out part of the signal, and then doubling the remaining parts, leaving a better picture, but not an amazing one? Seems like an awful lot of work to go to.
Thanks, good to know! While of course I’ve seen computer monitors that display high resolution, I don’t think I’ve seen a CRT TV that does the same. How big can you get them, and are these the flat-screen or still curved? Maybe I can get my parents to upgrade after all, if it looks like something closer to what they’re used to…
I don’t think they make many/any anymore, but when I bought mine about 5 years ago, I got a 32 incher. It’s a Sony Wega, so it’s a flat screen trinitron. Sony no longer makes Trinitron tubes, so I’m guessing they don’t make CRT tvs anymore either. I’m trying to sell mine. I’d check craigslist if you’re seriously looking. And don’t get a 4:3 HDTV. Those suck. Get a widescreen one.
Just as a comment high def TV is really only useful in screens over 30 inches. Anything smaller than 30" tests have show, that humans can’t tell the difference between standard and high def.
So while you would certainly be getting high def on smaller screens, (if it is built to receive signals) it wouldn’t be worth the extra money for smaller screens
That all depends on how close you’re sitting. I assure you you can tell the difference at some reasonable viewing distance. At whatever distance the test was conducted at, possibly not.
It took me a while to figure out it wasn’t HDTV either, and there was a time period of a couple years where there were a lot of consumer complaints about TVs like this being passed off as HDTVs by fast talking sales people. But under the specification on page 57, note that the “Television System” is “NTSC: American TV Standard”. And so it was – at the time. NTSC is the old analog standard.
The reason it exists is pretty simple, I’m guessing – manufacturers could continue to use their old screen production lines. It’s probably much cheaper to throw in a little circuitry to convert different inputs to the one standard than to replace the whole CRT.
FYI, line doubling is a way to increase the apparent resolution of a TV image without actually doing so. It takes each line and displays it twice, once in the original location and once in the space between that line and the next line. It’s almost a necessity for very large screens, as without it the picture can look pretty grainy. The thing is, there is no new actual detail added.
An actual HD signal has more than double the number of lines, and so there is actual new information and detail to insert – it’s much more than a “thickening” of the old picture.
For a screen size of <30", you just have to start sitting closer than ~8 feet. After all, your (most likely <30") computer monitor can probably display resolutions greater than 640X480, and I’m betting you can tell the difference, no?
There’s a nice graph of viewing distance vs. resolution vs. screen size here:
Line doubling is actually deinterlacing video signals to create progessive scans.
SD video is interlaced, so each field only draws half of the lines (either the odd or even lines) each time. While crude line doublers would simply repeat lines, better ones had more sophisticated digital interpolation to handle motion.
Line doubling is now replaced by scaling which generally does a better job, but your better off starting with a HD signal.
1080i is interlaced as well, and it’s HD.
I would say that even those distances are generous. I have a 37" 1080p set and at 12’ I would be hard pressed to tell the difference between a good 480p DVD and a 720p image. My viewing distance has to be about 8’ to appreciate the better image. To discern a 720p from a 1080p picture, I would have to be about 2-3’ away from the set.
Of course those distances are for movies; computer images and programming that is specifically intended for HDTV (such as Planet Earth) are apparent at greater distances.