Why is it that we sit so far from television sets and yet sit so close to computer monitors? Is it because of the refresh rate or resolution? Or is it because of what we look at on the screens (more detail on computers, more video on tv)?
This is a serious question?
OK go look at your TV. Unless it is very tiny, you will notice that if you get close up the picture looks grainy and unattractive. If you move back a few (say 3-9) feet the picture looks reasonably acceptable. It is designed to be optimally viewed from a distance of a several feet away.
OK go look at you monitor. You need to read text on the monitor in most cases. For most people this means you need to be within at least 2-3 feet or closer to the screen. It is designed to be viewed within these parameters.
Refresh rates and resolution are higher on monitors because it is both economically and technologically possible (no NTSC compliance to worry about) and is necessary to accurately display fine detail. HDTV will improve television pictues in the future but the resolution of an HDTV screen will still not be anywhere near a good monitors on a pixels (or TV equivalent) per square inch basis (and it doesn’t need to be).
I couldn’t read my monitor if I was very far from it.
combined with the high color and resolution of my monitor, I’m thinking of just getting DVD and TV cards for the computer.
If you could get tv reception using your computer’s monitor (I would suppose via one of these “tv cards”), would the picture be super sharp and crisp?
There’s another problem between the clarity of a tv vs. a computer monitor. TV’s are interlaced and monitors are non-interlaced.
Interlacing means only half the screen gets drawn on every pass of the electron beam. So, in a tv, odd numbered lines get drawn then even number lines get drawn on the next pass. A monitor draws all lines on every pass.
The result is that movement on a tv can be fuzzy. The ball just thrown by the quarterback gets half-drawn and on the next pass the ball has moved a bit so the next lines are a bit out of synch with the previously drawn lines.
HDTV conforms to some 17 different standards so depending on the signal you are receiving you may get an interlaced or non-interlaced picture.
HDTV, IIRC, is equivalent to a 1280x1024 resolution on a monitor which is quite good. A TV picture on your computer monitor is no more crisp than on a regular TV because the signal only contains the info to draw a standard resolution TV screen. It may look sharper because the picture is small but that’s it.
So, why don’t broadcasters use non-interlaced signals? Simply because you’d have to double the bandwidth to carry the signal. Channel 7 would now actually have to be channel 7 and 8 combined to hold the extra information.
IIRC a fully loaded, best quality (dolby stereo, non-interlaced, etc.) HDTV picture would require 8 channels of conventional bandwidth to carry the signal. This is why the FCC handed over all that extra frequency range to broadcasters for free a year or so ago. What did the broadcasters decided to do? “Gee, I could broadcast one top-notch HDTV signal but I can’t really charge much more for advertising OR I could broadcast eight standard crappy signals and raise my advertising revenue by eight! Hmmmm…decisions decisions.”
I will say that while many people were mad at the Feds for giving away all that bandwidth it did allow the FCC to go back to those broadcasters and say nothing doing. We GAVE you that extra bandwidth for HDTV and so that is what you WILL use it for. However, expect the broadcasters to wiggle inside these constraints as much as possible.
Not quite correct. The highest resolution HDTV, 1080i, is equivalent to 1920x1080 (remember, you’re working with a 16:9 aspect ratio for HDTV vs. a 4:3 ratio for NTSC or computer monitors), but is unfortunately interlaced (thus, the “i”). Presumably someone will come out with a line doubler some year that will allow us to view it in 1080p (progressive, or 1920x1080 non-interlaced), assuming you have a display device that can handle it. There currently exist line doublers to display standard NTSC TV on a non-interlaced display by interpolating the missing lines, or line quadruplers that actually double the resolution. Not nearly as good as HDTV, since the doubler/quadrupler is just making an educated guess as to what’s missing.
BTW, DVD has the potential to be as good as baseline HDTV (480p) if you get a DVD player with progressive scan outputs rather than interlaced (and you have a compatible display). They’re expensive now, but prices will come down, since there’s no technological reason for their prices to be high. Interlacing for DVD is done by the player; it’s not stored that way on the media.