What resolution is a normal, non-HDTV, TV? (Or, how does this gadget work?)

I have a small computer, affectionately named Pooter, that I have hooked up to my television. It outputs my 640 x 480 desktop tolerably well on my 32" TV, and outputs my movie files at “full screen” mode with excellent quality. Of course the movies are at white book VCD resolution of 352 x 240 (IIRC), so this is to be expected.

OTOH, while its display of my desktop and such is navigable, it’s too fuzzy (I believe the technical term is crappy) to use it as a WebTV type deal or for any “real” usage.

Which is why I was looking at this doohickey . Will this make things any clearer, or am I being limited by the output resolution of my television?

For that matter, what is the max resolution of TV? Is it the 720 x 480 (or so) of DVD, or what?

IIRC, TV’s are mostly concerned with vertical resolution. Standard NTSC signals come in at 480i, which means 480 lines interlaced. Interlaced means that every other line is drawn. TVs show only half a picture at any given moment, but thanks to a little biological trick called “persistence of vision” we see what appears to be a whole picture.

HDTV can produce up to 1080i resolution or 480p. In this case, the p means progressive, and there is a full picture, not half, being shown at any given moment.

AFAIK, NTSC has 525, HDTV 1080 and PAL 625 lines.

The “safe area” of most NTSC TV’s is about 560x420 (interlaced), even though the actual signal sent to the TV has more information than that. If you took the casing off your TV so you could see all the way to the edge of the picture tube, you’d see that there’s more picture you’re missing, and there’s usually even more beyond that that doesn’t fit on the tube. This is called “overscan” and it makes sure you don’t have a black border around your picture. So content producers have a concept of a “safe area” which tells them the usable space on the tube (so they can position their stock tickers and other graphics). Things like WebTV take this into account too and generally use 560x420 for graphics.

If you divide the time between starting two successive screens into the time between starting two succesive lines you get 525 in NTSC. This does not mean the picture itself has 525 lines at all. The circuit takes time to move from the end of the last line to the beginning of the first line so there are “inexistent lines”. The 525 number is just dividing the horizontal and vertical frequencies.

If you want, you can get a card that lets you see HDTV on your computer. It’s about $500.00

I don’t see how that doohickey you listed would make your screen any clearer.

handy’s probably right. The doohickey ain’t gonna help. The reason your TV looks crappy is because it IS crappy. The NTSC video signal was created by what, the Romans, and is a relic of a bygone era. Your computer monitor is way better, which is why it looks so much better.

There are two main problems causing this:

  1. While a computer monitor uses a nice, simple VGA signal made up of red, green and blue (RGB). The NTSC signal is a complicated system I don’t even pretend to understand (YUV), designed to allow a single signal to represent both color and black and white. As a result, the NTSC “colorspace” can’t represent all the RGB colors. That, for example, is why bright red looks so bad on TV.

  2. Interlacing. Much of your computer monitor, especially text, is filled with single-pixel horzontal lines. Since a computer monitor isn’t interlaced, these thin lines look fine. Single-pixel horizontal lines buzz on a TV, so your computer monitor looks crappy there. And it always will.

The bad news: HDTV is interlaced, too.

My TV looks pretty sharp. Old tvs suck (people give them to my shop), maybe its time for a new one?

A computer monitor is digital. It is comprised of individual pixels. A TV screen is analog.

While the number of horizontal lines on a TV screen is defined at 525 (for NTSC, the US system) horizontal resolution is variable. As the electron gun ‘paints’ each horizontal line it isn’t turning descrete points on and off like a digital system. It just gets brighter and darker like a variable waveform. How precisely it does this depends on the quality of the set. A Sony will do this much much better than say an Admiral, Samsung or Daewoo. You get what you pay for.

Check out the siteshere or here.

Bzzt. Nope, HD can be interlaced or progressive. HD is actually many different formats & sizes. Example: you can have 1080p (1080 line progressive, i.e. non-interlaced) or 1080i (same, but interlaced), 720p/720i etc etc etc.

The actual resolution of the image on your TV is nowhere NEAR 525 lines of resolution. DVD, for example, is only 480 lines (either progressive or interlaced, depending on your player and TV), but it looks much better than TV.

That’s because the resolution of TV is controlled more by the input stages to the TV. standard NTSC probably winds up being around 350 lines of resolution or so. VHS tapes around 240. I can’t remember the numbers exactly. If you are getting an S-video signal into the TV, you’ll get a better theoretical resolution than if you just taking a composite video signal, for example.

If you don’t believe me, get a graphics card with ‘TV Out’, and plug it into your TV. Set your resolution to 640 x 480, and then look and see if the image on the TV is as clear as it is on your computer monitor.


An S-video connector doesn’t confer greater resolution. It prevents the Y/C signals from being encoded/decoded and thus adding noise to the signal. So you don’t get greater resolution using s-video connectors (we’re not talking about SVHS tape, but the s-video signal), but you do get a clearer picture.

As far as resolution: ALL (U.S.) TVs have 525 lines of resolution. It’s not just a good idea, it’s the law – the NTSC standard is very specific. However, there are 20 lines of blanking where some signal bits (including closed captioning) are kept; monkey with your vertical hold and look for the thick black line, and that’s the blanking area. Some of the the rest is lost on your TV through overscanning (the picture is slightly larger than the screen), so only about 440 or so lines are usually visible. They’re all there however.

>> As far as resolution: ALL (U.S.) TVs have 525 lines of resolution

Thta’s a confusing and meaningless statement if i ever saw one. As Hail Ants (and I) explained, 525 /frame is the ratio of horizontal and vertical scan frequencies but you will never have 525 lines on the screen as the electron beam is shut of for the duration of several lines when it returns from the end to the start of the screen.

In spite of the vagueness of your statement I think I can say it is wrong. TV sets do not “have” any number of lines. Their circuits are adjusted to display a number of lines on the screen (< 525) at certain vertical an horizontal frequencies whose ratio is 525.

At any rate and to address the OP: as others have said, a monitor will give you much higher quality because that’s what they’re built for. TV does not need it.

Sailor, the NTSC signal has 525 lines, period. You’re correct, not all of them are displayed. Which is what I’d said (OK, I agree saying the TV itself has 525 lines was misleading). But there is always 525 lines in the signal, and all TVs, by law, are built to display a signal with 525 lines.

And rereading yours and Hail Ants statement, you seem to call this a ratio, I don’t take his/her statement that way. The other post agrees with mine that there are 525 lines but the horizontal resolution is variable (which I don’t dispute and didn’t address in my post).

One more for the good news/bad news category. The good news being that I was whooshed by only about half the posts on here, the bad half being, as I expected, that I’m stuck with crappy TV resolution.

Well, thanks anyway, guys. :slight_smile:

>> Sailor, the NTSC signal has 525 lines, period

Nope. That’s like saying I can speak French but I just can’t pronounce it. The time period between two frams equals the time periods for 525 lines. That does not mean there are 525 lines. Besides, even supposing there were lines which were not displayed what use would they be? Saying there are 525 lines in a frame is using a “line” as a unit of time but it does not mean there are actually those lines in the signal.

IIRC there are something like 21 lines used for the vertical retrace and 241 actual lines (multiply by 2 for a frame). If you have better information I’d like to see it.

>> all TVs, by law, are built to display a signal with 525 lines

If it does exist (which I doubt) I am sure the interpretation is what I just said. Where can I see it? You got a cite?

Lines which do not carry data are just time intervals, not actual lines. Lines which carry data and are not displayed are actual lines as they could be displayed. But a line which carries no data and cannot be displayed cannot be considered a line of information.

The CCIR-601 spec is what determines all of this. See here , here here, here, here, here , and even a powerpoint presentation here.

There are 525 lines in the signal, 483 of which are ‘active’. You can probably see about 440 on your TV due to overscan. If you have a production broadcast monitor available (I do), you can hit the ‘underscan’ button on the front (if it has one, most do), and see the entire picture, all 483 lines, framed in black. You can even see the 40+ blanking lines (I noted this as 20 lines earlier, apologies) should you so wish, and even watch the timecode and closed captioning bits dancing on in the overscan area. It all exists.

Your point probably boils down to ‘how much of it is useful’ which is about 440 or so lines. What you said was ‘how much of it is real?’ The answer is ‘all of it’. Every line is in the signal, the television chooses how much to display, by using the (crucial) information in the blanking area, and overscanning just because that’s what TVs have customarily done (and it makes the set cheaper to build, I would guess).

Nonsense. Those 7 lines carry the information ‘here is where the frame begins’ just by existing. Your TV wouldn’t work without them, so this important information for the device.

Ah, here’s another example of different displays of the same image cause by different display formats and various use of overscan. The web page topic is odd (a Thunderbirds cartoon??) but the image examples of what’s seen onscreen in different formats, plus the text on why this is done and what’s seen on a broadcast monitor, are (mostly) spot-on.

Aha! here’s an even better paper explaining screen size and overscan. See page 2 for the overscan part. It’s a PDF, you’ll need Acrobat.