Quick question about using an SDTV as a monitor

Say I buy this or something similar.

Now, using a SDTV I know I’ll have to put the resolution at 640x480. My question is, is there a limit to the color depth? Can I use 24 bit or would it have to be lower?

No, there is no color depth limit. And, even at 640x480, your image quality is going to suffer. SDTVs in the U.S. are actually 704x480, as they use rectangular pixels. The device should convert for you, but things are going to look somewhat blurry, and may flicker.

But a CRT TV has no color limitations. In fact, it’s the high color that allowed it to hide its rather low resolution.

I don’t think so. If you use either the RCA video connector or the S-video (separate luminance and chroma channels), you’ll be sending an analog signal so color depth should be irrelevant as the digital color (chroma) encoding will be converted to analog in any event.

Resolution matters mainly because the vertical and horizontal sweep signals have to correspond to what the tv set can display - IOW, the signals that tell the electron guns to start a new scan line or new scan frame.

OK, thank you both. And yeah, I know the graphics will be crummy. That’s OK though.

The older TV’s had a problem with chromatic blur. If you ever see an example of a broadcast signal, you have the B&W intensity signal for each line, followed by a spike which is the horizontal sync (ah, the good old days, when that sync was used by the electronics to ensure a sawtooth wave generator triggered on cue to create the horizontal sweep). Buried in the shoulder of the horizontal sync was a burst of colour information(chroma). Since the colour burst signal was much shorter, the horizontal resolution of colour was horrible. You tended to get bleed between colours; picture quality was not good, but then broadcast TV rarely was. As a result with really old tude TVs, since the signal was going to be crap, the cheaper TV’s put little effort into electronics for cleaning up the signal. there’s a reason why VIC 20’s had 22 characters across on the screen, and COmmodore 64, Atari, etc. from the 1980’s that expected you to use a TV would be unreadable if they exceeded their designed limit of 40 characters per line. Running the signal through a cheap channel 3 or 4 modulator did not help matters.

S-video at least separates this signal into separate chroma and luminance so you get equal information from each. The R-G-B signals from a computer VGA connector, or those red-green-blue video inputs, provide significantly more precise information; odds are a more modern TV with those inputs also has circuitry capable of displaying the same resolution. 740 pixels across means you have 9.25 across pixels per character in a standard 80-character screen, including white space. Not bad, but you better hope every pixel is sharp.

FWIW I had my computer hooked up to a SDTV for years via S-Video, functioning as a TV. The desktop display left a lot to be desired - it was hard to read and Web browsing was bad at that resolution - but video files were sharp as a tack.

Since the computer I’m going to use is older it uses an AGP video card, so I found one for $9.50 (plus shipping) that has an S-Video and composite video output.

I am mainly going to hook it up for things like YouTube or other videos. Even then, I’ll probably care more about the audio.

That’s exactly what I did, and how I could answer your question.

And, just in case it isn’t obvious, Audio’s pretty easy. Plug an RCA to stereo adapter into your sound card.

And if you’re just using it for video, you don’t really have to use the lower resolutions. The card will convert for you. I personally use 800x600 (the lowest my card will support) with 125% large fonts.

Thank you. And yeah, I already have the stereo to RCA adapter.