Why not just use coax for TV?

Part of this is IMHO, but there may be a technical answer. It seems like all cable signals for all channels, both digital and analog, SD and HD, come over coax cable, which also carries audio. It seems like a co-ax cable can carry enough signal to display everything needed and is cheap. Why do we go from the co-ax cable to the various forms of cables, such as HDMI, Component, AV, from the cable ‘converter box’ to the TV?

Because coax carries a highly compressed signal, and the converter box uncompresses it.

This is only partially true. In the days before Digital TV, the convert box was just a tuner, and it extracted the TV signal out of hundreds of other signals being transmitted simultaneously on different bands. Now, converter boxes are also decoders/decrypters, re-constituting the digital bitstream.

However, none of this is the reason not to use coax.

The real reason is partially due to history, and partly quality-related. Before digital TV, TV was one of the lowest-quality signal sources. Laserdiscs, DVDs, Beta, and BlueRay all had much better quality, so it made sense to use a high-fidelity interconnection between components. Taking those high-quality signals and squeezing them into the SDTV bandwidth would lose much of the video quality available. Also, it would mean that every device would need its own tuner (say you had a projector and a surround-sound system - both would need a tuner to determine what channel you were watching).

These days, cable is capable of pretty high quality, but the tuner issue still exists, and there is no real advantage to dropping all the existing interconnection systems, so they stay.

At my parents’ house they have 3 televisions with cable. One has a converter box for all their digital and HD content and the other 2 just have coax from the wall to the TV. This is how it worked before they had the digital box, too. At some point we had a cable box but eventually did not need it anymore.

Comcast just changed this where I live. We used to get about 30-40 analog channels without a cable box; now it is about 10, mostly the local network stations. They sent us a digital adaptor (not a full on cable box) to allow access to the other basic cable channels, which all went digital.

IANATVG (I am not a TV guy) but…
IIRC, the broadcast TV signal is a modulated - the signal is basically a black and white (luminosity?) signal, hence why it’s compatible with old black and white TV’s. For colour information, there is a short colour bust at the end of the data line.

This is the problem - that colour burts is much shorter than the black and white information. It takes advantage of a typical characteristic of colour images; while you can construct rainbow-burst colour images, typically an image consists of large swaths of a single colour. You can get a realistic colour picture by overlaying very imprecise colour info onto a sharp black and white picture.

This fell down as a good system when computers came along. They needed sharp edges and huge colour shifts in the image, also with sharp edges. What was acceptable as a television signal back then (480x640 or so; even worse for VHS) was not good enough for computer resolution. So instead of a colour burst signal at the end of the luminosity signal for each scan line, as found on the yellow “video” cable, the next logical step was separate luminosity and chromatic information. IIRC, this is what s-video does. Of course, since this information is basically decoded to drive the RGB (red green and blue) signals, the next logical step was to have separate output for each gun directly from the source.

The final stage is to feed the digital versions of these from the source to the end (HDMI) thus ensuring no quality loss or interference.

Finally - 3 signals plus sync is a LOT of information to send; especially at hi-def. 1080x1980 is about 6 times more information than 480x640; and you are sending that much info for 3 signals (RGB) so it’s 18 times more iformation. Broadcast the same way, it would take 18 normal channels to do 1 hi-def channel. SO new HDTV is compressed using MPG compression to a substantially smaller signal.

Of course, within a few feet, it’s cheaper and simpler to have one device which decodes and passes the information to everywhere else, than to pass it around fully encoded and have a decoder in each device. There’s a whole “hi-fi audiophile” community dedicated to determining which decoders work best, so it’s simpler to leave that task to separate choice.

Coax is just a medium. Its a single strand of copper. The long and short here is that cables like HDMI or CAT5 have multiple strands of copper so you increase your bandwidth geometrically and you can dedicate pairs for full duplex communication. Signal engineers and the people who create standards for data transmission prefer a multiple strand approach for many applications.

Component/Composite are more of a special case. They’re super dumb interfaces which split analog color and audio over each cable. That means very little processing for the TV and the device to do. Back in the 1980s this was really the only affordable choice. Now we can do digital to analog conversions pretty quickly thus standards like HDMI. Not to mention, HDMI has DRM which is why its so popular - because the movie studios dictate standards for their own needs.

Related question, how does the phone company do it? We get our internet, phone service, and HD television plus service for two other TVs through the phone line. Sure, they have fiber optic to the street, but not fiber optic to the house. How does that phone line handle all this?

If its uverse then they use VDSL which is a data connection over a pair of copper.

http://en.wikipedia.org/wiki/Very_high_bitrate_digital_subscriber_line#United_States

So its a 20-50mbps internet pipe to your home with x percent for internet and x percent for digital TV.

With all due respect, my signal, with 500 channels (I haven’t actually counted them; there may well be more) up to 100 of which might be HD comes in on that single coax cable. From the cable box comes just one channel. You cannot have more bandwidth coming out than goes in. Therefore a coax cable has to have 500 times as much bandwidth as it needs.