Heh, I was thinking he meant windows!

Cable was needed for reasons of limited bandwidth in the public airwaves and the ability to charge you for “premium” services.
I believe Cable was developed so that communities in canyons, valleys, etc. could receive television signals that were extremely weak or non-existent when transmitted “line-of-sight” from traditional tower/transmitters.
When CATV (Cable TV) first appeared on the scene I don’t think there was a problem with limited bandwith. Because there just weren’t that many broadcast stations back then.
So, CATV providers had a lot of empty bandwith in the early days. They even chose to fill some of it with “filler” like WBS-Atlanta and WGN-Chicago. We think of it as normal today, but really-- why would someone in San Diego want to watch local Chicago or Atlanta news? Another example of Invention being the mother of Necessity.
It certainly expanded the Cubs and Braves fan-base though! And made mucho bucks for Ted Turner and others.

The power density will eventually increase as the square of the distance, but that doesn’t mean that every method of beaming down power will be as inefficient as broadcasting in all directions. A narrow beam that broadens as the square of the distance can still be a relatively narrow beam when it hits. Even laser beams, for insdtance, lose intensity as the square of the distance once you get well beyond the Rayleigh Range, yet no one would deny that lasers are still tightly-focused beams with little spread even that far from the source.
The original plan for beaming power down from their orbiting power satellites did indeed envision microwave beaming of the power down to earth.
ThisWikipedia article confirms that you are quite correct about the microwave transmission proposal. Of course, in orbit a huge reflector could be used and quite large antennas, say 40 m in diameter, could be used on earth. Taking all of this into account, for a 1 megawatt station (quite small by present standards) the power density at the receiving antenna would be 80 mW/cm[sup]2[/sup]. That’s way too high for continuous exposure but it doesn’t look like it immediately fry you if you flew through it. This computation assumed 100% efficiency at the receiving antenna.
This seems to make microwave power transmission from satellites at least worth looking into, and that’s what happened.
Oh yes. The antenna beam width for a geosynchronous satellite transmitter would be about 1.1 microradians. My shirttail guestimate on the transmitter antenna size is 11 kilometers.
Similar to what scr4 and Shalmanese are talking about, there is a device called the Splashpower Splashpad, basically a plastic pad that wirelessly charges gadgets placed on it. Unfortunately, it’s already been a few years in the making and I don’t think it’s out in the market yet.
I believe Cable was developed so that communities in canyons, valleys, etc. could receive television signals that were extremely weak or non-existent when transmitted “line-of-sight” from traditional tower/transmitters.
That is correct. CATV, when first introduced, meant “Community Antenna TeleVision.”
When CATV (Cable TV) first appeared on the scene I don’t think there was a problem with limited bandwith. Because there just weren’t that many broadcast stations back then.
Also correct.
So, CATV providers had a lot of empty bandwith in the early days. They even chose to fill some of it with “filler” like WBS-Atlanta and WGN-Chicago. We think of it as normal today, but really-- why would someone in San Diego want to watch local Chicago or Atlanta news? Another example of Invention being the mother of Necessity.
Not so correct. Early CATV systems, because there was a dearth of channels, weren’t built with any significant excess capacity. The limited content meant there was no need to develop the technology necessary carry signals at the vastly higher frequencies used today. So, technology, or rather the immature technology played the key role. Each analog channel on a CATV system requires 6 MHz of bandwidth. Since there were comparatively few channels to carry, the first CATV systems carried signals with an upper limit of less than 100 MHz. This has an interesting effect on system architecture. Since signal levels at higher frequencies drop off much faster than signal levels at lower frequencies; this permitted very long runs between amplifiers and fewer amplifiers means less noise at the end of the line.
For the first 20 or so years of CATV, the limited bandwidth didn’t matter a whole lot since there just wasn’t much content to carry. As more channels became available, CATV systems had to adapt by carrying signal at ever increasing frequencies - up to 450 MHz in many cases and up to 550 in some. And as upper frequencies increased, the reach between amplifiers had to decrease. This led to some very long strings of amps - often as many a 40 in a cascade. With the longer cascades, re-amplified noise started to become a problem; the S/N ratio at the end of the lines became unacceptable.
To solve this noise problem, it became imperative that amplifier cascades be reduced. To do this, we needed to enter the age of fiber optics. The typical CATV system architecture is now HFC - Hybrid Fiber-Coax. Fiber optic conductors are run to small pockets of subscribers - usually between 200 and 1000. There is place an opto-electric convertor, most often called a fiber node. From this node, signal is placed on the coaxial conductors at frequencies all the way up to 1GHz (750 or 870 is more common tho’). And since we’re only serving a small pocket of homes, amp cascades are much reduced; to fewer than Node+7 in most systems, and even down to Node+1 in some.