I had this idea years ago when I first realized that radios are little more than glorified transformers. What I want to know is how practical would it be to try to collect energy from radio waves and other EM radiation near those wavelengths and lower? You hear radio stations talk about their 50,000-watt transmitters, can I reclaim a few hundred of those watts? What about natural EM radiation from space? How usable would it be?
I think that the actual device would be pretty simple to construct, just a large antenna connected to a series of diodes and capacitors to rectify the current and make it usable. Is this feasible?
PS, I know that this would be as much a “free energy” device as solar panels.
Muad’Dib, send me $15 and I’ll send you the plans.
If you feel ripped off, send $20 to sailor, and he’ll explain how to get your $15 back.
Seriously, though, it doesn’t work. You need a big antenna just to light an LED. You need to stand under a 500 kV transmission line to make a fluorescent light glow.
The best “EM radiation from space” that you can get is the 1 kW/m[sup]2[/sup] that you get from sunshine, and you’re back to the solar panels.
You need to realise that the energy you receive is inversely proportional to the distance squared between you and the transmitter. Why would you need power for radios if there’s so much energy in the waves?
This is possible, in the right circumstances, but also illegal. Some years ago someone who owned grazing land through which some of the main Washington-to-California transmission lines went through built a metal fence directly under the cables. He got lots of power from this ‘fence’ but the extra power drain was easily detected by the power company…
Think of it this way - if a radio station emits 50,000 watts of power omnidirectionally, then if you’re a mile away from the station think about a big sphere with a radius of a mile. All that energy is dispersed across that sphere. Now you set up your little 20ft antenna, and try to capture some of that energy. Even if you could capture all the power at that point and convert it back into electricity with 100% efficiency, just how much energy do you think you’d get?
Doing some rough math, and making some simplifying assumptions (that the Earth is flat, no energy is absorbed by the Earth so we have a half-sphere radiating out from the antenna), then at 1 mile, the energy from the station will be spread over an area of about 44 million square feet. Let’s say you have an antenna that covers 44 square feet, and you had magical pure energy conversion. The amount of energy you could pick up at that distance would be something like .5mW.
I did not mean for the radio station to be a specific example. It is just that if you go to the wilderness and turn on a radio to any station you will here static that could be transformed into some amount of current. How about for all spectrums? Roughly how much energy is hitting me in the 1-cm and longer spectrum? I don’t mean to insist on this being a great idea, I just have no idea what sort of useful current is carried by those frequencies.
BTW, it has been a few years since I studied electronics, so please forgive my mangled use of the words energy and current.