Batteries suck. AC Adapters suck. When are we getting wireless power?

IIRC, a lot of the power line studies didn’t take a lot of extra factors into account. In many areas, houses near power lines tend to be less desireable lower income properties, and Lord knows what else is creeping into your data.

Apart from having DC in your house, which I agree is not practical, I DO think the power bricks could be one hell of a lot more standardized than they are. There is no reason why they could not be a commodity item one bought at a hardware store rather than requiring a specific one for each gadget. A few standard voltages and power capacities should suffice WITH A STANDARDIZED !!*$&## CONNECTOR. You would think the manufacturers of gadgets would LIKE this idea, as it could get them away from having to package a power adapter with their product - as well as being “batteries not included”, the package could say “power adapter not included. Requires a 4.8 V DC, 500 mA or greater adapter”.

I was at a demonstration once where they had a Tesla coil and used it to power fluorescent tubes wirelessly, and they asked why this system wasn’t used. So they turned on a radio and showed how the resulting static drowned out any signal.
I haven’t looked into it, but I suspect that wireless power for your laptop would ruin WiFi for your laptop.

Besides, broadcast power’ll give you something like myasthenia gravis, at least in Heinlein’s universe – didn’t you read Waldo?

Well, it actually does work for Google, and would work for a lot of datacenters. What they’ve done is eliminate the single large battery backup for the datacenter in favor of a battery on each server. What this means in the context of this discussion is that this allows them to convert from AC to DC at a single point, and distribute DC directly to each server (charging/powering the batteries and the server components without additional conversions). Typically, AC comes into a datacenter, is converted to DC to go through a huge battery backup power supply (UPS), converted back to AC to send to each server’s power supply, wherein it is once again converted to DC for use by the motherboard and components. Google has found that, despite the supposed transmission inefficiencies of DC, it is far better than the repeated conversion from AC to DC and back. Their datacenters house thousands of servers in a relatively small space, though, so while it may be more efficient there, I don’t know if it would work for a typical house. I don’t know if Larry Page was advocating it in the home or not, but it certainly works for Google and other places with heavy IT infrastructure.
Cite: ZDNet, Google

You might be surprised. The Mac Mini uses 37 W and can drop as low as 8-10 W for tasks like web browsing. My office has a server with a power supply rated for 108 W. Laptops generally have converters rated in the 60-90 W range.

Obviously, none of these are optimal for video rendering, gaming, heavy-duty server applications, etc. but they do serve a large segment of the computer population quite well.

If we did convert to a 180W maximum, I think people would also find clever ways around it. For example, there’s no reason you couldn’t do your hard-core gaming or video rendering using a thin client, with the heavy processing done remotely in the cloud. Some companies are already selling server/computing services in a cloud-based model with fees based on computer utilization.

Not that this makes wireless power itself any more feasible…

What’s the deal with solar here? Its inefficiency?

Not to get too offtopic, but that’d require quite a bit of bandwidth and for gaming, very low latency. So far processing and local electrical power are both cheaper than fast Internet connections.

Yep, it works great in datacenters where the wire length isn’t all that terribly long. Low voltage DC is often used in manufacturing plants as well, for several cabinets of equipment all in a local vicinity to each other. I deal with a lot of 24 and 48 VDC equipment at work.

There is a big difference between a datacenter, where you have racks and racks of equipment all in close proximity to each other, and an entire buiding where the wire runs are much longer.

It bears repeating here that all of the methods of wireless power transmission that have been demonstrated take one of two forms: On the one hand, you’ve got radiative methods, which, in order to get enough power to run something like a computer, would cook everything in the room. There are proposals to get power from orbit to the ground using a system like this, but you’d have to be extremely careful how you aimed your beam to hit nothing but the receiver, and I doubt any country would be comfortable with other countries having technology so easily used for violence.

On the other hand, there are some near-field induction methods that have been demonstrated. These can be made safe enough, and relatively efficient, but the problem with them is that you need huge antennas to make them work (at least comparable in size to the distance between source and receiver). Now, there are some practical applications for this (for instance, a pad you can set your cell phone down on top of to charge it, without needing to bother with any plugs), but if you wanted to transmit power to your laptop from, say, 20 feet away, you’d need an antenna 20 feet or more across attached to your laptop.

Yes.

With *current *technology, PV-based solar panels are only useful for supplying energy where grid power is not available (e.g. remote mountain top), or for low-power devices where plugging into an AC outlet is inconvenient (e.g. calculator). PV arrays should ***never ***try to be used for large-scale power distribution; they destroy good coal and good oil that could otherwise be put to good use. Some countries have found this out the hard way.

In the future it is possible someone will invent a solar panel that generates net energy. That day may come. Or it may not.

As I said a few posts earlier, they certainly could do this. But the companies that sell the power bricks would make a lot less money. So why would they want to do that?

They produce net energy now, in good conditions, just not very much, and they do it extremely slowly. Basically, you need to keep the panel running for decades to break even.

i think decades ago it took a couple decades for energy payback. recently i’ve read that energy payback is years.

What I learned was in line with what Chronos and johnpost said. Are you saying that’s wrong?

I don’t understand… how does solar destroy oil and coal? You mean the energy it takes to make them? I thought the modern stuff does pay itself back – both in terms of money and embedded energy – within a decade or two.

Well, statistically, they do tend to be poorer, do they not?

Are you including the the batteries?

Actually, I am using OnLive’s service right now. I was part of their beta and I am part of the the founding members program.

I was skeptical before I joined, and was like most who thought there would be too much latency and that it would require too much bandwidth.

I was really surprised with how responsive controls were and how clear the image was. That’s not to say there aren’t any faults. BTW, I live in Ohio and the data center I connect to is in D.C. I think that’s pretty impressive, though most lag comes in the last mile anyway.

Currently it does not support WiFi, it has a max resolution of 720p@30fps (5Mbps), doesn’t allow customization of game files or most video/audio settings, does not allow any software to be installed locally, does not allow you to transfer games you already own to their service, and does not have a wide selection of games.

Though all of these problems should be addressed in the coming months.

OnLive does work with WiFi if you set it up with a bridge. Not exactly ideal. Most who use it claim that it works fine, but I have not personally tested it. I suspect it has something to do with how standard routers transmit wireless data. The omni directional and low-power antennas are probably the main problem. I’ve read that they’ve given good demonstrations with the iPhone and iPad using 4G, and that’s AT&T! Though it was a much lower resolution, and I imagine they had a personal cell site nearby. Wireless does seem to be the main chink in OnLive’s armor. I still like the idea of never upgrading my PC again to play a PC game, and having all my games available when I visit my mom, even though she has a crappy computer, and I have a crappy laptop(well one note capable of gaming).

OnLive should be able to handle 1080p@60fps with 15Mbps (my current downstream bandwidth) though kinks are still being worked out (there was some problems with artifacting in the beta. If we manage to get some decent bandwidth at some point we’ll probably be capable of 1080p@120fps and 3D. I plan on attempting to move to wherever Google sets up its 1Gbps ISP.

Batteries are only necessary if you’re going completely off-grid. You can still get usable power out of them without.

What about things like this?

http://www.amazon.com/Powermat-Portable-Black-Doors-Separately/dp/B002JCSAWM

Will proximity always be an issue? Could there be other uses for the technology utilized in the product I linked?

Yes, it will.

There is a lot of stuff being developed with wireless power but the distance is measured in inches.

http://www.sciencedaily.com/releases/2007/06/070607171130.htm

http://en.wikipedia.org/wiki/Wireless_energy_transfer

http://en.wikipedia.org/wiki/Resonant_energy_transfer

Most of the near term applications are for electric vehicles for recharging or providing power from a buried antenna. There is no risk of electrocution, since the coupling is magnetic rather than electrical.

http://www.bombardier.com/en/transportation/sustainability/technology/primove-catenary-free-operation

http://www.popsci.com/cars/article/2010-03/koreas-online-electric-vehicle-gathers-power-road-wirelessly

http://www.greenoptimistic.com/2010/03/10/olev-wireless-electric-vehicle/