You set up an account with a website, download a program, then when your computer is idle, this thing uses the spare computing power to crunch numbers to solve various problems.
My office permits us to do this and even has a link to a specific website; right now my laptop is alternating between finding clean energy, fighting AIDS, working to fight cancer, and doing something about clean water. I think in the past, it’s worked on the human genome project.
Does this sort of thing really help those efforts? And if so, how?
I don’t have exact figures for any of those causes, but yes, donated CPU cycles can make a difference. It’s nothing more than a simple division of labor–if you’re trying to do a certain number of computations, and their results don’t depend on each other, then dividing them among two processors will make the overall task go twice as fast as trying to do it on one.
Yes, it’s huge. The Folding@Home Project is reported to be the fastest computer in the world, operating at about 12.5 petaflops over sustained periods of time. The fastest traditional supercomputer runs at approximate one-fifth of this speed.
This works, as ultrafilter says, effectively by volume. One computer donating its spare cycles isn’t a huge deal, but half a million working in concert is tremendous.
GIMPS (Great Internet Mersenne Prime Search) has been working for about 15 years. When it started, the largest known prime number had about 250 thousand digits. Now, thanks to GIMPS, the largest known prime number has almost 13 million digits. (Almost?) Every record in this category set in the last 15 years belongs to GIMPS.
Be aware of the power costs associated with this.
If I had a large quantity of machines in my business, I’d be annoyed if their CPUs were drawing power full-tilt in off hours rather than going to idle cpu usage, or being turned off.
A real concern… though in my case, people are supposed to lock up their computers (physically, as in not out on a desktop) at night or take them home; if we’re working from home or from the client site, it’s Not Their Problem, cost-wise.
I guess what I’m wondering is: the cost / time to develop the infrastructure to split up the calculations into manageable bits, to download them, to re-upload them, to develop the applets for each individual purpose… seems like for a lot of things, this would be more trouble than it’s worth. I can see something like prime numbers - the algorithm is pretty simple, and the download would be “John, check from 1,000,000 to 2,000,000; Jane, check from 2 million to 3 million” and so on.
The original reason that the code in question was developed was because these problems HAD to be distributed among different systems, as no single system had the compute power needed to get the job done in fashion timely.
Now that the code exists, the actual job of organizing the worker units is relatively low-demand.
It’s not that big of a deal. Lots of things in computing are 100 times harder than that.
Check out Hadoop, which is the big open source framework to automatically parallelize number crunching. You may have to do a bit of thinking to frame your problem in their lingo, but once you do, almost everything else is free.
Some problems are very well-suited to being broken up in this way, and those are generally the ones that get @home projects. Some problems aren’t well-suited for it, and you don’t see those.
And also the hardware costs. Running a cpu at full bore all the time isn’t just going to use up more electricity, it’s going to subject all the components to higher heat and electrical loads, which will make components fail faster than they might otherwise. Especially components with moving parts, like fans and hard drives, are going to fail due to increased usage.
I’ve never been convinced that running the @Home distributed processing stuff is actually a good use of resources as a society. It’s just hard to observe because the costs are so distributed.
How many people have computers that go out before they wind up upgrading them? My 15 year old computer still runs. And I’m probably one of the slowest adopters, as my current computer is 5 years old and uses a 10 year old processor (as it was a bargain PC).