If the PCs were set to go into standby or sleep mode after a certain period of inactivity, they could potentially use only a few watts overnight, instead of running full-blast with SETI@home.
?!? I assume that’s just the reporter misunderstanding something (it happens. Part of the whole ‘being on a deadline’ thing.)
Because, while I’m no IT professional, I assume that if it takes a million bucks of time to remove this from these computers, then this guy should have been fired just for being an incredibly incompetent network manager.
Although it can use up more processor, it’s basically just a screensaver. Obviously either the guy was unfairly demonized, or the article was poorly written. Turns out it’s the latter, since the real reason he was fired was more complicated, and SETI@home is a legitimate research tool and not some kooky UFO thing.
While penny pinching is often wise, when calculating the supposed costs, you really have to give it context. If you multiply any particular thing by 5000 (or worse by 5000 computer time so many number of years)it’s going to seem like a huge thing, but when you compare it to the budget (or CPU or network traffic or maintenance hours etc) required to maintain a system of 5000 computers (or over so many years!) it’s not going to seem quite as outrageous. It’s not like they were expecting to spend $10,000 and it turned into $1 million. And it’s not like by uninstalling it they will have to hire a super crew of expensive uninstallers. The regular network administrator does that sort of thing as part of their job.
The source of the $1 million figure was a press conference given by Superintendent Denise Birdwell. It has not been substantiated. Presumably the utility cost of the program involves a comparison with the electricity that the computer would have been using to the electricity actually used with a given set of BOINC settings. The latest incarnation of BOINC is highly configurable.
According to David Anderson, the director of Seti@Home, “If you configure the software to compute in the background while you’re using the computer, and configure your computer to go into a low-power mode when you’re not using it, the cost is something like $1/month. Many people (currently around 500,000) believe that this is a good way to support research in areas like drug discovery, epidemiology, climate change research, helping design the LHC accelerator at CERN, and, yes, searching for signs of extraterrestrial life.”
http://www.pcworld.com/article/183744/uproar_over_et_hunter_misplaced_setihome_founder_says.html
1125000= $60,000 per year for 5000 computers. Over 10 years that would total to $600,000. I guess these things can add up.
As a comparison, a computer can use anywhere from $5.50 to $405 per year to run (or 45 cents to $34 per month). Then again the lowball example involves 10 hours per week of use, which is highly unrealistic in this context.
SETI@home is in hibernation.
We are no longer distributing tasks. The SETI@home message boards will continue to operate, and we’ll continue working on the back-end data analysis. Maybe we’ll even find ET!
Thanks to everyone for your support over the years. We encourage you to keep crunching for science.
SETI@home is a scientific experiment, based at UC Berkeley, that uses Internet-connected computers in the Search for Extraterrestrial Intelligence (SETI). You can participate by running a free program that downloads and analyzes radio telescope data.
https://setiathome.berkeley.edu/
https://news.ycombinator.com/item?id=35204860
SETI@Home was something that one heard about now and then–but I haven’t heard anything about it for a long time.
I would recommend trying Folding@Home. It is the same idea as SETI@Home in that it crowdsources many individual PCs to effectively become a super computer. But this is for folding proteins which has many applications in medicine.
And it is still operational.
I ran Folding@Home for years, until the CPU it was running on died. I suspected electromigration, and stopped running F@H because I didn’t want to risk wearing out my replacement.
If you ran it for years it may just be that your PC was old went kaput. Happens.
No, it was very specificly the CPU. Which I pulled from the motherboard and put in a new CPU. Don’t tell me how to suck eggs.
So it may just be that your CPU was old and went kaput. The fact that it was specifically the CPU, out of all of the many components, that died means very little.
Things don’t stop working because they “get old”. There has to be an underlying physical reason for a CPU to go kaput. It didn’t have hot/cold cycling. It didn’t have power spikes. What it had was years of running at full capacity 24/7. Which can lead to more wear via electromigration than if it was allowed to idle when I wasn’t using it. I swear people here will try to argue over any damn thing.
I ran Folding@Home for months until the laptop it was running on let the magic smoke out.
Early in the pandemic Folding@Home added Sars-Cov-2 folding experiments, so I, like many, many others, joined in. The GPU ran for hundreds of hours without a problem. Fortunately it died when I started Minecraft, rather than in the middle of the night running Folding@Home.
I don’t think there was anything special about Folding@Home that caused the failure, but just the hundreds of hours of running the GPU. Prior to Folding@Home it occasionally ran nvenc_hevc for video encoding, but wasn’t used much. The laptop still worked as long as I deactivated the GPU as early as possible in the boot process, and only used the integrated graphics. Still never let the computer be powered on unattended.
I believe the part that burned up is a MOSFET involved in feeding power to the GPU.
From your link:
The bottom line is that while electromigration is a real thing that can break a CPU, it’s not something you have to worry about unless you’re running a CPU above its rated limit, 24/7, without ever taking your foot off the gas.
Were you overclocking when SETI@home was running?
I don’t remember. But it also wasn’t a “modern” CPU with advanced thermal controls, it was an Athlon X2. But I’m also tired of this bullshit arguing with me over my merely saying that I suspected electromigration and won’t respond to any more of it.
Moderator Note
You have been around here more than long enough to know that we don’t just take things for granted here. There is noting wrong with questioning something, especially in FQ where folks want to get to the true facts of the matter.
If you don’t wish to engage in any speculation that doesn’t match your opinions that’s fine, but your hostility is very much out of place for this forum. Dial back the hostility, please.
Sorry, I may have overreacted. I saw the first instance as “computersplaining”. I didn’t say in my original post that after the computer quietly switched off, attempts at rebooting generated a specific pattern of error beeps from the BIOS that indicated a CPU failure and that (not having internet access to order a new CPU since that was my only computer and this was years before smartphones, or having the budget to buy a whole new motherboard, CPU, and memory) I had to go to local computer repair shops looking for a replacement for an older model CPU and had to go with replacing a dual-core Athlon with a single core Duron. I didn’t say all that because I wasn’t expecting someone to ask me “are you sure your computer wasn’t just broken”?
By this reckoning everything should work today because it worked yesterday.
“Get old” isn’t a reason for failure, “get old” is a vague description. Things fail because component x fails in method y because of physical process z.
Yes, but that still doesn’t establish that the distributed-computing process specifically was the root cause.
And in fact, electronics does ‘get old’ mainly due to electrolytic capacitors’ electrolyte changing properties or just plain drying out.
But on a solid state chip that, per se, is not a factor which is why I read *DG’s article – clear to the end’s concluding sentence, the one I quoted. Thus my question which was genuine and not a ‘gotcha.’ When that blew up in my face I stopped as apparently no further information or discussion would be forthcoming.