How does SETI@Home work?

No, they’re just for bragging rights and to create a sense of accomplishment. They also helped to create a competitive aspect for users and teams.

Now the programs can be run in the GPU also. I’ve run the Folding program before. They were going to make a game to see if gamers could find solutions. foldit Beta

I work in the Rosetta@home lab. And are very thankful for the donation of computer time and money that our volunteers donate. Most users get between 6 and 10 “BOINC” credits per hour of computation with our software.

A quick search indicated that a work unit is about 350K in size. That’s about equal to a medium-to-large size JPG. If each PC completes one unit a day, that’s about 175 megs of data transferred. That’s a little less than 1/3 of a CD-R, or maybe 35-40ish MP3s worth. Any campus large enough to have 5000 available workstations all connected to the internet is not likely to ever care about the network traffic. Heck, modern home internet connections might not even notice that traffic.

>The program is no more harmful than a screensaver program. How many computer cycles has this school district wasted over the years drawing Moire diagrams, bouncing a ball, or drawing pipes on people’s idle screens?

A lot less than crunching numbers 100% of the time. A well run network doesnt have this shit anyway. It has things like standby mode, monitor sleep, etc. He purporsely didnt use these things for his pet project and wasted 1 million dollars in tax dollars. Honestly, he got off too easy, he should be forced to pay half his wages back to the state for theft.

>SETI@home is a respected program hosted by the University of California at Berkeley that analyzes radio signals from the Arecibo radio telescope.

Perhaps he should have asked for permission instead of wasting power. I dont care if he’s curing cancer, he doesnt have the right to make that decision with public property.

Well that is really the basis of the problem. He installed software without asking permission. It wasn’t under his authority to decide, he got caught.

Good intentions don’t count. Most jobs have a policy of not allowing outside programs to be installed on their system without approval. He violated policy.

Add in the fact that is was a publicly funded school situation, where any taxpayer can raise a fit about wasted money, and he really, really needed to get prior approval.

I ran seti@home for almost 10 years before deciding that the approach they were taking was not likely to lead to a result. So I left it. But that would be another thread…

Well, the article linked in the OP refers to this guy as the technology supervisor. He might have been the guy who approves software. With 5000 computers under him he had to have help I’d think. That’s what’s fishy to me. This can’t have been a big secret.

Even if he was the head IT guy, which the areticle does not say, he still probably doesn’t set the school district wide policy on software usage.

And for the sake of discussion, let’s say say he was the top guy, set the policy, and violated no part of it.

As I mentioned above, in a publicly funded area, particularly schools, all it takes is a concerned parent or other taxpayer to raise a fit about squandering of public money. Sacrifices must be made and somebody goes. Happens all the time.

Ok, sorry, I missed a step…

FYI, our “typical” workstations have either 300 or 350 watt power supplies and their OS are imaged identically, including screensaver/powersavings settings so your assumption that they are left idling is false.

Thanks for the clarification, you’re right it’s not much.
You’re wrong about the board not caring about network traffic though, they monitor it quite closely for security, internet content, and bandwidth.
They may not notice it during the school day but it would definitely show up during the off-peak periods when SETI@home would be running.

Ok, but even if a 200 watt power supply is running only when schools are unoccupied from say 4pm to 8am Mon. to Fri. that’s 16 kW plus all weekend for 12.8 kW that’s 28.8 kW weekly and 1497.6 kW annually.

The kicker is … he had 5000 computers running… so now it’s 7,488,000 kW/year and he’s been doing this for years?!
I see how they might see it as a drain on their utility bill.:smiley:

In the interest of fighting ignorance, it’s important to note that along with running SETI and not installing firewalls AND stealing equipment, he was also downloading porn and was formally warned about his poor performance before.

http://www.azcentral.com/news/articles/2009/11/30/20091130searchforaliens1202.html

So he was probably going to get fired anyway. The alien search just makes for a better news story.

What the power supply is rated for is different than what the computer is actually drawing. For instance, here’s a table of actual total system power draw for a gaming computer with a high-powered videocard:
Power Consumption - AMD's Athlon II X3 435 & New Energy Efficient CPUs: Killing Intel Below $90. Note that even the biggest CPU system has less than 220Watts draw under full load.
Granted, you’re going to have to add some more power for the monitor, but still, I doubt your workstations are pushing 300W even if you throw in a CRT monitor.

I was just presuming that the guy’s been doing this so long that some fraction of the hardware he’s put SETI on might have predated powersaver settings.

I’m not sure I understand the above math you’re using. You might want to ensure that you’re adding units of power (kW) and units of energy (kW-hr) appropriately.

>I was just presuming that the guy’s been doing this so long that some fraction of the hardware he’s put SETI on might have predated powersaver settings.

NT has had support for the HLT instruction since the 1990s. That means the CPU would cycle a low wattage instruction. No need for fancy power savings. The difference between that and running the CPU 100% 24/7 is a lot of energy.

Still, the superintendent said they wouldn’t have had a problem with cancer research, so a million dollars in CPU usage is a red herring. It sounds like the guy was stealing equipment and downloading porn and generally just abusing his IT position, and needed to be gone.

Yeah, probably but the fact remains running 5000 computers at %100 CPU capacity and downloading 875 gig daily is a huge misuse of board equipment and resources and could easily have cost millions in utilities, maintenance, and replacement parts.

That being said… the other issues, computer parts found at his home and the inappropriate use of the internet, are both much easier ways to get yourself fired.

The news article I read said nothing about utility costs, merely that it would cost “$1 million” to remove it from the systems.

I know I don’t. I leave my desktop unit on 24/7 with BOINC running. I have a SETI project and a World Community Grid project running at any given time. When I get up from my desk, I physically power off my monitor rather than leaving it on with a screen saver. That, I think, more than makes up for an power consumption caused by BOINC.

Not even close. A core 2 duo based computer at 100% on both cores uses around 150 watts of power.

A LCD monitor in standby that is Energy Star rated? 2 watts or less.

Citation needed. I run BOINC on my desktop at home and my 2.2 GHz CPU still throttles down to 1.0 GHz when I’m not doing anything, even though a Rosetta and a WCG workunit are running in the background.

Here you go…