Can I rent time on my computer to the Cloud?

A recent thread discussed the cloud and that one definition of the cloud involves offloading complex tasks to an array of computers out there in the cloud, where presumably you pay for what you use rather than needing to pay upfront for hardware that’s going to sit in a server room and get used or not get used and gather dust and malware.

Can I rent time on my computer out to The Cloud in any meaningful and practical way and get regular rent checks? I have an older computer that is gathering dust but can still do floating point division and matrix transforms like it used to. If you can do this, can you do it casually with your computer sitting at home and connected over the Internet or would you have to physically take it to a Cloud facility where staff slaps a label on it saying “Property of robert_columbia. Lease expires October 31, 2014” and hooks it up to, well, their Cloud as one of many and you pick it up next year when the lease is up?

My guess is that the answer is mostly no and that the cost of the physical computers that make up a Cloud pales in comparison to the cost of administering, patching, supporting, monitoring, etc. them and that “one million computational cycles” is no longer a tradeable commodity like it was back in the 1960’s or 70’s.

I’m not talking about the old “Own a computer? Put it to work!” signs that people would put up around town that implied that you could get a passive income by hiring out your computer as a “worker” (shades of renting out actual human slaves in days of old and collecting all their pay because you owned them) but that actually were related to a pyramid scheme or Multi Level Marketing plan.

There are a ton of issues that would need solved first, one of the most important being, “what company is going to send their private data to a hundred computers in people’s living rooms to be processed?” Sure the data can be encrypted coming and going, but it can’t be encrypted while the operation your computer was rented for was in memory-- and there’s no way to stop you from using a debugger to read out all those juicy credit card numbers and other bits of personal information.

There’s other trust issues too. What if you overclocked your computer (to make more money) but as a result it gets a bit-error every 10,000 calculations or so? How do you verify the results from the computer are correct? Well, you could hire a second computer to check it, but… now the cost savings just evaporated. I’m reasonably certain that Rackspace and Amazon are running their computers in spec, with proper cooling, proper cleaning, and outage-resistant power supplies. The same can’t be said for the computer from Joe’s basement.

I’m sure there’s a hundred other problems too. In your specific case, the small amount of computer capacity your old computer adds to the network probably isn’t worth the cost of electricity used to run it, so you’d be better off just turning the thing off.

I did a quick search and found this which looks interesting. Let’s us know what you find out. http://www.cpusage.com/

edit: here’s an article on them

A bitcoin mining pool is a form of this, though probably not what you had in mind. You contribute work to the “cloud” by helping to verify transactions, and in exchange earn a share of the profits.

You can volunteer your computer time to various cloud-based problem solving tasks. For example, analyzing SETI data. It basically involves running a screen saver that does computations.

You are going to have a hard time getting paid for it. Companies rent Amazon cloud servers which are designed for this.

My guess is that your guess is right, save for volunteer projects where you are paid in good, cheap “satisfaction with having contributed to research”.

Computational power is cheap and Amazon Web Services is already set up to provide any VM you want on its servers. The additional compute power from the potential addition of your older computer could not be sold for more than the cost of the labour required to administrate it.

For-profit cloud computing is not as egalitarian as seti@home or the like.

I’m not sure what your basis for that is. If you’re going to include the cost of hardware, then yes, that’s certainly true, but i have several machines that produce 2-3 gigaflop benchmarks just on their hyperthreaded CPUs. If you add in a few high end graphics cards with over a thousand pipelines each, which I also happen to have, those go up by quite a bit.

Granted I’m not your average user since half of my rigs are dual processor Xeon servers each with between 8 and 16 hyperthreaded cores, but I spend virtually zero time with “administration.” I have a program that runs in the background that monitors all of the machines and lets me know if anything is amiss. Presumably any decent distributed service will provide some similar utility.

I’ve just launched my new startup business doing just this.

I take a slightly different approach to cpuusage though - you download an app from my site that creates a Linux virtual machine on your local box on the fly, then Slicify gives customers direct access to that VM. You also decide what rate you decide to charge (renting a VM from Amazon EC2 charges can be anything from $0.05 to $2 per hour depending on the spec of your machine, so that would give you some idea). I went with the VM approach because I think It’ll be much easer for users rather than trying to force them to use a custom API.

You raise some good points about security. On the other hand this is already an issue for anyone of the millions of users that use Amazon EC2/AWS. If Snowden can spill the beans on the whole NSA, imagine what one disgruntled sysadmin at Amazon could do with all the data from all the customers that host there.

If anything, I’d argue that splitting up your data and processing it across 1000 different individuals computers is more secure than just dumping the whole thing with Amazon or Google and hoping for the best.

Reported.