Would leading AI companies see any benefit from using excess processing power of individuals computers and gaming systems

This does sound like it would be a good plot for The Blacklist though. Some sneaky criminal using distributed computing power to achieve some nefarious goal.

Say, can we make sure that some sneaky criminal isn’t using distributed computing power to achieve some nefarious goal?

Its already happening. North korean cyber warfare agents will hijack computers overseas and use them to mine crypto currency. They’ve made billions doing this.

https://www.wsj.com/articles/in-north-korea-hackers-mine-cryptocurrency-abroad-1515420004?gaa_at=eafs&gaa_n=ASWzDAiF0-wUyBbzTogxRQHoKLzHCO3ioHiw0uIxheMBCBsf3c09Vm8HaYMq5L10SuE%3D&gaa_ts=683d090d&gaa_sig=WOu3H0bqSih84KKpAMd9V7FV05YcMnVcDaMfRohU8EJe2lEAam4GRCvA-rmMPE9DkMK-tVARUio4LDyghFzPTA%3D%3D

It was a plot on Silicon Valley. They used hacked smart refrigerators to run their distributed software because they couldn’t afford (or were banned from, I can’t remember) legitimate cloud services. And as @Wesley_Clark says, it is a real thing that happens with cybercriminals using hacked systems to run anything from coin miners to spam relays.

For all of the reasons stated up thread, I don’t think any of the current AIs can be trained on distributed network of computers. However, I don’t think that means it is impossible. Deepseek was impossible until it was revealed. It is (almost?) as good as the best US AIs and was much cheaper to train.

Distributed training will require lots of changes to how traditional AIs are created, but it only requires one group to figure it out.

Been happening for years.

So apparently this is old news, but I just learned that ChatGPT recently hit 800 million active users. I didn’t know they were growing so fast. There are now more ChatGPT users than there are people in all of North America or Europe (not combined).

The company also just raised another $40 billion dollars, despite their confusing semi-non-profit status. It’s hard to imagine any sort of crowdsourced effort catching up to that, even if some of the latency and performance issues could be partially mitigated. For every volunteer BOINC has, OpenAI instead has a million dollars… from just their most recent round. Their main rivals are similarly spending tens of billions this year. That kind of capital can scale much quicker than home users’ idle compute.

OpenAI is also in the process of building a $500 billion data center in Texas, which will be lined with new supercomputers if all goes according to plan.

If any gains are going to come from the smaller players or the open-source community, it’ll probably have to come not from competing on raw compute but on more efficient training, better productization, truly revolutionary breakthroughs, etc.

I guess there could always be a secondary or tertiary market for spare compute, just as there’s always been, but it can’t match the sheer scale of investment we’re seeing at the bleeding edge.

This stuff just tends to scale faster than decentralized networks can easily keep up with, and having to deal with arbitrage on an open market just adds further overhead. Some of the cloud providers already offer similar schemes in their existing, non-AI cloud services, where you can pay a cheaper rate for non-guaranteed compute (that can be preempted by higher-paying users). I think some of the AI compute marketplaces offer similar things too. But collectively, these are the leftover scraps compared to the big boys.

But not all the big companies are sure this investment is sound. Microsoft already pulled out of many of its data center deals, and is retreating further, especially after DeepSeek (Microsoft pulls back from more data center leases in US and Europe, analysts say | Reuters). DeepSeek’s latest update further closes the gap… it’s entirely possible to run a slimmed-down version of it for free on own your computer, BTW, but it has built-in Chinese censorship and pro-CCP propaganda in its training. Anyhow, I suppose sooner or later someone will figure out more even more efficient ways, and training costs may plummet.

So, who knows, maybe in a few years, all this new GPU capacity will be rented back out to gamers instead. That’d be nice… (selfishly, I’d love to see more of Nvidia’s compute go towards GeForce Now rather than better chatbots…)