So apparently this is old news, but I just learned that ChatGPT recently hit 800 million active users. I didn’t know they were growing so fast. There are now more ChatGPT users than there are people in all of North America or Europe (not combined).
The company also just raised another $40 billion dollars, despite their confusing semi-non-profit status. It’s hard to imagine any sort of crowdsourced effort catching up to that, even if some of the latency and performance issues could be partially mitigated. For every volunteer BOINC has, OpenAI instead has a million dollars… from just their most recent round. Their main rivals are similarly spending tens of billions this year. That kind of capital can scale much quicker than home users’ idle compute.
OpenAI is also in the process of building a $500 billion data center in Texas, which will be lined with new supercomputers if all goes according to plan.
If any gains are going to come from the smaller players or the open-source community, it’ll probably have to come not from competing on raw compute but on more efficient training, better productization, truly revolutionary breakthroughs, etc.
I guess there could always be a secondary or tertiary market for spare compute, just as there’s always been, but it can’t match the sheer scale of investment we’re seeing at the bleeding edge.
This stuff just tends to scale faster than decentralized networks can easily keep up with, and having to deal with arbitrage on an open market just adds further overhead. Some of the cloud providers already offer similar schemes in their existing, non-AI cloud services, where you can pay a cheaper rate for non-guaranteed compute (that can be preempted by higher-paying users). I think some of the AI compute marketplaces offer similar things too. But collectively, these are the leftover scraps compared to the big boys.
But not all the big companies are sure this investment is sound. Microsoft already pulled out of many of its data center deals, and is retreating further, especially after DeepSeek (Microsoft pulls back from more data center leases in US and Europe, analysts say | Reuters). DeepSeek’s latest update further closes the gap… it’s entirely possible to run a slimmed-down version of it for free on own your computer, BTW, but it has built-in Chinese censorship and pro-CCP propaganda in its training. Anyhow, I suppose sooner or later someone will figure out more even more efficient ways, and training costs may plummet.
So, who knows, maybe in a few years, all this new GPU capacity will be rented back out to gamers instead. That’d be nice… (selfishly, I’d love to see more of Nvidia’s compute go towards GeForce Now rather than better chatbots…)