Greenhouse gas contributions due to AI adoption

Are there any estimates for the impact of AI adoption at current and near term estimated adoption levels on greenhouse gas emissions targets achievement?

I understand that the big techs are also funding development of alternative energy options to power their processing centers, so longer term is hard to deduce, but none of those investments are going to be suppling significantly in the medium term.

I have previously read that bitcoin mining alone was a major contributor but not hearing anything about what AI adoption does to the risk forecasts.

An informed guess would have to be based on where the centers are located, planned on being located, and the regional energy production mixes I imagine. Beyond my knowledge base!

Here’s something related

There is some doom-and-gloom information here:

From that article, which is itself citing a DoE report:

In analyzing both public and proprietary data about data centers as a whole, as well as the specific needs of AI, the researchers came to a clear conclusion. Data centers in the US used somewhere around 200 terawatt-hours of electricity in 2024, roughly what it takes to power Thailand for a year. AI-specific servers in these data centers are estimated to have used between 53 and 76 terawatt-hours of electricity. On the high end, this is enough to power more than 7.2 million US homes for a year.

That DoE report says:

As noted earlier, U.S. data centers consumed approximately 176 TWh in 2023. […] Concurrently, total GHG emissions for that same electricity grid mix would be 61 billion kilograms of CO2 equivalent.

If 61 billion kg is 61 Mt, that would put ALL U.S. data center emissions (including but not just AI) somewhere between the emissions of North Korea and Libya: List of countries by carbon dioxide emissions - Wikipedia

They are #53 and #54 in the list of top-emitting countries. By contrast, the US puts out 4682 Mt/year of CO2e. This makes data center emissions about 1.5% of our annual emissions.

China puts out more than 3x what we do, about 13000 Mt/year.

As a part of global emissions (39000 Mt/yr), US data centers contribute a tenth of a percent (0.15%).

AI would have to scale up a lot before it becomes a major CO2 contributor.


IMHO: This whole thing about AI energy usage is overblown media hype that carried over from covid-era crypto reporting. Back then, “crypto is wasting energy” was in every other headline, but that made sense because crypto was wasting money doing pointless computation — artificially difficult hashing designed as a means of preserving the value of crypto coins. But even then it was never a huge amount of energy in absolute terms, just a lot of energy spent doing unnecessary work. When the crypto bubble burst, all those same reporters applied their same analyses to the AI bubble. But AI training, in its current iteration, actually requires that amount of energy… it’s not the same as the artificially inflated math puzzle number hunts of the crypto world.

In absolute terms, neither was using all that much energy. In relative terms, AI training may still be inefficient, but it is not deliberately so… and they are trying to make it more and more efficient every year, if not out of any environmental-mindedness, simply because investor money won’t last forever.

It’s also interesting to ponder the potential second-order effects. Once AI has sufficiently affected the economy and displaced or altered some number of human professions, how would that affect global energy use? People in an industrialized society, especially those dependent on fossil fuel cars and petroleum-based farming, are also incredibly inefficient. If AI decreases the need to drive to work (or the inability to, for lack of work and lack of money to pay for gas…), that would likely more than offset its own energy use. On the other hand, if it helps concentrate power in the hands of ever fewer people who can then use it to concentrate more capital and focus it on space exploration and opening up more mining and making super-sexy chatbots that the unemployed billions depend on for their emotional needs, that could dramatically spike power demands.

If the AI bubble has any lasting power at all, and if AI has the potential to be even 10% as transformative as its proponents claim it could be, then today’s power demands are probably not going to be very informative of where we’ll be in 10-30 years…

I am past my free limit apparently so thanks for quotes

And the share of AI data center use worldwide is what fraction of global emissions?

0.15% increase by the US alone when all of the agreed plans, “for 195 Parties to the Paris Agreement taken together – would lead to a 2.6 per cent decrease in global greenhouse gas emissions by 2030, compared to 2019 levels” is not trivial.

Actual number though may be less if the centers are located in favorable locations. Or more as AI data center growth has been increasing at a rapid pace, and 2024 numbers are already ancient.

Longer term I think it will get less problematic not more - the tech companies need reliability so they are desperately investing in power generation that happens to be low carbon but more importantly is amply available to them. But just asking about current best estimates so thank you.

Here’s an estimate (a few years old) on the impact of the internet as a whole (3.7% of global emissions)

Another source I could find.

Explained: Generative AI’s environmental impact | MIT News | Massachusetts Institute of Technology .

Globally, the electricity consumption of data centers rose to 460 terawatt-hours in 2022. This would have made data centers the 11th largest electricity consumer in the world, between the nations of Saudi Arabia (371 terawatt-hours) and France (463 terawatt-hours), according to the Organization for Economic Co-operation and Development.

By 2026, the electricity consumption of data centers is expected to approach 1,050 terawatt-hours (which would bump data centers up to fifth place on the global list, between Japan and Russia).

While not all data center computation involves generative AI, the technology has been a major driver of increasing energy demands.

“The demand for new data centers cannot be met in a sustainable way. The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants,” says Bashir.

Of course he is talking about the pace of the buildup leading to that, but the tech companies are looking also at nuclear, solar and wind; because, fossil fuels are getting more expensive.

https://energyinnovation.org/report/coal-power-28-percent-more-expensive-in-2024-than-in-2021/

Coal Power 28 Percent More Expensive In 2024 Than In 2021

Coal’s already poor economics have only gotten worse over the last four years.

This analysis builds on our Coal Cost Crossover 3.0 report, which found 99 percent of America’s coal fleet is more expensive to keep running than replacement by new local wind, solar, and battery storage – and coal’s only gotten more expensive since we released that analysis.

I don’t have a cite for this handy, but I think it’s reasonably safe to assume that the US is the top user / emitter. So at a rough estimate, we can probably assume that the EU and China and India each add less than we use (0.15% each at most, for a total of 0.6%). Or at an upper limit, let’s say global internet usage is 4% of emissions, per @Darren_Garrison’s cite.

But ok, where does that leave us? So we know that AI uses electricity, much of it trained in the US, but then subsequently used by people all over the world. Per capita, it doesn’t amount to anything at all. Would you be willing to stop using the internet if it meant 4% less emissions in your life? Few people would.

And the 2.6% is just how much we’re expected to reach by 2030, but far short of the 45% we’d actually NEED to reach in order to stabilize the climate.

Sure, every little bit counts, but then again… does it? The Paris Agreement is toothless and doomed without US involvement. In the next few years, the US will probably single-handedly undo decades of climate progress as the federal government rolls back various environmental protections and subsidies and fast tracks fossil fuel development again. A single Republican bill can eclipse all the data center usage. And even if you stopped all AI usage and development overnight, that is still going to happen.

I don’t think there’s any escaping that reality, AI or not. At least with AI by blue states and companies, there’s a chance they’ll go nuclear or renewable. Not necessarily so with Musk’s data center (which seems to have stalled) or the smaller companies that colocate their data centers next to coal & gas.

But all in all… AI is the global effort of many people in companies and governments across the world, while emissions is overwhelmingly the result of US and Chinese geopolitical and economic policy. Those are the elephants in the room and everything else is basically a rounding error.

I think focusing on data center power usage just goes to show how far the goal posts have been moved (ie we no longer believe it’s possible to actually address climate at scale, so now we’re just left with nitpicking and bikeshedding).

Also note that most of the energy usage is in training. Once a model is trained, you can make as many copies of it as you like for almost no energy cost, and then use those models for very little energy cost.

FWIW this was a straightforward factual question. It was inspired by listening to a podcast this morning on my way to work which stated that “after 20 years of really no electricity growth at all, a stagnation, just like in Europe as well, what we’re finding is a huge amount of electricity demand growth in America. A big part of it is those AI data centers …” and I was questioning how big of a part of it it was, and if so how much short to medium range GHG impact it would have. Not taking a position, just trying to get the facts. In point of fact the podcast was using this as a reason to argue why the future of green tech is extremely bright:

“… so the big tech companies behind those AI driven demand needs are funding a variety of climate technologies with great enthusiasm because they know they’re really helping solve a problem for themselves. Google has funded advanced geothermal company called Fervo, which is a unicorn and one of the most promising new technology companies that’s applying the techniques of fracking from oil and gas to actually make geothermal much more of a viable option. And that was coming to market because of Google.

And nobody would have suspected that 10 years ago. And like that, there are numerous examples where Microsoft, Meta, AWS, the big boys are all funding interesting financial models to help these new technologies come to market in a way that venture capital didn’t in the past because venture capital is a very narrow way of defining risk and return. These companies have both deep pockets, longer time horizons, carbon commitments, meaning they’re committed to being zero carbon or net zero in the long term, but also they have a selfish need for that power.”

They expound other reasons as well but that is the portion relevant to the question.

Longer term the increased demand, and specifically who is demanding it, may move the ball more than any concern for future generations could. And of course that technology will be used by more than their data centers.

Very good analysis; I think you’re spot on.

I always find it bizarre when people bring this up (not you specifically; I’m talking about the enviro-nuts). Why should the motives for driving efficiency matter in the slightest? Efficiency is efficiency, and capitalism is a vastly more effective engine for driving efficiency than political projects, decelerationism, moral outrage, and so on. The only time it goes wrong is when externalities aren’t accounted for, but that’s true of everything.

Gee, with an image like this, we can be very confident that the article is the epitome of objectivity:
Imgur

One of many things it fails to account for is that people may use AI as an alternative to other things. If each chat response uses 3 kJ, that’s like a TV on for 30 seconds. If the user is chatting instead of watching TV, they’re probably using less energy. And even more vs. playing a 3D video game.

Yes, sorry, I just started blathering because I thought the factual part was relatively answered already (somewhere between 0.15% to 4%, depending…). I should’ve saved it for IMHO, sorry. Wasn’t annoyed at you personally, but at the pundits/media who keep trying to make this a big deal — IMO because it was easy, low-hanging fruit to report on, rather than all the other way bigger things that cause CO2e. Anyway.

Well, just that efficiency is often an afterthought when used this way. These companies don’t pursue it upfront, instead burning investor money (sometimes literally, in the form of fossil fuels) first in the pursuit of results and worrying about efficiency much later if at all. It’s notable that it’s China, who is more resource-starved (in terms of computing flops) than we are, who came up with the more efficient models.

It’s also debatable whether capitalism (on its own) is a good engine for driving efficiency… fuel economy standards being an example of regulation and the market working together to pursue a goal that neither could’ve accomplished on its own. The solar market and EV subsidies are other examples of such partnerships. Sometimes political projects drive the market to where it wouldn’t naturally go otherwise. I suppose though that ultimately just ambiguates the externalities even more…

If any treat it as an afterthought, they’ll lose in the long run.

Companies building datacenters treat energy very seriously. They have to, because they are often limited by generation capacity, so if they are wasteful it means they won’t serve as many users or be unable to train the most advanced models.

The chips themselves have been power-limited for many years. That is, the performance is almost entirely dictated by their efficiency. “Perf per watt” has been the driving force behind chip design for a long time, with “perf per square millimeter” being secondary (although the two are highly related).

It’s what you expect when there’s healthy competition. They came up with some interesting techniques, but they aren’t the only ones working on it. To some extent, this was them making do with what they had available–which is another very effective forcing function.

Google released an AI energy report today:

I’ve reached my complimentary limit. Care to quote the bottom lines please?

Complimentary limit of what, sorry?

You mean the MIT article? You can see it here: https://archive.ph/jRHgD

Or the original Google report here: https://services.google.com/fh/files/misc/measuring_the_environmental_impact_of_delivering_ai_at_google_scale.pdf

Or Google AI (NotebookLM)'s analysis of it: https://notebooklm.google.com/notebook/4d177d85-9e37-4223-92cd-ea6bb3f99727 (Check the FAQ on the right for basic stats, or use the chat to ask it questions of your own. NotebookLM is relatively more grounded in the actual sources, the report itself and the MIT article about it, than other AIs would normally be. It probably also uses a bit more energy than lighter prompts…)

Or some quotes:

“We wanted to be quite comprehensive in all the things we included,” said Jeff Dean, Google’s chief scientist, in an exclusive interview with MIT Technology Review about the new report.

That’s significant, because in this measurement, the AI chips—in this case, Google’s custom TPUs, the company’s proprietary equivalent of GPUs—account for just 58% of the total electricity demand of 0.24 watt-hours.

Another large portion of the energy is used by equipment needed to support AI-specific hardware: The host machine’s CPU and memory account for another 25% of the total energy used. There’s also backup equipment needed in case something fails—these idle machines account for 10% of the total. The final 8% is from overhead associated with running a data center, including cooling and power conversion.

This sort of report shows the value of industry input to energy and AI research, says Mosharaf Chowdhury, a professor at the University of Michigan and one of the heads of the ML.Energy leaderboard, which tracks energy consumption of AI models.
[…]
The report also finds that the total energy used to field a Gemini query has fallen dramatically over time. The median Gemini prompt used 33 times more energy in May 2024 than it did in May 2025, according to Google. The company points to advancements in its models and other software optimizations for the improvements.

Google also estimates the greenhouse gas emissions associated with the median prompt, which they put at 0.03 grams of carbon dioxide. To get to this number, the company multiplied the total energy used to respond to a prompt by the average emissions per unit of electricity.

Rather than using an emissions estimate based on the US grid average, or the average of the grids where Google operates, the company instead uses a market-based estimate, which takes into account electricity purchases that the company makes from clean energy projects. The company has signed agreements to buy over 22 gigawatts of power from sources including solar, wind, geothermal, and advanced nuclear projects since 2010. Because of those purchases, Google’s emissions per unit of electricity on paper are roughly one-third of those on the average grid where it operates.

Yes to the MIT Review site. Thanks for the links. Lots of information in there for me to process. (And sure to produce some carbon dioxide as I do!) :slightly_smiling_face: