I don't understand the relationship between data centers and water use

Here in drought-stricken Texas, water use and availability are becoming (or have become) hot political issues. At odds with that is our burgeoning high-tech industry. Enter data centers.

Apparently data centers use up a lot of water. I mean, they use a truly mind-numbing amount of water. My question is, simply, why? I get that computers and such can generate a lot of heat and that some medium needs to be used to absorb and dissipate that heat. Water is usually pretty good at that. But why do data centers consume water? It seems to me that the water should be circulated through the facility to absorb excess heat, then sent someplace where it can expel that heat into the environment (or use the heat for some other productive work), then returned to the facility to do it all again. Once the water is installed into the system, why would the water be consumed? Why can’t it be recycled pretty much endlessly?

In a similar vein, is there any truth that an AI search or whatever consumes X gallons of water? If I do one of those stupid AI pictures on Facebook, am I really destroying a measurable amount of water? Why?

They use evaporative cooling.

Basically it all comes down to money.

Closed systems cost more and require more maintenance. It is significantly less expensive to use evaporative cooling.

It’s not a direct relationship where one AI search consumes X amount of water. A data center runs a whole bunch of computers, and those computers all generate heat, and that heat requires a whole bunch of cooling. If you divide the number of of AI searches or whatever by the overall amount of water used during the same time period, you get an X amount of water per search. But it’s not like if you do one less search you use X less water. It’s the overall heat load that you need to be concerned with.

In other words, it’s averages, not per unit.

The water is never really “gone” but it is removed from the area so others can not use it.

Evaporative cooling means the water goes into the air and blows away. No longer of use to others who live there. The water is still on the planet but moving around. Sure, some comes back as rain and re-fills reservoirs and such but the data centers tend to extract a lot more than comes back in the same time.

About the evaporative cooling, that Reddit thread points out that the evaporated water also is used in humidifying the air, since dry air with static electricity isn’t a good thing to have with electronics.

Thanks. That Reddit thread was pretty informative. I get it now. It does seem to be that a possible solution is to make water more expensive to be used for this purpose. Surely there are other ways to accomplish the same goals.

Maybe longer term, there’ll be more sustainable solutions. But right now it’s a frantic bubble and everyone is just trying to catch a ride up before it all bursts.

If the “just throw more compute at it” paradigm plateaus, so will investment, and probably then cooling and energy can be considered more. They’re always afterthoughts compared to getting on the gravy train while you still can, the environment and towns be damned.

The firms controlling the money don’t care about neighborhoods or community, and neither does the federal government or most smaller governments. It’s usually up to local neighbor led community groups to chase out the AI data centers at the grassroots level.

That’s especially the case in places like Texas, which companies often move to because California’s regulations (environmental and otherwise) are too strict.

(Edit: Toned down language after realizing we’re in FQ)

Microsoft has committed to closed loop cooling at all new data centers. The 2 newest data centers use no potable water once the system is filled.

Good for them. I hope it works out. Texas’ lax environmental regulations will be the end of us. People keep moving here (especially the Austin/San Antonio Corridor) yet there are no prospects for increased water resources. We don’t need data centers drying up what little water we have left.

That sounds to me like they’re still using water, just making sure that it’s non-potable.

You would think they would put the data centres in Canada where cooling would not be a problem.

It still would be. Even ordinary office buildings, where the heat sources are just lights, humans, and a low density of computers, usually need air conditioning even in the winter.

Plus, of course, you also need power, which means building somewhere where infrastructure already exists. And if you’re free to choose your location, the cheapest power is probably going to be hydroelectric, which already means you’re going to be somewhere with a lot of water.

Microsoft had a test data center in the ocean:

https://www.microsoft.com/insidetrack/blog/how-microsoft-kept-its-underwater-datacenter-connected-while-retrieving-it-from-the-ocean/

Well my office building in Montreal certainly didn’t need any air conditioning in the winter. Or spring and fall, for that matter. Of course, data centres are different. Heat pumps? That is an engineering question I can’t answer.

Thermal power plants run into a similar issue. Thise big cooling towers that people often associate with nuclear plants (but they’re used elsewhere) are for evaporative cooling, because the efficiency of the heat cycle is a function of the difference between the hot and cold temperatures. With evaporative cooling, the cold side is (close to) the wet bulb temperature as opposed to the ambient temperature.

There has been work on “dry cooling”, e.g., ARID | ARPA-E

Do datacenters actually used water in onsite cooling units? Or are we talking about the evaporative cooling from generating electricity?

I honestly haven’t set foot in a datacenter in 20 years, but back then the components were all air-cooled. With the fans and the aggressive cooling it was like being in an unbearably noisy meatlocker. Some people worked in there without hearing protection but I couldn’t tolerate it myself.

Liquid cooling of data center racks is the now thing because of increased rack and room densities and greater heat generated per server.

However, you can air-cool the closed liquid loop, or evaporatively cool it using swamp coolers. Generally, it takes more energy (e.g., electricity for fans) to reject the same amount of heat using a dry forced-air radiator than a swamp cooler; hence, the incentive to choose evaporative cooling of the heat rejection step.

(From https://blog.equinix.com/blog/2024/09/19/how-data-centers-use-water-and-how-were-working-to-use-water-responsibly/)

An adjacent discussion to this is water use in general. In parts of California and Arizona it’s an ongoing issue, usually framed around “running out of water!”, which is hocus-pocus for “my needs, my water”. What’s happening in the west is urban expansion and water needs are usurping that of traditional water users - agriculture is getting pushed to innovate irrigation methods or go out of business, and since ag uses the lion’s share of local water supplies, that is the primary area to make trade offs.

If a community in Texas considers a data center and all that comes with it, for now, then they accept the risks, too. And one risk is a finite supply of water, and making trade-offs with other local uses. At least until better cooling solutions come on line.

Is it “the community” who’s making the decision, or is it a single landowner?

I assume they have zoning and planning commissions in Texas, no?