Broadly speaking, is the shift to cloud computing net carbon positive or negative?

That’ll be relevant when fusion power actually exists. Which it doesn’t, and probably won’t for decades.

They just haven’t given ChatGPT the right prompt yet.

The OP has a chart showing the amount increase of cloud computing centers compared with traditional data centers. This implies (at least to me) that there’s a physical difference between the two. Is that actually the case? I thought any difference between cloud and not-cloud was purely smoke-and-mirrors software.

Anyway, as for siting of data/cloud centers, no doubt cost of electricity is important, but tax considerations, especially reductions, seem to be more important. Otherwise, they wouldn’t be siting so many in Loudoun County, Virginia.

If you’re the Chief Technology Officer for Widgets, Inc., and you decide you need a certain amount of compute, and install servers in an air-conditioned room in the Widgets, Inc. HQ building to meet that need, that’s a traditional data center.

If you decide that you’d rather not get involved with managing that, and besides, you only need all that compute for the three weeks before Father’s Day (widgets are a traditional gift for Dad) and the rest of the time your servers are only running at 10% capacity, then you instead contract with Amazon or Google or someone to provide your compute for you. And they’ll gladly give you a bunch of extra compute in early June, and shuffle actual hardware between all of their different clients (in a way that’s hopefully completely irrelevant to you). That’s a cloud data center.

Right. And the part of that picture that’s relevant to the carbon footprint question is the (apparent) fact that it’s not a one-to-one changeover. The cloud hosting business exists at the scale it does because hundreds of Widgets Incs have shut down (or greatly shrunk) their individual data centers and consolidated into the cloud. The chart I included (whose weaknesses I noted, but let’s roll with it for now) should be interpreted, I think, with lots of implied granularity in the falling bar of many many individual corporate facilities and much less granularity in the rising bar of large centralized cloud facilities. Hence, my curiosity about the potential efficiency of scale, and other effects, balanced against the vastly increased demand.

I’m satisfied by the discussion in the thread that the net-carbon picture is most safely summarized as “yes the cloud might be more efficient on a square-meter basis but the massive rise in computational demand overall means more power consumption on a net basis, and thus, likely, greater carbon output.” AKA, things are worse, but not as much worse as they could have been.

Yes, the traditional data center is manpower-intensive too.

One advantage is that serves have gone virtual. It used to be that a server arrived, you had to format and prep it, install the OS, join domain, install the application, etc. With Virtual Server tech, once a server is set up as virtual -a VM - (or moved to virtual) you only need to prep “hosts” and add them to the server farm, and you can shuffle virtual servers from host to host. If a server needs maintenance, move the VM’s to another host and reboot, replace disks, add memory, whatever. Reboot and rejoin the farm. However, this has the obvious issue that you need at least 1 more server than necessary to handle the load, for safety. If your VMs can run OK on 3 hosts, you need 4 anyway -plus whatever expansion. You also need someone who knows how to set up hosts, watch and manage the VMs, etc. Not to mention backing up data, being able to administer server issues, etc.

The cost may be acceptable to a company that has a farm of one or two dozen host servers, but for a smaller business that has say, only 3 or 4, it makes more sense to contract out the hardware side of their computing to a cloud service, now that communication tech is cheap, fast and reliable. Someone else handles hardware issues, backups, power and AC, host setup, etc. For a small business it reduces the number of IT staff needed.

I’ve seen it even go the next step. One health club I worked with had a server that ran a membership service application, everything from tracking billing to checking in members and unlocking teh gate. A few years ago, they went cloud - they don’t have a server, they don’t have an application of their own - they are an account on the application vendor’s cloud, sort of like a very active website. From local PC’s, it looks the same. they can generate reports, input new member data, it can even scan the card when the member shows up and open the gate autmatically. The app vendor benefits from having only the one application - previously, they supported hundreds of sites across North America where people ran 2 or 3 versions back since upgrading was a major effort. (One site up until a few years ago was running the version that used Access97 as the database) Now they have only one version to support - the cloud - and they can implement fixes or customization as needed, they have real-life data to test improvements, etc. Everyone wins.

the other step is virtual desktops. Again, VM’s on a cloud. However, each time you login, you get a freshly booted machine, and simply load a profile -your desktop, personal folders, etc. Everyone has the same MS Office version, Windows version, all the latest updates are applied. Upgrading Windows or an app smply means upgrading a copy of the master VM, be sure it works, then set it so whenever someone logs in after that, they get the new version. And like server VM’s, these can bounce from host to host, even while live, on the same host farm. No sending a tech out to install the same program on 100 different physical PC’s. The only thing the local PC needs is the remote destop login terminal program.

Side question here - why does computing have to use so much power? Is there some physical law that requires it, or is it simply because we haven’t figured out a more efficient way yet?

As above, Jevons paradox.

Our computing systems are incredibly efficient compared to the past and efficiency continues to improve. But our use of computing has also increased, not just in scale but also in scope (we use computers for more things than we used to).

It doesn’t have to and it doesn’t. What’s happening is there is so much computing going on, that all that computing needs power. Every time you use the internet, pay by credit card, make a phone call, send an email, get a paycheque (or check your bank balance) -pretty much every activity triggers some computer activity. Your phone is telling the nearest cell towner every few seconds where it is, and a phone compnay computer tracks it. You drive through a toll gate, the EZpass or whatever is processed by a central computer for the highways department. My local transit has every bus or train reporting its location so a central computer can tell you (or some electronic sign at the stop) when the next one arrives. Trucks and trains don’t move without a computer telling the warehouse or port what to load and where.

Basically, it’s computers all the way down.

You could probably write whole books on that, but there are many different layers to that question. There are absolute fundamental physical limits, and then there are limits on anything made out of matter, and then limits for matter that computes by moving electrons, and then limits when the matter in question is specifically silicon, and so on.

My favorite recent example of this:

There’s a public garage in a nearby big city which is our preferred parking option when we visit there. You pull in, push the button, take your ticket, gate opens, you go in and park. Just like every other garage. You take the ticket with you, and pay your parking fee at machines by the door.

Except on the way out, as you pull up to the exit gate, you don’t need to stop. At the typical garage, you insert the paid ticket into the machine to raise the gate and leave. But here, it just lifts automatically as you pull forward.

For the longest time, we just assumed this was one of the garages that’s actually free after X o’clock, so we shouldn’t have paid, we should have just come straight to the car and gone out. But then I looked closely at the ticket, and I realized our license plate was printed on it.

So what’s happening is, when you pull into the entrance, a computer is reading your plate and relaying it to the ticket-printing machine, associating your vehicle with the ticket code. At most places, when you insert your ticket in the payment machine, it’s the ticket that’s paid — but here, because the system has read your plate, it’s your vehicle that’s paid. So as you drive toward the exit, the system reads your plate again, and raises the gate for you.

Pretty slick. And it’s a fair bit of computing overhead to achieve the speed and convenience of moving lots of cars out the exit with most of them not needing to stop.

Interestingly there is a field called “reversible computing” which does computations in such a way that they could (if idealized) be reversed. I.e. there is no increase in entropy from doing the computation.

In practice, of course, there always is an increase in entropy, but there is no limit in how small that increase can be made, defying the Von Neumann-Landauer limit.

I saw this also in an airport hotel. Except when it took a while for the data at the hotel front desk to get transferred to the parking company contracted to do the system. We couldn’t check in, drop our luggage, and head out again without a long interaction with the front desk and the parking people.

I recall reading about a fancy new parking garage in Hoboken, where you catch the Path train to manhattan. It was essentially a jukebox, cars went onto an elevator and got put in slots in a tall tower. Apparently they had a series of computer failures stranding all these cars in the tower on several occasions before they gave up on the process.

Speaking of which, transit in NYC can use the credit card app in your phone at the turnstiles now. Several other cities are working toward this concept.

I guess the question is - what real life situations don’t use computers? Any restaurant more fancy than a food cart has a computerized order system. Elevators? Heating and air conditioning? Even your car’s ignition is computerized, not mechanical points.

The important thing is that in “classical” data centers, a “server” is a machine. In cloud computing a “server” is a binary, usually running in a container (which is a little more isolation than a raw binary, but not as much as a VM), that can be deployed to multiple physical machines.

A lot of this can be automated, too. Puppet for setting up machines, and Kubernetes can handle automatically spinning up new containers as machines are turned down to maintain the required SLO.

Sure, but it still takes some level of expertise to set it up. Before you can automate the steps, you have to know what the steps even are, and what tools are best for the automation, and so on. With a cloud service covering n different companies, you only need one person who knows what they’re doing, as opposed to n different people if each of those companies did it all in-house. OK, the cloud guy’s job is somewhat more complicated than the job of the guy running one company’s server, but it’s still big savings. And even bigger savings, if it turns out that one of the individual IT guys didn’t actually know what they were doing, and had backups that didn’t work right, or something.

So, just like in the 1970’s, '80s, 90’s, …

Yes, full circle.

Even smaller businesses were going full virtual in the last decade. I’ve seen busnesses with maybe 5 (VM) servers and aet of 30 virtual desktops on a farm of 3 hosts. When those hosts are end of life, it would make sense to move to the cloud and let a big cloud data center manage host loads, VM loads, desktop farms, etc.

As Chronis says, then IT staff benefit from economies of scale, and even more, they benefit from a large staff with standardized procedures.

Indeed, this is an area where AI-type applications would excel: ask the bot to set up users, grant permissions, set up shares with those permissions, set things like limited login hours, maximum resources, detect procedures taking excessive resources, maximize performance, etc. Save the complicated stuff for the human staff.

But to get back to the OP. that small business with 3 hosts is still using more electricity to run at 50% server capacity in business hours and maybe 10% at night, A data center can ensure 80% capacity, a robot process automatically turning off hosts and bouncing VM serers to other hosts if demand goes down below,say, 75%. Turning off one of 3 servers every night, rebooting in the morning, is too much automation for a one-man IT staff and too risky.

Systems like that are ubiquitous where I live, so much so that I feel surprised - and mildly insulted - when a gate doesn’t open automatically for me.

What does it say about me that I’ve started to expect the various genii locorum of the modern world to cater to my needs?

You’re asking the bot to do a given list of tasks. You need to back up a step: First you need to know enough to even construct the list of tasks. The top boss of your company probably isn’t a computer guy: The order he’s going to give isn’t going to be “set up users, grant permissions, set up shares”, and so on; it’s going to be “Set up a full computer system that works”, and then he’s going to trust that his computer guy knows what needs to be done to make the full system work.