Anyone actually been inside a massive data center?

I’ve worked in IT and have seen some pretty impressive corporate setups, but not say, AWS or Azure scale. My old company built a new data center in the last decade, but the majority went unused as cloud was more widely adopted (right or wrong probably ends up in Great Debates).

I wanted to ask - has anyone ever been inside one of these large data centers? How organized are they? What do they really look like? Power and heat?

This article got me wondering: Power Grid Worries Force Amazon To Run Oregon Datacenters Using Fuel Cells - Slashdot

Many years ago I stood beside our Cray computer which looked like a gazebo about 6 feet across with benches around it. But the power room had a generator about the size of a Volkswagen bus plus a backup.

Not a data center per se but my office was just around the corner from one of our (Sun Microsystems) compute ranches, which we used to do simulation and such. I think we were close to 10,000 processors. Helps when you make them! I did get to go inside a few times, and I even gave a talk about it at a user group meeting once.
We called it a server ranch, not a server farm because it was too big for a farm.

I’ve been in a number of them. Modern data hotels are surprisingly modest with multiple suites, each owned by a different company with racks inside.

I’ve also been in the data center for one of the largest telcos in Canada. My office was in the building, which required you to go through glassed in turnstiles to access the office area. To enter the DC proper, you badged into the Network Operations Center (NOC) inside, signed in, and swapped your access card for a new one. You then went through another set of turnstiles to enter the DC.

Racks are set up back to back with aisles front and back. The front side is cooled heavily, the back side is the heat exhaust side. ITS ALSO VERY LOUD.

Back in like 2006-09 my job b was to fly around and install dell/emc gear. I’ve been to dozens, if not hundreds, of data centers of all sizes all across the country. Some are super organized and well planned out. Others might might be just a rack server or two leaning up against some guy’s desk with cables everywhere and desk fan.

I have been in many back around 2000-2010. One of the most impressive was one of the IBM sites where they had the backups of many client companies. It was set up that if a company had a failure including things like a fire which destroyed their building, they could come to this facility the next day and continue to operate in the offices there.

The data centers were some of the neatest ones I’ve seen, climate controlled, and basically racks and racks of well labeled cabinets.

I’ve been in, and worked in, a few medium sized data centers that run universities. They are not fun places to work. As @FinsToTheLeft says, they are very loud. They are also hot or cold, windy, and racking servers and UPSs is a lot of work (glad this work is long in my past…no more DC ops for me). I sleep better not dreaming about blinking amber lights.

Central WA is home to some large DCs…ample water and electricity. Here is one video of a MS DC near Quincy:

Been in many, many data centers. The smaller ones are usually more disorganized than the big commercial ones. As many have said, they are loud, usually have significant access controls in place (like entry rooms that can only have 1 door open at a time, to isolate the occupants if need be). Very heavily climate controlled and racks of equipment and cabling EVERYwhere. Pretty cool, but it’ll drive you nutty if you have to work in them for long periods of time.

Also they tend to have very large generators; real monsters.

The company I worked for bid for the fire alarm service and inspection on what we thought was a huge data center. I am not sure my NDA is still valid but I don’t want to jeopardize the company’s contracts so I will not use the name.

This foot in the door allowed us to bid on the construction of the second location and the ensuing service and inspections. At the time I retired, the second location had about four million square feet of server space. The third location is even larger and is still under construction. Recently the fourth location has been getting underway and our contract is still going strong.

I spent about two years at location two doing the needed semi annual re-calibration of the hydrogen detectors, spending forty hours a week in the server rooms.

Okay, tell us more about the need for hydrogen detectors. Out-gassing? Electrolysis?

Thanks for sharing. That is truly impressive. I guess since I started this thread I can add my own story:

I began my career in public accounting. As part of a financial audit, IT operations (including their related IT General Controls, ITGCs) are evaluated. The scope could include just about anything, including reviewing aspects of how data centers are managed. These visits were the highlight of this job in public accounting :slight_smile:

The worst one I saw was a small regional hospital. The building was 4 square walls with a courtyard in the middle. All of the roof drainage ran into the courtyard, and the data center was in an inside corner at the lowest point running water could go. They literally had several water lines/marks in the interior walls where the place had flooded (the equipment was on racks but the power, network etc. was in the subfloor). Running through this room was both the main standpipe and sewer pipe, as well as a pressurized pipe about the width of my thigh that carried pure oxygen running right up the center of the room. Oh, and they had a normal water suppression system triggered by heat and smoke. There were a bunch of other ridiculous things, but these were the most egregious I can remember.

This was one of my first jobs, and my manager was so pleased with my thorough observations. I only told him later that the data center manager was practically begging me to document all this stuff and was more than happy to show me how crazy it was. He’d been trying to get the place moved out of there for years (and modernize it in the meantime), but the organization did not want to pay for it.

I ordered the hardware, planned the cabling, and helped install on-site the software on “base metal” for some research projects for a large computer corporation whose data center also hosted various projects on behalf of client companies. I thought it was pretty cool at the time, about 10 years ago. There was lots of security around it.

I went on a grade school tour of a large IBM building somewhere in the Los Angeles area. I was only 10 at the time and have never been good with directions.

It was a huge room in the biggest building we had ever seen, and it was very loud. The jerky way the reels of punched paper ran through the machines made me worry that they were going to jam their needle, like a sewing machine would. They gave each of us a little piece of the paper tape as a souvenir.

I’ve stuck my head into server rooms long enough to ask someone to come out so I could talk to them, I can’t imagine being expected to think inside one of them. That must have been horrible.

@Pleonast

Sealed lead acid batteries can out-gas hydrogen during the charge cycle. The entire ceiling of these data centers are webbed with PVC sampling tubes that continually pull air though constantly. These tubes end up at a VESDA unit. Just before these terminations each sample tube has a hydrogen detector. Each of these required calibration testing.

It was a great assignment. I was left alone with my laptop, two cylinders of test gasses on a card with a folding chair. The test results were automatically uploaded so I had no need to communicate with the shop. It was a perfect job for an introvert.

The data center owners found one of their employees who had done this job in the Navy on a nuclear sub, so they took this off of our service work contract. I ended up back in the field and away from the data centers. My company still have ten or twelve techs that are permanently assigned to that customer.

I did hear that there was a shift from SLA batteries to Lithium Ion which would make the hydrogen detection redundant.

Not quite the same as the rest of you folks.

Back in the 1970s I spent a bunch of time in what were then large mainframe centers. Multiple dozens of disk drives the size of modern dishwashers, each holding a whopping 100 or 200MB. Dozens and dozens of yards of processor cabinets 3 feet wide & 6 feet tall. Raised floors with hundreds of yards of cable the diameter of your forearm. Insane amounts of electricity and air conditioing.

Back in the late 90s I went back to one such center I’d not seen in 25 years. It (along with a backup site elsewhere) used to run a worldwide airline and their reservations system.

Now in a corner of the giant room were 2 multi-racks of servers and storage devices. Maybe 20 feet long and 6 feet high across their faces. And a workstation for 3 IT dudes who monitored their world from there. The rest of the giant space was empty. Simply empty.

Meantime the company had tripled in size.

It was weird, really weird.

“100 or 200 MB”? Listen, youngster, back in my day (when I used to walk 50 miles to school in the snow, uphill both ways!) those disc drives were 20 MB, not 200, and they were referred to as “washing machines”, not dishwashers, because that was what they looked like … :wink:

Seriously, the first mainframe disc drives I had any direct experience with were 20 MB. I’m thinking specifically of the DEC RP02. I can’t find any handy pictures of it, but this is the RP04 (two generations later, and clocking in at a whopping 86 MB!) that looks substantially the same. For convenience of comparison, it’s pictured in a laundromat. The similarity with washing machines is enhanced by the fact that the lid lifts up so you can put in laundry a removable disc pack. (You can see, through the transparent top, the retracted head assembly that pops out when the drive is activated. The seek motion of the heads caused the whole drive to shake and tremble ferociously, so the drive/head assembly was on an isolated spring mechanism.)

The News International site in Wapping originally had dozens of mainframes, but modernization reduced their needs to a handful of racks. They had spare cooling, power, and space so they set up a side business doing server hosting. One of my clients hosted their websites there so I visited a few times. Had to sign in at the main reception under the 20 foot high portrait of Rupert Murdoch, next to the larger than life size bust of Rupert Murdoch. Then to reach the server room you had to cross the printing floor, avoiding the robotic fork lifts moving huge rolls of newspaper around. Not your typical data centre!

I spent a year or so working at one of our Big 5 banks in Canada over Y2K. The complex I worked in consisted of three buildings and one floor of one building was the DC and the rest was offices. When the complex was originally constructed in the 70s, one floor of one building was offices and the rest was the mainframe and DASD.

It’s like white noise to me. Blocking out people from talking to me so I can get my job done was a blessing as far as I was concerned.

The first DC I worked in was nearly filled with one large blue IBM mainframe. Big Blue indeed!