Are big companies safe from power cuts and other disasters?

I started thisthread in MPSIMS. The thread was about the fact that my internet bank had been unavailable for two days, due to a power cut. Not being able to access your bank accounts could have serious consequences for many people.

But what I really want to know was what precautions do banks (and other big companies) take against power cuts, data corruption or terrorist attacks. Not much, in the case of Santander, it would seem. Is their lack of preparation for disasters normal?

I worked in a small bank in high school, about 10 years ago. Every night, we would back up all our files onto a deal that looked like a thick CD that came out of the computer in a clear plastic case. Don’t know what it was called, but the point is it wasn’t just a CD-ROM. Then, we’d take a copy to the post office at night. The post office wasn’t too far away, so it’s not like we could stand up to an alien invasion or anything, but our vault wasn’t going anywhere anyway; a tornado or fire would have just pissed it off. I assume this is done over the interweb these days. We also had a battery thing that would keep the mainframe going for an hour if the power went down. I guess this was to have plenty of time to back everything up and to not be bothered with short power interuptions. That is, not to deal with a 2 day outage.

When I worked on an energy trading floor in Oklahoma, we had some small building way out in the 'burbs (might have been a storage unit for all I know) so we could get going in short order if a big ass tornado hit. I presume the building had phone lines and a few computers.

I think factories have a different contract with the power people; I’m sure they get patched up pretty fast. Anything real important probably has diesel generators.

Not big companies, but a starting point.

There are businesses that allow a company to mirror its installation, so it can cut over if a disaster happens. It’s best if these places are far enough away not to get hit by the same problem - I remember reading that during Katrina the backup sites got clobbered also.
There are two problems. One is that not everything can be duplicated. A bank might think it is more important to ensure the safety of depositors records than to keep the total ATM network up. Second, people screw up. There are many sites talking about IT issues (The Shark Tank in Computerworld for one) and a popular theme is the discovery that some clown did all the backups onto the same bad tape, or on the same bad tape drive.

As voyager said, some business have multiple inputs from different grids, but most are just willing to deal with the interruption of service, since the cost of rare outages is simply not as high as the cost of reliable backup power systems which would only get used a few days of the year at most. Any bad weather/storm etc that knocked out power for more than a couple days is in all likelihood going to knock out anything other than onsite diesels anyway.

Exceptions are places like hospitals(for obvious reasons) or something like a server cluster, where the equipment costs and cost of downtime far exceed the cost of a diesel generator and tanks of diesel.
Power plants probably fare rather well during power outages as well… :smiley:

I worked at Honeywell in 1978 as a summer intern and they had these humongous motor generator flywheels that maintained a glitchless power supply for their mainframes even during a blackout. I’m pretty sure that certain companies have negotiated contracts with the power company to have a very high priority for electricity supply. During a brownout, they will be on a grid that will be the last to be cut.

The factory where I work has its own electrical substation with three power feeds from the city. I’ve seen the lights flash twice during heavy thunderstorms, when two of the feeds went down, but the third line stayed in operation.

We did have an outage years ago where the feed that we were using shut off; we had another shut off for repairs, and the city was doing maintenance on the third without letting us know. I am aware of two things happened as a result of the outage: we sued the city for the lost productivity, and we added the substation mentioned above when we did an expansion.

I’m not sure about now, but our IT department used to do nghtly backups, and I’m sure that the servers are on filtered and backed up power. I think that this is standard for any large server setup.

Most financial companies are required to have a BCP (Business Continuity Plan). I actually work at the BCP site in Michigan for a company HQed in Florida. We test our BCP methodologies several times a year to make sure that our tier 1 systems can be up and running within 24 hours of something major (ie hurricane swamping the HQ). This includes our backoffice and trading systems, etc.

At our site, we have a huge diesel generator that looks like it could run half of the city, and it has thousands of gallons of fuel in a storage tank that should keep us running for days if there was an outage. We also have several Uninterruptable Power Supplies with redundancy in case there is a hiccup in the power coming into the building.

And even with all this, we still have systems going down occasionally. The amount of variables to be taken into consideration seems to preclude 100% uptime.

This is mind-boggling.

We don’t tolerate **any **power loss, and our data centers can run indefinitely on their generators.

A few years ago, the generator and its associated electrical switching equipment was being replaced at one of our West Coast network nodes. Not even a data center - just the spot where the city-wide network plugs into an interstate long-haul link to the datacenter. We had not one, but two truck-mounted generators brought in. One to run the building full-time while it was off normal utility power, and the other as a backup in case the main generator failed during the weekend-long project.

Now, scale that up to how we protect the electrical power at the actual datacenters…

**crazyjoe **- 24 hours to get “tier 1” systems back up? Must be nice to be in such a relaxed environment. :stuck_out_tongue: Our core business-critical systems have to be back up in under 30 minutes, if they’ve been allowed to fail in the first place.

I have heard that one of the aftereffects of 9/11 is that many businesses have reconsidered their disaster preparedness plans, and increased the amount of planning for events on a scale smaller than nuclear war and larger than just affecting one building or business.

Holy crap, Eureka! Y’all have everything covered, dontcha?

I guess it depends on the size of your company…our entire IT budget is in the 100 mil range. I imagine yours is a good deal higher.

Interesting, though, about the 30 minute thing…I used to work for a pretty large bank, and they had an 8 hour failover window in the case of a disaster. I guess it just depends on how much money you want to spend.

I currently work for a company which supplies mission-critical power switching equipment to large datacenters. The heart of these systems is the static transfer switch, which detects outages and transfers the load to an alternate source, usually UPS (which for a datacenter is a ginormous block of batteries supplying 480-volt, three-phase power via inverters, up to 300 kVA or more) in less than 8 ms. Additionally, some of our customers also have on-site diesel generators which they power up after an outage to have yet another source to fall back on should the UPS units fail, too. In terms of normal power outages, nearly all financial institutions are so equipped and network failures in the face of a simple outage are extremely rare.

We have awesome power backup systems at the site I work at.

Didn’t stop someone from putting a backhoe scoop through our data lines.

I work for an IT for a mega-corp with many distributed data-centers. At least twice a month, I get an automated phone message from our disaster management team. I just have to take a couple of notes about how to resume work in the event of hurricane/nuclear war and then then press #1 to confirm and permission to wear jeans to work the next day. They like seeing most people wearing jeans in the building the day after.

I think we are fairly well protected against total failure. We have absurd levels of redundancy built in with backup generators at all sites plus full fail-over capability between data centers.

What does “uninterruptable” mean in this context? Generators need something to run on.

I work for a Fortune 500 sales company, and all of our North American business data is handled through our headquarters. I toured our new datacenter a couple of years ago. It’s hooked into 2 seperate parts of the grid, in case one goes down. On top of that, it’s got a bank of generators, in case that doesn’t work. I assume there is a GPS system, to catch any hiccups, but I didn’t actually see it.

If that building loses power, all of the North American offices can’t work. About all we could do is answer the phone and take notes for later.

UPSes are batteries which kick in automagically while the backup generator powers up.

The backup generator can keep going as long as you have fuel (and it doesn’t break.) Usually the power company fixes things before you run out.

By the way, when I worked for AT&T at least, all telephone switches actually ran on batteries which were continually being charged when the power was on - just like a laptop, but decades before laptops. It appears that Western Electric had excellent battery technology because a friend of mine, who is a chemist, used them for some purpose and said Western batteries were the best.
Not Energizer bunny batteries, of course.

That’s why your phone works even during a major power outage. They’ve got standard generators also, I expect.

Telephone central offices have a number of ways to combat a commercial power loss. The first line of defense are the large batteries that are part of the power plant. If power is lost these batteries can continue to power the telephone switch for a number of hours. I once installed a -48V D.C. power plant that maxed out at 4,000 amps. There were 8 strings of batteries (24 batteries per string) connected to the plant. They could have supplied power for 3-4 hours.

Usually the batteries won’t be utilized for that long. Within 10-20 seconds of detecting a commercial power failure the generator will kick on. The generator usually could provide power for 2 days before it needed to refuel. In fact, during the summer it is common for the power company to ask the phone company to switch off the grid and run off generators to reduce the load on the power grid.

You mean one of those yellow tractor-looking things with the big shovel on the front? We call those “cable locators”. Also, “job security”. :smiley: