GPS - Just in case the datacenter decides to move, eh?
There’s two main kinds of UPS. (Well, three, if you count the guys in brown trucks…)
Offline, or standby UPS is the kind that you’d buy at Fry’s or an office supply shop. The device charges its internal battery, and when it senses loss of incoming AC power, it turns on an inverter and starts making AC from the battery. Generally, these kick in within a few miliseconds, but there is a moment where there’s no power at all coming out of it.
Online UPS is a bit more exotic. The device charges its internal batteries with the incoming AC power, but the load is powered at all times by the batteries and inverter, so there is no interruption at all if the AC power goes out.
As mentioned above, datacenter UPS systems are sized to run things for a few minutes while the diesel generators start up and stabilize. Once you’re on generator, your primary limitation is fuel supply. In the case of a disaster such as a tornado or hurricane, your runtime may be only what’s in your onsite fuel tanks. If the roads are passable, you may have unlimited runtime, so long as your fuel provider can drive tanker trucks to your location. Of course, the runtime is not truly unlimited - at some point the engine will need maintenance.
The last job that I had had quite a power back up system.
UPS on line all the time for the data center. 3 phase AC converted to DC and connected to the DC buss. DC buss conected inverters and changed back to 3 phase 480 VAC. The batteries were also connected to the DC buss. With a power outage the output of the UPS does not blink. The generators come on line and when they snatch the load, load switches to th generators again with out a blink.
One building has three 1.5 megg generators. and their start up is wild.
Power goes out the all start. The first one to reach 60 cycles and 480 volts breaker closes and connects to the emergency buss. Then the second one syncs with the emergency buss and its breaker closes to the emergency buss. Then the thirs one syncs and its breaker closes. The wild thing is this all done no load on any of the generators.
When all the generators are on line the building’s main breaker opens disconnecting from the utility. Then all the buildings distrubution breakers open. Then the cross tie between the emergency buss and the main buss closes. Finally the distribution breakers close one at a time until they are all closed. If there is more load that the emergency buss can provide breakers will open until load matches supply.
Up until I retired, I worked at a federal maximum security penitentiary as a correctional officer.
We had our own sub-station which provided backup power to 2 institutions which were located on the same property.
We relied on these a few times a year for obvious reasons.
All the cell doors are electric, as well as the barriers and security doors.
Sure, we could always use a key and physically override the doors, but we’re gov’t workers, who wants to do physical work?
Whenever the substation power kicked in, the main lights always dimmed for a few seconds and then we were on ‘minimum’ power *essentials only) until the main power was restored.
I appreciate that none of the previous posters (appear) to work for banks, but, with reference to my OP, don’t banks have back up systems? If not, why not? Do they not consider their customers and data important enough? I’m still stunned that Santander (or, at least, their on-line operations and call centre) ceased trading for two days.
>One building has three 1.5 megg generators. and their start up is wild.
Reminds me of a story from a fellow I used to work with, who would have been 100 years old this year. He was trying to make a big diesel generator in the basement of the hospital deliver power, and it would spin but would not produce current. This type of generator counted on residual magnetism to start the excitation and occasionally that scheme fails, so you have to flash the exciter coil with a lantern battery while the thing runs.
Which he did. The resulting mechanical shock was so severe it made the entire basement rain grime from the floor overhead shaking.
I wonder what kind of surgeries they would have been doing around 1928, and how bad it would have been if the surgeon jumped.
Our critical operations have battery backup, a diesel generator and 24 hours of onsite fuel.
As another poster alluded to, you can keep the lights on, but if a fool with a backhoe hits the right telco lines, we cannot keep the phones and internet up.
Banking types are conservative when it comes to naming their employers on forums, especially non-banking ones.
I’m not going to run around saying,
“Hi, I’m Bob, I live in east Akron, and I work in the big First Merit building downtown.” and then spend 6 hours alternately posting about having had lots of gay sex in college, having a fascination with the Peugot 5-series, and thinking McCain is a big dummy-hat.
It would be rather conspicious that you’d been goofing off…
Incidentally, bank ops centers do, as a rule, have battery back-ups, off-site backup, generators and disaster recovery planning.
I appears that our friends at Santander forgot to purchase competent management for their disaster recovery planning…
Right.
Good business continuity plans include a second site:
If power/telco is failed and can’t be restored in a timely fashion, my employer has a building with hundreds of empty cubicles on the other side of town complete with a working network and phones.
I do not, however, think our IBM mainframe is terribly portable.
We do frequent backups, but… I have no idea if the mission-critical CICS environment can be made hot on a “spare” mainframe.
Anyone know if one can restore a customized CICS environment on a different processor within a couple of hours?
I don’t know how “virtual” virtualization environments are in a big-iron shop…