Charles Perrow has an interesting book: Normal Accidents: Living With High Risk Technologies.
It has an interesting thesis which applies directly to the space shuttle. In fact, the shuttle is an example he uses.
The gist of it is is that as long as we intend to to do extremely dangerous things accidents are going to happen.
The space shuttle is dangerous because it is inherently extreme and there is no way of getting around this fact. It is a highly complex device made of relatively fragile material. “Relative” is compared to the forces it controls and endures. We load this eggshell with high explosives which he then ignite. It then travels at ridiculous speeds under ridiculous conditions of stress.
In order to do this, it is necessarily highly complex. The more complex a machine is the more prone it is to failure.
This should all be fairly obvious so far. But here, perrow makes an interesting point.
Attempting to engineer out the danger is futile and will inerently have the opposite effect, making the machine more dangerous.
If you create another safety system to ameliorate the danger that safety system adds complexity to the device increasing its chances of failure.
Both shuttle accidents are the results of very minor degrees of failure in extremely complex machines. The first accident was simply o ring failure due to cold temerature. The second due to insulating foam specifically designed to prevent accidents.
A device like the shuttle is a series of engineering compromises. There are literally thousands of things like o-rings and insulation that people are screaming their heads off about as being unacceptable before every launch. There are always dire warnings.
Perrow goes on and on giving examples from airplanes about how safety devices have caused accidents, because they add complexity to what is an inherently dangerous undertaking.
Safety systems create an illusion of safety to inherently dangerous operations which actually makes them more dangerous.
Complexity adds danger to inherently risky circumstances. Malfunctioning coffee makers have brought down airliners on more than one occasion.
A good example is that of the prop plane. Almost nobody ever gets killed by walking into a spinning prop. Why not? It’s inherently dangerous. It’s out in the open. People are working all around them.
The danger very obvious. People take it seriously. People respect a spinning prop.
However, if you put turn that prop into a turbine and encase it in a hunk of metal like a jet engine, than it is not so obvious. People don’t respect it and they do stupid things, like sticking their hands or heads into them.
The inherent danger in safety systems due to their complexity and the illusion they create (along with the behavior it instills) can be seen quite clearly by an examination of accidents on Mount Hood in Oregon.
Mount Hood is a long slow grade. Anybody can climb it. The danger of course is that if you fall and start sliding you just keep going and picking up speed.
Every year climbers climb Mount Hood. Typically they rope themselves together for safety. Park Rangers call this a “Suicide pact.” If the guy at the bottom of the rope slips the guys above him have a reasonable chance of arresting the relatively small amount of energy he generates. The safety system however means that if the guy at the top of the rope falls he generates twice the momentum falling as the guy at the bottom before there is a chance to arrest the fall. There is basically no chance at all of stopping it. If the next guy in line falls then you have four times the energy at the next chance for an arrest. Impossible.
If the guy at the top falls, everybody falls. You have the snowball effect. The next thing that happens is that this mass of roped together people comes sliding down Mount Hood taking everybody else beneath out as well.
This happens depressingly frequently on Mount Hood.
In this particular example we do have a good analogy of the space shuttle. People, like the shuttle, are relatively fragile. Climbing several thousand steps up a mountain is a complex act, as is launching a vehicle into space.
In both cases we are working with very large forces. The space shuttle’s energy and the forces it is working with are obvious, but the kinetic potential in climbing Mount Hood is also very large. It’s a long way down, and you build up a lot of energy along the way.
The danger in launching the space shuttle and climbing Mt. Hood is identical. In both cases you are trying to safely manage a potentially catastrophic level of kinetic energy. Sliding down Mt. Hood, and sliding into earth’s atmosphere is an identical problem from a physics standpoint.
The more complex the system used to complete the task, the more it is prone to failure, thus the higher risk. Safety systems add complexity and therefore risk.
Perrow’s theme is that if you are working with complexity and high levels of forces accidents are a normal occurence. To minimize them one needs to strip the action to it’s essentials. Adding complexity in terms of fallible safety systems only adds to the risk (and thus the eventual certainty) of failure.