If a window on a spacecraft suddenly failed in space, is the force inside equal to the force of an equally sized commercial aircraft and window breaking in the air on Earth?
Meh, can’t edit again, but I imagine the suction power it would create would be based on the speed it’s traveling? If they are going the same speed and everything is equal would the atmosphere have any effect on it?
The speed has very little to do with it. The important factor is the difference between air pressure inside the spacecraft or aircraft, and the pressure outside.
The International Space Station is pressurised at 1 atmosphere, 101kpa. Space is pressurised at around about zero. So that’s 101kpa difference.
A Boeing 737 at 36,000 feet is pressurised at 82 kpa, while the outside is at about 22kpa; a difference of only 60 kpa.
So a Boeing 737 only has 2/3 as much pressure differential to cope with per square inch of window as the ISS .
The so-called “suction” effect is really just the air rushing out of the pressurized vehicle. So for a small vehicle (like an Apollo capsule), the period of excitement would be much shorter than for a large volume (like a 737).
Along with the small size was the 5 psi atmosphere(34 kpa).
This had me scratching my head, but apparently this is correct, and was attainable without asphyxiating the astronauts because even after the Apollo 1 fire they kept a 100% O2 atmosphere on Apollo spacecraft due to weight considerations associated with a mixed-gas environment.
The Apollo 1 fire happened at 16 psi (110 kpa) which is why it was so fierce.
Wasn’t the thing with Apollo 1 that they had 100% O2 at 1ATM, which is bad, and after that they had 100% O2 at ~1/5th standard atmospheric pressure, giving the same effect as normal air?
The Apollo spacecraft was always intended to operate at 5 pounds differential pressure (interior pressure five pounds above exterior pressure). In space, that means an absolute pressure of 5 psia. On the ground, if you were to try and have interior pressure at 5 psia, you’d have to draw a vacuum inside the capsule, relative to exterior pressure, which would be totally bass akwards and reverse the direction of the pressure force extorted on the hull. So what they did instead, during the “plugs out test” of Apollo 1 was pressurize the interior of the capsule to above atmospheric pressure (they didn’t do the full five pounds for the manned test, but they did do 2 pounds, leading to an absolute pressure inside of 16.7 pounds) and they kept the 100% O2 concentration because that’s what the capsule was designed to use in space and they wanted the test to be as close to what actual conditions would be in space as possible.
After the fire, they still couldn’t very well reduce the pressure inside capsule to 5 psia (because that’d put the pressure stress going the wrong direction), but they did start using an O2/N2 mix on the ground (they could do this without having to gut the O2 system on board and add more weight to the spacecraft in flight, it could just be external ground-based equipment used to pressurize and feed air into the capsule for ground testing only).
No, it was a fire in a pure oxygen atmosphere, that’s what made it so fierce.
It being pure oxygen doesn’t matter; what matters is how much oxygen there is. The amount of nitrogen isn’t really relevant to a fire.
I suspect the amount of N2 does matter, in that it’s a diluent which acts to reduce temperatures. Given two different atmospheres:
A) 15 psi of pure O2
B) 30 psi of a 50-50 N2/O2 blend
I’d expect fires to burn more vigorously in the former than the latter.
Right. The interior of the Apollo spacecraft was very close to the EVA suit pressure. This meant the suits weren’t pressurized very much, and the astros did not need to depressurize their bodies before going on EVA (spacewalk or walking on the moon). Also put less stress on the spacecraft. EVA suits inflate like balloons, and the more you pressurize them, the harder it is for the astronauts to move their arms, hands, legs. Apparently, 4 or 5 psi is about all you can tolerate and still be able to move adequately in a “soft” space suit. Anyway, the 5psi worked fine for the Apollo missions.
The space shuttle and ISS, OTOH, are pressurized all the way up to earth sea level, that is, 14.7 psi or 101kpa. Intended to be more comfortable for long term habitation. But, before an astronaut does an EVA, they have to slowly decompress their bodies, so they don’t suddenly get the bends (like a scuba diver might). This adds a lot of time to EVA prep.
Others have said it already, but I’d still like to point one thing out because there is a widespread misconception about it: What matters is the difference in pressure between inside and outside, not the absolute pressure values. Many people have the idea that space, being a vacuum, is “infinitely sucking”, or that the pressure acting against the hull of a spacecraft from inside is enormous because of the vacuum outside. It isn’t; what you need to look at is the difference in pressure, which for the ISS - it being pressurised at sea level, with vacuum outside - amounts to the 101 kilopascal mentioned by others. There are other devices outside spaceflight which are built to withstand far, far greater pressure differentials.
With regards to the Apollo 1 fire, I’m surprised nobody has mentioned how much of a contributing factor to the severity of that fire was related to all the velcro that was installed. That scene from the mini-series “From the Earth to the Moon” when they ignite a strip of velcro in a container with 100% oxygen was staggering. Or was this embellished for the show?
A lot of things burn in a “staggering” manner in 100% oxygen, so probably not embellished.
That’s probably true. But is the thermal mass of the N2 significant compared to the solid/liquid being burned, other solids, the remaining O2 and the combustion products?