Just to be clear, I was speaking in orders of magnitude; all of Dyson’s numbers, whether talking debris velocities or final mission velocities, are of order 10[sup]3[/sup] to 10[sup]4[/sup] km/s.
Isn’t that exactly what Dyson is accounting for with the 1/4 to 1/2 factors? The optimistic 1/2 factor assumes an ideal shaped charge where all the momentum is axial, but half goes in the wrong direction. The 1/4 factor assumes a symmetric, non-shaped charge.
True, this ignores leakage losses, but they won’t be too high as long as the diameter of the hemisphere is large compared to the shock absorber length.
I’m just wondering where that estimate comes from. Your 5 ks number is only 7% of Dyson’s bottom range estimate, which already takes into account the hemisphere losses. If that were a realistic efficiency value, it would mean the whole idea is doomed from the start (even for in-system travel) since there aren’t too many places for the waste energy to go besides heating the plate.
This is a plan for a mission in 100 years time. It is to a star about 4 light years away and would take 90 years. I think it is 700 meters long with a crew of about 200… http://100yss.org/
This talks about possible methods to travel… including theoretical methods like a warp drive and a worm hole…
(see the bottom) http://100yss.org/mission/challenges
So, what if we assume the taming of anti-matter as the power source? The energy density makes a nuclear reactor look puny, but I wonder if it would make that big a difference overall.
The most likely method I’ve heard of is the sailbeam method; a stream of tiny laser-accelerated light sails are used to transfer momentum to a magsail on the ship; this (to a certain extent) gets round Charlie Stross’s problem with rockets that require fuel to be accelerated.
As for deceleration this can also be achieved (again, to a certain extent) by magsail, but working against the interstellar medium this time.
Interstellar travel would be very energy-intensive but not really impossible.
It’s interesting reading through all this that Orion, despite having been cancelled over 50 years ago, still seems like the most realistic concept for interstellar travel. In other words, it doesn’t seem like a lot of progress has been made, even just conceptually.
Stross’s essay + comments was interesting too.
It’s funny, though, his assumption that economics is the basis, or motivation, for people’s decisions or actions. I think that’s been dis-proven any number of times. Mt. Everest is littered with corpses, and people keep going. Even though there’s nothing there. It’s just the highest point on this particular planet.
Part of this is interstellar travel is only at the absolute extreme fringe ends of Orion’s capability. it’s not really practical at that level, simply remotely possible. It is also currently unneeded for interplanetary travel (intra-solar system), at least unmanned interplanetary travel as a Orion ship requires vast resources and there are more resource efficient options, though slower travel available. If we had a pressing need for man/woman to Europa, Titus or Pluto (anyone remember that one, so unfailry expelled for our planetary system) that would seem the reason for a Orion drive system, so far no pressing need found, but interstellar is another thing all together, and Orion along with a form of suspended animation is just the bare bones minimum possible for such a journey.
How about unmanned probes, then? Still have to keep the thing running for 50-500 years, the mass/thrust issue is still there tho, as is decelerating the probe into orbit around the star without zipping on by.
You don’t necessarily need to decelerate the probe. There’s a lot of science that can be done through a flyby, as the Voyager probes have shown. And the probe could travel on to visit other star systems.
The idea that sounds most plausible to me is the Starwisp and similar ideas, where very lightweight unmanned probes are propelled by an external laser or microwave beam. We need a large beaming station in space, equipped with either a lens/mirror system many miles in diameter, or a microwave antenna hundreds of miles in diameter. Still, these are extensions of currently available technology.
One of the things about Orion is that it envisions accelerating for a short while and then coasting most of the way. What is the reason for that? Is it a resource constraint? (Not enough uranium, plutonium, deuterium or whatever.) A technological/economic constraint? (It’s too expensive to make that many bombs.) A physical constraint? (Carrying so many bombs would make it too heavy to push.) Or something else?
I mean, constant acceleration would be nice. For one, it means you get to take gravity with you. For another, you get there a lot faster.
The mass of fuel (bombs) you would need to carry to provide constant acceleration would get too heavy, also the number of bombs would get very large to do this.
That said Orion is one of the few designs that gets much better/more efficient as you upscale it. So then you run into the trouble f how big can you make it?
That would seem remotely possible, but may need something like self repairing circuits, maybe something stored in a very large shield that is stored frozen and unpowered, warmed up and activated upon arrival.
The assumption that all energy will be transferred, and indeed, that the entire energetic output of the bomblet will be issued as kinetic energy, are highly idealized conditions that cannot be realized, and Dyson is just estimating upper bounds, not realistic conditions. There are ways to design a nuclear device that will create an asymmetric field, but it isn’t even remotely feasible to design a device such that the output is along a single axis, so the uniform spherical (U=U’/4) condition for the distribution of the debris field is more representative of reality. However, unless you are a fly on the casing of the bomblet, this isn’t your effective exhaust velocity; it is the initial speed of the field. If even we assume perfectly elastic transfer of momentum, there will still be large losses due to heating and ablation of the pusher plate. And the assumption of a pure fusion device “burning completely to helium, with all the energy going to the kinetic energy of the debris,” is wholly unwarranted; nuclear weapons are good at heating and/or irradiating other substances, but the part of the yield that goes to kinetic energy of the helium products is small (~20%) for the D-T reaction. (The impressive pyrotechnics you observe when a nuclear fusion weapon is detonated in the atmosphere is largely due to the use of the resultant neutrons to accelerate the fission reaction, creating gamma and x-rays which heat the air to incandescence. A nuclear explosion in vacuum is much less impressive.) Of course, you can add a neutron absorber such as boron or xenon, or use a neutron reflector to redirect the other 80% of the yield to a fissile substance, and then use the radiation developed by that to heat a propellant (like polystyrene or polyethylene), but while this can be made directional via the use of a ‘gun tube’ the thermal efficiency and resulting conversion to kinetic energy is not going to be impressive unless there is a large mass of this propellant. The efficiencies of the system scale as the vehicle (and the output of the bomblets) gets larger, but in order to get above 20,000 seconds of impulse requires an extraordinarily large vehicle, on the order of a million tons and a pusher plate several kilometers in diameter. Even if we had the resources to build enough bomblets to power such a vehicle (which would require vastly more nuclear weapons than have ever been constructed), actually constructing the vessel would be a herculean task. It would require a massive infrastructure for in-situ processing and fabrication in space which itself is well and beyond the existing state of the art, regardless of cost.
A hemispheric pusher plate is not needed for a spherical explosion, and indeed, it would be difficult to build such a structure that would survive the loads. Most concepts for nuclear pulse propulsion use a flat or shallow concave plate, and as long as the standoff distance is less than 1/12 of the diameter of the plate, more than 99% of the initial axial radiative field will be eclipsed by the plate, which is part of why the vehicles are more efficient as the size goes up. (You can verify this for yourself by looking at the geometry of the “cap” formed by where the pusher intersects the expanding sphere and knowing that the forward field is defined by U/4*cos[SUP]2/SUP, where θ is the half-angle of the intersect to the vehicle axis.) However, any real propellant is going to stagnate and expand as the material forward of it interacts the pusher plate, so regardless of the size of the plate there will also be some losses due to edge effects.
Most of the energy goes into heating the propellant and free radiation (x-rays, gammas, neutrons), and relatively little into kinetic energy. (Note, this is a problem with all thermodynamic processes; except under very tightly controlled conditions you can almost never extract more than a modest fraction of the thermal input into the system.) Unlike a normal rocket engine which is contained within the vehicle, all of the initial heating goes on outside the vehicle, and as long as the vehicle can be insulated from the thermal radiation, the requirements for thermal heat rejection do not apply.
My numbers come from some research and analysis I did to assess the feasibility of the use of using pulse propulsion for asteroid deflection. I don’t have a single source or simple calculation (though I did assume the use of a gun tube and polystyrene as above to attain some directionality, with some coarse estimates on the divergence angle). However, this paper provides essentially the same conclusions. The authors state that “The realistic maximum Isp obtainable with fission-based EPPP is ~100,000 seconds…however, this type of performance would only be possible with very large spacecraft.” Unless you are prepared to build a complex spacecraft the size of a middling asteroid–and vastly larger than any single existing man-made structure–attaining a specific impulse above 10,000 seconds probably isn’t practical, and certainly not at the current state of the art.
Unless you happen to come across a clump of antimatter wandering through space–and if you do, please don’t bring it home with you–antimatter is not a power source, it is a medium. Given the energies required to create the conditions where antimatter can be generated, the efficiency is ridiculously low. The output of antimatter reactions will give the highest effective exhaust velocity physically possible, but only if you can figure out some way to direct it; otherwise, you are left with the same issue of thermalizing it into a propellant, and then controlling that propellant to extract a kinetic impulse from it.
For a rocket to generate thrust it has to expend propellant. In order to carry more propellant, it has to expend more propellant. This means that the longer you want to thrust, or at a higher rate, the amount propellant you have to carry grows exponentially. You can evaluate this for yourself using the Tsiolkovsky rocket equation. The only way to overcome this is to improve the propulsive performance (specific impulse), although this comes at the cost of higher power requirements. At delta velocities approaching the exhaust velocity of the rocket, the efficiencies become very low.
At any rate, even if it were possible to achieve speeds of an appreciable fraction of c, manned interstellar spaceflight would still be impractical. Even with the propulsion power source being external, you would still have to deal with the energy necessary to maintain a habitat which would require large radiative surfaces. Even at 10% of the speed of light, the nearest likely candidate for a habitable world is over two centuries away. Maintaining a closed environment without additional resources for that duration is well and beyond any extant technology.
All true, but remember that Dyson’s bottom-range estimate was based off of known nuclear devices. In fact, it appears that one can do at least double Dyson’s estimate of 1 kt/kg; the W87 warhead is allegedly in the realm of 2 kt/kg. How much boron-doped polystyrene do we need to capture a reasonable fraction of the radiation (I think we can let the neutrinos go)?
It certainly depends on the process. And of course Carnot tells us that efficiency goes up with increased “hot” temperature and decreased “cold” temperature. A nuclear bomb detonating in space certainly qualifies so it’s surprising to me that one couldn’t extract kinetic energy with fairly high efficiency.
How does the physics of a hot, expanding debris cloud work in space, anyway? You start off with a hot ball of vaporized material (the radiation that didn’t go to heat is clearly lost forever) that should cool as it expands. How much is lost to further radiation, remaining heat, etc. vs. kinetic energy?
Thanks; I’ll read that later (the site isn’t responding right now). I had certainly been assuming a large craft on the order of a million tons–NPP doesn’t seem worthwhile otherwise.
I’m not sure I agree on the practicality of such a large object, though. A million tons is only a modestly sized dam. And while launching that much material into space is impractical at current rates, there are vast economies of scale achievable if we needed to launch 10,000 100-ton rockets. $1000/kg seems quite achievable and would mean launch costs of only $1T. Still gotta assemble it, of course…