How far in the future does morality stop mattering?

“Take Action”?

Quarantine, sure. Kill? I don’t have the stones.

If someone came to me and told me that they were a time traveler, and I was destined to kill millions with some “super flu”, I’d think they were nuts…

Godwinism time: feel free to skip the next two paragraphs, if such activities annoy you.

The Nazi’s claimed (without real evidence) that Jews “tainted” their (German) society, both socially and through “poor genes”. Therefore, if they removed the Jews, future Germans would be “stronger” both socially and biologically. They used this to justify the roundups and mass killings.

The only difference between your scenario (ebola that will kill millions sometime in the future) and the Nazi viewpoint was that the Nazi’s had a subjective value system of “preferential” genetic and sociological qualities that they felt needed “protecting/preserving”, while a disease is a naturally evolved disaster in the making.

Your assuming that the ends justify the means.

Should we waterboard to find the ticking bomb? Other examples abound…

Thanks, Der Trihs. :slight_smile:

I always thought that the Nazi’s were using that as an excuse but thats a different thread. I know that you noted the Godwinism, but unless you believe that the Nazi’s truely thought that they were saving more innocents by killings people now, the comparison isn’t very valid.

Also this is Great debates. Where we debate things. I am in many ways asking the waterboarding question, but I’m adding a time element as well.

Some didn’t buy into it, but used it as a convienient excuse, I am sure. I also beilieve that most hardcore Nazis truly felt that they were being “Harsh/hard now” to preserve their way of life.

How is that any way different from your example of killing some (a thousand I think you said) to preserve other’s in the future?

You added a twist. The “time traveler certainty”.

The problem with “time traveler certainty” is that it doesn’t really include knowledge of both options - the time traveler only knows what will happen if you don’t kill the people. He’s speculating that if you kill the original 1000 carriers before they spread it, things will be hunky-dory in the future, but he doesn’t actually know. Maybe one of the dudes you’ll kill has a descendent who saved the grandfather of a guy who prevented the world from becoming a permanent living hell. He doesn’t know that, and neither do you - so I wouldn’t gamble.

I agree.

Also, with a time machine, what’s the rush? You have all the time in the world to come up with a cure, go back in time, and cure the disease at the moment it strikes the first few victims.

You shouldn’t have to conclude that mass homicide is the only answer.

“Not even in the face of Armageddon. Never Compromise”

All this presupposes that the traveller hasn’t already altered the future just by coming here, doesn’t it?

I agree completely, except to add that perhaps, in the abstract, you either kill them, or stand on your firmly-held principles, no matter what, and refuse to kill them even if it results in the destruction of the known universe.

But either way, the abstract is, as you say, nothing like the real world, which is why these sorts of hypothetical dilemma are useless. Actually, a training/practice exercise or thought experiment that specifically bears no relation to reality is worse than useless.

What if the ebola is acid glue and the birds are Hitler? Huh? What then?

Bad example. Tuvix was murdered by Janeway, who, as we all know, was both evil and insane. Any moral action she takes is thus suspect.

Also, is the time traveler able to guarantee that in killing the 1000 you are not killing someone whose offspring would save the human race at some time in the traveler’s future?

No, they’re not equal.

If we kill 1000 people right now they’re really dead.

But we have no way to accurately predict the far future. Maybe those 5 billion will die and maybe they won’t. Maybe the space squids will show up and save us, in which case the 1000 we killed will have died in vain.

Why do you specify Catholics? Not trying to start an argument, just wondering.

I’m gonna guess it’s a reference to their “sanctity of life” beliefs.

I may be wrong, of course.

Last time I checked (I was raised Catholic but have been lapsed since childhood) the Catholic Church put a huge emphasis on not killing anyone, even if it preserved ones own life (but self-defence was ok if I recall).

For those of you questioning how good a scenario it is - yeh it sucks. It was a daydream. I still think that its a good question.

I would have to know what was in it for me. Yes, I’m just selfish that way.

I wouldn’t kill them in any of the cases. The future is a possibility but the present is reality. Real, actual, lives are worth more than vague possibilities. Besides, someone showing up, claiming to be a time traveller and saying you have to kill 1000 people has an immediate credibility problem.

By the way, in case c) don’t you have time to track down the people in question and show them the overwhelming proof that every single person descended from them will die horribly of a disease linked to their genetics? I think under those circumstances most people would agree to be sterilised.

(Thought prompted by the abortion counselling ads google is displaying at the bottom.)

Then I picture the remake of the Ducktators as directed by Quentin Tarantino. :slight_smile:

I cannot recommend this thread highly enough to anyone who likes goofy hypotheticals. Plus it’s notable for having been closed, reopened and closed again.

Here’s a counter-hypothetical.

Say there’s an emerging threat which will cause a certain amount of lost lives and output in the future. We have estimates of this damage, but they are wide - and asymmetric. That is, there is a small chance of extraordinarily large damages. (Yes, I have global warming in mind.)

Well, the rational will want to invest something to avert possible catastrophe. But how much? Or rather, in this context, how much should we raise or lower that investment given that future damages will be absorbed by future generations, those whom we don’t even know?

Now businessmen make investment decisions all the time: they invest if the rate of return of their projects exceeds a hurdle rate, which in turn is related to the cost of financing for the firm.

In the public policy context the analogue would be the social discount rate: it is positive if we care less about future generations and negative if we care more.

IMHO, the social discount rate should be positive: future generations will incur the damages from today’s pollution, but they will also benefit from higher technology. Indeed, I’d set the social discount rate equal to something like per capita economic growth. That way, any damages done to the future are adjusted for the gifts of new technology that they will possess.

I imagine that others might deflate the future by a greater amount. As it happens, those who substitute in market rates of return for the social discount rate can make all manner of social problems go away: if you care little about future generations (or your own future in certain contexts), there’s little sense in caring about global warming and the like.

(I’ll leave it someone else to uncover the sleight of hand in the preceding paragraph.)