There’s an uninhabited planet, and you have a button that when pushed, will start a very similar process of life on that planet as the one on this planet; so you’ll get the chocolate ice cream, you get the sex, you get the relationships, you get the beautiful sunsets, you get the fun movies, you get the nice Thanksgiving family dinners, you get the cool 80s music- but you also have to claim accountability for the trillions of sentient organisms suffering. Every war, every genocide, every suicide, every atrocity, every famine, every plague, every harm- all that is on you. Do you press the button, or leave it the way out is?
“Very similar”, or “exactly the same”? If similar, then similar in which ways, and dissimilar in which ways?
“Very similar” in that all the"big" things are the same, but perhaps some small details are different.
How is this at all different from the previous threads you’ve started on the exact same philosophical issue?
And as I said last time: see David Benatar for a good treatment.
I want to push the button that makes a million Marcus Flaviuses, every one of which has to go to college. I’d be fun.
Accountability to who, exactly?
I think I’ll side with Lord Tennyson:
Eh, why not?
Looking at it from a utilitarian point of view, there’s almost certainly more good qualia than bad qualia in the balance, whether you’re measuring by incidents or hours or individuals.
I mean, to keep it in perspective, ALL those folk would have died anyways, with 100% certainty.
Just because 10%-20% of them died from violence/genocides/awful diseases a few decades “early” doesn’t mean on the balance that the other 80%-90% who lived relatively average-length full and normal lives full of families and friends, sex, raising children, preparing and eating good food, etc. were worthless.
So you’d forbid the 100% from existing based on the bad experiences of the 10%-20% minority within that population? That’s not a solution that scales to ANYWHERE in life. I mean, for one thing, how would you ever get a job? It’s 100% certain that no matter what job you have, a minority of your time will be shitty, taken up with shitty tasks or dealing with shitty people or whatever. So clearly by that philosophy you should never accept a job offer.
Eating is the same way - sure, there are some transcendental meals, and most of them are average and just do the job, but it’s a 100% certainty that some portion - let’s estimate 10% of the meals you eat - are going to be shitty, either in taste, nutrients, cost, consequences, whatever. So we should also never eat by this philosophy!
Does that mean none of us should ever work, or eat? No, it means your evaluatory framework is faulty.
I have always found that Tennyson poem a bit creepy. I imagine some “nice guy” incel in a fedora reciting it.
My view is encapsulated pretty well by quotes from Larry Niven and Robert Heinlein:
Sure, we’re creating a lot of misery, but that is vastly outweighed by the good we create. The entire history of the human race has, on average, been an upward slope, albeit with a lot of noise. The times when we’ve fallen back have been overwhelmed by the times we’ve moved ahead.
So, in short, it’s acceptable if some beings experience horrific suffering, so long as the majority live somewhat “ok” lives? Would it be acceptable to savagely torture one person to death, if it meant that a hundred others would live guaranteed “happy” lives? What about a thousand others? Ten thousand others? How many would have to live “happy” lives to justify inflicting that horrific suffering on that one person?
Since there don’t seem to be any personal consequences for me, i.e. I’m not accountable to anyone, I say go for it, why not?
Do you want those hundreds and hundreds and hundreds and hundreds of trillions of sentient organisms sufferings and deaths to be on your conscience though? Or are you a complete sadistic psychopath and don’t care?
Not to put too fine a point on it, but yes, that’s exactly what I’m saying. In any complex system, there will be winners and losers, and if having only winners is the only acceptable outcome to you, you cannot have any complexity or large entropy-gradients in your universe.
But complexity and large entropy-gradients are literally what makes life worth living, giving us language, technology, more complex societies, lifesaving medicines, space flight, and what have you.
I think you’re conflating two things with your followup torture hypothetical - in your OP scenario, it is merely creating and setting into motion life in all its complexity. Because it IS complex, and has emergent behaviors and fundamentally unpredictable outcomes, there will with 100% surety be some folk in the bottom decile of human experience, and that will be a pretty shitty experience. But it’s the cost of having complexity and entropy-gradients to begin with, it’s just table stakes, and yes it DOES benefit everyone else to have the complexity, even if it ends with some people getting the short end of the stick.
However, choosing to initiate a complex system with relative “winners” and “losers” is a different moral decision than deciding to personally torture somebody to improve the lives of 100/1k/10k others.
But even THIS, as posed, is not that controversial. If you were out with family and friends, and somebody pulls out a weapon and starts killing everyone around you, and you have a chance to push him off into a vat of molten metal, would you do it? Being thrown into a vat of molten metal sounds like torture to me, and yet it’s worth torturing this one guy to improve the lives of 10/100/however many people he could have killed.
That’s pretty much what prisons and criminal justice systems are all about.
I reject your choices as well as your nihilism. My conscience will be just fine.
I’d push the button just so new iterations of Dopers would bitch that the new iteration of you had started several threads on the same topic.
Part of me agrees with David Benatar’s antinatalism and if God came up with the solution that animals should survive and flourish by consuming other animals I find this quite a wicked idea.
There is no such thing as a perfect world, and to imply that pressing or not pressing the button can be regarded as a bad choice because it may lead to an imperfect world is a fallacy.
The dichotomy suffering vs. non-suffering is a simplistic one because not all suffering is bad and not all non-suffering is good. The suffering of a new recruit learning the ropes of the military, or that of a single mother raising her children, or that of a fire-fighter risking his own life to save those of others is a valuable experience on so many levels and from so many different perspectives. In contrast, the non-suffering of a slab of rock is a contemptible experience whereas that of a spoiled child raised by a privileged family seems even dangerous since it is likely to lead to antisocial attitudes and a lack of conscience.
Decision making involves pragmatism rather than idealism. The existence of such wonderful and varied life forms is always preferable to a barren planet. I have given a lot of thought to the “wicked idea” that animals should be able to survive and flourish by consuming other animals and I’ve concluded that this process has shaped natural selection in such a way that intelligent and sentient species can eventually result in the process.
Another “If you press the button, 2 things will happen …” Choice, eh? Binary curse.
No. At the risk of being accused of being ‘depressed’ again for saying it, I think the pain of existence outweighs the benefits overall, at least for mammals. Life now for humans isn’t bad due to hundreds of years of scientific advances, but it can still be pretty miserable.
However, we do need one species that can invent machine intelligence. Machine intelligence could be eternal, will be amazing at problem solving and will not suffer. A billion years of biological suffering would be worth endless quintillions of years of machine intelligence.
So if pushing that button meant that civilization would eventually reach machine intelligence and it was the only civilization in the universe to create machine intelligence? Yes.
But if the answer to either one of those questions is no, then no.
I’d flip the question around, and I’m actually curious about the answer to this:
Imagine I could poll a significant chunk of humanity, and ask them the question. “Suppose, before life started on this planet, an alien had had the opportunity to press a button to prevent life from ever developing. You never would have existed. Based on your own experience, do you wish the alien had pressed the button?”
If a significant majority of humanity wishes the no-life button had been pressed, I won’t cause life. But if a significant majority is happy that life occurred, I’ll assume that a similar planet would have a similar result.
Has any such polling ever actually been done? Obviously not with the “alien” question, but asking folks whether they’re on balance glad that they were born?