Say you developed a virtual world, ala The Matrix, where you could go in and do whatever you wanted, such as eat all the time, be rich, have lots of sexual encounters, jump off cliffs without fear of death…
Assuming you were rich enough to power the machine and to have somebody feed you and clean you (as if you were in a coma) in real life, would it be immoral to be hooked up to this machine permanently and live the life you wanted to live, while the rest of the world stayed poor and suffered?
In what significant way does this example differ from current reality, where any number of wealthy individuals may choose to spend their money entirely on their own pleasures?
Naw. Sounds cool. I read a book once (“Calculating God”) where the societies on whole planets would bury a giant supercomputer and upload themselves into the things. One planet decides that once they upload themselves they risk some other speices showing up and turning the machine off. They send a bomb into a star to make it go Supernova and destroy all life within a couple dozen light years. Upload immoral? No. Killing everything else around you so they don’t turn off the machine immoral? Yes.
I think it would be great to store my body somewhere and walk around using a robot body or in a virtual society. The body gets hurt I just make a new one, my conciousness lives on.
Interesting side question: If you did permanently upload yourself into a virtual reality, would your “actions” bear any moral consequences whatsoever? Would it be “permissible” to spend all day killing nuns and raping children?
VarlosZ: Considering you were the only being in this world, who cares what you do. Consider a video game where you run around shooting mutants. The video mutants have no conciousness.
Now, if the children you were raping were REAL children uploaded into the machine, then you’d have a problem.
As for the OP, I don’t think so. Yeah, someone that rich could use the money in a much better way. But it’s their money and their body to do with as they wish and they aren’t hurting anyone by living in their own world.
In a christian framework, would it be equivelant to challenging God’s power/authority? By creating a Universe (which is sort of what we already do when we make any sort of program) are you building a sort of tower of Babel, pretending to be God, or would God not care? What if you could live here indefinately?
I’ll say the opposite of most here: I say not enjoying the real fruits that God has offered us is indeed immoral. Not to mention Shallow and self-obsessed. Some might say this Hell on Earth, too.
And certainly living like that has no purpose to it. In my eyes, You can never do any good. period. You are a mindless pleasure-seeking leech on society. You have no point. Your existance, such as it is, means no more to anyone else than the mutant digits of a computer game.
Hey, if God cared that much about it, he’s big enough and old enough to do something.
Morally, its no diferent than killing other online folks in Doom or Quake or Halflife or whatever the kids play these days…
Like any other escapist activity, it is ok, even healthy in moderation. It shouldn’t be a substitute for actual relationships and activities.
When you say that I could never contribute anything if I was permanently installed in a VR world… well, one of my fantasies would be to have a remote cabin, off in the Rocky Mountains, where I can write uninterrupted. Would what I wrote there be somehow made invalid by the mere fact that the location I wrote it in exists in a computer?
Say I wandered into a group world; a networked world, with more than just myself there. A shared vision. Would the things I said to the other people logged on not matter? Would kind deeds be invalidated, because we were not flesh and blood at the time? If raping that person would still be wrong, surely helping them would still be right.
The possibilities for venue are endless. The morality would, however, still be able to be judged by the same standards as in our current reality.
Adams (of Dilbert fame) said in one of his books that when Virtual worlds became as convincing as reality (like Star Treks Holodeck) then society would collapse. He was joking in terms that men would choose to have virtual sex and not seek real women and that they would not work but choose to live as much of their life in the virtual world as possible.
He was being humorous but I think he has a point.
Think about it. You could live in a world where beautiful women threw themselves at you, where you could be rich and live and experience your dreams and it would all seem perfectly real. Sure, it is meaningless but imagine having to go from that to come back out into the real world and face your mundane existance??
I think many, if not most would find themselves spending more and more time in the virtual.
LOL, sounds like someone is playing too many MMORPG’s. Are you addicted to Everquest, Dark Ages of Camelot, Anarchy Online or some other one?
As for the question, I don’t think it is necessarily immoral to live permanently in a virtual world assuming you don’t have any family or friends you are leaving behind and won’t be a burden financially on society.
What if you were rich enough to not only supply yourself nutrients and to run the machine, but also to fund many charities? Assume you were going to be in the machine for the rest of your life, but every year you donated one billion dollars to charity. Not to mention the electric bills! And the people needed to fill your food supply, clean you, clean up your byproducts.
If you donated to charity and continued to help the economy, would it be okay then, smiling bandit?
IMHO this is a very troubling issue. Although I have no problem shooting the “beasties” in games like DOOM, one can imagine that if simulated people became so sophisticated that they were basicially indistinguishable from real-live humans, they might be conscious, have real thoughts and feelings, etc. In that case, it may be very wrong to harm them.
In fact, you don’t know that you yourself are not (right now) a simulated being in somebody else’s computer.
Curious how you posted this right after I watched the film eXistenZ, which featured several assassins from the “Realist Liberation Front” gunning for VR game creators — precisely for the reason that they considered total immersion in VR immoral.
“”"""""Say you developed a virtual world, ala The Matrix, where you could go in and do whatever you wanted, such as eat all the time, be rich, have lots of sexual encounters, jump off cliffs without fear of death…
Assuming you were rich enough to power the machine and to have somebody feed you and clean you (as if you were in a coma) in real life, would it be immoral to be hooked up to this machine permanently and live the life you wanted to live, while the rest of the world stayed poor and suffered?""""""
I agree with Papermache on this one. Though he might believe it for different reasons… To even achieve that degree of wealth one would be required to virtualize contradictions as truth to avoid the logical tension associated with their own negation. There is no significant difference. They have already proven suicide. As morality tends to not be linked to ethics (logically derived axioms, logical course of action from those axioms) - I don’t think it can be considered immoral. It is simply irrational.
I think that all moral systems operate from the ‘axiom’ that ‘life is better than death’ under any and/or all circumstances. This is clearly not a collective, representative or logically derived axiom – it’s emotive and personal.
Once any ‘axiom’ is established, then you have ethical systems; which determine the proper course of action in regards to the axiom (how to behave consistently in light of the belief).
Virtualization in and of itself is not unethical; unless it is being used to re-enforce a virtualization. A means to escape the negation of your morals in light of the vast scope of reality. There is a possibility that reality itself may evidence itself to be completely virtualized; in which case, even the choice to live would be absurd. It is not rational to use virtualization to escape virtualization; so this decision in my opinion would certainly be chatrachterized as moral. I tend to equate moral, religious and irrational all in the same loop - until a completely transprent bridge has been established between all of these variables.
Morality describes the aspect of reality where people make decisions under a percieved window of ‘opportunity’ or ‘disaster’; in such a case, that to defer the situation would be a failure to capitolize; even though it is aknowledged that there is indeed a contradiction and that it presumably will not be solved before the window expires. Some people have their morals mapped out as pre-emptive answers to these situations (so as to ignore them entirely) and others have emergent morals that blow with the wind of situation. The more rational a person, the more prone to inaction in these types of scenarios.
I think that all moral systems operate from the ‘axiom’ that ‘life is better than death’ under any and/or all circumstances. This is clearly not a collective, representative or logically derived axiom – it’s emotive and personal.
Once any ‘axiom’ is established, then you have ethical systems; which determine the proper course of action in regards to the axiom (how to behave consistently in light of the belief).
Your drivers liscence info at the DMV is a virtualized representation of you, though not a particularly representative one.
Virtualization in and of itself is not unethical; unless it is being used to re-enforce a virtualization. A means to escape the negation of your morals in light of the vast scope of reality. There is a possibility that reality itself may evidence itself to be completely virtualized; in which case, even the choice to live would be absurd. It is not rational to use virtualization to escape virtualization; so this decision in my opinion would certainly be chatrachterized as moral. I tend to equate moral, religious and irrational all in the same loop - until a completely transprent bridge has been established between all of these variables.
Morality describes the aspect of reality where people make decisions under a percieved window of ‘opportunity’ or ‘disaster’; in such a case, that to defer the situation would be a failure to capitolize; even though it is aknowledged that there is indeed a contradiction and that it presumably will not be solved before the window expires. Some people have their morals mapped out as pre-emptive answers to these situations (so as to ignore them entirely) and others have emergent morals that blow with the wind of situation. The more rational a person, the more prone to inaction in these types of scenarios.
Argg… double post on an amended post to make matters worse.
I added that your drivers lisence information at the DMV is a virtual representation of you; and that this is clearly a rational use of your personal ‘actual’ space (to virtualize these types of aspects of yourself).
Immoral? I don’t think so, but what dose it say about are world if people would actually do this. It says a lot about how unwilling far to many of us are to face the problems that face us. I mean what if Frodo had decided staying in his hole in the ground was more important than going into hell to save the world? I think the world is worth living bad things and all.