Morality of Simcity type games

Hm. Makes me wanna go back and reread the Old Testament, don’cha think?

And while it’s true that a Goblin is simply a buncha “If situation X, then reaction Y,” subroutines… how is a cockroach any different? I have yet to see a cockroach or earthworm which would fling itself on a bug bomb to save its comrades.

Seems to me that as far as programming and subroutines go, there IS no difference between a Goblin or Sim and a cockroach… which would seem to bring up a morality issue, if you feel that killing roaches is wrong.

For fun, at least.

Are these simulations moral agents?

If they are not moral agents then we could probably justify subjecting them to the same tortures that we subject animals to. Including suffering and poor living conditions, as well as death. In most western society it is not considered right to pit animals against eachother for human pleasure.

If they are moral agents then they deserve the rights of a moral agent.

I believe that in the final analysis, in order to judge the morality of a game, one should first take a serious and sober look at the underlying purposes and objectives of the game.

My first experience in this genre, or, for American readers, gener, was a simplified version of “Simcity”, possibly even a forerunner, called “Simbrick”.

In this game, the information a player could call up for guidance rovided a diagram giving the precise dimensions of the brick in both imperial and metric units.

The action part was the most exciting and depicted a brick falling on a large inoffensive ant from a great height, with predictable and bloody results.

The pointless killing of the ant troubled me greatly so I refused to play this version of the game after a few sessions at the expert level.

Well, all of this is countered by the game Lemmings in which you desperately try to save the little buggers from themselves. But for your involvement, they would gladly march off cliffs and into meat grinders and whatnot.

Of course, after a while, increasing frustration with the game often leads to a feeling of “Let the little bastards die!”

Hey I get enough of that lovie-dovie stuff from my girlfriend. I just want a hot chick naked chick to paint my house and serve drinks when I have guests over!:smiley:

Alan Owes Bess - That game does sound pointless and cruel (unlike Age of Empires II where my valiant soldiers sacrifice for the greater glory of the msmith537 empire!)
It does seem strange that we take so much pleasure from seeing scenes of destruction and gore in movies and VGs that would horrify us in real life. People were not cheering the collapse of the World Trade Center (unless you were a guy with a ratty beard named ‘Ahmed’). But the same structure being destroyed onscreen by Jerry Bruckheimer with a 1920 Style Death Ray is guaranteed to make the audience cheer.

“On the far end of the spectrum, we have sapient artificial beings, like, say, Lt. Commander Data from Star Trek, or Andrew Martin, the Bicentennial Man (from the short story and film of that name). Data and Martin were both artificial beings, but capable of feeling and desiring and even understanding and articulating.”

I have to wonder though (and maybe this was actually covered in a ST episode), if push came to shove, would Data’s “life” be as valuable as another crewmans (not counting “red shirts”)

Whenever my wife plays one of those simulation games, it always ends the same way. After the novelty wears off, she gets bored and starts devising creative ways to kill off her sims.

In one respect, this is simply human nature: growing jaded, and looking for a new source of entertainment.

In this sense, my wife is acting like an AI, controlling other AIs - as I see it.

Ah well, when simulations and virtuals achieve full self awareness, it will be necessary to afford them some sort of sentient rights;

simulations have the advantage that they can be backed up more easily than us fleshies, and so can be killed without too much inconvenience. As responsible virtual citizens they will have to make sure their backups are up-to-date themselves.

(1) Another computer game that provides hours of simulated death and carnage, in gory squishy detail, for hundreds of hapless simulacra under your irresponsible control, is Bungie’s Myth series — which I highly recommend. (Myth I and II are long out of production, but you can find them cheap in many computer stores’ discount bins.) However, as in many games, the combatants don’t really express any suffering or aversion to death. They move and fight and charge into battle just the same as always right up to the moment when their health falls below zero, and they quickly expire.

I’m sure that’s not an accident. As other posters pointed out, a game in which the characters did suffer and scream, or beg pathetically for their lives, would cast the human player as a psychopath. Most of us don’t want to be made to feel that way, especially in a game we bought with our hard-earned cash and are using for recreation. And I’ll bet most parents don’t want to overhear such sounds coming from their teenaged boys’ bedrooms, where these games are more commonly played. (At least I hope “most” people are like this. I don’t have a supporting study to back it up I’m afraid.)

However, the characters in Sims or Myth are more like chess pieces with some special effects. I know I’m fond of Myth’s impish grenade-tossing dwarves and their deranged cackling, but I also know that they’re all identical, they have no history or future, and that the game will generate another batch of them whenever I restart the level.

(2) On the other hand, there’s the game Rune, a first-person 3D action game I’m also fond of, but which includes a feature that made me cringe with remorse the first time I encountered it.

You are a Viking in a medieval world of magic and demigods and ghoulish creatures. In several of the levels, you’ll find yourself running down castle corridors holding a flaming torch for a light source (and occasional weapon). The evil masters of the castle have hung some human prisoners upside down from the corridor ceiling. There’s nothing you can do for them; you just have to run by and leave them hanging. They are part of the scenery.

This by itself isn’t so bad. It adds to the game’s creepy atmosphere. But if you bump up against a prisoner while holding your torch, no matter how briefly, you will set him on fire and he then howls in agony while aflame. Thankfully he doesn’t convulse; I don’t think I could take that. As it is, I always apologize before moving on. That’s just the way Momma raised me.

(3) A good book that explores “artificial suffering” a bit is The Mind’s I by Hofstadter and Dennet, which (wouldn’t you know it) includes an excerpt from Lem’s Cyberiad as a starting point for one chapter’s discussion. The book is full of philosophy essays and short stories on artificial intelligence in general, how it might be built, how you’d recognize it if you found it, and the ethics of dealing with it. The book is a thought provoking read, even if it never answers anything very concretely. Consciousness is a pretty slippery topic after all.

Isn’t this whole ethical dilemma played out on Star Trek with Data?

We are constantly left guessing as to what level of humanity he possesses and are constantly struggling with the question of whether he deserves the same rights as the rest of the crew. In fact I believe there was an episode directed at this very point when a scientist of some sort wanted to ‘open him up’ and investigate his android anatomy.

I think the concencus with Data is that it is wrong to harm him - but lurking beneath as to why - imho it’s because we perceive him as human… so to harm him we are in fact saying it is ok to harm another human.

Sorry if this is a hijack…

something i wonder when considering the nature of consciousness is, how is this different from human beings?

cloning is no longer a thing of the distant future, and if we came up with way to “save” ourselves, would that change our morality with regards to human life?

i have to believe it wouldn’t. at least not significantly. the reason i feel this is that though for all appearances to the outside world, this “saved” copy would be me, i firmly believe that it would not be me, that this consciousness would end the moment anyone tried a “revert to saved”. i feel like if i was threatened with death, knowing that there was a perfect replica of me that would have exactly the same thoughts and memories, minus the death experience, this would not make me any more comfortable with death. bearing that in mind, my human empathy makes me feel that most others would feel the same way and i would not wish them to go through the discomfort of death, even if i wouldn’t feel the loss associated with it.

so i’m not sure having a backup copy of a simulated human would grant us any greater freedom with regards to the morality of its termination.

that is debatable. i assume your thinking will mean you will object to the use of teleporters as well. placed in that context, where using a teleporter would be no different than using a ‘saved game’; there seem to be a divide between those who share your thinking and those who would sacrifice themselves for the sake of convenience.
can a cockroach feel pain?

Ramujan, that’s not the way clones work in the real world. If you were cloned, the clone would be your younger twin, not another you. There is no way to “transfer” your consciousness to the clone. And even if there were, the clone wouldn’t be some kind of tabula rasa that you need only scribe your memories onto to activate. It would be its own person, growing up quite differently from the way you did (unless you have a Boys From Brazil program in place, anyway) and therefore turning out a different person in the end.

Only in bad science fiction are clones possible full-body replacement units.

you bring up an interesting point about teleporters. i’m not sure if it works out the same though, since with teleporters, it is presumably the same molecules and such that make you up on either end, so it could still be said that the teleported version is still “you”. perhaps.

my point is that we need a better understanding of what consciousness is before we can tackle these sorts of questions.

as something more to think about, suppose you were having your brain replaced by computer bits, piece by piece. suppose, to further complicate matters, that you were awake and aware the whole way through. one would expect the first neuron replaced by an implant wouldn’t affect you terribly. but after the procedure was completed, there would be nothing left of your brain. it would be a different brain. the question, then, is at what point would you cease to be you?

that would be why i spoke of “saving” rather than cloning, giving cloning as an example that this sort of idea might not be something we don’t have to think about.

for the moment.

That’s right- the process is-
upload your mind into electronic media- c
opy that upload as many times as you can afford-
earn your virtual keep by working as a bit part player in a shoot-em-up sim;

get killed whenever the sim scenario calls for a realistic death-
download your surviving self (or selves) into (a) realistic robot body(ies) whenever you have gathered enough virtual credits.

Such a body could be very realistic in the far future; but why limit yourself to a realistic body.

(oh, and this is probably a far too conservative view of the real future, which will be much stranger).


SF worldbuilding at
http://www.orionsarm.com/main.html

I still say its a silly way to do it to embody any AI in a game, nearly any game would be better with the AI opperateing at a control level. in that way its like playing against a person, who reacts useing the rules of the game, not the rules of the world of the game.

no game would be fun if you inflicted real pain on the charactor, not for some moral reason that thats ghastly… just in the fact that once you shoot the guy once hes not going to play right anymore if the other player is playing for their life they will play in a way that is not so much fun as it is trying to stay alive.

think of it this way, if you invented AI and wanted to make a chess game the worst way to do it would be to give the peices AI. the better thing to do would be to give the AI to a second player who could move the chess peices.

to embody the AI into a virtual charactor in the game world makes for a stupid game in almost any case, most games its not a problem of the enemys being not smart enough, quake could make enemys that could aim perfectly and calculateing the travel time of rockets and everything and beat nearly everyone every time. thats not fun though, what all the programming goes towards is makeing bots that play like a person would play. not a bot that is better at the game.