you are kidding right?
Sadly, he’s not. Creatures fandom is unexaggeratedly stuffed with people who can’t tell the difference between a program sprite and a real animal…
I’ll bite. WRT suffering, is a program sprite different than a real animal? Are scripted responses less valid expressions of suffering that nervous ones?
I’d be mopre worried about what it says about the person, rather than the sim.
For instance, what about someone who abuses stuffed animals? Ther is no question of a teddy bear feeling pain…but the sort of person who would knife one in a vicious manner is exhibiting signs of mental distress.
If the person in question subjectively identifies with the subject, to the degree of reacting to it as if it were independent and conscious, and if the person abuses that subject, then he is behaving in an immoral fashion.
Um… I need to add that some minor wickedness is “okay” in our culture… We can cheer for the defensive linebacker who puts the hurt on the other side’s quarterback without descending into psychopathy… But if, say, one played a variant of a first-person football sim, and that was all one did – qb sack after qb sack, specializing in the nastiest possible hits – we’d have to look at that player with moral suspicion.
(Second the praise for Lem: “Talex of Pirx the Pilot” was full of scary exploration of the nature of consciousness.)
Trinopus
Consider the following script:
for(;;){
print "Ow, you’re hurting me!
";
sleep 5;
}
Is this program in pain? If not, what is the difference between this “scripted response” and the one you’re hypothesizing about, other than (perhaps) more conditional statements?
If you DO believe this program is in pain, I’d like to understand why you think so.
I remember, a few years ago, a story on a guy who created “tortured norns”:
This is good science actually, what this guy has done is test the roots of empathy, and whether true altruism really exists. These people who were angry with him were obviously pissed off at him because of how he made them feel, not because he caused any actual harm to any actual thing. Nice work fella.
Well, but at its base, we’re just a bunch of “subroutines” to, or at least the organic version. Certain nerves fire, and we call that “pain”. Certain chemicals, adrenalin and such get produced in response to a stimulus, and we call that “fear”. Why are our “subroutines” real feelings, and the goblin’s subroutines not?
Not that I agree with the “Creatures” people, of course…slap around your goblin all you want…I’m not going to get outraged. (Just don’t come crying to me when you go blind)
Yes, and yes.
The goblin’s coding tells it what to do not what to feel.
The goblin runs away and looks afraid. It does not get scared, and flee in terror.
Importantly, IMO anyway, the goblin has no spontaneous emotional response.
He doesn’t flee in terror when the odds are in his favor.
He doesn’t suffer post traumatic stress disorder after battles.
There are no goblin conscientous objectors.
Doc, to a certain extent, all those things apply to the creatures referenced above. Does it make a difference?
…and there’s my question.
First of all, I am not interested in the motives of the guy torturing his Norns, nor am I interested in the sapience of the creatures, although that may become a later consideration.
The question here is this: is it wrong to torture and abuse simulated lifeforms or not?
We have laws against cruelty to animals. This means that we, as a society, feel that it is wrong to inflict needless suffering. If one MUST have a hamburger, one should raise the cow in a humane fashion, kill it as fast and painlessly as possible, and make use of it in a nonwasteful manner.
Bear-baiting, on the other hand, is a terrible, disgraceful thing, under this particular moral code, and should not be countenanced by civilized people – since it basically involves torturing helpless animals to death for fun.
So, from here, we go into whether or not it’s okay to torture SIMULATED animals to death for fun.
On one level, this is easy enough. Remember Teddy Ruxpin, the talking teddy bear? Simply remove his story cassette, and insert a different cassette, one which has a person screaming in agony. Activate Teddy Ruxpin, and suspend him over a slow fire. There we go, torturing simulated animals to death for fun.
Is this wrong? No. Teddy has less awareness than a flatworm. Teddy is a stuffed animal with a tape recorder stuck up his ass. There is no moral quandary here.
On the far end of the spectrum, we have sapient artificial beings, like, say, Lt. Commander Data from Star Trek, or Andrew Martin, the Bicentennial Man (from the short story and film of that name). Data and Martin were both artificial beings, but capable of feeling and desiring and even understanding and articulating.
At one point in the book, a gang of street kids intercepts Martin on his way to the library, and orders him to disassemble himself. Martin is trapped by the Laws Of Robotics; he MUST obey a direct order from a human, except where such order would involve HARMING a human… and he must take himself apart, despite the fact that he has self-preservation programming… and is sapient enough to realize that he has been ordered to kill himself, a situation very much against his will!
This ultimately leads to a court decision: “It is immoral to deny freedom to anything capable of understanding the concept… and desiring it for itself.”
Well, this would seem to imply that I am a bad person for keeping my idiot cat in the house so she won’t run out and fling herself in front of a car while trying to get it to pet her, but this would seem to also be applicable to Norns… or DK2 Goblins.
It is true that Norns and Goblins are nothing but pixels and subroutines, sure.
But I, Wang-Ka, am nothing more than a heap of meat with a variety of subroutines geared to ensure that I will live a while and make more little Wang-Kas, yes?
Subroutines like “fear,” “pain,” “pleasure,” “joy,” “horniness,” and so on, motivate me and control my reactions. The fact that I am sapient does not alter these subroutines or their effects on my behavior; it simply alters how I rationalize them, and to some extent, how I act on them. The fact remains that if you offer me gold, food, or a big wad of money, I will motivate towards you. If you pull a .45 or sic a gang of Elves or Grendels on me, I will react yet another way.
SO: we have established that Norns and Goblins and Sims are not sapient… but I have seen them act autonomously to meet their own needs and desires. They are, perhaps, less sapient than a cockroach, but they have needs and desires and motivations. They like some stuff, and don’t like other stuff, and their behavior reflects that.
So… um… is it wrong to step on cockroaches? Is it wrong to torture one to death for fun? And is a Sim or Norn more or less than a cockroach?
perhaps your point would be made more clear if you could explain the difference between what someone feels and how he behaves.
why is the conclusion that there is actual pain when a person says “ouch” but not when a computer says “ouch”? is it because we don’t know what causes the person to say “ouch”?
if one takes the position that a pin prick can cause a person to say “ouch”, and that that person feels pain, perhaps the holder of that position could explain why a machine that says “ouch” when it is pricked with a pin does not feel pain.
How cruel, then , to create creatures that can feel pain, and love, and hate, then demand that they love us.
And punish them if they don’t.
If we can’t do it now, then one day it will be possible.
Secondary creation; dontcha just love it.
SF worldbuilding at
http://www.orionsarm.com/main.html
The program is a simply a facsimile of a living creature.
Will we eventually be able to create an android that is our perfect mate and we can fall in love with (i.e. program her to be exactly what we want - intelligent, love sports, slutty in bed etc…) and look exactly how we want it to look? It becomes scary if anything and I want no part of it… not to marry at least … date maybe… hrmm
Anyways to the OP, I think the only way it is detrimental to us as humans is in how real we perceive these creatures. To be cruel to them without the absolute clarity in your mind that these are not real must have some detrimental effect on your humanity.
For those of you who picked up on it, it’s the same opinion Kant has human treatment on animals… I don’t know if growing up in the society I have I can agree with him re: animals but certainly re: sim games. Am I conceeding cultural relativism? Yes.
I’m takeing a class in AI this semester. take a class on AI and see its all A* searches and hillclimbing algorithms and monkeys climbing ladders and you will see this is not a topic that really will be relivent for a long time.
beyond that there is no reason to belive that an artifical world would be 1 to 1 translation of our world.
say you wanted to make a quake 22 player that had human quality AI. to make the AI the embodied in the enemy would produce an unfun game. it would fear pain and run and hide, it would not fight in a way thats amuseing, it would not make crazy leap 50 feet in the air attempting to shoot you and land on the tower that has the BFG. it would be much better to make the AI another player… a controler of a game charactor… not a creature embodyed in the game.
nearly any game would work better with this abstraction. the actions a sentient creature embodied in the game would not be fun to play for 30 minutes or an hour… they would have all sorts of daily life stuff to attend to… or would run away too much or whatever… haveing them as an abstraction of a player or a controler would produce a far better game in almost all situations.
You´ve forgotten that thing about will, we don´t just react to stimuli in a IF xxxx THEN xxxx fashion, that´s part of beign human and I suppose of beign sentient; a really sentient goblin may not run away in front of an overwhelming enemy, perhaps he would charge if a sense of duty strikes he. What I´m trying to say is that IMO one of our most important characteristis is our caotic response to stimuli.
Then again a programmer could add that behaviour to a virtual creature…
Okay, I’ll bite. I never played Creatures myself (I was a Civ II woman in those days), but my baby sister was a big fan. She found the “tortured Norns” site and downloaded some to nurse back to health and happiness. Why? Mostly because she wanted to see if she was good enough at the game to be able to do it. She had already raised plenty of well-behaved Norns herself, and I guess she wanted a challenge. I don’t see why that’s any weirder than wanting to see if you can cause the game characters to manifest deviant behavior in the first place.
“But I just wanted to turn off Skynet”.
“Suspect confessed intent to murder.”
Seriously, though, if we are discussing the morality with Norns and Sims, shouldn’t we extend (logically speaking) the same reasoning to less cuddly types? I don’t hear people complaining about ending Skynet, turning off HAL (who did seem very human). Nor is there much complaint for the sake of the game characters about shooters like Doom, Quake, Half-Life etc. The designers actually talk about the ‘pain’ or ’ hurt’ these creatures are able to withstand.
I do not find it problematic per se that the game character hurts. What I do find to be troublesome is the attitude of the player: if you take pleasure in suffering of others (which seemed to be the case with the Norns), you would have a serious moral shortcoming. Notice that in most shooters the set-up is one of self-defence. Í do remember a game (Rise of the Triad?) that had the ‘enemy characters’ actually beg for their life when you tried to finish them off. That made me dislike the game: I found it repulsive having to shoot a character in such a situation.
thanks Lamia, that makes sense. the context given by hansel imply doing it out of spite.
unless we are able to program the software equivalent* of pain and suffering (as opposed to behaving as such) then the spotlight will not be on Teddy Ruxpin, but on the said torturer.
- example data from star trek ‘first contact’ - you can mutilate him all you want, and he might react in pain if he want, however it is not suffering unless he turn on the pain receptors.
I can see the problems a real thinking and feeling AI would present to the fun of FPS games, but there are plenty of other types of games where having self-aware beings that thought they were really alive would be a great enhancement - like The Sims, or action games with lots of civilian bystanders.