I think the word is ‘consciousness’. (Sorry, I don’t normally nitpick but you really threw me for a minute there.)
Why is it wrong to kill a man but not an AI, if there is no shortage or you grow another man to take his place?
Suppose you could erase a human’s memory. Is it morally okay to torture them, and then erase their memory of the torture? Is it hunky-dory to revel in their screams, to languish in their terror…as long as you clean up after yourself?
And can we infer from the argument that if nobody else knows about your misdeeds when the dust settles, than those deeds are not immoral? That it’s “exactly as if those deeds never happened”? So, if I kill everyone, it’s more moral than if I leave one witness behind?
As **begbert2 ** said; if that’s so for humans, that’s so for AIs. While it existed, for whatever length of time, it was aware of the harm done to it. It did happen to it and it was wronged by it.
The important thing is not that they don’t remember it. What matters is that they are replicable. That you can go back to the state they were before the event happened. In effect, that for all practical purposes, the event didn’t happen except in some bizarre parallel universe to which they have no access.
If you erase a person’s memory of an event, there still remains a memory of something lost. I saw it done to my wife when her first childbirth went horribly wrong. She doesn’t remember what happened, but she knows something happened, and she knows there is something she can’t remember. On top of that, of course, she had all the evidence of something having happened, the bruises, the pain, the drug’s hangover. I do believe she is better off not having the memories of what actually happened (I kinda wished I didn’t), but it is not like to her this didn’t happen.
As for our imperfections on running the machine, well. It is our thought exercise, we can assume whatever it is we all convene to assume. I was thinking the machine ran flawlessly. If it doesn’t, then we don’t have the tools to do a clean reset and then we cannot discuss the ethics of doing it.
My apologies for quoting either of you haphazardly. I hope neither of you takes this as me valuing one’s opinion over the other. It is just that the issues are the same. Let me know if I am leaving anything unaddressed. (and my apologies for using the wrong word before)
Yes, but in the case of the human, his life is being ended. It is not a reversible process. For the AI, he is being returned to exactly where he was before the torture (or whatever) happened. His torture and death are begin effectively undone.
How about this; pain changes us. Torturing the old man would, if he survived, make him a different person. I suggest that that would be what is happening to the AI; and thus in essence the tortured AI is being killed and replaced by an AI of the way it used to be but no longer is. It is not a 100% accurate process, not because the saved copy is different, but because the current one is.
Think of it like me killing you and replacing you with your one-year younger self. Have I just “undone” the memory of torture? Or have I essentially killed the person known as “you” and brought back someone who used to be “you”, but likely will no longer be “you” exactly?
Some of the forces involved are quantum mechanical and random, some aren’t. As for the simulation itself, whether true randomness is introduced would depend on the hardware.
No. First, order is imposed by whatever rules govern how the building blocks hook up; it’s not random. Second, a universe/simulation with randomness in it should be more stable, not less; the structures that exist there have to be more stable to survive.
8 months, at our present level of knowledge should probably be given at least some benefit of the doubt. It doesn’t matter much since 8 month fetuses are only aborted under extreme circumstances, and are typically non-viable or brainless anyway. As for chimps, I’d say that they do deserve protection; more than, say, a squirrel.
So if someone tortures you and gives you a drug that induces amnesia about the torture that’s moral ? If you undergo an operation without painkillers and the doctor wipes your memory afterwards that makes the agony OK ?
No, it happened, it still suffered. You are just making retaliation harder, not erasing the evil.
I’ve heard that unethical surgeons have tried to cover up bad anesthesia by giving their victims a drug that blots out recent memories ( sometimes ). Is that ethical ? Is it only ethical when the drug works ? How does that erase the pain ?
The big difference is that my younger self is being introduced in my present time. It is not a perfect restoration of my life as I would have lost that year.
The AI is getting reset with the whole universe around it. It is returning unchanged to an unchanged environment. Even the restoration itself is completely invisible to him. For all practical purposes, it just never happened. Not for him, not for the world around him.
Imagine then that instead of a year younger, we just go with a clone “saved” at the point just before the torture. Instead of it, you get a nice cold drink of your choice.
I still suggest that this current you (non-tortured) is a considerably different being from tortured-you, and thus what has occurred is the murder of tortured-you. Tortured-you has as much right to exist as cold-drink-you does.
Most of your concerns I have addressed in post later than the one you quoted. Let me know if there is anything I missed in my replies.
The short of it is that you are not just erasing the memory. You are returning him back to where he was before the event happened. In effect, it never happened. It would be more akin to you watching an alternate universe where it happened while he lives happily, never been wronged.
As for the memory wiping drugs, that would make for a great GQ thread. I can only offer my very limited experience that I recounted earlier. In short, my wife had to undergo some emergency procedures without the benefits of anesthesia and had her memory of the event wiped. Nothing unethical from the doctor’s side. She was just saving my wife’s life as best as she could.
No. First, the fact is the torture happened, whether it’s remembered or not. Second, what you are doing is spliiting the AI into two copies, and freezing one while you let the other experience agony. Then you kill the second version and clean up the evidence before activating the frozen version.
Consider this similar scenario; I somehow create a perfect copy of you and torture it, then murder it. Then I clean up the evidence and you never know the copy even existed. Does that mean it didn’t happen ? Does that make what I did moral ?
I am currently confused with a general tendency to agree with you, although I know there is something that doesn’t fit but I can’t put my finger on it.
I am trying not to dump a stream of thought on you, so I will have to let this be for a little while so I can gather my thoughts. I reserve the right to retract the following:
For now, let me just say that we are running into the issue of digital copies being originals in their own right but without inherent value.
Also that with an AI, you can manipulate the whole environment of the AI, which I am not seeing as possible with a person.
Whether or not fetuses are routinely aborted at 8 months gestation is beside the point for this question…they entire discussion is hypothetical, after all. I am in no way arguing for or against the legalization of abortion. My question to you is whether or not it is morally acceptable to abort an 8-month gestation fetus (for whatever reason), based on the same logic as your defense of the AI? And as far as a chimp deserving more protection than a squirrel, how much more? As much as a human infant? A very strong case could be made that your average chimp is as cognitively capable as a young infant, and certainly that it is more self-aware (to use your criteria for why AI should be protected from harm).
Again, I feel that my keystrokes are coming faster than my thoughts, so I will leave this until tomorrow when I have a fresher mind.
Until then, my thought is that a perfectly replicable AI in a perfectly replicable universe is somehow disposable. I do understand what you are saying, that you are still torturing an actual consciousness, no matter how many identical copies are out there. This does not affect the other copies, though, and they are potentially infinite.
I do think (and agree with everyone, I believe) that this torture does make the torturer a bad person, even if it is somehow proved to be insignificat to the “victim”. I find it very similar to torturing ants. I don’t even think of ants as individuals but as part of a super organism, but I am not comfortable in the presence of a person who makes a habit of torturing them.
What I am not sure is that in a world where infinite identical copies of a consciousness are possible, what the value of these copies is.
Just a thought, but unless you consider consciousness to be infinite (and I should admit that I do, but I know that most people in here don’t), how is there are distinction between suffering of a conscious entity that will be immediately forgotten and suffering of a conscious entity that will be… ultimately forgotten?
Well, no, of course not. There’s a PERL code of morality, a Java code of morality, and a C++ code of morality, to name but three.
D&D
Ok. Back to the topic at hand. I think that any attempt at trying to make a parallel to this on our “real” world is bound to fail because to make it work, you need to twist reality so much that you are pushing it into being more like a simulation. That is the opposite of what this exercise is about.
The point that is growing on me, though, is that of the simulation being in actual suffering while the torture is happening. Although it can be recreated and returned to the point before the torture, he is not supposed to know that that is the case. This instance of the AI is actually being tortured and dying. That we can later create another instance that is not affected at all by this torture, makes things ok for that other instance, but not for the tortured one.
The issue I have is whether these two instances are really separate individuals or are somehow the same.
Does an AI “feel” an upleasent sensation or does it just have sensors indicating damaged components that need repair? If I program a robot to recognize that fire will cause damage and thus avoid it, is it cruel to chase it around the room with a blowtorch? At what level of AI does it become cruel?