If you can't remember it then it didn't matter?

In a thread on AI rights and the ability to ‘back-up’ persons minds discussed here some time ago (which I unfortunately can’t locate) an interesting point was raised regarding a moral question concerning the ability to wipe a sentient beings memory.

The question was is it morally wrong to conduct negative and harmful acts on a sentient being if said beings memory of those events can be completely wiped after they occurred eg the scenario of torturing and raping a sentient holographic construct in a Star Trek like holodeck and then wiping that entities memory of events was discussed, or using a sentient AI construct in horrific experiments and then wiping its memory of such events.

For the purposes of this discussion lets assume that such an AI construct was every bit as self-aware and capable of feeling and understanding as a human or that a technique had been developed to completely and effective wipe specific parts of a persons memory.

The general concensus was that doing such acts were morally wrong, but it seemed to be agreed that it was wrong because of the ‘negative karma’ (for want of a better word) incurred on the person doing the act but that if they subjects memory was wiped then there was no real problem in that sense, after all they couldn’t remember what had occurred so it may as well have not happened to them.

I’ve been mulling this discussion over but that has never really seemed right to me, in my opinion although their memory may be wiped later a sentient being is still being abused and experiencing hurt at the time the abuse is occurring and it really does them no good at all that their memory of the incident will be wiped at a later date.

Thoughts?

A harmful act is a harmful act regardless of who remembers it. It remains immoral even if no one remembers it.

It’s quirky, innit? For instance, why are we polite to prisoners who are being executed? We give them a last meal, a cigarette, a hood, maybe a glass of brandy. Then we kill them. What’s the point? We could kick them in the nads, spit on them, and cuss them out. The end result is exactly the same.

If there were a perfect memory-erasing drug – a Nepenthe – that lets you erase all recollection of the last few days, would “torture” mean anything? (The kind that doesn’t leave lasting physical damage: suffocation torture, say.) You not only can’t prove it happened, you don’t even know it happened at all.

It begins to approach that power that Aristotle said was denied even to God: the power to change the past.

This is one of the reasons I argue that it is our actions and memories that define morality. Not exactly so much that “torture is immoral,” but that “It would be immoral for me to torture someone.” The suffering of the victim is the face of the immorality; his pain is the basis of it. But it is my wrong to become a torturer.

Jack Vance was speaking of history, not morality, when he said, " It is a composite, a mosaic of a trillion pieces, the account of each man’s accommodation with his conscience." But I think it is an appropriate definition.

The wiping of the memory is a separate event from the harmful act. It does not negate the act. In fact I would submt (and I guess this ties in with what Trinopus said) that the memory wipe is itself a horribly harmful act against a sentient being.

With, of course, the obvious exception is of voluntarily having one’s memory wiped of horrible events one might well prefer never to remember. Nepenthe sometimes comes in a bottle…

But, yes, most certainly, if I do horrible things to someone, and then wipe the memory from them so they don’t remember it at all, that’s one more sin accruing to me. “The cover-up is worse than the crime,” at least in some cases.

Larry Niven once said that time-travel is the answer to the childish prayer, “Make it didn’t happen.” If one went back in time and prevented WWII from starting at all – are the Nazis still guilty of genocide? In an abstract way, yes, because it’s what they are. In the immediate practical case, no, they’re just a goofy bunch of guys in a boyish marching society. But deep down, we know better.

We can only have this discussion at the meta-level, of knowing the things that no one else knows. The victim may have forgotten that he was tortured, but we know it happened, and that’s the stance from which we make our judgements. It’s the only stance that is privileged enough to make such judgements.

What if you woke up tomorrow and suddenly remembered a bunch of horrible things that, until now, had been blocked from your memory?

Okay, what if you woke up tomorrow and didn’t?

In the long run, we are all dead and the Earth will be swallowed by the Sun.

The retrospective “result” has nothing to do with it. Things that happen are real, and either it all matters, or none of it does, even when it is happening.

Doing so without consent is a serious violation of a person; mind rape.

As an aside, I recall reading some years ago that this is actually attempted on occasion in the real world by surgeons who think that their patient may have been conscious during surgery; they inject the patient with some drug I don’t recall the name of that has a 50/50 or so chance of inducing short term memory loss so they won’t recall the agony, and of course won’t know to sue.

Assuming “if you don’t remember it doesn’t matter” assumes too much about the human mind. First of all, you have to define “memory”, are we talking about the ability to access the sensory and introspective experience of the event – or the holistic effect on the brain of the experience. Either way you have a problem. If it’s the former, there’s a good chance that experience, even if you erased the memory, has already had effects on that person’s behaviors or world view. They may have a neurosis you now can’t diagnose because they don’t remember the trigger event.

If you delete all the side effects? Well, good luck with that. My wager is that any side effect from an experience is going to butterfly effect out of control so fast you’d end up doing considerable, completely unethical brain damage just to erase that memory – in which case now you’ve probably eclipsed in horror whatever it is they experienced.

@Der Trihs – They’re called “Twilight Drugs” though I’ve never heard of that particular nefarious use of them. I’ve heard of them being used (with patient consent) for some operations that are painful, but not worthy of the full anesthesia treatment. I’ve also heard unconfirmed conspiracy theories about them being used INSTEAD of anesthesia (one wonders why), but I’ve never heard the scenario you presented.

The AI case is not the same as the case with a human. We don’t know how to totally erase the memories of a human being in such a way that there are no remnants left. Assuming AI based on something like current computer technology we do know how to wipe the board clean. Consider the case of an AI that commits murder. Should we destroy the physical mechanism, or could we just re-program it? What if we were able to re-program humans? Would we then destroy a murderer or re-program her so she was no longer a murderer?

I disagree, TriPolar, that the AI case is different. To me, editing (without consent) the memories of a sentient being, whether human, AI, or some entity as yet undiscovered or undefined, would be a harmful act.

As for murder, I’m against the death penalty no matter what sort of (sentient)* being the murderer is. In the case of an AI I would view reprogramming as equivalent to the death penalty. I would also, obviously, view reprogramming humans as equivalent to the death penalty.

*Murder is different from killing; any being can kill, to be a murderer requires sentience.

So you would put the murdering machine in prison for life? And what is life for a machine? Forever?

That may be a whole other thread, as the OP was discussing the victim rather than the perpetrator of a harmful act.

I will say, though, that I could probably bring myself to agree that if a being gave its consent to having its memories altered, to be reprogrammed, that would be more acceptable than altering them without consent. Otherwise, yes, prison for life (or until it reprogrammed itself away from the predilection for harm) for the murdering machine. What would that prison be? Perhaps access only to a very limited network, and with no access to any sort of manufacturing capabilities that would allow it to circumvent the limitations?

Why are you coddling murdering machines?

I don’t consider machines to be the equivalent of humans. Turn them on and off or reprogram them at will. It doesn’t matter to me. They are just machines. Or define ‘sentience’ much better. The first such machines may pass a Turing test, but they will certainly be different from humans and I doubt many people will give machines equal treatment when a question comes down to a question of a survival of either a human or a machine. If they do, then the sentient computers better have that same strong moral sense or the weaker human species will soon be extinct.

OP, is this the thread?

I agree the definition of sentience is a problem. For the purposes of this thread I’ve been assuming that sentience is equivalent to what we think of as human consciousness: some combination of awareness, intelligence, emotion, and desire, which, as one result of their interaction, evince a moral sense.

I don’t think an AI would necessarily be sentient in the above sense. And if it were intelligent but not sentient, I would have no more problem in reprogramming it than I would have in putting down a rabid dog.

Thanks for linking to that other thread; looks like I’ll have some interesting reading this afternoon.

I’d agree with you on this. A computer very strongly highlights the “mind/body” dichotomy. For humans, the mind is much more integral to the body, with the brain being the body-part most involved. Memories are physical, and much harder to erase. Also, memories are largely holographic – spread out over wide chunks of brain – whereas modern data storage is much more easily reducible.

I can trivially remove one person from a database, but it would be all but impossible to remove only one person from your memories of all the people you know. Little tentacles of memory spread out widely into many other memories.

As you imply, the notion of putting the physical machine in prison, in the case of an AI murderer, is inane. With computers, simply ceasing to run the CPU is equivalent to putting the personality into a dreamless sleep.

The old Doc Savage stories explored the idea of “criminal surgery.” The parts of the brain that promoted criminality were said to have been found, and could be excised. (Marvel Comics “Squadron Supreme” also went there, in fascinating ways.) SF writer Damon Knight also looked at the idea, in his “Analogues” series of stories.

Knight saw the potential for civil rights violations: “Hey, as long as we’re making this guy law-abiding…let’s also make him liberal.” (Or conservative.) The temptation for corruption would be great.

I have a block-time view of the universe so that even if the event was washed from the memory it still occurred, and was an evil act. I think long term it is hard to have any view of morality that would accept that erasing memory of an event would erase guilt. After all, in 6 billion years the sun will expand erasing any evidence of any babies I may or may not have eaten, and yet people get all upset about it. :frowning:

ETA: Peremensoe beat me to it

You’re assuming too much about AI. You might be able to pull this off with a specific type of GOFAI system (ignoring quibbles about whether the problems that face GOFAI in making an intelligent machine end up with tractable solutions). But if you have a connectionist or bayesian approach – ESPECIALLY if it has self modifying code? If it’s an emergent intelligence? It could well be impossible to do anything other than destroy it and make a new one. You’re simply not going to be able to isolate trigger memories or anticipate all possible side effects.

Someone has to do the erasing. Won’t those people know what happened? But even if not, I can still think of a reason it would be wrong. By erasing a memory, even a bad one, you are robbing a being of the ability to learn from a particular portion of their past. You are also robbing them of time. If you hadn’t been torturing them, they could have been reading a book or studying &etc. And taking away someone’s time is the worst form of torture (well, it was in *The Princess Bride, *anyway).