In a story I’m writing, a young woman wants to be more powerful so she has her mind transferred into a robot body. However, things go haywire and her mind is only duplicated into the new body, not removed. The human body is badly injured, and unwilling to have her suffer robot-her kills the ailing human body. Is this suicide? Homicide?
Why not make your story about authorities trying to make that decision?
Does the human body still have brain function? Can the human and the robot (each of whom have the same personality effectively) have a conversation of any kind?
How is the mind transfer accomplished? Not a brain transplant, right? Is her brain destructively scanned to record her mind, which is then shuttled over? If her brain remains in her fleshy body, does she have two minds working at once, one here, one there? Destroying that body+brain would be homicide, right?
I recall an old SciFi quiz. If one has sex with one’s clone, is it homosexuality, incest, masturbation, or narcissism? Here, is disposing of an old body+brain homicide, suicide, or housekeeping? I’ll go with the latter till a space lawyer intervenes. Some future societies (cf. John Varley) may outlaw cloning and multiple consciousnesses, so disposing of the original is only self-preservation. G’bye, old self, and thanks for all the fish.
Is there a freaky three-way before the murder? Or would that just by a two-way? Wait… Uh…
“In the event of accidental duplication, Hyperion recommends killing your double and never speaking of it again.” Good advice(?)
Death by accident. The robot is a dangerous machine.
Hmm… so it’s basically killing your previous biological form after you move on to a different state of being. Maybe Biologicide? Ascension Euthanasia?
(Partial) Self Euthanasia …?
Or First Degree Mercy Killing, with the first degree part referring to the original human as opposed to the “replica”.
“Stay the blazes home” - Stephen McNeil, Premier of Nova Scotia
The robot and the human are distinct individuals, even if the one is a copy of the other.
Therefore, it’s murder.
Only if anyone finds out. (One individual walked into the robot body shop, and one individual walked out…)
ETA I am positive I have read multiple short stories with this basic premise, as well as others where duplication is (semi)routine
Consider if her human body had protested, had asked to be taken to a hospital, had begged not to be killed. Would that change anything?
If it would, that points to it not being suicide: her human body has agency and some right of self-determination.
As such, the fact that there are identical brain-patterns both in the robot and the human isn’t relevant. It should be considered like any other case where person A finds person B badly injured and kills them.
Edit: The identical brain patterns do add one wrinkle: the robot might claim something like a living will. She knows the human would have wanted death, and can presumably make that case in her defense. What would courts do if I documented my desire to be put out of my suffering under similar circumstances, and then you put me out of my suffering?
Bear in mind that the person in question had *wanted *her human body to die - that was the whole purpose of the procedure. Having her metal body finish the job is essentially assisted suicide.
That’s a fair point: if there’s documentation ahead of time that she (human) wanted to die during the procedure, then “assisted suicide” fits.
But the bio-person could say “I didn’t want to die; I wanted to become a robot! Here I am, still a fleshy, not experiencing a metal body at all! From my point of view, the procedure was a complete failure.”.
I think that my position would be that there’s no crime if the transfer process inherently cannot leave the mind in the original host (and by “inherently cannot”, I mean something stronger than just “the super-detailed CAT scan is so high-energy that it cooks the brain”, something more like quantum information). There’s probably no crime if the minds do not have time to diverge before the destruction of one of them. But if there is time for divergence (which would take only a fraction of a second), then destruction of one of the minds is murder.
Or suicide, if the flesh-person wants the flesh-body to be dead at the end of the process.
Playing around with a logical extremes of this, if the human had a prosthetic arm and used to shoot herself in the head, it would be suicide. Her will controlled her own arm to kill her - its mechanised state is immaterial.
If the human used a remote device to kill herself, it would be suicide. Her will controlled her own creation to kill herself - its mechanised and detached natures are both immaterial.
If the human used a remote device she had previously programmed with a death command to kill her after certain other tasks had been performed, it would still be suicide. Even if the mechanism took a while longer than she’d envisaged and she thus took longer to die.
In the OP’s premise, we know the robot’s mind is the same as the human’s, and we know the human intended to move into the robot body and leave her biological body lifeless…so it’s in the ballpark of setting up her own suicide machine.
Still a judgement call, obviously, but it feels like suicide to me.
Maybe “Self-Assisted Suicide”?
As an aside, Love Rhombus, I trust that you follow the webcomic Schlock Mercenary? It’s explored a lot of questions along these lines, and not all of the characters have come to the same metaphysical conclusions about them. Like, for instance, Captain Tagon has multiple times sacrificed himself to save others, and then been brought back by processes related to mind-uploading… and each time, the new Tagon questions whether he would make that same decision, because those other people who made that decision weren’t him. Or an assassin working for a shadowy government agency, whose MO is to upload his personality into assorted deep-cover agents (destroying the agent’s existing personality in the process), who are considered expendable once the mission is done: Every time he starts an upload, he’s terrified, because the way he sees it, each time he has a 50% chance of ending up on a suicide mission, and the fact that there’s someone ELSE who won’t be going into danger is no comfort to him.