I feel there might be an issue of degree of distinction. If an individual was duplicated and one duplicate was killed ten seconds later, I would feel that the remaining individual was essentially the same as the dead individual. But if an individual was duplicated and one duplicate was killed ten years later, I would feel that the two duplicates would have had enough different experiences in those ten years to have effectively become two distinct individuals.
Which I guess demonstrates my inability to think like a dinosaur.
Forget the science-fiction aspect of the scenario. Suppose someone goes to a doctor for euthanasia (somewhere where it is legal). The person signs all the forms stating clearly that he wants to die and is doing this of his own free will. But the procedure goes wrong and at the end the person is still alive but suffering. The doctor then does something unorthodox (in the context of the normal euthanasia procedure) to kill the patient. Did the doctor commit a crime? I say no, he’s just completing the desired procedure in the best way he can. Similarly, the robot is just finishing up the botched procedure to get the outcome that everyone desired. I don’t see a crime.
But take assisted suicide. Say the doctor gives you drugs that make you fall unconscious and then, half an hour later, die. Should the doctor assume that during that half hour there’s a chance that you might have changed your mind and halt the process, or, barring any clear indication otherwise, should they just presume you want to continue dying? I’d say that unless you actually wake up and say “stop!”, the doctor should continue with the procedure.
I think it’s assisted suicide if the meatbag is still willing to die, and homicide if they change their mind and want to continue living instead. They are two individual people from that point on.
It’s an interesting question what society should do if both of the copies want to keep on living. One option is to say that copies aren’t people, and deserve no rights. I tend to think that a sentient being does deserve individual rights even if they were copied from someone. That still leaves a lot of open questions. Does the copy have any right to property owned by the original? Is the copy bound by any contracts that the original had made? What about relationships with friends and loved ones? Does this change if someone makes a copy of you against your wishes?
First: does the robot-her have legal rights—or, more exactly, legal “personhood”? Or was she at least intended to have rights, if the procedure had gone perfectly according to plan?
Second, and most importantly: was meatbag-her conscious (and presumably requesting death) when she died?
Not a lawyer, but I’m thinking that’d be the most critical point. From the most generous legal perspective, the robot is merely an “identical twin” if to an unusual degree. You’re not legally permitted to kill your own twin, or presume that they’d consent to being euthanized, even if you’re 99% likely to know that that’s exactly what they’d want to be done. Or even then, in strict anti-euthanasia/assisted-suicide law.
It depends on the laws of the future society you create. The situation you describe is not homicide, suicide, or murder according to contemporary law.
From a hard sci-fi perspective, getting sufficient information about a human brain to meaningfully emulate it would require freezing the person’s head, shaving/slicing the head in molecule thin layers, and imaging the layers with a scanning electron microscope. The mind isn’t software and the brain isn’t a disk drive.
Whether we call it murder, suicide or something else depends on the answer to a philosophical question that is still very much debated.
So it’s basically up to you.
In terms of imagining the social change for a novel, I think we’d probably have a whole set of new concepts related to moving / copying / splitting / networking minds etc. One body, one mind rather simplifies things in our reality, a world where brains can be copied or moved opens up countless possibilities. But I think if one physical brain is unambiguously responsible for the unlawful death of another physical brain…we’d still think of that as some flavor of murder.
Freezing and slicing would definitely kill the patient, and probably wouldn’t reproduce the entire mind.
I’d do it a different way; expose the brain, and tease all the neurons apart while keeping them alive and active, replacing them gradually with artificial replicas which could then be relocatedv into an artificial body. I call this mind-expansion; synaptic connections are so slow that you could replace them with much faster connections and expand the brain to several metres in radius, giving your medical technology room to operate comfortably.
If you are really careful you could keep the original neurons and reassemble them into a working brain at the end, and pop the skull back on. The result; two people, one biological, one artificial. I won’t speculate on how the artificial replica functions, or how accurate it is; but the level of medical technology required to do all this is several centuries away, at least, and if you could do all that, there should be no real difficulty restoring the original to health (even if they are somewhat disappointed that they are not the robot).
Of course, extracellular signalling would be replicated too. Here’s a picture I made of the process; the blood supply and cellular environments are carefully moderated, and synaptic connections replicated in great detail.
Don’t think for a minute I am recommending this process, or think it would be desirable; but I do think it is possible in the long run, and may even become a normal part of life in the far future.
But how realistic would the end product be? Would it even be conscious?