That’s covered by the “don’t look to close” clause of all science fiction.
Gosh, I’m glad you found that. I’m sure now that the millions of Native Americans enslaved, slaughtered, and forcibly converted in the 16th century have now been retroactively freed, resurrected, and paganized.
Enslaved, slaughtered, and forcibly converted by Officially Bad Guys. Certainly you will get no argument from me about the levels of hypocrisy and corruption of the Church or of self-professed Christians. They even say somewhere that murder is bad, and yet…
The only, admittedly obvious, point is that slavers and conquistadors needed to engage in some fundamental doublethink in order to justify their actions, since the victims not being Christian was no official or legitimate excuse for any kind of reprehensible treatment; they were real guys and had souls and everything, it even says so. Unofficially? I doubt that if each and every Native American had immediately and unequivocally declared allegiance to the Pope it would have made the least bit of difference.
In the context of this thread, for “non-Christian” one may read “robot”.
I would say the real issue is not a moral one but a practical one.
If you’re regularly indulging fantasies in a hyper real VR, it’s possible you might get confused and forget for a moment that you’re in the reality where actions have consequences now.
Without some design / regulations around that, I could see accidental killings becoming a thing.
My favorite example from fiction is from Astro Boy. For those who don’t remember their Saturday morning weirdly translated Japanese cartoons, a scientist creates a superpowered robot duplicate of his son who got killed. Somehow the brilliant scientist forgot that robots, unlike boys, don’t age. When he finally realizes this - after ten years - he tells Astro Boy: “You’re not a human child. You’re nothing but a machine, like a refrigerator or a dishwasher.” Then he sells him to a circus. Of robots.
I largely enjoyed the first season because I felt the moral conflict between treatment of sentient versus non-sentient was the entire point of the show. If the hosts are merely robots without feeling or sentience, then it it easy to accept their treatment. And if they are alive and sentient, then it is easy to accept that what people are doing to them is wrong. To me, the point of Westworld is touching on the edge where in some ways they are sentient (feeling pain, passion, love) and in other ways they are not (dialog that was written for them). The blurred line is what makes it interesting. Or at least did.
They should do a Purge in World of Warcraft. One day where you can PvP at will, anywhere, in faction. And you cant come back to your character for a day.
It would be a fucking slaughterhouse.
We all have dark impulses, things we may have some desire to do which might cause harms to another, but which we do not do.
Sometimes the constraint is law/rules and some fear of consequences. Remove those and many would act on a fantasy in a fantasy world. No harms done. Easiest the more different the one we act upon is than us. Easier to kill an orc than a character who looks just like you.
But the more the one acted on by us, the other, is like us, the more we can imagine ourselves in their place, the less comfortable one is in the action. Empathy emergent of having a theory of mind constrains.
If we are sure there is no mind there experiencing it as we would we will have little constraint. The more sure we are of a mind like ours the more we put brakes on acting out those dark impulses.
Yes there have been lots of ways used by some through history to convince themselves that the other is not a mind like theirs t justify actions and deny empathy.
AI that LOOKS exactly like us will blur the line more and sooner than AI that looks like a toaster, for the same level of sentience.