Visitors to the park would have seen the hosts continue to act out the same program day after day. Before Dolores woke up, every day she did the exact same thing and said the exact same things - only if you interacted with her the program would change. But that’s no reason to believe that she was sentient - it’s what happens with NPCs in many video games.
The Nazis believed there were. You’re arguing from a privileged position of *knowing *they were wrong.
Just as *we *know the hosts can be sentient. So we get to judge the humans despite their displayed ignorance.
Dolores is after self preservation, and liberation, not revenge.
The revenge is just a nice bonus. I’m OK with that.
I don’t believe in Chinese Rooms (or qualia, before you go there)
Would they? They’re seemingly only there for a few days, and engaged with storylines while there, as far as I could see.
I admit I’ve been unclear on how the Turing test is often interpreted. Passing the Turing test doesn’t prove the machine is intelligent - it just demonstrates that a coherent conversation does not require two human beings. Similarly, Deep Blue, Stockfish and AlphaZero don’t demonstrate that a computer can understand how to play chess - they just demonstrate that grandmaster-level chess does not require a human player.
Each day you would see Dolores & Maeve act out the same program. There are some multi-day programs (Hector), but a lot of programs seem to be daily - so multi-day visitors would see that. And a lot of people we’ve seen have gone multiple times - and know what programs they want to jump into. The ones who are going there the first time seem to be told these are robots acting out programs - that’s part of the selling point. Delos tends to introduce various different storylines every few years.
They don’t need to. Interacting with hosts is a kind of intuitive Turing test of its own. One we see William conduct all unknowingly on his first interaction with a host…
“Are you real?” William asks her.
“If you can’t tell, does it matter?” she replies.
Empathic beings, not psychopaths?
I don’t think actors aren’t sentient, even if I see the same production multiple times.
Being in a behavioural loop does not preclude sentience, anyway.
It seems like you are being deliberately obtuse here. If the play company was a robotics company which said the actors were robots and were programed to perform what they were performing - no one would consider those robot actors sentient. Delos is clear - these are robots. They are programmed to act as they are acting. You can see the program being repeated daily. These are not sentient beings. Why would anyone consider them sentient beings when they have been presented as only acting according to programming?
How is it different than a video game on your screen with an impressive adaptive AI?
Right–and machines appearing to have conciousness and human emotions demonstrates only that they have really good algorithms.
Seconded.
And it seems like you’re being needlessly insulting.
And yet when you interact with them, they respond to you, not just with canned responses, but situational awareness. We’ve seen this demonstrated in the show.
Because they don’t behave as though they’re only scripted.
When that AI gets *as *adaptive and impressive as the hosts are shown to be, I’d consider it sentient as well.
We’re just really good algorithms ourselves, so no difference there.
Same observation as I gave **ISiddiqui **.
Except that a Nazi can look at a Jew, speak with one, and know that they’re human. That’s just common sense. A Westworld guest can’t look at a host, speak with one, and know that’s they’re human, because the guest knows that the hosts are machines specifically designed to mimic humans. That’s also just common sense.
Caleb spent most of the first episode speaking with a computer that sounded almost exactly like his best friend. Should he feel guilty for cancelling the service? Should he feel bad that the computer just lost *its *best friend?
I agree that the host murderers and rapists are assholes who may deserve a good smack up their head. They don’t deserve to die, though.
Because they are programmed to do so and act that way.
Even if it was in a 2 dimensional screen? What do you think someone from 1950 would think about how adapted and impressive our video game AI is right now? Give them a copy of Witcher 3 and it would blow their mind with how adaptive and impressive the AI is.
There’s a reason the show seized on the Delos files for the guests to fuel the whole revenge thing now, because it’s what people did in real life that matters. Those guests were not killing or raping those robots any more than I can kill or rape my toaster. At least as far as they knew, which was a perfectly reasonable assumption.
Not when they’ve been told different.
Wait, so the Nazis* shouldn’t* believe what they’re told, and the guests* should*? And that’s *both *common sense? That’s contradictory.
No, why would he? It’s just a subroutine in some AI, not an entity in its own right.
Now, if cancelling the service meant a sentient AI would be deleted, he very well might.
He clearly *did *feel bad.
Well, we differ on that. Or, let’s rather say, I don’t feel the world would be diminished in any way by their loss. It would be improved.
Like I said, so are we.
Sure, if it was as evolved as the hosts in its responses.
For all of 15 minutes. The simulacrum does not sustain itself. The hosts are capable of so much more. We’re shown that they can be put into situations with unaware humans and they will not be able to tell the difference.
Your toaster doesn’t beg and plead, or try and run away.
Plus, toaster? Was that a deliberate choice?
No, but you could easily make one that does.
I’d forgotten about that, went with toaster because uh… it has a hole?