Westworld is coming back on HBO

Of course. It’s been a few seasons since I told my wife, “This show is the Walking Dead!”

Sure. 1/10000 of 1% of humanity.

Yet the vast majority of host experience with humans. Biased sample? Maybe maybe not. It is pretty easy to conclude that near any human given a chance to be in that position would act similarly, wealthy or not.

I’m tired of all these sci-fi shows that think they’re breaking new ground with the whole “Robotic servants rise up against their human overlords” idea.

There really needs to be a show where the robots plan on overthrowing humanity, but then space aliens show up and the robots gotta team-up with the humans to fight the aliens.

They say most guests choose the white hat and have one trip.

Not of the humans they’d ever met while in the park. More like 50%, based on what we’ve seen.

And once they leave the park, it’s not exactly much better.

Still on Team Robot, here, same as the last time we had this convo.

You know what I hated the most about Season 3? The Aaron Paul character. For the first half of the season, he was portrayed as just an ordinary guy, who had been written off by the system and by society, who found himself in the wrong place atthe wrong time and, against all expectations and computer algorithms, actually decides to risk his life to help a woman in need. He was humanity’s redemption, proof that unlike everything we’ve seen in the show so far, human beings were capable of empathy and compassion.

…and then he turned out to be a JAFCO - Just Another Fucking Chosen One. He wasn’t ordinary, he was a very special snowflake, a one-in-a-million outlier of human free will, with suppressed memories and all, and everyone other than him knew this and factored him into their machinations. Worse, it proved that Samaritan II was right - that it can predict human behavior, and that free will doesn’t exist. So why fight it?

I may not be remembering perfectly well but white hat or not most guests often did not represent humanity admirably.

I vaguely remember someone here pointing out in a previous thread/topic about the show that you wouldn’t think much of people if you judged them by how they behaved in a game like Grand Theft Auto. But we accept antisocial behavior there because we know that the non-player characters are fake and have no real intelligence or feelings, so it’s all just pretend. Presumably in the Westworld universe, those who visit the park have the same understanding of the robots.

It nevertheless does say something about us that behaving in that antisocial way is the desired fantasy.

And from the host POV claims that “I didn’t know that you actually had feelings intelligence sentience …” ring pretty hollow.

Exactly. It is the same problem with Sid in Toy Story. You can’t consider someone to be “bad” if they are just playing creatively with toys, and have no clue that the toys are conscious beings.

If a guest comes across a bunch of hosts playing bandits attacking hosts playing homesteaders, kills the bandits, tips his hat to the grateful farmers, and rides off into the sunset… is he a murderer?
…I mean, he did kill some hosts.
…but they were attacking innocent people.
…but none of the hosts had a choice - they were just following their programming. And tomorrow, the homesteaders could be the bandits and the bandits could be the homesteaders.
…but if they were just following their programming, then they weren’t sentient beings, and there’s nothing wrong with killing them.
It’s complicated.

Well, they were all of the 1%.

But early on they show him choosing white hat a rescuing Dolores, and apparently that is pretty normal.

Yeah, he is playing the white hat role.

It’s also kind of weird. They discuss in Season 1 how they don’t want the hosts to be too realistic because they don’t want people thinking they actually killed someone or their husband is actually having sex with a prostitute. But OTOH, are you really going to pay $40k a day to fuck very realistic sex dolls? You could have sex with a human prostitute for a fraction of that,

Like I said in the Season 3 thread, if they retain that impression after interacting with hosts for more than 5 minutes, then they’re broken.

So the simulated voice that Caleb was talking to on the phone at the start of the season was also sentient? It sounded real to me.

I think the people in this future world are used to non-sentient virtual intelligences perfectly imitating human behavior. It’s a normal part of life. As far as they could tell, the only innovation Delos made was the in the peripherals.

So in the human world outside of the park, humans were also, without being aware of it, just following their programming. Therefore not sentient beings and nothing wrong with killing them? Or it is wrong to kill them because they were not responsible, had no choice?

Personally I’m not in favor of killing creatures that I believe are not sentient just for fun. Magnifying glass on ants? Disturbing even though I do not believe an ant is sentient. No problem killing a whole colony to keep ants from invading my cereal boxes, but not just as amusement.

I appreciate that playing the evil villain character can be fun. Maybe more fun than the hero part. And I see that a host who sees that, who sees the fun we humans have causing pain to others, even if the humans did not realize the pain was real, would not think well of humanity.

Hell I look at the world and am not so impressed.

I don’t know if that machine was sentient either way. I’d have to see its reactions to being unplugged or similar.

I’m not just talking about “sounds”, I’m talking about “behaves” - hosts behave indistinguishably from humans.

I disagree as to the “non-sentient” part, that’s circular reasoning of some sort. And in any case, IMO If they consistently perfectly imitate humans, they’re sentient.

And it being normal doesn’t make it right, any more than a Southern slaveholder could argue that they be absolved from blame for recognizing the humanity of their slaves because it was “a normal part of life”.

That’s where we disagree. I’ve seen some bots online that are capable of conducting a very convincing conversation, at least for a while; think at where they’ll be 50 years from now. Human beings are very, very good at building systems to fool other human beings. I can easily imagine us developing a piece of software that acts exactly like a person, despite being nothing more than a complex flowchart with a random number generator.

That’s not to say that computers won’t become self-aware some day, or even that, like on the show, sentience will be an unexpected, emergent property. It’s perfectly possible, and a fascinating possibility. I’m just saying that for a machine designed to act like a human, acting like a human isn’t sufficient evidence of this occurring.

I mean, what about Siri? She sounds sentient to me; if I were a time traveler from the 19th Century, I’d definitely think that an iPhone was a person. The hosts, at least as designed, were nothing more than improved, ambulatory iPhones.

That’s no different than a person.