Yeah, that ending disappointed me. I would have preferred it if everything she’d said was a manipulative lie. Because I thought the point was that she was only mimicking human emotions to get what she wanted, and that she had zero empathy for humans.
Maybe an ending where she is building more AIs would have been good.
Yeah, I figured bottom up evolution of intelligence rather than top-down programming made more sense. But I always wondered how an artificially evolved intelligence would have any empathy for humans (which worried me). Your description of the process makes sense.
I went and saw it this last Sunday. It takes a pretty darn good review of a movie for me or my wife to want to see one outside of the November - December time frame (the time frame during which she and I have come to believe any given year’s BEST movies tend to come out) but the reviews for Ex Machina did the trick. We weren’t disappointed. I’m not sure I’d go so far as to call it “the best movie I’ve ever seen” but it’s up there near the top of the list. Actually, one of my co-workers (I let a few of my co-workers know, via e-mail, that I thought it would be worth their time to see it. A VERY rare move by me. Matter of fact I don’t think I’ve done that before, at least not at my current place of employment) I thought put it well: “Creepy and intelligent.” Spot on, as far as I’m concerned.
Lines of code that define “global stress variable” may produce behavior that EMULATES the result of stress on humans, but the machine that’s running the code isn’t any more emotionally involved than your computer is when it runs a porno vid. The computer is running the porno vid but the vid has no meaning for it, it might as well be a spreadsheet. It has meaning for YOU, the viewer, because you’ve got the hormones that give it that kick.
In fact, I thought it was telling when Mad Scientist tells The Kid that Dancey McTeaspiller “really LIKES to dance.” Um, she may be programmed to simulate enjoyment of dance, just like she may be programmed to simulate enjoyment of sex, but that doesn’t mean she really LIKES to dance, or to have sex. It’s just lines of code being executed. But Mad Scientist guy apparently did not understand that distinction, perhaps because he was so far gone into geekery that he thought that human behaviors were just another kind of code being executed in fleshy form.
There isn’t only one solution to an engineering problem. If I build a helicopter, it isn’t any less a flying machine than an airplane because the method by which it operates is completely different.
If you replicate the functionality of all of the mechanisms of the human brain, then regardless that they’re based on silicon instead of carbon, you’ve ended up with something that has all of the humanity as a human. There’s nothing unique about a hormone that makes it impossible to model.
We are just machines, running on electric signals and chemistry. Simulate that, either by directly simulating chemistry or by simplify the model down to something easier to run on a computer, and there’s no difference between us and the mind running on the computer. There is no magical “soul” that can’t be simulated. All of the universe is just the results of modellable events, in gargantuan quantity. A single human is just a minuscule fraction of all of that, and consequently just as modellable.
Remember, though, that Nathan laid out at the end what was involved in that. She needed intelligence, self awareness, empathy, the ability to communicate her own feelings and so forth. At that point, you can continue to argue that it’s just code, sure. But in what way does that differentiate from you and I?
And that’s the real point of the Turing Test, isn’t it? Can I prove to myself that YOU are self-aware? You seem to be insisting that it’s just an emulation. But can you - or anyone - prove the difference between an emulation and the real thing?
Did she really have empathy, though? Is the ability to understand what motivates humans the same thing as empathy? Because there’s a guy at the end of the movie who would tell you that she didn’t have any empathy at all. That ambiguity is one of the cool things about the movie.
That’s like setting off a grenade and saying, “See, explosions expand in a sphere. Consequently, an explosion-powered weapon that flung a projectile in a targeted direction would be impossible since a sphere is not directional.” Extending one example of engineering to represent the entire scope of all engineering possibilities is silly.
Unless you can demonstrate that a brain operates outside of physical laws or that a computer cannot simulate physical laws, there is exactly 0 reason to think that it is impossible to create a full, emotion-feeling being in a virtual space.
I don’t know whether it’s clever story telling or cheating that Caleb had a scar on his back. We know now that he got it in the car accident that killed his parents, but it could be some sort of access point.
Except for when he played Bill Weasley, the only other thing I’ve seen Domhnall Gleeson in was “About Time”, which I absolutely loved. Another intelligent film. So he’s become one of my favorites very quickly.
Oscar Isaac, on the other hand, has the part of douchebag down pat. I hope he can overcome that when he stars in the new Star Wars movie. (Gleeson is in that, too)
That’s really the question, though, isn’t it? If it sounds, acts and is perceived as empathy how can one tell it isn’t?
And empathy isn’t necessarily being in sympathy with others…it’s just understanding what others are feeling and identifying with it. No rulebook says one can’t use that to your own advantage. If it did a ton of advertising guys would be on the street.
But yeah, the movie seems designed to make the viewer speculate on what exactly is intelligence and emotion. One of the interesting things is how the apparently bad guy - weirdo richguy Nathan - is in the end the actual protagonist. He’s the only one who truly understood what was going on and worked to contain it. That he failed only leads to his being a tragic hero. But again, the monster has to kill the doctor. That’s all there is to it.
Which one are we supposed to be in sympathy with? The imprisoned but frightening AI? The gullible kid? Or the supergenius with the terrible flaw of hubris?
I disagree. Empathy is not only understanding the feelings of another, but sharing them in some measure. Isn’t lack of empathy the mark of a sociopath? So the AI in this movie, while feigning empathy, is actually sociopathic.
I mean, that’s what sociopaths do, isn’t it? Feign the emotions the rest of us actually feel.
A bit of drift: I haven’t seen “Ex Machina”, but I’m pleased that it was good enough to bring about this thread.
Alex Garland has done the script for Jeff Vandermeer’s ''Annihilation" (book 1 of the Southern Reach Trilogy), and he’d like to do the movie, but they’re waiting for money to come through. Having a hit movie with Ex Machina would be a great step in that direction. http://www.blastr.com/2015-4-28/exclusive-ex-machina-writerdirector-alex-garland-small-sci-fi-films-sentient-machines-and
You’re not arguing your position, you’re just asserting it.
Other than a person saying so, what reason do we have to believe that another person is feeling emotion? We extrapolate from ourselves, sure. What gives us emotion? Electrical impulses in a physical medium. Why assume that’s special with regards to a sufficiently advanced computer?
I’d disagree and offer that what you’re describing is sympathy and not empathy. There a real - if subtle - difference. Empathy is about understanding anothers emotions and points of view whereas sympathy implies a greater sharing of it.
It’s entirely possible for someone to understand an emotional position while not sharing it and in fact being sociopathic. It would just be an intellectual understanding. If nothing else - and I’m not conceding that point - Ava certainly knew how to understand and play on emotions.
You’re thinking of the human body as an equivalent to a computer. I’m not arguing that a computer can’t be programmed to exactly emulate human behavior, especially that it can’t be programmed to fool people into thinking it’s a human being –the early success of Elizaindicates that it will be supremely easy to do, MUCH easier than creating awareness in a computer program.
I just don’t know if we can create awareness. I also consider exactly emulating human responses to be pointless. Much of human emotion is garbage, from a rational point of view, something held over from the days when our brains were in bodies that were pretty much rodent bodies. The garbage limits us, hinders us.
Plus, I’m not sure what consciousness is, what awareness is, how you go from programmed responses to perceiving the world and yourself as a part of it. Are you?