Do Androids Dream of Electric Sheep? by Philip K. Dick - What does it mean?

I just finished this book. I had of course watched the movie, and what I came away from the movie with was the feeling that the androids weren’t all that different from humans. I came away with a great deal of pity and empathy, feeling grief for their short life spans, etc.

However, the book seemed to come to the opposite conclusion - that in the end, they were the robots and we were the only real things. Am I reading the wrong thing here? It was a very entertaining book, but what does it all mean? All opinions are welcome, please!

SPOILERS SPOILERS SPOILERS

First of all, it is important to know that Rick Deckard is an android. This might not be apparent on first reading, but there are clues scattered throughout to point out that this is more likely the case, and Dick has pretty much come out as saying that he is one. The book makes much more sense when you take this into consideration.

I haven’t read the book in two years (I’ve been working my way through all of PKD’s books since then; this was the first one I read so it’s a little fuzzy, gotta reread it soon), but my take on it is this: there are no clear lines between humans and androids; the only determining factor is the V-K empathy test, which is not always accurate. Dick posits that empathy for other humans and living things is the most vital component of being a human. However, is empathy only the domain of humans? Perhaps an empathy-possessing android is more human than a non-empathic human. And if so, then our definitions of artificial vs. authentic are entirely man-made, and we must come up with a new definition of humanity, one which is based on empathy and love, instead of DNA and parentage. This theme (empathic artificiality is more “human” than non-empathic authenticity) is even clearer in DADoES’ semi-prequel, We Can Build You, in which an empathic machine throws into sharp light the uncaring, emotionlessness of the “schizoid” main female character, who is a biological human.

The androids were given four-year lifespans in order to prevent them from developing emotions and empathy, and thus always passing the V-K test (i.e. making even an inaccurate test entirely worthless, and thus making completely obsolete our current biological definition of humanity, something that the establishment would never abide). This seems to say that what makes something or someone feel is not heritage, but rather experience, and that empathy is not the accidental side effect of the freak evolution of the species homo sapiens, but instead a quality which is separate from our species and can arise in anything. This is a radical concept since it basically says that the quality we admire most in humans has nothing to do with humans at all, which is why the humans in the book went to such lengths to ferret out feeling machines and make sure they never arose at all.

Who is more “human” using the empathic definition of humans: a machine that’s been taught to care about other beings and act tenderly toward them, or a natally born sociopath entirely devoid of feeling whose only reason for not hurting and killing other beings is because he might get caught at it, not because it will make other people hurt and feel bad? The Mercerism in the book (though it was cut out of the movie) was a religion based on empathy; you literally put yourself into someone else’s mind and felt what they felt. Our “humanity” is not something within ourselves but a communal experience, important only in our relation to other beings. It is not what you think that makes you human; it is what you feel.

The artificial vs. authentic theme is also apparent in the humans’ devotion to their animals. Even though all humans preferred to keep real animals (owing to our aforementioned prejudice toward the natal “real” instead of the manufactured “fake”), they were tender toward the fakes, because it is in the nature of most human beings to show empathy toward other creatures. At the end, when Deckard discovers the toad which turns out to be a fake, he doesn’t cast it aside, but instead uses it as a channel for his own empathic drive (which might not be “authentic” itself). Which brings up another interesting interesting tangent: if Deckard is an android, with a “false” empathy, and the object of his empathy is false, does it really count up to anything on that great tote board in the sky? And if you say it doesn’t, isn’t that a position based on prejudice and reactionary thought? Dick is saying that in the end, our love toward one another and toward “lesser” (be it animal or machine) beings is the only thing that counts, and that biological authenticity doesn’t matter, it’s empathy that counts. Or at least, that’s what I gleaned. It’s been a long time since I read it.

Which version of the film are you familiar with? The director’s cut is far superior and throws light on the authentic vs. artificial schism, whereas the original cut is rather shallow wrt these themes. It is also clear in the DC that Deckard is an android, whereas you were less likely to infer that from the original. (I watched the original when I was a kid, and didn’t grasp it at all, but when I grew up and read the book and saw the DC I got so much more out of it.)

What it boils down to, and this seems to be a recurring theme for Dick, is the question of what is it that makes us human? Dick’s thesis seems to be the ability to feel empathy is what makes us human. So Deckard has crossed the line on this into a sort of humanity, even if he is a machine (which is heavily implied but really up to you to decide).

Of course the movie doesn’t really convey this, but by all accounts Dick really liked the movie (although I don’t remember if he actually saw it in it’s final form or he died before then).

Now go read his divine trilogy and come back with more questions.

:slight_smile:

SPOILERS SPOILERS SPOILERS

First of all, thanks for doing this! This is exactly what I needed.

First of all, it is important to know that Rick Deckard is an android. This might not be apparent on first reading, but there are clues scattered throughout to point out that this is more likely the case, and Dick has pretty much come out as saying that he is one. The book makes much more sense when you take this into consideration.

No, this wasn’t apparent at all, or I totally missed it. I think I’ll have to re-read it again. But thinking about it, I can’t even recall any clues. Perhaps I read the book and didn’t see any of the deeper themes. The other bounty hunter was also an android, were all bounty hunters androids? And even though Rick was, he still felt a great deal of empathy for both Priss and Rachel toward the end, even though he still managed to kill Priss.

I haven’t read the book in two years (I’ve been working my way through all of PKD’s books since then; this was the first one I read so it’s a little fuzzy, gotta reread it soon), but my take on it is this: there are no clear lines between humans and androids; the only determining factor is the V-K empathy test, which is not always accurate. Dick posits that empathy for other humans and living things is the most vital component of being a human. However, is empathy only the domain of humans? Perhaps an empathy-possessing android is more human than a non-empathic human. And if so, then our definitions of artificial vs. authentic are entirely man-made, and we must come up with a new definition of humanity, one which is based on empathy and love, instead of DNA and parentage. This theme (empathic artificiality is more “human” than non-empathic authenticity) is even clearer in DADoES’ semi-prequel, We Can Build You, in which an empathic machine throws into sharp light the uncaring, emotionlessness of the “schizoid” main female character, who is a biological human.

Ok, yes, that was one of the things I did pick up out of the book. No clear lines, even the V-K test had failed on some of the questions. So Dick was saying that even the androids could be human? Or an android, built well enough, could be human? Should I try to read We Can Build You? And is there significance in saying that our definitions of humanity are man-made, even when applying them to a man-made object such as an android?

The androids were given four-year lifespans in order to prevent them from developing emotions and empathy, and thus always passing the V-K test (i.e. making even an inaccurate test entirely worthless, and thus making completely obsolete our current biological definition of humanity, something that the establishment would never abide). This seems to say that what makes something or someone feel is not heritage, but rather experience, and that empathy is not the accidental side effect of the freak evolution of the species homo sapiens, but instead a quality which is separate from our species and can arise in anything. This is a radical concept since it basically says that the quality we admire most in humans has nothing to do with humans at all, which is why the humans in the book went to such lengths to ferret out feeling machines and make sure they never arose at all.

So the androids could develop empathy. And we were killing them because & when they became too human. I remember a section of the book where they said that animals could not feel empathy, predators most especially not, and it did strike me as a little odd that humans had developed this trait and still were predators.

Who is more “human” using the empathic definition of humans: a machine that’s been taught to care about other beings and act tenderly toward them, or a natally born sociopath entirely devoid of feeling whose only reason for not hurting and killing other beings is because he might get caught at it, not because it will make other people hurt and feel bad? The Mercerism in the book (though it was cut out of the movie) was a religion based on empathy; you literally put yourself into someone else’s mind and felt what they felt. Our “humanity” is not something within ourselves but a communal experience, important only in our relation to other beings. It is not what you think that makes you human; it is what you feel.

The Mercerism, yes, confused me quite a bit. This explanation helps. But again, I will have to reread and try to understand.

The artificial vs. authentic theme is also apparent in the humans’ devotion to their animals. Even though all humans preferred to keep real animals (owing to our aforementioned prejudice toward the natal “real” instead of the manufactured “fake”), they were tender toward the fakes, because it is in the nature of most human beings to show empathy toward other creatures. At the end, when Deckard discovers the toad which turns out to be a fake, he doesn’t cast it aside, but instead uses it as a channel for his own empathic drive (which might not be “authentic” itself). Which brings up another interesting interesting tangent: if Deckard is an android, with a “false” empathy, and the object of his empathy is false, does it really count up to anything on that great tote board in the sky? And if you say it doesn’t, isn’t that a position based on prejudice and reactionary thought? Dick is saying that in the end, our love toward one another and toward “lesser” (be it animal or machine) beings is the only thing that counts, and that biological authenticity doesn’t matter, it’s empathy that counts. Or at least, that’s what I gleaned. It’s been a long time since I read it.

Or is there even a great big toteboard in the sky? It seems to me, if Deckard’s empathy is real, even if the object and the giver are unreal, then Dick is saying in the end that’s what matters. Is this right?

Which version of the film are you familiar with? The director’s cut is far superior and throws light on the authentic vs. artificial schism, whereas the original cut is rather shallow wrt these themes. It is also clear in the DC that Deckard is an android, whereas you were less likely to infer that from the original. (I watched the original when I was a kid, and didn’t grasp it at all, but when I grew up and read the book and saw the DC I got so much more out of it.)

The question you asked - which version am I familiar with - is a good one. I am familiar with the original cut. However, when we bought it, we bought director’s cut. I’ve only seen the director’s cut once though…I think once I reread the book with the idea in my head that Deckard is a android, I will watch the director’s cut again and see if I can understand it.

Thanks for the description. Exactly what I needed.

I thought Dick said Deckard wasn’t a replicant. :confused:

It’s been years since I read the book; but IIRC, I thought the point was that Deckard, a human, was so jaded as to have lost much of his humanity and has become more like his quarry even as they became more like humans.

  1. The book is a completely different animal. Although one could make the case for Deckard being a replicant (and unaware of it) in the case of the film, I never got a sense of that from the book at all. The clues people draw on for the case of the film don’t even exisdt in the book.

  2. In any case, I’m in the camp that thinks the belief that Deckard is a replicant in the film doesn’t hold too much water, either. I never even heard anyone suggest it until decades after the film came out. Certainly no one suggested anything of it at the time of the film’s release. The first time I heard that Dick thought so was on these Boards.

  3. Damned if I know exactly what it means, aside from Dick’s perennial preoccupation with what’s real (or human) and what isn’t, and how easy it is to confuse the two.

Oddly enough, I finished reading this last week.

I’m not sure that Deckard isn’t human. After all, he passed the V-K Altered Altered Scale, and may have passed the Boneli Reflex-Arc Test, too. And he’s able to use the Mercer box. My views were that the book was about racism, or prejudice. Is Deckard more human than the andies he’s hired to retire? He’s cold enough to listen to Luba’s aria before preparing to kill her. Resch, the other bounty hunter (or Blade Runner, if you prefer), told Deckard to sleep with an andy, then kill it. Meanwhile, the andies are interested in music and literature.

I think it is implicit that these bounty hunters be less than empethetic to do their job. It’s Deckard’s new found empathy for androids (specifically Rachel) that makes him a bad bounty hunter.

Or, worded differently, Deckard and the other humans loved their electric pets like real pets. Yet they could not love androids as they love real humans. Or could they? I think it’s pretty clear that many of the important personages are not human. How else could a human be on TV all that time?

To me, it rings true of a genocidal regime: love for minor things, and complete hatred of the important things.

Okay, a lot of that didn’t make much sense, it was just kinda pouring right out of my brain in real time. And I suppose it’s irrelevant if you think Deckard’s an andy.

I thought the point wasn’t that Deckard was an android, but that he felt empathy for the androids and not for humans. This could be seen in his feelings for the androids he was pursuing, as contrasted with his feelings for his wife.

It sort of turns the VK Test on its head if the subject has more empathy for androids then humans. Especially when so much of the human emotional state is artificially induced, anyway.

thwartme

I’m confused, here. The PK Dick short story that I read doesn’t seem to match everyone’s discription of the book. Has it been too long since I read it, or is there a novelization of the movie out there?

The story I remember has very unempathetic androids. Also, corporations seem to allow humans to behave, in a group, as unempathetically as the androids, so that there is little difference between an android and a corporate human. And the main character is brought to a fake police station by fake police who try to convince him that he in an android with fake memories of being human. Deciding that he is not an android was a crucial part of the story.

I’m another one who doesn’t remember anyone ever saying that Deckard was a replicant until after the movie came out. I don’t believe it even in the movie, although the original version is the only one I will ever see. (I hated the movie and would never subject myself to a longer version of it. :smack: )

I think that making Deckard a replicant destroys whatever tiny value the movie has - not that it’s one of Dick’s better books, either. If humanity doesn’t include empathy, then what’s the point of humanity? Deckard has to gain empathy and emotion as the events progress, as least enough so that he is co-equal to the replicants. If he’s a replicant to begin with then this revelation becomes meaningless.

I’d also like to know what the supposed clues are in the book for Deckard’s being a replicant.

Are you talking about “The Little Black Box”?

I totally forgot about the Penfield mood organ until now; that’s another example of how artificiality had started to seep into “authentic” human beings (just as the androids had started developing genuine emotions), further blurring the lines. I don’t think it’s necessarily a strike against Deckard’s alleged androidness to say he can use the empathy box; the way I figured it is that any empathic being, real or fake, can use the box, as long as they can feel.

Even though it seems pretty clear to me that Deckard is an android (in movie and book), I can definitely see that it’s supposed to be ambiguous. dasgupta’s analysis makes a lot of sense, and I think prejudice was supposed to be one of the themes of the book as well (that’s the thing with Dick’s books, they are so intensely layered that you can catch four or five themes in every one, and no two people have the exact same interpretation). To me, the theme of the book just doesn’t seem to work quite as well if he’s human, though I accept that as an alternate reading. Some of the clues were Resch telling him that “the best place for an android would be with a big police organization” (hm, like the SFPD?) because the opportunity to test him would never arise (end of chapter ten), as well as a few comments about Deckard dying soon (again, fuzzy, but I remember them being there). Then again, Deckard did pass the V-K, though if he had developed his empathy enough, and had implanted memories which convinced him he was a human, that might have been enough for him to pass the test. There’s certainly evidence on both sides.

I recommend reading We Can Build You; as well as featuring two of Dick’s best characters (the Abraham Lincoln sim and Pris Frauenzimmer) it gives some background information about the Rosen corporation and the reasons why the androids were created. It’s not as well-written as DADoES but it’s still a pretty good story (one of Dick’s very few first-person narrators).

If you have an LJ there is a Livejournal community called phillipkdick where I’m sure a lot of people would like to answer your questions. Also, here is a pretty thorough study guide that goes through the book chapter by chapter and picks out the important questions from each. Not many answers though.

ShibbOleth: Dick only saw some of the opening shots of the cityscape before he died, but he said it looked exactly like he had pictured it. He also read the script (though the draft he read may have differed from the final script), because there was talk of him writing a novelized version of the movie (basically, a dumbed-down version for the yokels), which thankfully never happened.

There has to be a point?

There doesn’t have to be a point, or this particular point. It just seems to me that the endings of both the book and the original movie do have this point and moreover require this point. The redemption of individual humans in an empty, deracinated and dispirited world is one of Dick’s themes in his novels, regardless of how bleak his overall world view was.

OK, I have another question. WHat was the meaning of Rachel throwing the goat over the edge of the building at the end of the book? And what was the meaning of Deckard’s reaction to it?

No matter what, this was a good book if it made me think so much.

Also, FTR I really liked the movie Blade Runner. There was much to be desired, for example, the girl could have been a better actress, but the mood it struck was very immersing.

I think sean Young’s wooden acting fit perfectly. She was a replicant who thought she was human. As a replicant, she has nocapacity for empathy. She found out that she wasn’t human, which would leave her feeling numb.

A talk that PKD gave at UBC three years after the publication of Do Androids Dream of Electric Sheep is useful if you’re looking for elaboration of his thoughts on the AI/human dichotomy, without having to make inferences from a narrative.

The Android and The Human.

I think this sums up the core concept in DADOE perfectly:

Like almost everything Dick writes, it comes down “What is real?” Is a purpose-built machine more real than a man who is defined and guided entirely by his job? Is a mystical experience that’s mediated by a machine a real mystical experience? Why? Or why not?

Is the quality of authenticity a concrete thing, or something ephemeral and arbitrarily assigned?

SPOILERS BELOW…

The major one is only in the director’s cut-- an origami unicorn ends up outside Deckard’s door (forget who exactly put it there-- he’d been dreaming of unicorns-- and the people who make androids know all of their “memories,” as Deckard tells Sean Young’s character). Scott came out and said he was a replicant. Harrison Ford said he had no idea and played him like a human. (this, of course, raises all sorts of questions about who has final authority over reality, etc. the movie is so much more than just real vs. replicant)

Another popular argument (from the imdb)…

It’s been a while since I read the book, though, so I can’t help you there.

And here I thought it was a robot counting robotic sheep.

Silly me.

:slight_smile: