The concept of a philosophical zombie makes no sense to me

Imagining a person who is isomorphic with someone who is conscious but lacks consciousness, even though that lack of consciousness is undetectable, is that same as saying that a person can have the quality ‘wibble’, but this quality is undetectable, and so it is possible to imagine people without this quality who are otherwise identical to those who do have it.

I imagine a ‘consciousness meter’ might come in handy here; in fact there was someone who was selling a device recently which would do the job as well as anything else.
ADE 651 Fake Consciousness Detector

No, not at all. There is a clear difference between conscious entities and unconscious entities—the fact that it is not a third-person accessible difference does not impinge on this. Imagine the difference between being in extreme pain versus being in perfect bliss: if subjective experiences really don’t matter, then you should have no cause for preferring one to the other. In the zombie case, there is something it is like to be a conscious entity, while there is nothing it’s like to be a zombie: the set of facts that are true about the world is different in both cases, even though the set of physical facts are the same. With some arbitrary property like ‘wibble’, there’s simply no fact of the matter as regards whether a being has it.

In fact, this sort of thing seems, to me, like a pretty good litmus test for the idea that consciousness is an illusion: once you know something is an illusion, you’re typically able to “see through” it—that is, while the lines in this picture on some level still seem to be of different length, once you know that’s an illusion, you stop believing they are. The illusion ceases to be compelling.

But that’s not the case with the “illusion” of consciousness—even to the most hard-headed eliminative materialist, who takes themselves to have deconstructed the notion of qualia thoroughly, pain will still hurt—the illusion has lost nothing of its force, it is not possible for something to seem to hurt without the person experiencing the pain also believing that it hurt—because what things seem to be is just what they are in the mind.

So if consciousness were to be an illusion, it would have to be one of a very peculiar kind, namely one which can’t be “seen through”. But in what sense, then, is that still an illusion? It seems that the very notion of illusion implies the possibility of seeing beyond it, understanding and coming to believe what is really going on.

As I understand it, there are lots of behaviors we’ve evolved that aren’t relevant to survival. The enjoyment of music is one of the big ones. Consciousness is suggested by some to be an emergent property of minds that are large enough to function as successful hunters, gatherers, and especially coursers.

Consciousness may not be necessary to our behaviors, but might give a short-cut to rewarding survival-benefiting behaviors. This is, in essence, why “pleasure” exists. Not because it’s good in itself, but because it can be used to reinforce behavior. There are people who lack a sense of pleasure; their lives are simply empty of that kind of joy. Yet, nevertheless, they can get through life, basing their behavior on other rewards.

But, again and again, what if there is a minor physical difference – something missing from that person’s brain – and he is in most respects identical to me, but not absolutely identical? When you insist that he has to be exactly the same as I am, you break away from the premise that I hold forth.

The sociopath is not exactly the same as the people around him. Yet, he gets along in society so well, most of his closest friends don’t even know that he lacks a conscience.

The p-zombie lacks consciousness…but doesn’t know this, and, while he might wonder, every now and then, what people are talking about, he doesn’t have anything to compare his own experience to, and so he just shrugs and acts the way I do when people start talking about Basketball. It doesn’t mean anything to me, but it clearly means something to them.

In essence, I am resisting being painted into a corner, by accepting the premises that you seem so very eager to urge upon me.

If there is no physical difference, then the guy is me! (And we walk straight back into the Star Trek Transporter!)

In fact, some people are able to see through pain in just that way. I, myself, possess this capability, to a very limited degree. I can enforce the fact that pain is just a signal of nerve impulses, and not a “real” thing of its own. I can “turn it off” to a degree (again, very limited) by recognizing it for what it is. Pain is only a memo, an email, a written depiction of a sensation.

The more we study optical illusions, the more aware we are that our visual sense is, in some degree, an illusion. The blind-spot illusion, especially, is very amenable to demonstration.

But there is no clear difference between a p-zombie and a conscious person; they are made of exactly the same matter, in the same configuration, and will answer the question ‘are you conscious’ in exactly the same way. If you were a p-zombie you would be having the same conversation with me now as we are having, and vice versa. It is a difference without any detectable difference.

This depends on which kind of p-zombie we’re talking about.

In my case, I’ve mainly been talking about behavioural p-zombies, and the corresponding argument against Behaviorism. A behavioural zombie’s brain does not need to be like a human’s in any way. All that is necessary is that the entity behaves the same as a conscious entity.

This is the concession I’m not willing to make, as it plays too much into the hands of the mysticists. If the physical bodies are exactly the same, then the operating processes, even including the ones we can’t measure, should be the same.

If one guy is conscious, and the other guy isn’t, I’d hold there is some physical difference…or, possibly, an informational difference. One guy might never have been taught some important life-lesson. Maybe if you beat a child brutally at an early age, he won’t develop consciousness.

So: is the programming of a computer a physical difference? I think it is, so that an informational difference between two people is still physical.

(Take a magic matter-duplicating machine and make two exact duplicates of a person. Tell one of them that the password to a web-site is XYZ, and tell the other than the password is ABC. Now, there is a physical difference between the two, in the way that this information is stored in their brains.)

A thought on the value of consciousness (and why it would arise):
Maybe consciousness is required to have emotions
Emotions are a mechanism for guiding our actions (social interactions, fear of danger, love of children/mates, etc.)

When I think about emotions, they seemed intertwined with my sense of who I am (no surprise) to a degree that they seem like they might not make sense if I didn’t have a centralized feeling about who I am.

It’s tough to picture emotions in a being that doesn’t have consciousness.

Maybe I’m stating the obvious, just thinking out loud.

Well, the enjoyment of music is really a by-product of how our auditory cortex is wired up, which, in turn, is dictated by evolutionary pressure. But the main point is that if our internal states have no causative influence on behaviour, then yes, we could nevertheless have evolved them, by random genetic drift for instance, but then, it’d be an unfathomable coincidence for our internal experience to be appropriate to the situation in which we have them—nothing could ‘steer’ our phenomenology towards this appropriateness.

But again, why would such reinforcement need any conscious experience? Lots of feedback systems function along those lines, but without any consciousness.

But the stipulation is that he is identical to you in every respect—to respond to this by saying that maybe he’s not identical after all is just to miss the point of the hypothetical.

If that premise is physicalism, then yes, indeed, because the point of the argument we’re discussing is to question physicalism; it’s to question the premise that physicalism holds. Sticking to that premise requires refuting the argument, but the arugment isn’t refuted by sticking to physicalism, as you seem to be doing.

Only is you pre-suppose physicalism, which you can’t do without incurring a circularity in response to an argument designed to question that very premise.

Well, maybe (though I rather suspect that that’s a matter of how much pain one inflicts; nevertheless, I don’t want to quibble), but there is still a feeling of pain—pain is still painful, even though you can arrange yourself with the pain (I think everybody can to some degree). There’s also a possibility that mental control might be instrumental in engaging the body’s own mechanisms to combat pain, such as releasing endorphines, or whatever.

Anyway, maybe pain was a bad example. Let’s think about visual awareness: the way it feels like to see a rose, say. I don’t believe you can convince yourself that there is, actually, nothing it is like to see that rose—that seeing it is the same as seeing something behind your back; because that’s how it’s like for there to not be anything it’s like to see something, because you don’t see anything there. But the rose in front of your eyes, and the rose behind your back, produce wholly different phenomenology, and I don’t believe that you can convince yourself by any means that they’re really the same thing.

Yes, but only for one of both will that answer be true.

Say that two worlds are the same if all the same propositions are true of both. Then, a world populated by zombies, and our world, are not the same: in our world, propositions like ‘there is something it is like to be a human being’, ‘pain is painful’, ‘there is subjective experience’, or ‘eburacum45 is experiencing reading these words right now’ are true, while in the zombie world, they are false. Nevertheless, all the physical propositions that are true about our world are true about the zombie world. But then, physics doesn’t exhaust the world.

This is not the case with made-up properties like ‘wibble’: there’s nothing that makes the proposition ‘there is wibble’ true or false, it’s just a sentence that doesn’t refer—there’s no fact of the matter. But ‘there is consciousness’ does refer to a particular state of affairs: it can be either true or false. The difference between both is the difference between what’s in front of your eyes, and what’s behind your head, if you only consider visual awareness. For a zombie, everything would be like what’s for you behind your head—i.e., like nothing. You have no (visual) awareness, no subjective experience, associated with that part of space; the zombie has no awareness of any part of space, yet nevertheless acts as if he did.

Without a third-person detectable difference; but my experience, and presumably, yours, would be different.

Again, that’s only the case if physicalism is true, which you can’t presuppose to answer an argument whose conclusion (if it is sound) is that physicalism doesn’t hold. You have to attack the argument itself.

Well, the feeling you have when you have some emotion is certainly connected to consciousness—in fact, it is an element of it. But all the neurological, physiological, and physical processes occurring when an organism has some emotion, if the zombie argment is sound, could just as well occur without any attendant experience. So I think it’s question-begging to suppose that emotions, i.e. the attendant feels, have any instrumentality in guiding our behaviour.

In that case, no, I don’t accept the stipulation. If two entities are exactly alike, then they could not have a significant difference between them. Saying otherwise seems contradictory.

What is being said is that they are physically identical but with a difference of having consciousness or not having consciousness.

Thing is, I don’t buy that. If there is a difference, then it’s a physical difference.

If you say, “That’s circular,” well, okay. Guilty as charged. Until the matter is settled scientifically – please show me these two persons – I see no reason to accept the premise.

All I’m getting out of this is that we’re making up excuses to call each other mystics. And since everyone here (I believe?) has rejected mysticism, then what’s the point of creating premises that invite mystical explanations? I see nothing but “gotcha” behind such premises.

Agreed.
And I don’t think your objection is circular, the argument itself is, if it boils down to: “Imagine something physically identical to you, but not identical to you. See? That refutes Physicalism!”

Behavioural p-zombies OTOH and the discussion surrounding behaviourism, is more interesting, because even many Behaviourists will accept the premises. So at least we have a start point.

Only if physicalism is true. But it’s the conclusion of the argument that it isn’t—it’s not just put in as a premise. To resist this, you’d have to refute the argument—just insisting that, anyhow, physicalism is true, is basically saying ‘Assume the argument is false. Then, the argument is false!’.

And it’s not the case that the only other option is some form of mysticism. Just to sketch an alternative, consider the following: in physicalism, there is only one sort of stuff—it’s a monist position. Furthermore, that stuff is material—it’s a form of materialism. What makes it physicalism is the assumption that all the properties of matter are physical. Property dualism (or, as Chalmers calls it to explicitly emphasize his rejection of mysticism, ‘naturalist dualism’), however, stipulates that we don’t actually know that this is the case.

Physics, at least as regards the simplest building blocks of matter, is concerned with properties such as mass, charge, spin, and so on. It describes how these properties interact; but nowhere does it say that these are necessarily the only sorts of properties. Here, property dualism comes in: this is a viewpoint that says, again, that there is only one sort of stuff—so it’s monist—and that it’s material—so it’s materialist. However, it stipulates that besides the physical properties, at least some of the fundamental building blocks of the world also have mental properties. This is not any more mysterious than stipulating they have only physical properties. In a way, both are simply material properties, just that we have so far emphasized the third-person accesssible, objective, spatial, extrinsic ones while neglecting the first-person accessible, subjective, qualitative and intrinsic ones.

In such a conception, zombies are a simple possibility: they can exist in a world in which the fundamental building blocks lack the mental properties, but possess all the physical properties. But it is not necessarily any more ‘mystical’ than physicalism is.

Now, besides this possibility, numerous others have been proposed; so claiming that a rejection of physicalism, or an acceptance of the zombie argument, commits one to mysticism just ignores all those other options.

Are there even any behaviourists left? My understanding was that the viewpoint has basically died out.

Well Daniel Dennett has described himself as a kind of Behavourist.

And certainly here and in other debating forums, you will come across many people espousing a Behaviourist position.

(Although, frankly, a lot of them don’t grasp what the actual problem is WRT consciousness. What I mean is, for example, it’s intuitively obvious that “red” is something out there in the world, that we passively see. It takes a certain mental leap to appreciate that EM radiation of a particular wavelength is out there, but “red”, the colour, happens in your brain.
Many of the people arguing Behaviourist positions, IME, are pre-“penny drop moment” (though of course some aren’t))

I think you’re missing the point of what the illusion of consciousness is about. To quote a relevant psychologist:

Furthermore, Blackmore’s book Consciousness - an introduction offers exercises that do allow you to see through the illusion and see clearer how limited “consciousness” actually is. Once you start noticing how much of your actions and behaviour that you would have thought inextricably linked to consciousness actually isn’t, the concept of a p-zombie becomes more comprehendible.

If anything I would describe my position as ‘informationalist’; there is physical matter and information, and that information can be processed by matter but has a quality that transcends matter in the way that this hypothetical non-physical element to consciousness is alleged to do.

Information could be arranged into entire virtual worlds that can replicate the physical world or create entirely alien ones. There is plenty of room in the infosphere for the sort of imagined non-physical component that is included in consciousness.

However there is no room for a p-zombie in an informationalist view of the world; if your processing substrate is identical then your experience of consciousness will be identical. I can imagine (with some difficulty) p-zombies which have physical substrates that are *not *identical to the original, although even there I would expect some sort of consciousness to exist.

Ah, I’ve generally associated him with more of a functionalist point of view (or, as he’s called it, ‘teleofunctionalist’), which I don’t necessarily associate with behaviourism. But I can see how one might consider his heterophenomenology kind of a behavioural approach. Nevertheless, I don’t think his behaviourism is that of Skinner et al.

Yes, it’s one of those things that can be hard to see, but, once seen, can’t be un-seen. Consciousness appears to us as being wholly transparent, as if we looked on the world as through a window. Magritte’s The Human Condition provides a good analogy, with the caveat that it’s seen itself from the point of view of a conscious observer; in reality, the surroundings and the environment beyond the window should be like nothing at all, with only the painting having intelligible qualities—but of course, that’d be hard to draw. But it helps that what you’re experiencing is always just the painting in the painting, and that thus, the qualities of your experience are not qualities of the world, but of the painting (and of course, that shouldn’t be taken too literally either, lest one encounter the dread homunculus).

To some extent, different people mean different things when talking about the illusory nature of consciousness. Proponents of eliminative materialism, such as the Churchlands, do indeed claim that consciousness is wholly illusory—that we’re fundamentally deceived about being conscious (or rather, that consciousness is such a confused and muddled notion that it basically doesn’t refer to anything in the world, and will, once we truly understand the relevant processes, simply vanish from our considerations the way élan vital did—I think that this is a misguided analogy, since élan vital was an explanatory hypothesis, while consciousness or being conscious is merely a name for a certain kind of state we may be in, but that’s another debate entirely).

Keith Frankish explains a related idea in this recent philosophy bites podcast; he doesn’t exactly call consciousness an illusion, but believes we’re deceived about its properties—that it isn’t actually the subjective, qualitative, experiential state we believe it is, but that this is merely illusory, a story we tell ourselves in order to feel special (which I think is question-begging). So there are some people who take consciousness, or its difficult aspects at least, to be illusory in the sense that I’ve been considering.

Regarding the idea that we’re deceived about the precise nature of our consciousness, I’ve got no qualms at all—the canonical example is the poverty of peripheral vision, which nevertheless seems to us to be perfectly sharp and accurate, which can be readily demonstrated by drawing a card at random and bringing it slowly to the center of your visual field from the outside while staring fixedly forward, and noting when you can recognize various features—its color, whether it’s a number or a picture, and finally, what it shows.

In fact, it’s been estimated by Tor Nørretranders in his book The User Illusion: Cutting Consciousness Down to Size that our conscious perception involves merely 40 bits per second*, which seems astonishingly little considering the richness and detail our phenomenal experience appears to have. But demonstrations like the above make it sound not quite so striking.

Well, I’m not sure you can separate physics and information. First of all, information is physical in the sense that one bit of information always needs a physical difference along at least one property—you can’t store a single bit with infinitely many red balls, you need to have the possibility of turning balls a different color (for example). Only in the difference between a red ball and a blue ball (say) can you store information, so it’s linked to physical properties.

Furthermore, information itself also has physical properties: deleting one bit of information, as Landauer has pointed out, needs an energy that’s proportional to the ambient temperature T (the proportionality factor being k*ln(2), where k is Boltzmann’s constant and ln is the natural logarithm). This is often simply referred to by the maxim ‘information is physical’, since you need to consume energy to manipulate it. So I don’t think there’s any good sense in which information ‘transcends matter’; it’s merely an abstract way to treat structural material differences.

This, again, is simply rejecting the argument without addressing it. If the argument is sound, then the informationalist view is simply false, and refuted by it.
*Recently, Max Tegmark, a cosmologist who’s somewhat famous for occasionally writing papers grappling with ‘the big questions’, arrived at a similar figure (37 bits) in his paper ‘Consciousness as a State of Matter’ from theoretical considerations, but considered that way too little; I wrote him to point that it’s actually appeared in the literature before, and he wrote back to tell me he’d ordered the book right away. :slight_smile:

‘Informationalism’ isn’t just about the states of matter, but about the way they change when processing occurs. You are different from the person you were yesterday only because additional data has been received and processing has occured.

If there are two identical people, one who is a conscious person and the other one a p-zombie, then the only way that his difference can occur is it there is hidden processing going on somewhere that is otherwise inaccessible to investigators. Where do you suppose this hidden processing is occuring, and how does it differ from the processing that occurs in the physical brain?

Or is the difference something else (wibble, perhaps) which is unconnected to the processing of information?