If consciousness is an illusion, who is being illuded?

How can I be deceived into thinking there is an I if there is no I?

A magician can in principle produce any illusion you can imagine. Except the illusion that there is audience observing his illusion. If there is no audience, there is no illusion.

So to me it sounds not so much like “consciousness is an illusion” is wrong but rather that it is a non-sensical statement. Likely the problem is that I don’t understand it fully. Please explain.

It seems like a variation of the “if a tree falls in the forest and nobody is around to hear it, does it make a sound?”

Some phenomena are defined by being observed. If they’re not observed they don’t exist. And the act of observation requires an observer.

So I agree. The existence of an illusion requires the existence of an observer.

Where did you get the idea that consciousness is an illusion? In terms of Hindu non-dualistic metaphysics, shuddha chaitanya (pure consciousness) alone exists, everything *else *being an illusion. Names and forms, including those of the Gods, arise from it and subside in it.

It is a concept seriously put forward by cognitive researchers. I believe Daniel Dennett and Sam Harris are popular proponents of the idea.

The OP could be talking aboutthis concept, that the brain is a machine and consciousness as we perceive it is an illusion. BTW, that is how it works. Traditionally we have imagined consciousness as some kind of magical ability, but there is no evidence that the human brain is anything but a machine that dreamed up this concept based on a lack of knowledge of how our brains work. Consciousness is a myth to explain how our brains work.

My conscious brain knows I’m not gonna fall off the spinning earth. But somewhere in my subconsciousness there’s a little tiny neuron, or whichever they’re called, it fires a message. And any time I’m in a wide open space, it happens. So my conscious brain decides to throw a panic attack at me, in an attempt to make me run and get under something. That’s how I know consciousness is real.
YMMV

I agree with the OP’s referenced position (don’t know if the OP regards it as their own viewpoint or not). Whatever-the-fuck I may or may not be, I’m conscious. My consciousness can’t be an illusionto me.

If the machine that is the human brain dreamed up this concept, then I am that human brain-machine and it, and I, am conscious. Unconscious brains don’t dream things up.

If I write a book, and in the book is the sentence “This book is conscious”, then one could reasonably say “the book thinks that it is conscious”.

Is it?

Whatever it is, is the book’s status any different than that of a mind, that has written in it “This mind is conscious”?

You can’t reasonable conclude that the book is conscious. But if the book is consciously pondering the question, the book is entitled to conclude in the affirmative. You should not take the book’s word for it; the book’s consciousness could certainly be an illusion to you. But not to itself.

From two recent essays on precisely said topic which have come out in the last month:

Michael Graziano

Keith Frankish

To do this of course they pretty much had to ignore a quarter-century’s worth of debate on said topic.

Where has that debate gotten us and why should we pay any more attention to it than what any other professional has to say?

These are serious questions, not snark.

It seems to me that you are being pretty much DeCartesian here. I’m more of a fan of the Hofstadterian Godel Escher and Bach strange loop approach. And that Strange Loop book is a faster read! The linked article condenses it well:

The framing of the op is like declaring that the chicken had to come first for there to be an egg.

No, because you wrote the book. But if a book was capable of generating that line by itself then I’d be willing to concede it is conscious.

Honestly though the illusion of consciousness is a boring debate. Bottom line is so what if it is? If consciousness is an illusion (emergent of a system containing itself as part of its set) it is a necessary one. “I” experience a self be it an illusion or otherwise and that experience is necessary for my self to exist. Eliminate that sensation and I am no longer “I” but an automaton.

I have no difficulty with the notion that consciousness is emergent as an outcome of the strange loop kind of process described by Hofstatder and yet I don’t consider that to make it an illusion. I still experience mine. If I’m a strange loop via being a program sufficiently complicated that it contains a model of itself as part of its overall world-model, I’m nevertheless a conscious strange loop.

I’d like to see some falsibility if this is a scientific discussion. If the hypothesis “consciousness is an illusion” is true, what predictions can we make? If it is not true, what predictions can we make?

I think there are different senses of ‘illusion’ that need to be distinguished here. One sense is that of a non-veridical experience: I have the illusion of there being a black cat in the room if I have an experience of seeing a black cat in the room when there isn’t, in fact, a black cat in the room. In this sense, saying ‘conscious experience is an illusion’ is indeed nonsense: I can’t have a non-veridical experience of conscious experience—indeed, I don’t have any experience of experience at all, the experience is all there is!

But that experience can be systematically non-veridical, and thus, be illusory in a different sense. This sense concerns false conclusions that are only apparently implied by some data—in a literal sense, for instance, if I have a study of some illness indicating the effectiveness of a treatment which turns out to have been a statistical fluke (in which case I could rightfully say that its efficacy turned out to be illusory), or in a more figurative sense, if some items of knowledge—perceptions, memories, and the like—strongly imply something to be true that actually isn’t, as in when I think that the hairs on the cushion imply there was a black cat in the room, but there actually wasn’t.

So illusionism, in my opinion, means that experience is systematically non-veridical in the sense that it doesn’t have the properties that we normally, and perhaps unavoidably, associate with it, and thus, the word ‘experience’ does not actually refer to anything real in the world, but merely our misguided beliefs. For instance, experience is often claimed to be ineffable—that is, I could never explain to a congenitally blind person ‘what it’s like’ to see the color red. In principle, however, that could be wrong: it could merely be really, really hard, involving the transfer of quantities of information that we can’t fathom transferring via language, or transferring that information in a way that we haven’t yet thought of. (For instance, there are people that claim to be able to visualize four-dimensional objects, without obviously ever having seen one; thus, the capacity of having that experience must be something that’s transferable without the need of actually having that experience, and that transfer presumably was enabled only by our development of the requisite mathematics.)

It’s this sense in which consciousness may turn out to be illusory: the properties we consider it to have may simply turn out not to apply to it, and hence, the term fails to refer to anything in particular—i. e. there’s nothing of the sort of what we believe ‘consciousness’ is in the world.

I have to say it doesn’t really concern me that much.

I have a mechinism inside my head that helps me navigate the world. It analyses input and makes decisions. I also have some sort of a view of that happening which I consider to be my consciousness. Rather like a computer may handle inputs and crunch numbers and outputs the results on a screen. The actual processing happens behind the scenes and I only become “aware” of it when it is displayed. That “screen” is what I might call my consciousness, it is a manisfestation of the processing, not the the processing itself.
This seems reasonable to me because it is entirely possible for all that processing and decision-making to happen and for the “screen” to be turned off (when asleep, dreaming, sleepwalking etc.)

I realize you’re probably speaking metaphorically, but it’s important to note that whatever way consciousness works, it can’t be anything at all like that. Because viewing stuff on a screen requires being conscious of what’s being shown; so if consciousness were some internal screen or analogous instrument just ‘highlighting’ some data, then we’d beg the question against how this data is itself perceived—is there some entity in the brain (called the ‘homunculus’ in this sort of debate) that perceives the display? If so, then how does its perception work in turn?

Trying to explain, or even describe, consciousness in terms of data displayed for internal consumption thus appeals to the phenomenon it’s trying to explain by way of explanation, leading to the so-called homunculus regress.

This is an important distinction. however, Daniel Dennett in particular speaks of the illusion as an illusion of experience, not an illusion of conclusion.