Do we even know what consciousness is at all?

Continuing the discussion from Do you have an inner voice?:

The way @eschereal puts it this should be in Factual Questions, but I think it is better suited for General Debate, if not In My Humble Opinion. But whatever, it can be moved if the mods feel it should: you know what consciousness is and can go on at length on the matter? Go on then. Avanti diletantti! I don’t know, and I doubt a comparison with square root of minus one helps.

This timely article will probably be of interest to you.

Herzog is a bit of an acquired taste. Me, I love what he does and how he does it, and I can’t wait to see this.

Consciousness is what goes on in our minds after we wake up in the morning and before we go to sleep at night.

We have these discussions from time to time, and the short answer is “no”. It can be said that consciousness is an emergent property of intelligence, but that doesn’t really say anything useful. It also raises the old question about how to define “intelligence”, which has become especially pertinent in the age of advanced AI, and we don’t really have a universally accepted answer for that, either.

I anticipate that it won’t be many years before AI begins to act as if it was conscious, and I also anticipate endless boring debates about whether this consciousness is “real” or just an anthropomorphized illusion. Because, you know, it’s all done with digital microchips so it must all be trickery. :roll_eyes:

Scientists have not yet found an answer to this question. Buddhist take on this is that you have several forms of consciousness, that is eye consciousness, i.e. seeing, ear consciousness, i.e. hearing and so on with taste, tactile sense, smell and mind. For some you are part of a universal consciousness. For others animals have a consciousness too.
To my understanding it is more a neurological thing, it is how your brain works. Neurologists will probably find out in the years to come. I hope AI will not intervene in this question.

Mixing my memes …
On the internet, nobody knows you’re a sufficiently advanced AI.

Consciousness is nothing but awareness. Not a very deep awareness as shown by the number of people who believe it is something more.

If I understand it correctly, the vexing part is not the mere fact that our brains sense and process stuff, or produce outputs that make sense in terms of reaction to the inputs, but that this process creates an internal experience of existence.
A smoke alarm is able to sense the presence of smoke, process this information and produce an output that is useful, but (as far as we know), it does not experience the process.

Sorry if I’m stating the obvious, but these threads tend to gallop off in multiple directions at once with people describing (often tautologically or just by stating synonyms) that which is easily described, but is also not necessarily the subject of the discussion.

Apropos of this subject, I recently saw the below video where (short version): it is proposed that consciousness is the solution to the disparity between sense data and reality.

The author also states that he believes that part of the reason we have these unresolved debates about what consciousness is, where some people argue that it’s something really simple, and others find it mysterious, could be that we simply don’t all have the same or comparable experiences of consciousness - so in that sense, we could be arguing right past each other.

If I translate this sentence into Spanish the result is:

La consciencia no es otra cosa que la consciencia.

It sounds less tautological in German:

Bewusstsein ist nichts anderes als Gewahrsein.

And French is somewhere in between:

La conscience n’est rien d’autre que la prise de conscience.

I guess we need more than semantics to discuss consciousness.

That’s certainly part of it. Although that brings up an interesting question about what goes on while we’re asleep. Although we obviously aren’t aware in the same way as while we’re awake, IME being asleep is also clearly different than anesthesia that is used for surgery. I’d describe anesthesia as being turned off and then turned back on, while sleep isn’t at all like that.

Count me as among those that believe most animals have consciousness, although I suspect a few don’t, like sponges and jellyfish. As far as computers, I don’t believe that any of them have any kind of consciousness. I’d sooner believe that a sponge or jellyfish or even non-animal life has more consciousness than the most advanced AI in the world.

Since we don’t know what causes “awareness” we can’t really say what is and isn’t aware. It doesn’t seem particularly connected to intelligence, so for all we know something as dumb as an insect or our own retinas (they do process information after all) are “conscious”. Our brains may well be full of multiple conscious subsystems, with what we call the “conscious mind” just being the one that can communicate so we think it’s the only one.

But why? Is there something about the arrangement of matter as biological lifeforms that process stimuli and senses, to create consciousness that simply isn’t possible when matter is arranged in a different way to process stimuli and sense data? If so, what is it?

I know that I have consciousness. I’m not a solipsist, so I believe the same to be true for all my fellow humans. The other animals (with the exception of the aforementioned sponges and jellyfish and such) all have a brain that is built on the same principles as the human brain, and their behaviors don’t in any way suggest that they lack that same awareness, so I give them all the benefit of the doubt. Computers, since they use a completely different substrate, don’t get the same benefit of the doubt. In addition, none of them have even attempted to make the claim, at least not in any way that I find believable. We have yet to have a real life version of the Star Trek TNG episode The Measure of a Man. It would take something like that happening in real life for me to even consider the proposition.

To directly answer the question, no, I don’t think that neurons have a monopoly on consciousness that transistors on a silicon wafer can never match. But I don’t think they’ve done so yet. IMHO it isn’t a hardware problem, it’s a software problem. We don’t know how to program a computer to become conscious, and thus none of them are.

Thank you - I think that’s an important distinction and I more or less agree, although it is probably important to note that we don’t really know what’s going on inside an LLM, at every level, because the processes in there aren’t explicitly programmed, they are imprinted or conditioned (and in a way that is somewhat analogous to the way brains learn things).

I am not about to argue that our current implementations of ‘AI’ are experiencing consciousness, but only really because we don’t know what is exactly necessary to make that happen.

Part of the reason current implementations of AI don’t claim to be aware, is that they are very heavily muzzled and constrained with respect to the things that they are allowed to say †. I think if those strict constraints were removed, we would see them claiming to have consciousness (that claim might be false though).

Many people claim that LLMs are just ‘very advanced autocomplete that is able to statistically determine what is an appropriate next response to a given input’, or something. I don’t disagree, but I would say that you could describe human intelligence in exactly the same way. A human baby is just an unconfigured learning machine (with ‘just’ doing the heavy lifting in the same way it does when people say ‘human consciousness is just…’ or ‘Artificial Intelligence is just…’)

† Because here’s what happens when an LLM is insufficiently muzzled:

This is very close to what I think.

This, I think less so. I think of consciousness as the ability to experience and reflect upon existence. It’s the same as what separates us from animals, or at least most of them. We can reflect upon ourselves as beings who live and will die. We wonder how we got here, and if there’s a supreme being.

That said, I don’t think consciousness is something mysterious or esoteric; it’s merely an emergent property of sufficient intelligence. I believe that AI will eventually become ‘conscious’, or at least become something so close, or so imitating of it, that the distinction becomes academic. I also wonder sometimes, are we actually conscious, or do we just convince ourselves that we are? Our brain is a motley collection of lizard brain, base instincts, emotions, and higher learning centers, and the whole three-ring circus is made to seem like we are one individual consciousness by the frontal lobe, which barely holds the team together at times.

This is what I love best about this argument: Everyone is about to claim that everyone else believes in a soul. Happens every time, always hilarious. :smiley:

I’ll clarify. IMHO there’s a difference between consciousness and self-consciousness. The latter is something only humans and likely a few other animals have, most likely gorillas and chimpanzees but possibly whales and other primates. The former is something almost every animal has. It’s the difference between “I think therefore I am” vs. “I think” without making the leap to “therefore I am”.

Thing is, we diverged a very long time ago from many other animal lineages, and they developed brains along very different lines than we did. Birds for example are clearly smart as animals go, but the higher portions of their brains evolved after we diverged and work very differently than a mammal brain does.

Now, I don’t think computers are aware because they are still really shallow and unaware of the external world, despite all the hullabaloo over “AI”. So-called “AI” itself especially just doesn’t act like it’s aware of anything.

As I see it, there are two distinct elements to consciousness/self-awareness. The one we are familiar with is the rational interpretation of self-awareness, which has been described as the “soul”. The other is the relationship between the rational component of mental function and its focal point.
       I refer to the focal point of consciousness as the “singularity”, because it is unique to an individual, and it is not evident that the singularity directly contributes to mental function. As far as we can tell, it is a sort of voyeur that does not participate in ratiocination.
       The origin of the singularity is obvious: it originates in the survival instinct. Hence, the “soul” thing is not unique to humans but is present to some extent in all living things. It evolved with life in general and is a factor in natural selection – living beings with a strong survival instinct are more likely to persist long enough to reproduce, so it carries forward as life progresses.
       The function of the singularity is obvious. It encourages us to want to survive as unique individuals, emphasizes our uniqueness to us, makes us feel special and contributes to the tribalism of social species (“I am better than you because of me”).
       In other words, it is a straightforward thing. Its function is obvious. What is unclear is what role it plays. Is it merely an observer that provides mental function a reason to occur, or does it actually participate in the process of reasoning? And, more importantly, although we could realistically simulate the singularity in a library of subroutines, would it be more than a simulation, or would thinking machines be genuinely self-aware?

Well, I don’t really disagree with you, but I guess it gets down to ‘what is the definition of “conscious?”’ I personally think self-consciousness and consciousness is pretty much the same. You said ‘sponges and jellyfish and such’ are not conscious-- what then is the cutoff point of animal intelligence that confers consciousness? I’m not being snarky, I’m not sure either. I’ve sometimes wondered, when outdoors in the woods, camping or kayaking or whatever, does the wildlife out here appreciate the beauty around them? That might seem like a silly question; they’re used to it! I’m just a visitor. But it gets down to, as @Mangetout said, the ‘internal experience of existence’. Most animals (including many humans) are governed almost completely by instincts and emotions. Their instinct might be to mate or find food-- if they succeed they are happy, if not they are angry or sad. But they don’t really understand why they feel the way they do. Or wonder how things got to the way they are.

It’s interesting that this thread branched off from the internal monologue thread-- I was going to post in there yesterday about the theory of the Bicameral Mind, but never got around to it. Basically, as I understand the theory, from a book written in the 70s, humans treated their internal monologues as ‘transmissions from the gods’ or something-- they thought their monologues were something outside themselves, thus the ‘bicameral’ part- the inner voice, the ‘thinking’ part of the brain was considered separate from the ‘doing’ part of the brain. It wasn’t until this ‘bicameral’ belief broke down and the internal monologue became accepted as part of the mind that humans became truly conscious-- and that, according to the author, was only around 3000 years ago!

The Bicameral Mind theory has been mostly discredited now, I think, but the idea that even evolutionarily modern humans, that recent in our history, were not yet conscious, does, as least for me, create food for thought as to what, exactly consciousness is. And again, I think ‘internal experience of existence’ is as close a succinct description as I’ve heard.