That’s the case if there is, in fact, a strange loop, which may be a possible resolution to the homunculus problem. But neither is it the only possible one, nor is it necessarily the correct one. My analogy was not between hens and consciousnesses, it was between hens and levels of (ap-)perception. These may close back in on itself to form a strange loop, but they don’t have to by any means.
Which matter is that exactly? There are many things we don’t know about the mind, but some things we do.
I didn’t say that, exactly… just that there’s no reason to believe the mind has a magical ability to gain knowledge (even knowledge of itself) without perception.
Other options for what? I’ve given the only reasonable option for one thing (the minds knowledge of itself) and the other black box you seem so concerned about is entirely beside the point.
There are 2 common descriptions of consciousness (I wouldn’t call them theories). One is that the brain knows itself through magical (known to be impossible) means. The other is that a homunculus with it’s own mind accomplishes the task (though no explanation is given for how the homunculus does it, or why it’s method, whatever that may be, couldn’t work just as well for a human mind alone without the need for any homunculus at all)
We are NOT at a loss for other explanations. What I offer, as does anyone else with a clue, is the simple straightforward theory that the mind gains knowledge of itself not through magic or homunculus, but through perception.
Does this explain everything about the mind? Nope. It isn’t intended to. It’s mearly a rational alternative to believing in nonsense like magical self-knowledge.
Aw, c’mon now. The homunculus is used as an illustration of why it can not be simple perception that leads the mind to acquire knowledge about itself! It’s not a theory of consciousness, or even a description thereof. (Anyway, this hijack’s probably gone far enough; if there’s further interest in a discussion, maybe we should open a separate thread.)
That may be what you’d like it to be, but that’s not what it is.
I suppose my description was something of a caricature. I meant the word homunculus to stand in for any completely undefined (but not necessarily magical or impossible) process.
I don’t think this is a hijack. A thread has to have a topic of it’s own before it can be hijacked.
I would call this an ‘emergent topic’.
Actually, the only thing I’d like it to do is to go away, so if you have anything to invalidate it, I’d love to hear it. My problem remains that, if, in order to be aware of something, it needs to be perceived, that perception itself, in order to be aware of it, needs to be perceived, and so on and so on, which runs on forever without getting anywhere.
Heh. I can work with that.
It doesn’t have to run on and on forever unless the person in question is supposed to have 100% awareness of his own mind. Nobody does.
You are not consciously aware of all the thoughts in your own mind.
You are not consciously aware of all your perceptions.
No, it’s not. Just ask Daniel Dennett.
But that’s what I’m talking about - using the perceptive faculties that normally apply to your sense organs, and applying them to the virtual senses arising in your own mind.
That’s not what I mean, or what anyone else means, by the words: virtual environment.
ETA
and there are no virtual senses arising in your own mind! gawd, did we just go through all that homunculus stuff for nothing?
I’m sorry, but I don’t see it. If, in order to be aware of something (any one, singular, thing), I need to perceive it; and, in order to be aware of my having perceived it, I need to similarly perceive my perception; and on and on – then, at no point do I actually become aware of it. I don’t see how different degrees of awareness even enter into it.
Not having read the entire discussion…
I would say that there is seeing, then “perception” - which is really something more like “recognition”. As in, your brain is given access to your ocular data, and then it either does, or doesn’t, recognize particular details in the image and incorporate that recognized knowledge into the cognitive knowledge base - thus becoming aware of it. And that’s all there is to it.
The brain also has access to data about the cognitive process itself. It then either does, or doesn’t, recognize particular details about its own thoughts and incorporate those details into its own knowledge base. But this isn’t a necessary step to become aware of the details of the ocular data; that’s already happened.
The brain can also, in review it’s ingoing cognition, note that it had a thought about it’s thoughts. But this isn’t a required step to have the first thought in the first place; the later thoughts about the first thoughts are completely separate thoughts from the first thoughts.
Which is to say, a person can have the thoughts:
“That dot is red.”
“Hmm, I’m thinking about a red dot.”
“Why am I worrying about whether the dot is red?”
but the first thoughts don’t depend on the later thoughts occuring at all.
So I don’t see any loop at all. Data is available to the brain, and the brain does (or doesn’t) process it and have thoughts about it. No further steps are necessary.
You do not need to perceive your awareness to be aware of something you perceive.
I don’t see what’s stopping you. Let’s start at the beginning:
1 you perceive X.
2 you are, as a result, aware of X.
Congratulations you are now aware of X!
No “and on and on –” required.
You are not aware of your awareness unless you go on to perceive it, but that is in no way required to be aware of X.
Without awareness of your awareness, you would not be sentient, probably couldn’t even be called conscious, and I wouldn’t be surprised if all kinds of other vital processes of the mind broke down without it, but none of it is required for the awareness of X.
Once again - are you just in this to play semantic games? A virtual environment in the context of a discussion about cognition is obviously a not-real one in the same way a computer simulation is not a real environment. Only rather than running in a machine, this simulation runs in your head.
Are you claiming that you can’t close your eyes and “see” things in your imagination?
You and some other people went through that homunculus stuff. It’s got nothing to do with me.
So what are these “senses” and “perception” of which you speak?
Along the theme of words having meaning, I wonder how much of the argument above is occurring because no one has agreed on what “flavor” of the words to use. In common parlance, “perception” often means awareness. So in that sense, sure, you must perceive a thought in order to be aware of it. But that argument seems rather circular, and circles don’t have points.
In cognitive science, however, it specifically refers to our experience of sensation. And although “sensation” in the vernacular encompasses a wide range of experience (e.g., “a sensation of dread”), in cognitive studies it refers specifically to phenomena involving the sensory organs (sight, sound, taste, touch smell - and our sense of balance, which most of us tend not to think about until something goes wrong with it - and sometimes proprioception, the sense of neighboring parts of the body, which tends not to be talked about as much because it’s rather mysterious and confusing). But anyway, by that definition, I don’t think it’s correct to say we perceive our own thoughts. To extend the definitions of perception and sensation that far, in my opinion, saps them of their usefulness in scientific contexts.
Which is why I’ve carefully tried to outline the non-real nature of the “perception” taking place. Perhaps a better term would be apprehension or introspection
But the virtual environment I spoke of was NOT in the context of a discussion about cognition! That was what I was saying when I mentioned it! It was in the context of an article about virtual reality which I came across trying to chase down some meaningful interpretation of your “virtual perception” thing!
Who’s playing semantic games? Are you really trying to change the meaning of everyone else’s words to suit yourself?
Pretty much. You can perceive images, but it would be irrational to propose that the mechanism by which this is done is in any way analogous to seeing with your eyes.
It does now… because you have just made a homunculus argument.
I guess after going to all that trouble to explain what is not a homonculus argument, I now have to start all over and explain what is.
sigh
on preview:
I have to agree with one thing… whatever this virtual perception thing is you speak of, it’s definitely non-real.
Remembering a memory, or a song, or imagining a woman naked are all as real as feeling angry. Having a thought or a mental image is absolutely real perception, it is simply not real vision.
Further, the claim that cog sci only focuses on sensory input when it talks about perception is simply wrong. It’s silly to act as if cog sci only talks about sensory organ perception when there is a very clear delineation between P-consciousness, A-consciousness, S-consciousness and M-consciousness.
I disagree, and it’s not irrational. I believe that both visual perception and introspection of visual mental models use the same area of the brain - Baddeley’s “visio-spatial sketchpad”. The reason I believe this is because it is well-known that visual perception is already a short-term memory thing, not an as-it-happens thing. The disjunct between the eyes’ reacting to light and the brain’s forming the image you perceive is notorious as both requiring time and being only an approximation mostly built up from internal data.
It’s like I said in my second post to this thread: Your senses are not really your senses like a camera or a Dictaphone, they’re what you brain tells you you sensed. So drawing a line between one sort of internal brain process and another strikes me as excluding quite a bit of middle for no real reason.
Unless you want to start throwing cites around that say different, of course?
Who or what is ‘looking at’ that sketchpad?
This is not a fallacy of believing without evidence. This is a fallacy of speculating in ways which can only lead you further from the truth. That’s right. It’s irrational to even speculate about these virtual senses of yours!
When you’re brain tells you what you sensed… who or what receives that message?
I am not excluding the middle, just accepting it as the Black Box that it is.
Logic does not need a cite.
I am however working on an explanation of the homunculus problem… But it’s taking some time since I have to take into account all your meaningless terms and shifting definitions.
Nothing is “looking at” it. It’s just a name for a connected series of brain regions related to visual memory and perception…
You have yet to prove any irrationality. Argument by assertion…
You do.
But it’s not a black box. We can see the brain at work.
Assertion certainly does.
I don’t have a homunculus problem. I don’t believe it exists.
I’m off to bed. PDF link for anyone interested in mental imagery vs. sense perception: http://www.wjh.harvard.edu/~kwn/Kosslyn_pdfs/2003Wraga_EncycOfCogSci2_Imagery.pdf. Lots more here