So conscious thoughts are thoughts that are thought as well as perceived, and subconscious thoughts are only thought? Fair enough, but then, what is my consciousness if not the sum of my conscious thoughts – and what perceives my thoughts if not my consciousness? One too easily gets tangled up into a Cartesian theater-like fallacy when arguing like that: thoughts show up on a viewscreen in the brain, read by some homunculus, and thus perceived. But then, how is this homunculus himself aware of what he reads? Is there another homunculus in his brain?
‘Virtual perception’?
Did you just totally make this term up? If this term has ever been used to mean what you are describing here, I can find no mention of it on the intarwebs. …except for possibly one google books result from some quack of a philosopher. The only reason I can’t rule out the possibility his meaning is similar to yours is because he’s so vague and convoluted I can’t discern any meaning from him at all.
I don’t think such a thing as virtual perception exists. Now that I think about it, I suspect the term was deliberately invented to accommodate this, wasn’t it?
Half Man Half Wit
Is your knowing of what I’m writing based on anything but my writing it?
I’m gonna have to guess yes.
I see no reason why knowledge of your thoughts would be exempt from the restrictions that apply to other knowledge.
Well, I see it, and based upon that, form an internal representation of it – within my thoughts. See the dilemma? It’s turtles all the way down.
Really? I can find at least 100 scholarly papers that use the term in a somewhat similar sense, mostly in the area of cognition and AI. Maybe you lack google-fu?
But anyway, even assuming I pulled it out of my ass, the meaning is
(a) reasonably clear from the two individual components of the phrase, to anyone with a dictionary or the Internet; and
(b)** explained in a paragraph **immediately fucking following my use of it.
…so why am I suspicious that you merely want to play “Semantic Nitpicker” rather than have a real discussion here?:dubious:
When you understand Strange Loops, what consciousness is, what thought-applied-to-thought is, what mentalese is and how perception, cognition and sentience are intertwined, I will nominate you for Philosopher King.
Seems like you’re already claiming that position for yourself, to be honest.
MrDibble:
What I found was all in the area of AI, and was obviously referred to as ‘virtual’ due to being a part of a computer simulated mind rather than a human one. This is a clearly different concept than the one you described.
a) your meaning was clear from the words and their context, but that meaning seemed to refer to something for which there was no justification for belief. I decided to give you the benefit of the doubt and assume there was either a justification or a less obvious meaning I was missing. One perhaps established elsewhere.
b) I saw the explanation. That’s how I know other people using that term don’t mean the same thing you do. When others use it, they do not mean non-sensory perception. They mean perception with senses similar to that of a human, but performed by a machine. (and every now and then, someone uses it to refer to normal human perception in a virtual environment) There is no way to reconcile these concepts with what you have described. Since my google fu is indeed weak at times, I just wanted to confirm whether this was a unique definition of yours for the purposes of this discussion alone and I should just stop looking elsewhere.
I have no problem accepting your term and your personal definition… but since nobody else has ever discussed this virtual perception you describe, we are left back where we started. What reason can you offer to believe such a thing exists?
Half Man Half Wit:
No I don’t see a dilemma there, but I think I may see your problem your last post.
There’s your problem, IF your consciousness is just the sum of your conscious thoughts (those thoughts you perceive) then, NO it is not your consciousness which perceives your thoughts. Your mind perceives your thoughts. There would be some recursiveness, (being aware, of your awareness of yourself) but, as we seem to be demonstrating right here that awareness is somewhat lacking, so it’s hardly ‘turtles all the way down’. 2 or 3 at the most
snatches crown
Why?
Because I pointed out the fact that there is a functional and objective difference between thinking something and knowing that you have thought it?
It’s hardly a claim of some sort of philosophical dominance to point out that we don’t understand Strange Loops, what exactly mentalese is, what consciousness is, etc…
Those are just facts that pretty much any philosophy/linguistics/psychology 101 course will point out pretty quickly. I don’t think that Plato was going for “basic competence” when he was talking about the traits of the PK.
No Justice, you’re making a homunculus argument, whether or not you try to disguise it by changing labels; putting the onus of perception on the ‘mind’ does not resolve it, it just makes the mind the homunculus.
Umm, no. I’m really not.
Which label do you think I’ve changed?
I misread you; I thought you were claiming that you, unlike me, knew the ins and outs of those concepts. The egg’s on my face here. Unfortunately, though, this apparently means that we’re kingless.
You’ve exchanged ‘mind’ for ‘consciousness’ and then claimed that that solves the problem. I don’t think it does.
Fuck no. I just know enough to know why they’re so confusing, paradox ridden and resistant to easy answers. I know (some of) the questions. I know none of the answers.
Consciousness Explained is an interesting book, but it’s hardly gospel and Denett is hardly infallible. Perhaps I’ll hunt down my copy somewhere but it’s probably not worth my time.
The facts of the matter are that seeing is simply not the same thing as perceiving, at all. Imagine a camouflaged moth on a tree. If it covers a red dot and you do not see the red dot due to the moth being on top of it, you can objectively prove that you have seen the moth. But if you notice the moth or someone points it out to you and you perceive the moth as a moth, then it is objectively different from the mere act of viewing. That is, there is a definitely, objective, verifiable difference between seeing and perceiving, and that difference is mediated by consciousness.
That doesn’t change if we look at change/attention blindness, or how the mind constructs perception out of saccades, or what have you.
Of course, that doesn’t tell us what the various ‘parts’ of a Strange Loop actually are, but it does point out that the Loop itself is not a fallacy. It’s a reality.
I didn’t exchange ‘mind’ for ‘consciousness’. I used BOTH terms in the explanation to refer to different things. I used the quite narrow and mundane definition of consciousness established in the quote, to refer to the sum of the thoughts which you perceive, and I used the more general term ‘mind’ for the total sum of all thoughts and thinking processes both perceived and subconscious. (plus whatever other bits you may think a mind should include).
I made no suggestions as to what any subconscious thinking processes are because, that would inevitably lead to Homunculus territory. That’s a mistake commonly made by those who attempt to refute the “the mind has magical abilities to know itself without need of perception” claim, but just because most people make that mistake doesn’t mean any refutation of the claim is automatically a homunculus argument.
Yes, seeing and perceiving are different things, though what you are saying seems to me more to relate to recognition – you could make the same argument if whatever ‘sees’ the moth on the tree covering the dot is just a sophisticated image recognition program. It can similarly succeed or fail to recognize the moth as a moth, yet I don’t think many people would claim that the program has perception in the sense a conscious human being has, just because it can put a little tag saying ‘moth’ onto the image.
So, you write something; I see it, which leads to a change in my conscious thoughts. The mind ‘sees’ that, causing a change within itself. And then what? Can I be aware of that change, if, as per your own arguments, awareness only arises through perception? Or do I get to claim ESP in that case, then?
If that’s all that was happening, then perhaps. But consciousness can and does also notice that is had noticed, notice (and analyze/question) the difference between noticing and not noticing, attach semantic and emotional significance to the concept of ‘moth’ (and “noticing”), analyze and question that significance, etc…
We cannot ignore the Strange Loop that has to go on for us to see something, perceive it as a specific something, perceive that we have perceived it as a specific something, and so on.
It’s not as per his arguments, it’s as per fact.
In order to be aware of thought processes, you must first perceive them. They are perceived by the action of mind-upon-mind or thought-upon-thought or consciousness-upon-consciousness or perception-upon-perception if you would prefer.
You cannot get away from the Strange Loop.
You might be aware of that change… if you perceive it, which would be another level of recursiveness of course. One more turtle in a very unstable stack. And we can only build that stack so high in this fashion. Exactly how high?, I have no idea, but the very fact we’re having this discussion demonstrates that we are extremely limited in our ability to perceive our own minds. Luckily the human mind manages to do a pretty good job of understanding the world and making choices without really understanding itself.
Yes, and that’s where the magic comes in. (And, to pre-empt misunderstandings, by ‘the magic comes in’ I mean ‘I don’t have a clue what the heck’s going on’.)
I’m not sure if there is necessarily a strange loop, though. Just as an analogy (which of course proves nothing), consider how the mother of a hen is always a hen, just like there appears to always be another higher-level homunculus watching the internal state of the lower level homunculus. This is essentially an infinite regression of hens (let’s ignore the subject of eggs for the moment) – there’s always a hen that’s the mother of the previous hen. A strange loop would be to postulate that some hen is actually the mother (of the mother of the mother of the mother etc., perhaps) of itself – on the face of it, a plausible solution. But in reality, we find another resolution of the infinite regress realized, namely that of evolution – a self-organizing process of gradual changes terminating in the present-day hen as an emergent element of the evolving system. So there’s not necessarily a strange loop wherever we need to break an infinite regress.
Yes, that’s pretty much what I was trying to get at; but then just giving up at some level and postulating a black box is too defeatist for my tastes.
Which black box would that be?
The fact that one cannot describe the exact process by which the brain interprets information?
Or the idea that the mind’s knowledge of itself just appears out of nowhere without being perceived?
One of these black boxes is unavoidable for now, as any attempt to fill it in suffers from the homunculus problem.
The other is not only unnecessary, but in fact is known to be impossible, because it requires information to come out of a black box without anything entering the box to begin with.
Well, pointing out our ignorance on the matter was my sole intention; and the fact remains that this ignorance prohibits you from categorically stating that thought must be perceived in the same way everything else must be for us to be aware of it, because if that were the case and universally applicable, then the regress would indeed be infinite and unbreakable. At some point, something else must happen; what, I have no idea.
Also, just because you or I can’t think of any other options, it doesn’t mean that there are none – it was once thought that, regarding the hens, either god did it or the whole thing came together by chance, until Darwin pointed out that there’s this whole evolution thingy that explained matters rather neatly. Perhaps our present ignorance just means that we are in need of a similar breakthrough regarding a theory of consciousness.
Who knows? Maybe consciousness is just a fabrication of our memory, and our recall of it nothing different than a computer readout on a viewscreen noone reads?
It necessarily is, and analogy is always suspect anyways. It’s not a case of hen[sub]1[/sub] giving birth to hen[sub]2[/sub]. But consciousness[sub]1[/sub] perceiving the actions of consciousness[sub]1[/sub] and, in turn, having that perception evaluated by consciousness[sub]1[/sub], and that evaluation perceived and then evaluated by consciousness[sub]1[/sub], and so on.
It’s a Strange Loop because, for instance, whatever level of the hierarchy you get to, in order to be apprehended it must be perceived. But perception was the basic, first level function.
This is not, at all, a convincing argument.
There’s nothing, at all, to suggest that the Strange Loop can’t just keep looping and various ‘functions’ of mind can’t be used upon each other or themselves, for that matter.
It is an ‘infinite regress’ in a manner similar but different to mathematical infinity. You can always take the previous number and add one to it. Numbers go on forever and we know this without having to count them. Thoughts, however, do not. You always perceive that you have perceived something, and perceive that you have perceived that you perceived it, and so on. But at some point you die of old age, so it’s not truly infinite. It’s more accurate to say that it is limitlessly recursive.