The discussion seems to be wandering from the original question, which if I remember correctly is: what is Consciousness? Manipulation of abstractions is not prerequisite to Awareness (or more precisely, to recognition of Self and one’s own mental, as well as physical, existence) and near death experiences and intuition are red herrings. Quantum mechanics (like “Society” in the opposite direction) are probably involved, but at a level of analysis too far removed to concern us here (that is, the biochemistry is much more immediate).
Development and maintenance of Self-Awareness, or Consciousness, is one of the major things which separates simple minds from complex minds. It is explainable in terms of simple mental processes layered recursively on top of each other until they become complex and the mental fiction “This pattern is remembered as ‘I’ because it is always here and can be traced over time; all else is not” develops on top of the unavoidable physical truth, “This is the body I feel and control, and that’s not.” And yes, that’s all that’s needed to explain it (though I point out that Occam’s Razor does not generate proof positive).
For those with the fortitude, an explanation follows. I may not make myself clear, but I hope that it’ll at least give an idea. It’s not rocket science, and it doesn’t require magic or Goddess to explain.
I earlier proposed that “Consciousness” or “Awareness” is not a Thing but rather a state, like “aliveness” or “happiness.” It is a name given to some of the emergent properties of a properly-functioning, sufficiently-complex Information Processing System or “Mind” at work. Historically these mind states and functions have been associated with a Central Nervous System (biological brain and spinal chord, and possibly more), but there is no reason to think they will always be so limited (Cecil’s point). It could also be said that we recognize certain states and the functions/processes/mechanisms that produce them as “consciousness,” but deny such categorization to other states et al.
So: what sort of mechanisms might be important to Consciousness? At a building-block level I would say Memory, Sensory (or other) Inputs, some form of Decision-Making or Comparative process.
Okay, great, so our immune systems are conscious? They demonstrate all of those abilities.
Well, obviously our immune systems are not Aware within the meaning of the English word, so a definition of “Consciousness” requires some higher order or orders of function and complex process. Examples incorporating the basic building blocks could be Prediction and Recursion. (Prediction is the comparison of (i) changing data over time and/or (ii) new sets of data against learned data in order to predict an anticipated state. Examples might be (i) predicting the flight path and landing zone of a baseball or (ii) recognizing that this man one has never met before is about to lose his temper. Recursion would be those processes by which a Mind checks itself and its “outputs” (decisions, behaviors, learnings, predictions) against not only new data and predictions but against desired states, etc. If you will, it is basic to the process of the “Mental Editor.”)
The level and complexity of functioning are critical to our characterization of any mind and its outputs, and to whether we would call that mind “Conscious.” A basic mind (say, in those flys mentioned earlier) takes Inputs and generates, or incorporates them in, a transitory State which then (usually) engenders an Action (such as flying towards the light/green/warmth). (No “intention” necessarily involved.) When the actions generated by a simple mind are prevented (the fly hits the window), that mind doesn’t have the complexity to even remember that it just tried that behavior, let alone to learn to recognize that there is such a thing as “glass.” Presumably, the Memory or Comparative powers necessary are lacking, and it just takes its new set of sensory inputs and tries again.
And below even this level we could argue pretty successfully that plants don’t have Minds at all, for despite the fact that they can detect infections, seal off damage, grow towards water or light, determine what season it is, etc., there is no central-clearing or decision-making point with alterable States dedicated to producing “plant behaviors.” Plants don’t really learn or remember or decide; their behaviors are a result of their natural pre-determined functioning, which reacts and/or produces different future “behaviors” more due to accident or gross physical changes – or, at the intergenerational level, due to evolutionary changes – than due to “experience.”
In contrast a relatively complex mind NOT ONLY interprets Input (say, modulated sounds or finger patterns - “speech”), compares it to previously encountered inputs (interprets the consistent past contextual meaning of said “speech”), generates a contextually-associated “understanding” of the intended meaning of the speech, makes a whole series of second-and-higher-level associations (that’s my wife’s voice; she was happy this morning; in the past when she used that tone it meant more than she is now saying with words; etc.), and makes predictions (if I continue to act this way the outcome may be similar to the previous times) – but it is a mind that has become SELF-Aware. (Thus the ‘I’ in “if I continue to act this way…”) It’s about the ‘I’, not the speech - and not abstractedness or the degree of separation between the intended communicative concept and a physical object or process. (We could argue that almost all speech includes varying levels of abstraction - even just at the grammatical level.)
The complex mind recognizes then a sense of Self vs. Other in a detailed way, over and through time and in a myriad of circumstances. This is where Recursion starts to come into play. The truly complex mind can not only compare the current (complex and evolving) present with the past and infer associations from those experiences, can not only analyze the evolving present for subtle variations and opportunities to move events towards a desired future outcome or state and then further evaluate the results of interim actions in terms of the positive or negative influence on the desired outcomes (this could almost apply to a frisbee dog gauging the flight of its target and adjusting for the wind as much as it could apply to a social encounter), BUT it can further make the (possibly false) distinction between outside events and internal states and can modulate itself as a tool for behavior control. (“I’m getting angry. I can’t afford that. I must control myself. Think of something pleasant… It’s not working; okay, depart this vicinity. But this is my boss; I can’t just walk out. Okay, think of the long term benefits. You can’t afford to be fired now. Just get through this and you will…etc.” In this example, the long term benefits do not exist, other than as a predicted or an imaginative occurrence within the mind. They exist only “in” an assumed future and may or may not ever come to pass, yet they can have great power over the state of the Mind involved.)
Basically that’s it: Consciousness of Self just requires a mind complex enough to remember and recognize its own separateness and its own continued existence over time. And of course if Awareness separates very simple minds from very complex minds, there is also a range on the continuum from less-complex to more-complex. I’d wager rats are Self-Aware on a basic level, dogs more so, humans (perhaps) most of all. Elephants? Of course. Human (or any other) infants? At first not really, but as their brains and minds develop, yes, within the limits of their species. So while the question may be open as to the abstractive ability of non-human ape minds, I have seen no one seriously propose that they are not Aware. Evidence strongly suggests that elephants and apes (at least) recognize and mourn Death - and while confusion at a sudden and complete Difference is understandable, mourning req