[QUOTE=SentientMeat]
Hi, TibbyToes - apologies for the delay. I’ve been in Tokyo and did try to reply but the keyboard refused to behave itself.
[/QUOTE]
Ah, the ol’ “my Japanese keyboard didn’t work” dodge, eh?
After racking my brain over the course of many sleepless nights, trying to pinpoint the most accurate and literal graphic representation of my philosophy of the mind model, my eureka moment came when I found this.
Alright, perhaps not quite a “literal” representation (and, no more far-fetched than Kekule’s silly dancing benzene ring snake vision, IMHO), but, actually, a hamster in a habitrailwill suffice well enough to represent the salient features of this model (#1)—and not in a cortical homunculus sort of way, either.
The moment you become conscious, your brain lays down the first of very many habitrail sections, and pops out a baby hamster, named Hoppy, into it. Hoppy begins his (i.e. your) lifelong journey of consciousness through the habitrail, created and added, in interlocking fashion, section after never-ending section, till you both die. Hoppy is always in the most current section, never quite falling into the abyss, before the next section locks into place. While, he has just enough room to turn his head and see his real (“hard-wired”) past, he can’t actually turn around and revisit it. Looking forward, Hoppy can perceive and anticipate a real (“hardwired”) future ahead of him, and, though many branches of habitrail may present before him, he can only commit to one—only one hamster per closed system trail…unless you’re someone like Eve, she had three hamsters in her cage :).
I believe there is more than one level, or echelon, involved with consciousness and the higher order representation theories (HOR)* seem most logical to me. In this scenario, each section of habitrail represents a level-1 echelon of consciousness: mental states—thoughts and perceptions (along with the requisite accumulation of memories). These sections may be re-created anywhere, anytime, as generic assemblies of fundamental particles corresponding to any instant of a conscious beings metal state—perfectly in alignment with the materialist worldview.
Acting upon these level-1 mental states is the supervening higher order of perception (HOP), or Hoppy, our hamster. While identical hamsters may certainly be created down to the exact same assemblage of fundamental particles and are valid hamsters in and of themselves, only Hoppy was created (as a process) and hardwired (or, should we say, hard-habitrailed) from a particular brain, composed of particular particles in space-time—making him (i.e. you) unique in the universe, though not in conflict with the materialist worldview.
Now, carrying this hamster metaphor further, to represent what I believe to be your accepted model (#2) entails placing a hamster in each self-contained section of habitrail. Each instant, your brain lays down a new section of trail, but the section is sealed on both ends. As your brain creates the next section, it puts a new hamster in it, leaving all previous hamsters to wither and die. Just as this new hamster walks with joyful anticipation toward the next section—bam—he hits the sealed see-though plastic end and begins to suffocate, tormented by the sight of a new hamster imposter in what he perceived to be his future (his fate no better than had he traveled via a Star Trek transporter with a broken arrival pod). This model fails due to cruelty…and, because there are more hamsters than necessary, it fails the cut of occams razor. (OK, under a thinly guised appeal to emotion, there is some valid logic beneath).
Recap: I believe that, while memories and mental states (i.e. level-1 consciousness) may be recreated or copied ad infinitum, when and if they are, a new and unique higher order of consciousness is created at the same time and from that point on, must follow its own future. There is no measurable difference between an original and it’s copy, they are both valid individuals. The copy has a real and unique future and an imagined past (take him back in time and he ceases to exist prior to the point of duplication); the original has a real future and a real past.
[QUOTE=SentientMeat]
I just don’t see the need for it in an Ockham’s Razor sense. If it was true, then swapping over a few atoms for identical atoms would surely cause some kind of ‘fading’ of the original consciousness. I know that evolution doesn’t happen to do this for deep CNS neurons, but I would ask you to imagine if this was the case. The rest of the brain and body regenerates continually without any diminution of your feeling of ‘you’. If we replaced, say, 1% of deep CNS atoms with identical atoms every day, would you think ‘you’ would gradually fade away somehow?
[/QUOTE]
OK, I’ll treat you to another fun analogy to help explain my thoughts on this (and, by the way, where is our friend, Mangetout? If memory serves me correctly, he used to be thoroughly enthralled by my analogies…or maybe I’m thinking of someone else :p). As mentioned before, I liken consciousness in general to being a unique process (“current”) that’s switched on in ones third trimester, from a particular brains circuitry (CNS neurons in perpetual physical contact with each other, passing information…cascading, recruiting and whatnot—which, of course, at a deeper level, corresponds to a particular arrangement of atoms). Once switched on, this current is a unique entity that doesn’t switch off completely until death, although it may exist in different states and may even be damaged…or enhanced, during its existence. I don’t believe it can be divided with both parts remaining viable, nor replicated and remain completely identical. If using HOR terminology, this current would be the Higher Order Perception (or Thinking, if you prefer that version).
Sallying forth with the analogy: imagine 100 people with arms stretched above (i.e. the atomic configuration of the relevant neuronal circuitry), supporting a large slab of Jello, one with just enough tensile strength to remain intact under normal wear and tear (i.e. HO consciousness). As long as this slab remains aloft, wiggling and relatively intact, consciousness continues, unabated. So, we have 1000 fingers supporting this material thing (Jello) engaged in a higher order process (wiggling). Certainly, you may replace one or a few support people at a time, with little to no effect on the Jello, or its wiggle. In fact, it’s conceivable that you could eventually replace each and every person, keeping the slab and wiggle viable. Try to replace (or remove) too many people at one time, however, and the slab will be damaged in some deleterious form or fashion. Try to replace all at once, for example, and it will fall and stop wiggling, even if a new crew of 100 attempts to pick it up immediately—it’s code [del]strawberry[/del] red for the Jello. Remove key supporters and chunks of Jello begin falling to the ground; lose too many over time and even the wiggle begins to falter, resulting in the slab sometimes forgetting what type of Jello it really is (e.g. loss of memory and personal identity from senile dementia, something I’m familiar with in my immediate family). Somewhat paradoxically, once in a great while, losing a significant amount of Jello, shortly after its made, may result in a new and improved wiggle (like a beneficial harmonic or overtone wiggle), ultimately more functional than the original (something else I’m familiar with in my immediate family, interestingly enough).
So, there you have it, my philosophy of the mind can be fully explained with hamsters and Jello. I can’t imagine why you find it so hard to take seriously. :dubious:
*I’m not necessarily an advocate of the HOP model as opposed to the HOT or generic HOR, but “Hoppy” seems a better name for a hamster than “Hotty” or “Whore”.