A memory is a physical thing.

I thought you made up that list having agreed that amoebas, hot air balloons and kicked balls don’t have memories as such? That they don’t even attain the “at least” of your agreeable (to me) list?

On reread, it seems that I misrepresented your position, Sentient. A thousand apologies. I highlight my error here so no one will miss the correction.

The memory storage is the connected to the knob that sets the temperature. If Mrs. Homeowner changes the setting, the thermostat remembers. I admit that’s not remembering much, but we don’t remember everything either.

Err … I agree entirely. You’ve said we turn heel and part ways and then said how we are completely like-minded. Physicalism is the antithesis of panpsychism.

On preview, thanks :).

Ah, no it doesn’t, I’d say. That’s just setting up a consequence of a future state. It is not a memory of a past state - there is no way to access what states the thermostat has been in the past.

Now, as with Lib’s pressure gauge earlier, it could buffer values periodically. This would attain the first three of DS’s list. But it still does not satisfy the fourth: it still has no means of processing those memories in order to compare to other memories.

A complex feedback-control system like in a nuclear power station, say, might just about attain the bare essentials of the list, thus perhaps just scraping past our “awareness” threshold like, say, a nematode worm. Even then, it would only be “aware” of temperature which, again, I suggest corresponds very poorly to biological cognition.

SentientMeat, at the end of the OP you suggest I ask myself: “Could I be a biological computer?” That answer, from me, is assuredly ‘yes.’ I would like to get that out of the way before we delve further into the chaos of my mind here. Which I probably won’t do until later tonight because I’m terribly busy just this moment.

Ah, so it’s you with whom I disagree, rather than Sentient! :smiley:

The rheostat is not a memory device. It is a trigger device. It isn’t that the thermostat remembers to come on at 70 degrees F. It’s that the thermostat MUST come on at 70 degrees F.

[QUOTE=Liberal]
A memory, in this sense then, is the picture that the brain gives you of what it has recorded but doesn’t know what to do with. It is by being aware of that picture that you make a decision. As I said before, it cannot be possible that you react simultaneously with any event because time must transpire between the event and your recognition of it. Once you have a clue that something happened, it has already happened and is in the past. Thus, there is more than just short term and long term memory; there is also immediate memory. The immediate memory is your first conscious encounter with an event. The more significant it is, and the more it relates to other events, the more likely it is to make its way to the more mediate short term and then long term memory. But most immediate memory simply disappears, being replaced from moment to moment by new immediate memory.

Damn. I can’t tell if I’m wandering into Deep Insight Territory, or if I’m just flailing about in Nitpick Land.

Lib, there’s something woefully oxymoronish about the concept of “immediate” memory.

I (think I) totally get what you’re saying with your driving-a-car example. There are massive amounts of sensory processing going on that we normally are not aware of, primarily because we don’t need to be aware of it. When we do, it’s like someone bursting in the door saying “Boss, what am I supposed to do with this?”

But it’s not as if you’re non-concious, completely un-aware when you’re daydreaming or just toodling along. Is the daydreaming a memory? In what sense?

Your brain forms something for you to be aware of, and if it’s significant, your brain forms a memory. But what is the thing that’s aware and examining?

Your brain? Something your brain forms to examine what your brain forms? Is it comparing some brain-formations with other brain-formations, including itself? Is it brain-formation turtles all the way down? Can I have some of eris’s gin?

This whole thing seems to be walking and quacking exactly like a Cartesian homunculus.

Immediate merely in the philosophical sense (this is, after all, a philosophical discussion ) — immediate as in not mediate; i.e., no mediation between the event and the memory other than the aforementioned involuntary and spontaneous processing by the brain to present it to you. As I stressed at quite some length, it is not immediate in a time sense. The event is in the past. There can be no immediate recognition in that sense, let alone memory or awareness. It takes time for neurons to do their thing. You don’t even know there was an event until that process is finished. The event, meanwhile, is gone.

I didn’t say you’re completely unaware. I said you’re unaware “about the road”. I’m beginning to feel like Mozart being criticized by the Austrian Emperor. Maybe I’ve put too many notes in my post, and some of them are going unheard. :wink:

The brain does not operate as a monolythic entity. Different parts of it tend to different things. That’s why I wrote that it “dispatches involuntary activity to certain less cerebral portions of itself”. There’s nothing recursive about the concept.

I was more agreeing to them not having “awareness” than not having “memory”. If an amoeba works solely via – what was the term used earlier? – deterministic forces, I’d agree that it is not aware. (Although I think there might be problems with this too; it’s just too early to bring them up.)

Now I’m debating somewhat. (And, on preview, I see that you’ve already addressed some of the following.) My point would be that “memory” is necessary, but not sufficient for awareness, which is why it only occupies one item on the list. “Memory” is a static thing that, when talking about the mind as a computational process, can be referred to as a symbol. This really is important and not me quibbling, as computational symbols are not fluid – for otherwise there can be no computation. The amoeba lacks the mechanism to compare (or even store?) a memory to its world/other memory/internal state, so it’s not aware. A thermostat (as opposed to a thermometer) does have such a mechanism. It’s not that I want to count a thermostat as aware, it’s that there’s still something lacking in my proposed list.

I think one might object to my positing that memory is static; after all, human memories grow fuzzy, are incomplete, etc. However, as a bridge to computation, this is necessary. If we throw off that yoke (that the mind really is a computer, that is, a Turing machine), that’s fine. But I think that’s contrary to (or at least an incomplete accounting of) what you’d like to get to.

I have to read the posts made while I was putting this together to see if there’s more…

Thanks for addressing this; I think I’d have misinterpreted it also.

I’m not sure this addresses the point. Sure, some responses can be automatic. But what performs the examination of memory? How are decisions made regarding those examinations? If a thermostat does not make decisions because its actions are deterministic, what is it that is non-deterministic in awareness that can make them? I think you’d reject a simple rule-based mechanism. Would you also reject an incredibly complex rule-based mechanism (such as the nuclear reactor SentientMeat proposed)? I think so, but am trying to understand – why?

You’re disappointed that Sentient doesn’t hold that rocks are aware, aren’t you? You earlier stated that you were under the impression that physicalists believed as much. Can you point me toward a prominent physicalist saying such? As to whether I believe it, or I’m just trying out the idea to see if it works, I’m not sure. My philosophical position when I’m driving a car is naive realism. When talking philosophy, my position is that of a student.

You say" It isn’t that the thermostat remembers to come on at 70 degrees F. It’s that the thermostat MUST come on at 70 degrees F."

No, the thermostat does not remember to come on at 70; the themostat remembers that 70 is the temperature to come on at - it’s the POSITION of the rheostat that is the memory.

But surely you can imagine providing a thermostat (or perhaps a more complex machine) with some mechanism to formulate memories. A CPU and some RAM, perhaps. If that is not sufficient, why not? Furthermore, it seems to me that comparing the sun and a tennis ball is ignoring the issue – if we were looking for yellow spheres, they would indeed both qualify. Just because one is bigger than the other doesn’t dismiss the similarity.

Are you taking an Intelligent Design position?

If Lib doesn’t mind me speaking for him, I think he would find it supremely edifying and welcome that, yet again, we are not leagues away from each other but almost at the same place from different directions. I think this thread is going swimmingly, and a great may misunderstandings are being swept away - I hope you do too.

Again, I think that stretches the word ‘memory’ past breaking point: it is an IF-THEN consequences, not a past value - it indicates, stores or engenders precisely nothing of past states.

As I said in a reply to Liberal above: surely you can imagine providing a thermostat (or perhaps a more complex machine) with some mechanism to formulate memories (as you say, past states). A CPU and some RAM, perhaps. If that is not sufficient to qualify as memory, why not?

Yes D-S: I proposed the buffering mechanism myself not 10 posts ago, remember!

(To follow on from that, I think I was overly hasty ascribing memory to nematodes: roundworms are really just big amoebas, cognitively speaking. I think the absolute limit is likely to be spiders or insects, although the ‘nerve net’ of jellyfish just might comprise some rudimentary memory access, I suppose.)

I don’t think a whole lot is presently known about exactly how the brain works or what gets divvied out to what, but current opinion holds that it seems to go on in the lateral frontal cortex: the ventrolateral is concerned with updates and maintenance; the dorsolateral is concerned with selection, manipulation, and monitoring; and the anterior is concerned with secondary goals or processes. But in any case, one can deduce that some part of the brain handles it. A decision cannot be anything other than a response stimulus to an initial stimulus. Synapses fire to make you aware of an event, and more synapses fire in reaction to it. The reaction might take place in the frontal lobe in the form of an inference, or in the amygdala in the form of a religious experience, or in the medulla oblongata in the form of pain and then off to the language centers (varying by age and other factors) into an interjection.

Regarding determinism, I’ve tried to draw the line as clearly and succinctly as I can: moral decisions are expressions of free agency, and all else is fart-bang — deterministic. The brain will fire how the brain will fire. This or that synapse will do its thing whether in the brain of a man or a monkey. What sets man apart is his moral essence. That is the realm in which he is free — morality. But in the physical realm, he is just another animal, beholden to the laws of nature and physics as any other. When he touches a hot stove, he will recoil. When he curses his wife and blames her that he touched it, he is making a free moral decision. Man is a creature with a dual nature, one essential and the other merely existing.

I agree, understand and concur with the idea that that sensory and/or pre-cognitve processing must take place before awareness of an event can occur. I have no problem calling this “sensory” memory, as long it’s clear that there’s no awareness of it; at least, not yet. If this discussion drags on, I’d like to keep “sensory” memory and “post-awareness” cleanly separated.

In the example you gave, the end result of non-aware processing came into awareness by displacing a daydream. Did the daydream come into awareness by displacing something else you were aware of? It sounds as if you’re saying that “sensory” memory becomes awareness by displacing awareness. As if there were a part of your brain that acts as a container or pool: some things are temporarily in awareness, perhaps to be transformed into memory, perhaps to be nudged out by new input (Digital Stimulus touched on this a bit). Am I paraphrasing your stance correctly here?

Yes, I do. But the natural follow-up question is central to the debate of awareness (I’m not yet ready to equate awareness with cognition). If a thermostat/nuclear reactor has memory and can change its state based on that memory, does it qualify as aware?

Here, I think, is where you and Liberal really do part ways. I think (if I’m making a mistake Liberal, please correct me) he wants to say that there is something not-physical in (human) awareness that he’s characterizing as “morality”. I think you are trying to establish a purely physical basis for awareness. Is that right?

There are some similarities between computers and brains, but there are fundamental differences as well, just as there are between the sun and the tennis ball. And they are qually far removed. A CPU and some RAM is no more like a brain than a tennis ball is like the sun. For one thing, computers don’t emote. (The importance of emotion to intellectual processes is becoming widely known now.) They do not attach significance to results. They do not have experience, but merely history. They do not learn. (Unless you introduce an artificial randomness, for example, a computer will fall into the same chess trap over and over no matter how many times it encounters it.) They rely solely on logic gates, whether boolean or fuzzy, whereas brains rely on judgmental instinct and other features intrinsic to animal organs. They are not alive. RAM circuits do not replicate the way brain cells do. I could go on and on listing the differences, and yes, I know that you could list a lot of similarities. But the computer in the thermostat is just a fancy rheostat. The thermostat is not remembering anything; it is just doing what it must do when the computer does what IT must do.

The huh…? :smiley: No. Whether God designed the universe or not, it serves His purpose all the same.

Lib, upon reading your response to Digital Stimulus, let me try a re-paraphrase of what I think your stance is:

Crap just happens. Sensory stimulus comes in, motor output goes out. The only thing resembling the “container or pool” in my previous paraphrase would be a sort of holding tank for moral examination; a “staging area” for moral decision-making. Is that close?