Take what we know about mechanisms of the brain out of it. How about the brain just qua physical object, development, genes and so on, do you think that lends any weight (to others having qualia)?
No, you’re getting on the wrong path, man Think about consciousness – it’s not a thing you put over there and put a probe into. That’s why I was right about taking a stand in terms of a definition of truth – we’re never going to figure out reflection as a property of cn. Except, probably a formal recursive definition, but we’re talking the involvement in the world of realia.
Man, no.
As I said, I don’t think I can make any headway into what you are saying. But if you wish to persist, tell me what difference a “definition of truth” could make. Tell me, of two distinct such definitions, how they could differ on the question of qualia.
OK, that one I can answer. FTR I have to get packing pdq. But why a definition is important? That’s easy. Think of something you “know.” Then think of how you know it. How do you know that you know it? It’s from a formal definition. It’s all from straight Euclid.
And, oddly enough, that’s why that reflective “property” of consciousness is important. “It” “is” not very much like other things in the world. Wait, I do might have to jet, but you want a philosophy of mind stuff – Daniel C Dennett has a good book from a while ago.
What, exactly, is your question? It helps me sometimes to find a solution when I can rephrase a question very much precisely.
There are many items of biology I can objectively measure, temperature, heart rate, so on; qualia are not among them. I infer that others who are like me and behave like me are experiencing things like I do. Of course what I am actually observing is the most superficial. What I propose should matter for that inference of “like me means experiencing something similar to what I experience” is the nature of the information processing structure and complexity, not superficial behaviors. Unfortunately as of yet however we only have some interesting speculations as to the what features of our information processing systems lead to qualia and conscious experience.
To those who would take the position that humans experience qualia while other animals do not (if anyone would take such a position), I would ask at what point they believe the primate-hominoid line began t experience qualia. Did our last common ancestor with the apes have qualia? Did Homo habilis? Australopithicus? At what point in the line did we suddenly have a ghost installed in the machine?
Take two distinct “definitions of truth”, and tell me how these different definitions might give a different answers to the question “do animals experience qualia?”
OK a minimalist Tarskian semantics is like “it is like” just where it is like. But an expansionist, robust beyond all reason definition of truth is something like “where is that last rabbit at the foothills of Mont Blanc” or the tallest Finnish spy or the author of Waverly. A correct truthmaker-realist definition accounts for “fiat” declarations while not abandoning the necessity of formal descriptions. This way, one can have all the good of formal language, while…wait for it…using the world as a kind of map, a la Husserl’s third LU or Ingarden’s plot at the back of The Controversy Over the Existence of The World. It’s the only way I can find that accounts for diverse phenomena. Basically, since you sound like you had some schooling, kind of a Thomistic account of the world with some jazz.
This does not answer the question. I can barely state it any more clearly than I already have. What different theories of truth would give different answers to the question “do animals experience qualia?”
The nature of this question is a simple one; it is about whether it is *true *that animals experience qualia. To answer the question requires giving one definition of truth where it follows that it is true that animals experience qualia, and another definition where it is not-true.
OK I get your question. But the reason I’m having difficulty is that your model is a little bit – to my mind, odd and even baroque.
Why in the hell should there be qualia in the first place? There are historical reasons for coming up with the empirico-idealist perspective, but to start by ignoring all that and just begin with some formal definitions (well, doesn’t have to be a formal language – plus at some point you end up where I am dealing with modalities and that’s where my point of view is coming from), we’re back in science and it all makes sense.
What model? I haven’t said much in this thread, and what I have said shouldn’t have committed me to much at all (if anything). I also don’t see how this has any bearing on the question I asked. It was straightforward and not loaded.
Who fucking knows. I haven’t even said anything about whether I think the concept makes sense or not.
I have no idea what you are trying to say, nor why you went on this little rant. I asked the question because, given an actual answer, I might understand something you were saying. As it is, I don’t.
I’m not trying to argue with you, man. Well, in a sense, but not making a rant. OK, what’s a model. My understanding of the term is … remember Rutherford and the atom with little spinny things around it? I’m just questioning whether the model of sensibilia and empirico-idealist qualia should be one accepted without question. I happen to think it’s a deficient model for the manner in which conscious entities relate to the world of which they are a part.
So. Now we’re back into a question about materialism. You dig, right? Some of this is just a convenient shorthand for talking about stuff. It’s all good – just some different ways of “expressing” “things” and now we’re back where we started.
As it is I’m going to have to think a little bit more about phrasing the Sachverhalte correctly – bear with me. But I’ll try to be more clear.
I cannot, with absolute certainty, say that humans other than myself experience these things.
I agree with Jaledin
Well, I’m going to assume that, say, rocks don’t experience things. I’m going to observe that I do, and I’ll be generous and say all humans (or at least average, neurotypical, non-vegetable humans – we can quibble about brain dead people I suppose) are thinking experiencing creatures. For simplicity, we’re going to also assume matters that make the question moot are false (everything is an illusion, you don’t experience because you don’t have free will so “experience” is meaningless etc).
From here, we observe that there must be some division between experiencing and non-experiencing. Not necessarily a line, it could be a gradient, though I’m not sure how exactly to define “kind of experiences” in a fuzzy logic sense. So the question becomes less clear cut. At what point does an object feel experience? That is the tricky question.
Since we observe that there is some division between an experiencing and non-experiencing thing, we can say that there might be something special about humans. Even if you’re a materialist you can say that maybe there’s something inherent in human brain structure that makes us experience and other animals not. Or maybe some other animals have it (cats? dogs)? and some don’t (worms?), maybe anything with a brain does, maybe anything with a rudimentary nervous system does. Maybe anything living does.
I don’t think you have to accept that if humans experience, even as a materialist, that you have to accept that non-human animals obviously experience.
Of course, until we get a good physical definition of what, exactly, experience is we’re kind of stuck here.
I don’t think this question is as deep as you’re making it…It’s kinda obvious
I would also expect there to be pretty much a gradient, but if animals experience qualia it may be that they experience richer qualia in some cases. So putting us at the top of a gradient may be misleading in some cases.
(Obvious example: vision. If animals perceive colour as we do, the world would look much more colourful to them than it does to us, because I think tetrachromacy is most common outside mammals, with a genuine 4D colour space. Primates are trichromats, but our colour space is not even full 3D)
One useful division is on the basis of discretion. In humans stimuli that inexorably lead to a particular behaviour have no qualia (e.g. reflexes, autonomic functions).
So where we see that an organism always responds to a stimuli exactly the same way we might assume there’s no associated qualia.
…But it’s certainly too early to have confidence in such an assertion; it’s just another best guess.
Not at all, it’s [del]basically[/del] the Hard Problem of Consciousness. A very open question with absolutely no clear answer.
Of course, but stimuli don’t happen in a black box either. If something reacts differently to the same stimulus twice there are many possible reasons for that. Perhaps it just reacts differently when the same thing happens to it twice in a preprogrammed fashion, non-experiencing creatures can still learn, maybe it figured out another way to respond. And on and on. But I will grant that the whole “reflexes vs perceived stimuli” seems like a decent starting point for a definition.
I didn’t mean for that to sound so close minded and bland, I mean the point in which one defines ‘experience’. One who is conscious, in my mind is somebody who has self awareness, I don’t believe animals are ‘conscious’ or at least not nearly as conscious as us, because of the way they behave, they behave stupidly, really. I don’t mean to say that animals don’t feel pain, but do they know they’re feeling pain? That, is a deep question.
I agree - in fact, I’d say the gradient doesn’t just happen below the human level - some humans are acutely aware (of self and stimuli), others less so. Some of this is no doubt the result of habit and conditioning, but I don’t think it’s too much of a stretch to say that humans represent a region of the spectrum of awareness, rather than a point (or the endpoint).
I know plenty of people who claim to think that way right now. They are, of course, wrong, but they definitely assert what you say they don’t.