Reply
 
Thread Tools Display Modes
  #51  
Old 02-11-2019, 03:32 AM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Mijin View Post
The OP mentions consciousness. Well, there's no mystery there: it evolved as a feature of intelligent life; it's something brains do.
Lots we don't understand about consciousness, but not how it originated.
Never understood why consciousness is such a big problem for some.
Are we happy to say that other creatures have a degree of consciousness a little below ours? and others at a level below that? and others below that? back through other apes, primates, higher mammals etc. etc.
If a physical or mental capability confers an evolutionary edge then it'll be selected for. Now it just so happens that this capability ultimately allows the possessor species to notice it and ponder about it, capabilities such as the sight processing of a raptor or smell-sense of canines are equally incredible but don't cause the possessor to think too hard about them.
__________________
I'm saving this space for the first good insult hurled my way
  #52  
Old 02-11-2019, 06:17 AM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by Novelty Bobble View Post
Never understood why consciousness is such a big problem for some.
Are we happy to say that other creatures have a degree of consciousness a little below ours? and others at a level below that? and others below that? back through other apes, primates, higher mammals etc. etc.
If a physical or mental capability confers an evolutionary edge then it'll be selected for. Now it just so happens that this capability ultimately allows the possessor species to notice it and ponder about it, capabilities such as the sight processing of a raptor or smell-sense of canines are equally incredible but don't cause the possessor to think too hard about them.
Well, there's a reason that the question of how conscious experience arises is generally known as 'The Hard Problem' in philosophy. One way of approaching the difficulty is to try and imagine how you would go about explaining to a blind person what the experience of seeing red is like: if consciousness is reducible to the physical, then there should be a story, no matter how complicated it might be, that you could tell, such that the blind person knows as well as you do what red looks like. Such a story exists, and can be easily conceived of at least in outline, for anything else---say, how a gigantic supercomputer performs its calculations. You could never tell that story, or hold it in the mind all at once, but it's obvious that there is such a story.

However, with consciousness, nobody has yet even come up with a hint of how such a story plausibly might go, and indeed, pretty much everyone's initial reaction is that this story is impossible: a blind person will never know what it's like to see, because there is no amount of third person knowledge that suffices to transfer first person experience. But if that's the case, then consciousness is fundamentally different from anything else studied in science, and conceived of in physical terms.

Furthermore, this serves to call the evolutionary story into question: after all, it's perfectly possible to conceive of every action a living organism performs, in the complete absence of any conscious experience. Take stubbing your toe, and crying out in pain: a signal is transmitted from your toe to your brain, which activates the right sort of muscles to make the air vibrate just so. At no point in the explanation do I have to appeal to the conscious experience of pain: it's all just signals flipping switches, like dominoes toppling one another.

But then, evolution must be blind to whether any conscious experience crops up at all: it can only select for function, but functions can be performed unconsciously (we all experience that---from being 'in the flow' to phenomena like sleepwalking, or more extreme cases like blindsight). So if that's right, consciousness can't be evolutionarily selected for---at most, it's thus an accident, something we might just as well have ended up without.

Now, I think these questions can be answered; but they're nevertheless difficult, and I think it only hurts us if we try to handwave them away (much like invoking 'quantum fluctuations' for the origin of the universe).

Last edited by Half Man Half Wit; 02-11-2019 at 06:19 AM.
  #53  
Old 02-11-2019, 07:15 AM
SamuelA SamuelA is offline
Guest
 
Join Date: Feb 2017
Posts: 2,902
Quote:
Originally Posted by Jay Z View Post
But even if it's impossible for there to be nothing due to sub-atomic rules, at some level there will be no explanation for WHY those rules exist. It is impossible to keep passing the buck. At some point, things exist for no reason, God or no God.
Precisely. Moreover, the rules don't just say any arbitrary thing. They are exceedingly specific in what kind of substance can come from nothing, specify countless very specific details about our universe, and so on. It is hard to see why these rules would spontaneously make themselves exist or always exist and not some other set of rules.

This is ironically why theories of the multiverse or theories where every possible permutation of our universe must also exist somewhere do seem more plausible than "there is nothing but void and nothing but own our universe which will in some number of trillion years cool itself to nothing but empty space". Even though this is all we can directly observe.

Last edited by SamuelA; 02-11-2019 at 07:15 AM.
  #54  
Old 02-11-2019, 07:20 AM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Half Man Half Wit View Post
if consciousness is reducible to the physical, then there should be a story, no matter how complicated it might be, that you could tell, such that the blind person knows as well as you do what red looks like. Such a story exists, and can be easily conceived of at least in outline, for anything else---say, how a gigantic supercomputer performs its calculations. You could never tell that story, or hold it in the mind all at once, but it's obvious that there is such a story.

However, with consciousness, nobody has yet even come up with a hint of how such a story plausibly might go, and indeed, pretty much everyone's initial reaction is that this story is impossible: a blind person will never know what it's like to see, because there is no amount of third person knowledge that suffices to transfer first person experience. But if that's the case, then consciousness is fundamentally different from anything else studied in science, and conceived of in physical terms.

Furthermore, this serves to call the evolutionary story into question: after all, it's perfectly possible to conceive of every action a living organism performs, in the complete absence of any conscious experience. Take stubbing your toe, and crying out in pain: a signal is transmitted from your toe to your brain, which activates the right sort of muscles to make the air vibrate just so. At no point in the explanation do I have to appeal to the conscious experience of pain: it's all just signals flipping switches, like dominoes toppling one another.
All of that is pure assertion, the so-called Hard Problem of consciousness seems no harder to explain in evolutionary terms than any other specialised function. Sight, smell, hearing, vibration, magnetism etc. etc. we have no problem in accepting a continuum of capability and system sophistication for all of those but somehow consciousness is different? No, I don't buy it at all. In fact, if consciousness confers evolutionary benefit we should expect to see a spectrum of capabilities across the species and indeed we do.
Also, just because philosophers have given the problem a Capitalised Name does give it any special status.


Quote:
But then, evolution must be blind to whether any conscious experience crops up at all: it can only select for function,
If consciousness is, as it thought, an emergent property of a complex, physical brain then mutations will result in differing forms of consciousness which will lead creatures to behaving differently in the world. The outcomes of that behaviour will be positive, negative or neutral. Natural selection does the rest. It doesn't seem like a problem to me at all.

Quote:
but functions can be performed unconsciously (we all experience that---from being 'in the flow' to phenomena like sleepwalking, or more extreme cases like blindsight). So if that's right, consciousness can't be evolutionarily selected for---at most, it's thus an accident, something we might just as well have ended up without.
That simply does not follow. The ability to sub-contract behaviours to the unconscious brain leaving higher brain functions for other purposes seems like a potential heritable trait that can easily be beneficial and so be selected for.

Last edited by Novelty Bobble; 02-11-2019 at 07:23 AM.
  #55  
Old 02-11-2019, 09:02 AM
Mijin's Avatar
Mijin Mijin is offline
Guest
 
Join Date: Feb 2006
Location: Shanghai
Posts: 8,802
Novelty Bobble and Half Man Half Wit I think you're talking past each other.

Hopefully we can all agree that:

1) It's pretty clear how consciousness evolved
2) There are significant aspects of consciousness that we cannot model in detail yet i.e. don't understand very well yet

But even for (1), I was careful here, and in my previous post, to say how it evolved, not why.
Because I would tend to agree with Half Man Half Wit that at least in some cases it's not clear what subjective experience adds from a survival point of view. It may, at least partly, simply be a byproduct of useful traits.
Novelty Bobble alludes to some hypotheses of the utility of subjective experience but I don't think anyone doubts there are candidate explanations, just we have no idea which, if any, are right.

Last edited by Mijin; 02-11-2019 at 09:04 AM.
  #56  
Old 02-11-2019, 10:18 AM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Mijin View Post
But even for (1), I was careful here, and in my previous post, to say how it evolved, not why..
There doesn't have to be a "why" at all.

"why" is not necessarily a meaningful question because it seems to assume a purpose when really for any evolved feature "why" can be answered by saying "it is useful" and that is it. No further explanation needed and the "how" is the interesting part.
__________________
I'm saving this space for the first good insult hurled my way
  #57  
Old 02-11-2019, 10:36 AM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Mijin View Post
would tend to agree with Half Man Half Wit that at least in some cases it's not clear what subjective experience adds from a survival point of view. It may, at least partly, simply be a byproduct of useful traits.
Seems pretty trivial to me to think up benefits. A subjective experience of the world may prompt a complex conscious brain to approach problems in different way and create better strategies and designs for food and shelter or sexual display or to enable higher-order thinking about the desires and motivations of you v others.

But sure. It may also be a misfire, an outcome of having a big brain that is either neutral or helpful.
__________________
I'm saving this space for the first good insult hurled my way
  #58  
Old 02-11-2019, 10:41 AM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by Novelty Bobble View Post
All of that is pure assertion, the so-called Hard Problem of consciousness seems no harder to explain in evolutionary terms than any other specialised function.
I didn't merely make assertions, but proposed arguments, such as explaining what sight is like to a blind man.

Quote:
Sight, smell, hearing, vibration, magnetism etc. etc. we have no problem in accepting a continuum of capability and system sophistication for all of those but somehow consciousness is different?
I'm not saying that consciousness doesn't occur on a spectrum; I'm saying that as yet, nobody has any idea at all how matter gives rise to conscious experience. To all appearances, we can tell a complete story of everything an organism does, on the level of individual molecules if need be, without mentioning consciousness at all, or being able to deduce what its conscious experience is like from just that story.

Quote:
In fact, if consciousness confers evolutionary benefit we should expect to see a spectrum of capabilities across the species and indeed we do.
But that's the problem: evolution is selective on the level of behavior; but behavior and consciousness are not obviously coupled---just because something acts a certain way is not sufficient to conclude that it is conscious, much less in what way.

Quote:
Also, just because philosophers have given the problem a Capitalised Name does give it any special status.
Fair enough, but it does mean that those with the biggest claim towards being experts on the matter consider it kind of a tough nut to crack, which one would think at least merits some examination of the reasons why they do think that way.

Quote:
The outcomes of that behaviour will be positive, negative or neutral. Natural selection does the rest. It doesn't seem like a problem to me at all.
Consider the causal closure of the physical: that you scream when you stub your toe is fully determined by the facts of certain muscles moving, electrical signals racing along nerve fibers, neuron firings, and so on. Nothing else needs to be said---in particular, nothing about the fact that you feel pain needs to be said. You could show the exact same behavior, while feeling no pain at all. Indeed, it would be simple to construct a 'robot', that does nothing but emit a certain sound if certain sensors are triggered by an appropriate stimulus. Nobody would consider such a robot conscious.

But that's true for the sum total of your behaviors. Evolution can select for avoidance behavior---i.e. you turning away from a harmful stimulus. But, since that on its own, to all appearances, does not determine whether you feel pain, feel nothing (are as unconscious as the simple robot), or feel something else entirely, evolution can't select for you feeling pain.

Take the famous 'inverted qualia' thought experiment: my behavior would remain the same if, say, my color experiences were inverted. What I used to call red, I now call green, and vice versa; since I still call the same wavelength entering my eye by the same color name, nothing about my behavior will change. Evolution thus is blind regarding whether I experience a red-sensation while saying 'I see red', or whether I experience (what I would have earlier called) a green-sensation. Thus, my seeing red is not adaptive.

It gets worse than that. It's a bit more subtle, but once you think about it, you realize that it's deeply puzzling that our experiences should be appropriate to our circumstances at all. Evolution can't differentiate between me fleeing from a predator and being terrified, and me fleeing from a predator and being content, or me avoiding a pain stimulus and being in pain, and me avoiding said stimulus in a feeling of bliss. Again, that's just causal closure: that the particular pattern of neuron firings coupled to stubbing my toe should feel painful as opposed to blissful doesn't make a difference, as long as that pattern leads to me retracting my foot and shouting 'ow'.

Quote:
Originally Posted by Mijin View Post
Novelty Bobble and Half Man Half Wit I think you're talking past each other.

Hopefully we can all agree that:

1) It's pretty clear how consciousness evolved
I don't really think this is clear at all. How does evolution select for pain to feel painful? It can only select for behavior following a pain-stimulus being such as to minimize harm; but that doesn't entail anything about how it feels, or indeed, that it feels like anything at all.

What do you think couldn't we do if we were completely unconscious automata?

Last edited by Half Man Half Wit; 02-11-2019 at 10:42 AM.
  #59  
Old 02-11-2019, 11:19 AM
Exapno Mapcase Exapno Mapcase is offline
Charter Member
 
Join Date: Mar 2002
Location: NY but not NYC
Posts: 30,849
Quote:
Originally Posted by Half Man Half Wit View Post
It's really this sort of answer that has to be retired. Quantum uncertainty doesn't really entail anything about virtual particles becoming real; virtual particles are a mathematical bookkeeping device that only exist in approximations to a full theory that help us calculate stuff.

All the sort of 'universe from nothing'-arguments in physics going back to Edward Tryon really start the same way, by defining some origin state as 'nothing' and then going on from there. But of course, no state is 'nothing'; there is no 'nothing'-state. The vacuum state in a quantum field theory is just that, not nothing. It has properties, such as a nonzero energy expectation value; nothing doesn't have that. Indeed, it's literally meaningless to say it does. Likewise with 'nothing is unstable'.

Besides, even if these ideas were sensible, they would fall short of the goal of explaining the universe: as soon as you rely on the laws of quantum mechanics, those themselves become explananda. So if quantum mechanics truly were to explain how 'something can come from nothing', that would just shift the question to 'why quantum mechanics?'.
If you say that "The vacuum state in a quantum field theory is just that, not nothing" I'm not seeing how that differs from what I said. Which is that "nothing" is impossible.

It may or may not be true that the universe started from a vacuum fluctuation. Apparently some physicists postulate that, even if you don't. Even if it someday is shown not to be the answer, it works to gainsay the nonsensical "something comes from nothing."

But it does not merely shift the question to "why quantum mechanics." If there is always something, then there must be some scientific explanation. What that explanation is may be interesting but the explanation does not create the reality.
  #60  
Old 02-11-2019, 11:33 AM
Mijin's Avatar
Mijin Mijin is offline
Guest
 
Join Date: Feb 2006
Location: Shanghai
Posts: 8,802
Quote:
Originally Posted by Novelty Bobble View Post
There doesn't have to be a "why" at all.

"why" is not necessarily a meaningful question because it seems to assume a purpose when really for any evolved feature "why" can be answered by saying "it is useful" and that is it. No further explanation needed and the "how" is the interesting part.
But in this context "why" is not some lofty philosophical question, we're asking what evolutionary purpose it serves. This species has ridges on its back: is it for defensive purposes, sexual selection, temperature regulation, all of the above? That kind of question.

Quote:
Seems pretty trivial to me to think up benefits. A subjective experience of the world may prompt a complex conscious brain to approach problems in different way and create better strategies and designs for food and shelter or sexual display or to enable higher-order thinking about the desires and motivations of you v others.

But sure. It may also be a misfire, an outcome of having a big brain that is either neutral or helpful.
I don't doubt we can trivially think of hypotheses: that's true of the vast majority of phenomena that do not yet have a explanatory theory.
The point is simply: we don't know yet. Your initial post begun with asking "why consciousness is such a big problem for some". It's not "a problem" for me, it's simply that I acknowledge that there is a number of aspects of consciousness that we don't have a good model for.
  #61  
Old 02-11-2019, 11:36 AM
Mijin's Avatar
Mijin Mijin is offline
Guest
 
Join Date: Feb 2006
Location: Shanghai
Posts: 8,802
Quote:
Originally Posted by Half Man Half Wit View Post
I don't really think this is clear at all. How does evolution select for pain to feel painful? It can only select for behavior following a pain-stimulus being such as to minimize harm; but that doesn't entail anything about how it feels, or indeed, that it feels like anything at all.

What do you think couldn't we do if we were completely unconscious automata?
Well you're kind of putting me between a rock and a hard place.
In terms of the OP, who possibly thinks of consciousness as something needing a separate explanation from other characteristics of life...(s)he's wrong; there's copious evidence it evolved.

But yes if we're having a more nuanced, scientific conversation about consciousness, there's plenty we don't know about the exact evolutionary path taken.
  #62  
Old 02-11-2019, 11:41 AM
CurtC CurtC is offline
Guest
 
Join Date: Dec 1999
Location: Texas
Posts: 6,733
Quote:
Originally Posted by Half Man Half Wit View Post
But that's the problem: evolution is selective on the level of behavior; but behavior and consciousness are not obviously coupled---just because something acts a certain way is not sufficient to conclude that it is conscious, much less in what way.
This goes along with what you said earlier, that behaviors could be selected for without consciousness, so every outcome would be the result of something like a chain of dominoes falling.

This reminds me of how, when I was in high school, pocket calculators were a new thing. I recall trying to imagine how they work, and the only thing I could come up with was that the engineers had set up this chain-of-dominoes for ever possible combination of inputs, to give the correct output. Somebody had to sit down and tell it that 57635 plus 87569 is 145204.

But later I learned about programming and algorithms, and I realized how absurd my earlier thoughts had been. It was MUCH easier and more straightforward to give it methods to find a response through generic algorithms.

That seems the same for your chain of dominoes that evolution could have used for our survival. The simpler explanation is that a generic brain algorithm to experience survival and strive for it is many orders of magnitude more straightforward to achieve.
  #63  
Old 02-11-2019, 12:19 PM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by Exapno Mapcase View Post
If you say that "The vacuum state in a quantum field theory is just that, not nothing" I'm not seeing how that differs from what I said. Which is that "nothing" is impossible.
I don't see how you'd conclude that. Certainly, there's nothing in physics that forces you to; and if it's true that "nothing" is impossible, then we wouldn't really need any more arguments.

Quote:
It may or may not be true that the universe started from a vacuum fluctuation. Apparently some physicists postulate that, even if you don't. Even if it someday is shown not to be the answer, it works to gainsay the nonsensical "something comes from nothing."
Again, I don't see how. What these arguments show is essentially that there's a state of zero particles that can evolve to a state that contains particles, and a little more. This isn't really surprising, but it also simply doesn't have anything to do with the question of how the universe came about; merely with how an early stage of the universe may have evolved into the later stage we see now. This is hugely fascinating, but it's being sold as something it manifestly isn't.

David Albert put it well: essentially, these arguments show how a state of my hand that contains no fist may evolve into one that does contain a fist; but they don't do anything towards addressing the question of why there's a hand in the first place.

So as an answer to the question 'why is there something rather than nothing', this sort of thing simply never gets off the ground.
  #64  
Old 02-11-2019, 12:34 PM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by CurtC View Post
That seems the same for your chain of dominoes that evolution could have used for our survival. The simpler explanation is that a generic brain algorithm to experience survival and strive for it is many orders of magnitude more straightforward to achieve.
The analogy doesn't really work, though: you had a perfectly viable model of how calculators could do the job they do; it's inefficient, but given the necessary resources, one could, in fact, create a 'calculator' that simply pulls the result of every calculation from memory (which was in fact done with logarithm tables, and is still done whenever one hears the much more fancy term 'precomputation').

With consciousness, we don't even have that rough initial guess. Worse, there are arguments that no such guess is even possible; certainly, no other problem has that sort of status. There are plenty of open problems we can point to, and in every case, it's easy to at least outline how a solution might look (as you did with the calculator). I don't know what the exact theory of quantum gravity might look like, but I've got a good idea what kind of theory it's going to be, and I can envision candidate solutions.

Something like a calculation is the performance of a certain function: everything that performs that function, calculates. But the body seems, in principle, to be able to perform its functions perfectly well without there being any conscious experience associated with that. Stub my toe, recoil, emit some choice profanity: all perfectly doable without anything such as a pain experience anywhere.

Any attempt to explain consciousness so far, thus, has stopped right at what's usually called the 'explanatory gap': the difference between the third person knowledge we can gather about the molecular structure of ammonia, and the way it smells. There's lots of things we can deduce from that molecular structure: its solubility in water, whether it's gaseous, liquid or solid at room temperature, its color, and so on (never mind that most of these things would be computationally infeasible to predict; we can take the perspective of an ideal reasoner here). But what it smells like? Whether it smells the same for me, as it does for you? I can't for the life of me imagine the sort of story we'd have to tell to even begin answering such questions.
  #65  
Old 02-11-2019, 12:50 PM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Half Man Half Wit View Post
Evolution can select for avoidance behavior---i.e. you turning away from a harmful stimulus. But, since that on its own, to all appearances, does not determine whether you feel pain, feel nothing (are as unconscious as the simple robot), or feel something else entirely, evolution can't select for you feeling pain.
If you turn away from a stimulus. the processing that detects and evaluates that signal is pain. If a creature were unable to process that signal as well they would not be able to avoid that harmful stimulus and natural selection does the rest.

Quote:
Take the famous 'inverted qualia' thought experiment: my behavior would remain the same if, say, my color experiences were inverted. What I used to call red, I now call green, and vice versa; since I still call the same wavelength entering my eye by the same color name, nothing about my behavior will change. Evolution thus is blind regarding whether I experience a red-sensation while saying 'I see red', or whether I experience (what I would have earlier called) a green-sensation. Thus, my seeing red is not adaptive.
Your brain doesn't bother attaching names to colours other than by linguistic convention. The reaction to different wavelengths is all that matters. If a poisonous berry is wavelength A and it's non-poisonous cousin is wavelength B then does not matter one jot whether your subjective experience of A is the same as another person's. All that matters is that you eat the right berry and if you can't then you'll be dead and your crappy genes are dust.

So yes actually seeing "red" (where "red" is the right berry) is adaptive.

Quote:
It gets worse than that. It's a bit more subtle, but once you think about it, you realize that it's deeply puzzling that our experiences should be appropriate to our circumstances at all.
I'm not sure what you mean by "experiences" here, do mean what we feel?

Quote:
Evolution can't differentiate between me fleeing from a predator and being terrified, and me fleeing from a predator and being content, or me avoiding a pain stimulus and being in pain, and me avoiding said stimulus in a feeling of bliss.
you are using qualitative terms there and evolution really doesn't care.

Quote:
Again, that's just causal closure: that the particular pattern of neuron firings coupled to stubbing my toe should feel painful as opposed to blissful doesn't make a difference, as long as that pattern leads to me retracting my foot and shouting 'ow'.
"just" causal closure? What you describe here is exactly what evolution cares about and acts upon. Is it possible that there are non-physical drivers of consciousness? I don't know, but there is certainly no necessity for them so it makes no sense to invoke them until required. It may be an interesting philosophical conversation for some but I don't think it leads anywhere useful.

Quote:
What do you think couldn't we do if we were completely unconscious automata?
How complicated is this automata? Something simple like a prion or a virus or a bacteria? we can get them to react to stimulus is simple ways and we wouldn't think them conscious but as we step up the complexity we will reach more complex behaviour and will reach that point of consciousness eventually. If ultimately you created an atom for atom reconstruction of a human they would be capable of every thing that we are, including conscious thought, they would be human.
__________________
I'm saving this space for the first good insult hurled my way
  #66  
Old 02-11-2019, 12:58 PM
Exapno Mapcase Exapno Mapcase is offline
Charter Member
 
Join Date: Mar 2002
Location: NY but not NYC
Posts: 30,849
Quote:
Originally Posted by Half Man Half Wit View Post
David Albert put it well: essentially, these arguments show how a state of my hand that contains no fist may evolve into one that does contain a fist; but they don't do anything towards addressing the question of why there's a hand in the first place.
Albert is responding to Lawrence Krauss, who adamantly makes the case I am citing.

Albert quotes him as saying:

Quote:
He complains that “some philosophers and many theologians define and redefine ‘nothing’ as not being any of the versions of nothing that scientists currently describe,” and that “now, I am told by religious critics that I cannot refer to empty space as ‘nothing,’ but rather as a ‘quantum vacuum,’ to distinguish it from the philosopher’s or theologian’s idealized ‘nothing,’ ”
This appears to be exactly what you are doing now. You state unequivocally that "It has properties, such as a nonzero energy expectation value; nothing doesn't have that. Indeed, it's literally meaningless to say it does." Whether that state led to our current state is an interesting and disputed hypothesis. Albert doesn't refute it in any way. Possibly there has always been something and the rearrangement of energy is a matter of a closed fist and a hand showing fingers, as Albert puts it. Moreover, he states the argument in terms exactly equivalent to my own.

Quote:
And if what we formerly took for nothing turns out, on closer examination, to have the makings of protons and neutrons and tables and chairs and planets and solar systems and galaxies and universes in it, then it wasn’t nothing, and it couldn’t have been nothing, in the first place.
I think Albert is dead wrong in dismissing this argument as trivial, as he does in his last paragraph. I think that dismissing "nothing" as a possible argument has vast implications. It undoes 2000 years of philosophy at a stroke, so I don't wonder that a professor of philosophy would rail against it.
  #67  
Old 02-11-2019, 01:04 PM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Mijin View Post
I don't doubt we can trivially think of hypotheses: that's true of the vast majority of phenomena that do not yet have a explanatory theory.
The point is simply: we don't know yet. Your initial post begun with asking "why consciousness is such a big problem for some". It's not "a problem" for me, it's simply that I acknowledge that there is a number of aspects of consciousness that we don't have a good model for.
I agree with you,( and my original post just used your quote as a starting point, it wasn't a criticism of what you said) I am perfectly comfortable with "we don't know yet" as an answer. However, we can put forward lots of plausible pathways as to how consciousness might have been beneficial in evolutionary terms and so develop to the human level. We also have empirical evidence of brain complexity and increased capability and increasing levels of consciousness.
I don't see any value in exploring anything beyond the purely physical, I see no reason why the mind should be anything other than the purely physical.
__________________
I'm saving this space for the first good insult hurled my way

Last edited by Novelty Bobble; 02-11-2019 at 01:05 PM.
  #68  
Old 02-11-2019, 01:11 PM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by Novelty Bobble View Post
If you turn away from a stimulus. the processing that detects and evaluates that signal is pain. If a creature were unable to process that signal as well they would not be able to avoid that harmful stimulus and natural selection does the rest.
So, in my simple robot, does it feel pain? It processes the signal, and reacts; other signals, it wouldn't have reacted.

And how does a subjective dimension arise? Why, for instance, does pain feel the particular way it does, and not some other way? Could an alien with different 'processing' feel something subjectively different, yet react in the same way?

Quote:
Your brain doesn't bother attaching names to colours other than by linguistic convention. The reaction to different wavelengths is all that matters.
But that reaction doesn't suffice to fix what a color looks like. As an example: a newborn has, thanks to whatever genetic quirk, just the inverse wiring from us. When they experience light of wavelength A, they have what we would call a green-experience, rather than the red-experience we have. However, growing up, they'll of course attach the name 'red' to that experience, and identify the color red correctly when prompted, react to red in the same way we do.

Now, my wiring is changed, to resemble theirs. Do I notice a difference?

Quote:
So yes actually seeing "red" (where "red" is the right berry) is adaptive.
No; reacting to light of wavelength A in a particular way is adaptive. Whether that's accompanied by 'seeing red', 'seeing green', or not seeing anything---having no phenomenal experience at all---is completely immaterial.

Quote:
I'm not sure what you mean by "experiences" here, do mean what we feel?
What's typically called 'phenomenal experience', or 'qualia': the way red looks like to us, for instance.

Quote:
you are using qualitative terms there and evolution really doesn't care.
Yet, you claim that evolution is responsible for these qualitative experiences.

Quote:
"just" causal closure? What you describe here is exactly what evolution cares about and acts upon.
Exactly; that's the problem. Evolution cares about how I react to wavelength A; it doesn't care about how seeing red looks to me. The latter is what we're concerned with.

Quote:
Is it possible that there are non-physical drivers of consciousness?
I don't think so: after all, the physical world is known to us only via its causal properties, i.e. the effects it has on us---so the idea of something non-physical impacting on the physical in some way seems incoherent from the outset.

Quote:
How complicated is this automata? Something simple like a prion or a virus or a bacteria? we can get them to react to stimulus is simple ways and we wouldn't think them conscious but as we step up the complexity we will reach more complex behaviour and will reach that point of consciousness eventually.
Why? How? This is always the sort of impasse these discussions reach: if you just bundle enough of these unconscious reactions together, I'm sure consciousness somehow just sparks up.

But the difference we're talking about is not a quantitative one; it's qualitative.

Quote:
If ultimately you created an atom for atom reconstruction of a human they would be capable of every thing that we are, including conscious thought, they would be human.
You'd think so, but examine your own premise: an interaction of this here atom with that one there doesn't contain a spark of consciousness. So why would it somehow arise if we put this interaction in the same room with a lot of others?

Yes, I know: water is liquid, while single water molecules aren't. But the fluidity of water is in fact readily apparent from the properties of a single water molecule: it's conceptually simple to derive the details of its bonding with other, identical molecules, given its configuration. Fluidity doesn't just happen; it's very clear how it emerges from the properties of single molecules.

With consciousness, however, the story is always: obviously simple processes aren't conscious, but then, you put enough of 'em together, something something something, ta-daa, consciousness. Saying 'consciousness emerges' isn't an answer: it's a re-statement of the question.

The problem is, as long as the systems are simple enough so that we can easily hold them in mind in their totality, it's completely clear that there's not any conscious experience necessary for them to perform their function. But then, once you get enough of that together, we can't easily do that anymore, so who knows, right? Maybe consciousness just sorta sparks up?

But in every case where something 'sparks up', we can easily see how and why it does. With consciousness, for some reason, people never even really attempt to address that.
  #69  
Old 02-11-2019, 01:19 PM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by Exapno Mapcase View Post
I think Albert is dead wrong in dismissing this argument as trivial, as he does in his last paragraph. I think that dismissing "nothing" as a possible argument has vast implications. It undoes 2000 years of philosophy at a stroke, so I don't wonder that a professor of philosophy would rail against it.
Sure, if we dismiss the problem, it goes away (that's literally why this strategy is trivial). You can do that with every problem. But the question is, what justification do we have to dismiss it? And once we ask that question, the philosophers all come crawling back out of the woodwork.

On a related note, this is one thing I've never quite got about this board. Usually, people on here are quite big on at least fairly considering the opinion of the experts. On vaccines, people defer to doctors; on climate change, to climate scientists. But on philosophy? Everybody seems eager to listen to anybody but philosophers. For some reason, that lot falls most often on us poor physicists; but in reality, somebody trained in physics doesn't really have much more of an informed opinion on matters of philosophy than they do on matters of dentistry (something Krauss in particular is always eager to demonstrate).
  #70  
Old 02-11-2019, 02:07 PM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Half Man Half Wit View Post
Yes, I know: water is liquid, while single water molecules aren't. But the fluidity of water is in fact readily apparent from the properties of a single water molecule: it's conceptually simple to derive the details of its bonding with other, identical molecules, given its configuration. Fluidity doesn't just happen; it's very clear how it emerges from the properties of single molecules.

With consciousness, however, the story is always: obviously simple processes aren't conscious, but then, you put enough of 'em together, something something something, ta-daa, consciousness. Saying 'consciousness emerges' isn't an answer: it's a re-statement of the question.

The problem is, as long as the systems are simple enough so that we can easily hold them in mind in their totality, it's completely clear that there's not any conscious experience necessary for them to perform their function. But then, once you get enough of that together, we can't easily do that anymore, so who knows, right? Maybe consciousness just sorta sparks up?

But in every case where something 'sparks up', we can easily see how and why it does. With consciousness, for some reason, people never even really attempt to address that.
I'm not sure I can make any more clarifying points for the first part of your response as I'd just be simply re-stating what I'd already said.

For the parts that I've quoted here though I say that I have no problem in imagining consciousness emerging as sensory complexity increases. I don't see any need for it to suddenly "spark" into being. Like the 3d sonar pictures of bats wouldn't have just appeared fully-formed but would have gradually evolved from a base-line normal hearing system with each "improvement" conferring a benefit.
So it can be with consciousness. Go back to through the proto-humans and early primates. Were we able to run tests to measure consciousness on them I'd expect to see a gradual change over the millions of years involved. I wouldn't expect to see a single generational leap from "non-conscious" to "conscious" with Lucy-Junior being perplexed and frustrated as to why neither of his parents are able to grasp even the most simple of card-games.

I think trying to draw an equivalence with water isn't helpful. A single neuron or even the simplest sensory structure of a procaryote is orders of magnitude more complicated than a water molecule. Once life has bridged that gap to sense organ I think the evolution of a brain-like structure and some form of consciousness is pretty much a nailed-on certainty.
__________________
I'm saving this space for the first good insult hurled my way
  #71  
Old 02-11-2019, 02:30 PM
Exapno Mapcase Exapno Mapcase is offline
Charter Member
 
Join Date: Mar 2002
Location: NY but not NYC
Posts: 30,849
Quote:
Originally Posted by Half Man Half Wit View Post
Sure, if we dismiss the problem, it goes away (that's literally why this strategy is trivial). You can do that with every problem. But the question is, what justification do we have to dismiss it? And once we ask that question, the philosophers all come crawling back out of the woodwork.

On a related note, this is one thing I've never quite got about this board. Usually, people on here are quite big on at least fairly considering the opinion of the experts. On vaccines, people defer to doctors; on climate change, to climate scientists. But on philosophy? Everybody seems eager to listen to anybody but philosophers. For some reason, that lot falls most often on us poor physicists; but in reality, somebody trained in physics doesn't really have much more of an informed opinion on matters of philosophy than they do on matters of dentistry (something Krauss in particular is always eager to demonstrate).
If you think that there's a split between physicists like Albert and Krauss you've never looked at philosophy. Nobody agrees with anybody about anything. Add that to the humongous handicap that nothing any philosopher says can be proven or shown to have empirical backing. Perhaps that's why few outsiders take the word of a philosopher on any issue.
  #72  
Old 02-11-2019, 02:37 PM
Lemur866's Avatar
Lemur866 Lemur866 is offline
Charter Member
 
Join Date: Jul 2000
Location: The Middle of Puget Sound
Posts: 22,343
Quote:
Originally Posted by Half Man Half Wit View Post
With consciousness, however, the story is always: obviously simple processes aren't conscious, but then, you put enough of 'em together, something something something, ta-daa, consciousness. Saying 'consciousness emerges' isn't an answer: it's a re-statement of the question.

The problem is, as long as the systems are simple enough so that we can easily hold them in mind in their totality, it's completely clear that there's not any conscious experience necessary for them to perform their function. But then, once you get enough of that together, we can't easily do that anymore, so who knows, right? Maybe consciousness just sorta sparks up?

But in every case where something 'sparks up', we can easily see how and why it does. With consciousness, for some reason, people never even really attempt to address that.
Dude, this is not a hard problem. Thus, I refute it. You keep talking about how we could imagine a really sophisticated robot that could act exactly like a human being, but not have consciousness. It would just have really complicated stimulus-response system and it screams when it stubs its toe but doesn't feel pain, it complains about Mondays but doesn't feel boredom, it goes out for a pizza but doesn't feel hunger.

Except no it doesn't. That is what consciousness IS. If you really could construct a robot that could act "as if" it were a person, then what that robot does IS consciousness. Consciousness is just a partial awareness of our internal state. I feel anger, but consciousness is when I'm aware that I'm angry. As for the internal qualia of whether red seems like red, here's how I refute it: there's no such thing as qualia. It's a nonsense word.

When I look at a red apple, something forms in my mind and I experience the color "red". Except I know for a fact that's not exactly the same thing that forms in your mind when you look at the same apple, because your mind isn't physically connected to my mind. If it's impossible to determine that have the same qualia when looking at a red apple, then the only logical answer is that qualia don't exist, and are a bullshit way of thinking about the problem. They are invisible intangible fire-breathing dragons in my garage that disappear when you look for them. What's the difference between an invisible intangible undetectable fire-breathing dragon that exists in my garage and nothing? If there's no difference, then it's incoherent to say that the dragon exists. And also, I know that I experience colors differently than other people. I'll look at a shirt and call it brown, but my wife will roll her eyes and say it's green. Because my cones are slightly different, I have partial color blindness. Except I see red things, I see green things, I can tell you a green apple is green. But I can't have the same internal qualia as my wife, because I literally see differently than she does.

What makes you think I have consciousness? Because I react kinda like you? And you have consciousness? What makes you think you have consciousness? Because you experience internal states and are aware of those states? That's not a hard problem. Why do we understand our own internal states? Because we're social animals who live in a complex social system and we have to keep track of the internal states of the rest of the hairless primates around us. And understanding that Thag is angry gives us an advantage in dealing with Thag. There we go. It's not mysterious. And the "subjective feeling" we get is just how it works. Maybe Thag is like the Pyro, and when I believe he's angry, he's really experiencing lollypops and rainbows. But if he's really experiencing a completely different reality than I am, why is it that I can predict how Thag will react when he's angry?

There's nothing magic about the state of being angry. It's just a name we give to a particular internal state, and the reason we believe that others experience the same state is that they react the same way over and over again. There could be lots of human internal states that don't have names, because those states are idiosyncratic, and whenever Thag tries to explain how he feels to other people, he can't, because as far as Thag can tell nobody else feels like he does. Or maybe they do, and he just can't figure it out.

In any case, it's not super-mysterious, unless we redefine "consciousness" to mean something that nobody except a few philosophers agrees it means. There's glory for you. How do I know Thag is angry? Because he acts as if he's angry. There's no qualia there. How do I know that I'm angry? Hey, sometimes I'm angry and I don't even realize I'm angry. Where's the qualia then? How can I be angry if I don't have a subjective experience of being angry? Well, the human mind is complicated, and so is the chimpanzee mind, and so is the monkey mind, and so is the tree shrew mind, and so is the lizard mind, and so is the fish mind.

When we get down to the wormy-thing mind, maybe it's not so complicated, and we can map exactly the exact neurons that fire to each exact stimuli, and the exact physical response. And then we can say that the worm is "just" a meat robot, without consciousness. But consciousness is just a word we use to mean a creature that reacts like a human being, so whatever it is that causes humans to act like humans that's what we mean by consciousness. And so a meat robot that can act as if it were a human being is conscious, because that's what it means to be conscious. And of course, humans are those meat robots. But we're not "just" meat robots. You can't smuggle that "just" into there.

Last edited by Lemur866; 02-11-2019 at 02:42 PM.
  #73  
Old 02-11-2019, 05:03 PM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by Novelty Bobble View Post
For the parts that I've quoted here though I say that I have no problem in imagining consciousness emerging as sensory complexity increases.
Well then, explain it to me! Tell me how you tell the blind man what it's like to see red.

Quote:
I don't see any need for it to suddenly "spark" into being. Like the 3d sonar pictures of bats wouldn't have just appeared fully-formed but would have gradually evolved from a base-line normal hearing system with each "improvement" conferring a benefit.
So it can be with consciousness. Go back to through the proto-humans and early primates. Were we able to run tests to measure consciousness on them I'd expect to see a gradual change over the millions of years involved. I wouldn't expect to see a single generational leap from "non-conscious" to "conscious" with Lucy-Junior being perplexed and frustrated as to why neither of his parents are able to grasp even the most simple of card-games.
Again, I'm not saying that consciousness can't be gradual, that it's some kind of all-or-nothing deal. I want to know how consciousness---any little bit of it---reduces to its material substrate. Imagine the simplest conscious being there could be. Tell me what it's conscious of, and how that consciousness comes about. Tell me how a subjective viewpoint arises; tell me what physical facts entail which phenomenal facts. Or at least give me a hint---after all, you seem sure that story can be told, so you'll have to have some idea of how it might go, right?

Quote:
I think trying to draw an equivalence with water isn't helpful. A single neuron or even the simplest sensory structure of a procaryote is orders of magnitude more complicated than a water molecule. Once life has bridged that gap to sense organ I think the evolution of a brain-like structure and some form of consciousness is pretty much a nailed-on certainty.
So again: once things just get complex enough, consciousness.

Quote:
Originally Posted by Exapno Mapcase View Post
If you think that there's a split between physicists like Albert and Krauss you've never looked at philosophy. Nobody agrees with anybody about anything. Add that to the humongous handicap that nothing any philosopher says can be proven or shown to have empirical backing. Perhaps that's why few outsiders take the word of a philosopher on any issue.
Agreements emerge among philosophers all the time. Mostly, that's when people start calling them scientists, though.

That's slightly facetious, of course. But if things were as you say, then we wouldn't have any science, because what science is, is a philosophical question---and yes, that definition evolves, from positivism to falsificationism to methodological anarchism to Kuhnian paradigm shifts and whatnot. Not everything has an immediate right answer that just has to be discovered; not every discussion just boils down to a game of twenty questions. The object of discovery may evolve as it is subject to discussion, but that doesn't mean there aren't better and worse answers. That everything should have empirical backing is philosophy; it just happens to be rather bad philosophy.

Quote:
Originally Posted by Lemur866 View Post
Dude, this is not a hard problem. Thus, I refute it. You keep talking about how we could imagine a really sophisticated robot that could act exactly like a human being, but not have consciousness.
Actually, I've talked about how we can take really simple systems that replicate really simple behaviors of conscious agents while clearly not being conscious themselves; this puts the onus of proof on those that claim by bundling such simple things together, consciousness ought to arise. Nobody has even tried to rise up to that, and nobody will.

Quote:
Except no it doesn't. That is what consciousness IS. If you really could construct a robot that could act "as if" it were a person, then what that robot does IS consciousness. Consciousness is just a partial awareness of our internal state.
Those are contradictory statements. Either acting 'as if' something is conscious is consciousness, or partial awareness of our internal state is consciousness.

And of course, the latter is question-begging: since awareness is a feature of consciousness, explaining consciousness by means of awareness is circular. The question is exactly how we can be aware of our internal state!

Quote:
I feel anger, but consciousness is when I'm aware that I'm angry. As for the internal qualia of whether red seems like red, here's how I refute it: there's no such thing as qualia. It's a nonsense word.

When I look at a red apple, something forms in my mind and I experience the color "red".
But that's all that qualia are: what forms in your mind when you experience the color red.

There are, of course, more sophisticated ways to attempt to formulate an eliminativist theory of mind, though; but ultimately, all of them must eventually rise to the challenge of explaining how the illusion of subjective experience comes about. Eliminativists typically believe that this will be easier than explaining how 'real' subjective experience comes about, but so far, none have managed to cash in on that intuition. Personally, I doubt this can be of help: there is no material difference between having subjective internal states, and being under the illusion that one does. Whether I have a migraine, or merely believe I do, the fact is, my head hurts.

Quote:
Except I know for a fact that's not exactly the same thing that forms in your mind when you look at the same apple, because your mind isn't physically connected to my mind.
Why does something need to be physically connected to be the same? If a computer on Mars performs a computation, and a computer on Earth does, they can be exactly the same without both ever having interacted.

Quote:
If it's impossible to determine that have the same qualia when looking at a red apple, then the only logical answer is that qualia don't exist, and are a bullshit way of thinking about the problem.
That's not even close to the only logical answer; indeed, it's not actually a logical answer at all! It's perfectly logically possible that incomparable properties exist.

Besides, qualia are not a way of thinking about the problem; they are the problem. That we have subjective experience is just data; eliminativism is the failure of coming up with a theory that fits the data, and consequently, seeking the fault with the data rather than with one's theories. That such desperate moves are even considered speaks to the hardness of the problem. In no other discipline does one just try to throw out all the data we have, because we can't seem to make sense of them.

Quote:
They are invisible intangible fire-breathing dragons in my garage that disappear when you look for them.
No, quite to the contrary: they are the most basic, most immediate realities we ever come into contact with. Everything else, all that we know about the external world, all that we conclude about physics, other people, the sun and the stars comes first and foremost mediated via subjective experience; they are the things we ought to be certain of most of all, whereas we may doubt anything else.

Before I can begin to know what the Moon is, I know how it feels to me to see its light.

Quote:
Except I see red things, I see green things, I can tell you a green apple is green. But I can't have the same internal qualia as my wife, because I literally see differently than she does.
So now you're busy comparing those things that don't exist, and even if they existed, couldn't possibly be comparable!

Quote:
What makes you think I have consciousness? Because I react kinda like you? And you have consciousness? What makes you think you have consciousness? Because you experience internal states and are aware of those states? That's not a hard problem. Why do we understand our own internal states? Because we're social animals who live in a complex social system and we have to keep track of the internal states of the rest of the hairless primates around us. And understanding that Thag is angry gives us an advantage in dealing with Thag. There we go. It's not mysterious.
I agree, none of that is mysterious. It also has nothing to do with subjective experience, and the problems it poses: anger in others is displayed via certain behaviors, and one's own behavior can be adapted in response. This is all on the level of functions; what, if any, subjective experience accompanies this is entirely immaterial.

Quote:
And the "subjective feeling" we get is just how it works.
That's of course always a good answer. Very Aristotelian. Falling down is just how stones work, it's their nature to want to move downwards! No need for a Newton with a theory of how this works, much less an Einstein. It's just how it works!

Quote:
Where's the qualia then? How can I be angry if I don't have a subjective experience of being angry?
I keep getting confused by what it is you're trying to argue. Do qualia exist, or don't they? If they don't, then how come there's a separate experience of being angry in addition to simply being angry?

Quote:
But consciousness is just a word we use to mean a creature that reacts like a human being, so whatever it is that causes humans to act like humans that's what we mean by consciousness.
Picture a robot whose internal dynamics is solely given by a humongous lookup table. For any input, there's the appropriate output. All it ever does is match inputs to outputs.

With a huge enough lookup table, it can act like a conscious being for any given length of time. Does that establish that it's conscious in the way we are?
  #74  
Old 02-11-2019, 05:32 PM
DrCube DrCube is online now
Guest
 
Join Date: Oct 2005
Location: Caseyville, IL
Posts: 7,170
Quote:
Originally Posted by XT View Post
Well, it could, if you believe in the multiple membranes and spontaneous creation of new bubble universes theory, or even the bounce theory (i.e. big bang leading eventually to a big crunch, rinse and repeat). If we are talking membranes then it's turtles all the way down...where did the first ones come from? Sort of like the panspermia theory...you are just kicking the first life happening down the road. But as for our universe, the reason they say it stated at a specific instance (and a lot of physicists were not happy this was the case, including old Einstein) is because of observations. We can see the various galaxies and clusters moving apart, even speeding up, and we can look back in time and see the progression backwards to smaller and tighter clusters, even back to the emergence of the first observable stars (just a few hundred thousand years after the big bang, IIRC...or perhaps a few million, drink and age has dimmed my memory). It's not just because of the human construct about time.
I don't see how those observations necessitate a beginning or kick any cans down the road. The observations show the universe used to be much more dense and look very different structurally. Then it exploded. We often equate the big bang with the "beginning" of the universe, but that seems unwarranted. It's just as likely it was one of many changes our universe has seen in its infinite history.
  #75  
Old 02-11-2019, 06:03 PM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Half Man Half Wit View Post
Well then, explain it to me! Tell me how you tell the blind man what it's like to see red.
I don't think you can, but nor do think that the ability to do so tells you anything meaningful about the concept of consciousness.

Quote:
Again, I'm not saying that consciousness can't be gradual
good, because it is empirically the case.

Quote:
I want to know how consciousness---any little bit of it---reduces to its material substrate.
yes, it would be nice to know the exact mechanism wouldn't it. We don't....yet.

Quote:
Imagine the simplest conscious being there could be. Tell me what it's conscious of, and how that consciousness comes about. Tell me how a subjective viewpoint arises; tell me what physical facts entail which phenomenal facts. Or at least give me a hint---after all, you seem sure that story can be told, so you'll have to have some idea of how it might go, right?
no, I don't claim that story can be told, nor do I think that being able to tell that "story" is at all relevant. The brain is a physical entity and incredibly complex. Its workings are mysterious and we have barely scratched the surface but there is nothing to suggest that the mind is anything other than a product of that purely physical entity and so should, ultimately, be amenable to natural scientific analysis.

Quote:
So again: once things just get complex enough, consciousness.
Well that seems to be the case yes. Conscious thought and self awareness seems to correlate pretty well to brain complexity.

Let me ask you a straight question.

Do you think that consciousness is a product arising purely from the physical matter of the brain and nervous system?
If not, what leads you to think otherwise?
__________________
I'm saving this space for the first good insult hurled my way
  #76  
Old 02-11-2019, 06:24 PM
The Other Waldo Pepper The Other Waldo Pepper is online now
Guest
 
Join Date: Apr 2009
Posts: 16,128
Quote:
Originally Posted by Exapno Mapcase View Post
I think Albert is dead wrong in dismissing this argument as trivial, as he does in his last paragraph. I think that dismissing "nothing" as a possible argument has vast implications. It undoes 2000 years of philosophy at a stroke, so I don't wonder that a professor of philosophy would rail against it.

Two things: first, why would it undo 2000 years of philosophy?

Second: what if they don’t dismiss it as a possible argument, but instead go for something like unto the I-Have-No-Need-Of-That-Hypothesis line to say we can’t technically rule out the possibility; but, as far as we can tell, it ain’t so?
  #77  
Old 02-11-2019, 06:41 PM
Lemur866's Avatar
Lemur866 Lemur866 is offline
Charter Member
 
Join Date: Jul 2000
Location: The Middle of Puget Sound
Posts: 22,343
Quote:
Originally Posted by Half Man Half Wit View Post
Picture a robot whose internal dynamics is solely given by a humongous lookup table. For any input, there's the appropriate output. All it ever does is match inputs to outputs.

With a huge enough lookup table, it can act like a conscious being for any given length of time. Does that establish that it's conscious in the way we are?
"A huge enough lookup table"?

Dude, that's not how the brain works. Thing is, your stimulus response lookup table has to have some sort of memory, because how else can the robot respond sensibly to human conversation? It literally can't be a giant lookup table because it wouldn't be physically possible to map every possible human sentence and construct multiple plausible responses to each sentence.

The reason the Chinese Room won't work is that it can't be actually implemented, because you'd need a room as big as the solar system, and how can you give your snappy comebacks to "Working hard or hardly working?" when the answer is 12 light-seconds away?

But to bite the bullet, despite my protestations that a Chinese Room isn't physically possible, if you really could show me a Chinese Room that really could pass a Turing Test, then yes, I'd agree that the entire system is conscious. Not the book, but the system of memory and communication and all the dudes running back and forth fetching papyri. If you really could physically implement it, and it really did work, then yeah, it would be conscious. In the same way that your brain is conscious, despite the fact that your neurons and glial cells and what have you are not conscious.

This is like asking if a computer chess program can "really" play chess. It's not playing chess, it's just solving various math problems. Right? Except what's the difference between really playing chess and only acting as if you're playing chess? I hereby assert that there's no difference. A drunk badger wandering on a chess board isn't playing chess even if he's moving around the pieces. He's not playing really bad chess, he's not playing chess. But if he's sitting there obeying the rules of chess and moving the pieces around with his little paws or nose? Then yeah, he's playing chess.

Or if you assert that a computer program can't play chess, then how do you know a human being can play chess? How do you tell the difference between a human being who can play chess, and a human being who's just knocking over pieces like a drunk badger? If you can tell the difference between a human playing chess and a human being not playing chess, why can't you use the same heuristic to determine if some unknown entity is playing chess or not?
  #78  
Old 02-11-2019, 06:57 PM
Exapno Mapcase Exapno Mapcase is offline
Charter Member
 
Join Date: Mar 2002
Location: NY but not NYC
Posts: 30,849
Quote:
Originally Posted by The Other Waldo Pepper View Post
Two things: first, why would it undo 2000 years of philosophy?

Second: what if they don’t dismiss it as a possible argument, but instead go for something like unto the I-Have-No-Need-Of-That-Hypothesis line to say we can’t technically rule out the possibility; but, as far as we can tell, it ain’t so?
The question asked by the OP is the basis for much of philosophy. It was raised by the Greeks and it became essential to Christian belief, which powered the western world's philosophy for 2000 years. It certainly has a bearing on being and existence and other of the fundamental arguments in philosophy like ontology. (I say ontology recapitulates phylogeny: our beliefs are based on our history of beliefs.)

Laplace gave the I-Have-No-Need-Of-That-Hypothesis line to refute a Creator, i.e. religion, in favor of scientific explanation. This isn't even about competing scientific explanations. Even Albert, in his own words, acknowledges there is always something. Whether vacuum energy gave rise to the present universe is not the issue; that may be right or not. That there was never nothing is the issue. If acknowledged it must be incorporated. And note that no philosopher brought this into the discussion: it was forced upon them by science.*

*To my knowledge. I haven't encountered a philosopher who negated nothing out of hand. If there is a branch of philosophy that did so I definitely would like to hear about it.
  #79  
Old 02-11-2019, 07:34 PM
The Other Waldo Pepper The Other Waldo Pepper is online now
Guest
 
Join Date: Apr 2009
Posts: 16,128
Quote:
Originally Posted by Exapno Mapcase View Post
The question asked by the OP is the basis for much of philosophy. It was raised by the Greeks and it became essential to Christian belief, which powered the western world's philosophy for 2000 years. It certainly has a bearing on being and existence and other of the fundamental arguments in philosophy like ontology. (I say ontology recapitulates phylogeny: our beliefs are based on our history of beliefs.)

Laplace gave the I-Have-No-Need-Of-That-Hypothesis line to refute a Creator, i.e. religion, in favor of scientific explanation. This isn't even about competing scientific explanations. Even Albert, in his own words, acknowledges there is always something. Whether vacuum energy gave rise to the present universe is not the issue; that may be right or not. That there was never nothing is the issue. If acknowledged it must be incorporated. And note that no philosopher brought this into the discussion: it was forced upon them by science.*

*To my knowledge. I haven't encountered a philosopher who negated nothing out of hand. If there is a branch of philosophy that did so I definitely would like to hear about it.
I’m no expert, but: while I don’t recall Nietzsche negating it out of hand, I also don’t recall him relying on it. I don’t, off the top of my head, recall Hume ever negating the possibility; but I don’t recall him relying on it, either. What did Spinoza believe? What did Marx believe? What did Wittgenstein believe? What, in their writings, do you figure would’ve fallen apart if they hadn’t granted this?
  #80  
Old 02-11-2019, 08:03 PM
Voyager's Avatar
Voyager Voyager is offline
Charter Member
 
Join Date: Aug 2002
Location: Deep Space
Posts: 44,911
Quote:
Originally Posted by Half Man Half Wit View Post
Well then, explain it to me! Tell me how you tell the blind man what it's like to see red.
I assume your subconscious mind solves problems. Mine is very good at anagrams. How does it do it? Do you have visibility into its actions?
Now, I trust that when you write a post you are observing yourself writing it, analyzing what you have written, and sometimes going back to improve it. There is feedback between your conscious mind and your writing, feedback which doesn't exist for your subconscious mind.
Our subconscious minds work better with practice, and we can program them somehow (like for driving, which is obviously not inborn.) But we need our conscious minds to radically alert a problem solving strategy. Animal do this through natural selection, we can do it in a lifetime.
That should show the evolutionary advantage of consciousness pretty clearly.
I don't understand the pain thing. Non-conscious animals react to pain pretty much as we do, though we can do better at eliminating the source. Our reflexes don't even need higher brain function to work. You take your hand away from the stove long before you think about taking your hand away from the stove, after all.
  #81  
Old 02-11-2019, 11:12 PM
Mijin's Avatar
Mijin Mijin is offline
Guest
 
Join Date: Feb 2006
Location: Shanghai
Posts: 8,802
Quote:
Originally Posted by Exapno Mapcase View Post
If you think that there's a split between physicists like Albert and Krauss you've never looked at philosophy. Nobody agrees with anybody about anything. Add that to the humongous handicap that nothing any philosopher says can be proven or shown to have empirical backing. Perhaps that's why few outsiders take the word of a philosopher on any issue.
Well at the point where we are able to test philosophical ideas, we stop calling them "philosophy". So it's true by definition that philosophy is purely abstract, but that doesn't mean that philosophical discussion never leads anywhere.

Also, a thing to bear in mind for philosophy, is that while it's often not be possible to prove any particular proposition, it's often pretty easy to find flaws in one. So, IME, the people most dismissive of philosophy, are the people who want to espouse a particular philosophical position while handwaving or just ignoring solid philosophical arguments against that position.

This is often religious apologetics; who want to use first cause arguments or stuff about God being necessary as a foundation of logic while being dismissive of all the arguments against such lines of reasoning.

But unfortunately it is also sometimes "celebrity physicists" e.g. Hawking had many philosophical opinions and he was also quite dismissive of the philosophers, which is not a good combination. I think a lot of what he said could be shot down.
  #82  
Old 02-12-2019, 01:08 AM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by Novelty Bobble View Post
I don't think you can, but nor do think that the ability to do so tells you anything meaningful about the concept of consciousness.
The two parts of this statement seem diametrically opposed to each other to me. If it's impossible to tell the blind man what red looks like, then knowledge of what red looks like is radically unlike knowledge of anything else---because for anything else, from how a computer works to how to ride a bicycle, knowledge can easily be communicated. So if it's true that you can't tell a blind man what it's like to see red, then consciousness differs from every other area of inquiry.

Quote:
no, I don't claim that story can be told, nor do I think that being able to tell that "story" is at all relevant. The brain is a physical entity and incredibly complex. Its workings are mysterious and we have barely scratched the surface but there is nothing to suggest that the mind is anything other than a product of that purely physical entity and so should, ultimately, be amenable to natural scientific analysis.
Again, the first and second part are in direct conflict. If consciousness is amenable to scientific analysis in the usual way, then you should be able to tell the blind man what it's like to see red, by simply teaching him that scientific analysis. "The brain works in mysterious ways" certainly isn't a good answer in any case.

Quote:
Do you think that consciousness is a product arising purely from the physical matter of the brain and nervous system?
Yes (although I'd wager my thoughts on what that means differ quite a bit from yours). But I also think this approach has big problems we won't get around by vaguely handwaving at the complexity of the brain.

Quote:
Originally Posted by Lemur866 View Post
"A huge enough lookup table"?

Dude, that's not how the brain works.
Well, that was sorta the point: to try and showcase a system that works very differently from a brain, yet serves to generate the same behavior, to test the claim that 'conscious is as conscious does', i. e. whether something is conscious is entailed by how it behaves.

This isn't the Chinese Room, by the way; Searle was concerned with intentionality, not phenomenal experience, which is a different, but also difficult, problem in the philosophy of mind. The Chinese Room is allowed to rely on all manner of processing in order to produce answers; the lookup table merely looks things up.

Also, the physical realizability of the lookup table has no bearing on whether the argument succeeds; we're solely interested in the truth of the counterfactual 'if such a lookup table existed, it would be conscious', which, like 'if it rained tomorrow, I would get wet' is true or false independently of whether it actually exists, or actually rains.

Quote:
Thing is, your stimulus response lookup table has to have some sort of memory, because how else can the robot respond sensibly to human conversation?
By having each entry in the lookup table be the entire history of the conversation up to that point. Again, trivially not physically realizable, but that just misses the point.

Quote:
But to bite the bullet, despite my protestations that a Chinese Room isn't physically possible, if you really could show me a Chinese Room that really could pass a Turing Test, then yes, I'd agree that the entire system is conscious.
Good. So you hold that consciousness is entirely determined by behavior. Then, are fish conscious? What about bats? What about a thermostat? What behavior is sufficient to decide conscious experience? Only human-equivalent behavior?

When does behaving a certain way need conscious experience? Because for any behavior at all, it seems easy to come up with a system that shows it, without that system being conscious. Right, we can't keep the entire bundle of such behaviors in mind, and similarly see that there's nothing conscious about that, so maybe, something unknown is doing we don't know what and the whole thing suddenly becomes conscious, but is that really a satisfying answer to you?

Quote:
This is like asking if a computer chess program can "really" play chess. It's not playing chess, it's just solving various math problems.
No. It's asking whether, in order to play chess, a system needs to experience itself playing chess. Playing chess manifestly is just behavioral: chess is defined by behaving a certain way. But you're arguing that this behavior suffices to determine whether an agent is conscious, that is, whether it experiences something while showing that behavior. This goes beyond simple behavioral analysis, and utilizes a hypothesis that anything that behaves like a conscious being must itself be conscious, which you haven't so far done anything to justify.

Quote:
Originally Posted by Exapno Mapcase View Post
The question asked by the OP is the basis for much of philosophy. It was raised by the Greeks and it became essential to Christian belief, which powered the western world's philosophy for 2000 years.
I don't think that's really historically accurate. Early philosophical thought was, at least when it comes to metaphysics, mostly concerned with what the world is made of---take Thales, and the idea that everything is water.

Quote:
Even Albert, in his own words, acknowledges there is always something.
In order to point out that this doesn't alleviate the explanatory burden: if there always is something, we still have to explain that something. The idea that the world is eternal does not do away with the question of why, or how, it exists.

Quote:
That there was never nothing is the issue. If acknowledged it must be incorporated. And note that no philosopher brought this into the discussion: it was forced upon them by science.
The idea that 'nothing' is, in some sense, impossible is very common throughout the history of philosophy. Parmenides is the first that comes to mind.

Quote:
Originally Posted by Voyager View Post
I assume your subconscious mind solves problems. Mine is very good at anagrams. How does it do it? Do you have visibility into its actions?
Now, I trust that when you write a post you are observing yourself writing it, analyzing what you have written, and sometimes going back to improve it. There is feedback between your conscious mind and your writing, feedback which doesn't exist for your subconscious mind.
Feedback isn't the same as conscious experience, though. A control loop changes its state based on feedback, but it does so (presumably, although if you're a panpsychist, you might beg to differ) without any conscious experience.

And everything in our mental problem solving that needs feedback can be formulated in terms of such control loops; complex ones, maybe, but still, there's no indication that conscious experience is a necessary consequence of complex control loops.

Quote:
That should show the evolutionary advantage of consciousness pretty clearly.
The evolutionary advantage of adapting to the environment by incorporating feedback into our behavior, yes. The advantage of being conscious of the whole process, no.

Quote:
I don't understand the pain thing. Non-conscious animals react to pain pretty much as we do, though we can do better at eliminating the source.
This seems confused. Non-conscious animals don't feel pain (as they don't feel anything, 'feeling anything' being what being conscious means), so they can't react to pain at all. They react to stimuli, in the same way a thermostat reacts to temperature. The thermostat doesn't feel hot when it decides to lower the temperature. Why do we? How do we?

Quote:
You take your hand away from the stove long before you think about taking your hand away from the stove, after all.
Exactly! Even complex behaviors don't need any conscious attendance. Sleepwalkers can perform highly complex tasks without being conscious of them. One could imagine evolutionary pressure shaping the behavior of sleepwalkers to be indistinguishable from that of a waking human; so how can evolution select for conscious experience?

Last edited by Half Man Half Wit; 02-12-2019 at 01:10 AM.
  #83  
Old 02-12-2019, 01:48 AM
eschereal's Avatar
eschereal eschereal is online now
Guest
 
Join Date: Aug 2012
Location: Frogstar World B
Posts: 15,602
Consciousness is not that big of a problem. I could observe it in my Nefurtari, or, for that matter, that one crow that lands on the tree in the back yard and squawks at me with an irritated tone.

It is entirely organic, rooted in the survival instinct and taking the form of the ectoplastic phanticulum. Living things developed it as a side effect of natural selection, though most living things compose no treatises on the subject. We probably will eventually build devices that can exhibit the symptoms of self-awareness, but it will not be consciousness because it cannot be holistic without the basis of need that creatures possess.
  #84  
Old 02-12-2019, 02:23 AM
Chimera Chimera is offline
Member
 
Join Date: Sep 2002
Location: In the Dreaming
Posts: 24,183
Quote:
Originally Posted by eschereal View Post
Is the universe a thing?
Everything is a thing, man.

Just as nothing involves no thing.
__________________
Tentatively and lightly dipping my toes back in the water.
  #85  
Old 02-12-2019, 02:25 AM
Voyager's Avatar
Voyager Voyager is offline
Charter Member
 
Join Date: Aug 2002
Location: Deep Space
Posts: 44,911
Quote:
Originally Posted by Half Man Half Wit View Post

Feedback isn't the same as conscious experience, though. A control loop changes its state based on feedback, but it does so (presumably, although if you're a panpsychist, you might beg to differ) without any conscious experience.

And everything in our mental problem solving that needs feedback can be formulated in terms of such control loops; complex ones, maybe, but still, there's no indication that conscious experience is a necessary consequence of complex control loops.
I never said that feedback and consciousness are equivalent - I said that consciousness is a form of feedback. Do you modify your thoughts based on examination of previous thoughts? For instance, do you come up with an idea, see flaws in it, and modify it? Voila, feedback.
I repeat my question about your experience of the subconscious. The ability to perceive our thoughts internally is what makes us aware that we are thinking beings. Is your subconscious aware that it is thinking? Is it aware of anything about what it is doing?
Maybe the right take on your blind man question is whether a non-blind person can imagine not knowing what red is. Likewise we as conscious people have a hard time understanding unconscious thought, thus the pathetic fallacy.
Quote:
The evolutionary advantage of adapting to the environment by incorporating feedback into our behavior, yes. The advantage of being conscious of the whole process, no.
Not just feedback from the environment but feedback from our mental processes. We as conscious beings can teach unconscious beings complex behaviors, but they can't teach themselves anything.
Quote:
This seems confused. Non-conscious animals don't feel pain (as they don't feel anything, 'feeling anything' being what being conscious means), so they can't react to pain at all. They react to stimuli, in the same way a thermostat reacts to temperature. The thermostat doesn't feel hot when it decides to lower the temperature. Why do we? How do we?
Really? We know that we feel pain because we are conscious, but animals certainly can feel pain. Any dog owner knows this. Reacting to stimuli is different. Dogs and other animals will avoid painful things. That's obviously evolutionarily advantageous by itself.
We have plenty of internal feedback mechanisms in our bodies outside of consciousness. That's why you can't hold your breath and die for example. We do temperature regulation without using higher functions.

Quote:
Exactly! Even complex behaviors don't need any conscious attendance. Sleepwalkers can perform highly complex tasks without being conscious of them. One could imagine evolutionary pressure shaping the behavior of sleepwalkers to be indistinguishable from that of a waking human; so how can evolution select for conscious experience?
So can dogs. I trained guide dogs, mostly socialization, but when they really got trained they could do things like refuse to let its partner cross the street when there was danger. I already mentioned that our subconscious can drive. But our subconscious cannot improve, as I said before. I'll believe you when you show me a sleepwalker write and revise a paper.
  #86  
Old 02-12-2019, 03:44 AM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Half Man Half Wit View Post
The two parts of this statement seem diametrically opposed to each other to me.
Not really, If we accept that consciousness is a brain function and nothing else then of course we should not be surprised that it has limitations.

Quote:
If it's impossible to tell the blind man what red looks like, then knowledge of what red looks like is radically unlike knowledge of anything else---because for anything else, from how a computer works to how to ride a bicycle, knowledge can easily be communicated.
I'm incapable of imagining multiple dimensions but I can be given the mathematics that explain how it works. The concept of "red" in the mind of a blind person could also be defined in that way. If they have any mental imagery at all then it seems possible to get them to imagine light wavelength changes as corresponding to different mental image appearances.

Quote:
So if it's true that you can't tell a blind man what it's like to see red, then consciousness differs from every other area of inquiry.
I don't think you've made that case.

Quote:
Again, the first and second part are in direct conflict. If consciousness is amenable to scientific analysis in the usual way, then you should be able to tell the blind man what it's like to see red, by simply teaching him that scientific analysis.
I wouldn't rule out the possibility that one day we may be able to. First understand how the brain processes red light visual stimuli for sighted people and then replicate this within the brain for the blind person.
Careful with that "usual way" though. Science is a method of rational enquiry and I think it is the only way of understanding how the world works and that goes for the brain as well. However, advancements in techniques and knowledge ensure that the methods used in that scientific enquiry can change and the "usual" techniques at the moment may need enhancement in order to unlock the secrets of the brain. But whatever those techniques are they will be employed using science in "the usual way".

Quote:
"The brain works in mysterious ways" certainly isn't a good answer in any case.
It isn't an answer at all, it is merely a statement of our current knowledge.

Quote:
Yes (although I'd wager my thoughts on what that means differ quite a bit from yours).
Really? does your definition of "physical" differ from mine?

Quote:
But I also think this approach has big problems we won't get around by vaguely handwaving at the complexity of the brain.
I don't promote "vague handwaving" as a method for understanding the brain.
__________________
I'm saving this space for the first good insult hurled my way
  #87  
Old 02-12-2019, 06:46 AM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by Voyager View Post
I never said that feedback and consciousness are equivalent - I said that consciousness is a form of feedback.
Well, let's say that we're conscious of feedback mechanisms in our minds. I am conscious of pain, and that in itself doesn't seem to require any feedback.

Regardless, there evidently exists feedback without consciousness: take the thermostat. Hence, the presence of feedback mechanisms, such as learning and revising plans, are not sufficient to conclude the presence of consciousness. So, what is it that makes feedback conscious in one case, and not so in another?

Quote:
I repeat my question about your experience of the subconscious. The ability to perceive our thoughts internally is what makes us aware that we are thinking beings. Is your subconscious aware that it is thinking? Is it aware of anything about what it is doing?
I'm not sure I understand exactly what you're asking. Am I aware of my subconscious thought processes? Certainly not, that's what makes them subconscious. But what does that establish wrt the question whether behavioral analysis suffices to decide whether something is conscious, and thus, whether behavior determines consciousness, and hence, whether evolution, acting on the level of behavior, can 'select for' consciousness?

Quote:
Not just feedback from the environment but feedback from our mental processes. We as conscious beings can teach unconscious beings complex behaviors, but they can't teach themselves anything.
So is your claim that consciousness is necessary for (self-guided) learning? Is a neural net performing unsupervised learning necessarily conscious, then?

Quote:
Originally Posted by Novelty Bobble View Post
Not really, If we accept that consciousness is a brain function and nothing else then of course we should not be surprised that it has limitations.
I don't know what you mean by that, or how it applies to my argument.

Quote:
I'm incapable of imagining multiple dimensions but I can be given the mathematics that explain how it works.
You're making my case for me: you're incapable of imagining what it would be like to experience multiple dimensions, but you're perfectly capable of knowing everything else about them. So it's the experiential aspect that you invoke via imagination that remains inaccessible to you even given all of the objective facts about multiple dimensions---that is, this experiential aspect is fundamentally different from all the other aspects of multiple dimensionality. You can know, for instance, how light propagates, what the ratio of the surface area to the volume of a higher dimensional sphere is, and so on, without ever having come into contact with higher dimensions. However, there is one kind of knowledge, and one only, that requires you to directly come into contact with its object, and that's experiential knowledge. Thus, this category is fundamentally set apart from every other.

Quote:
The concept of "red" in the mind of a blind person could also be defined in that way. If they have any mental imagery at all then it seems possible to get them to imagine light wavelength changes as corresponding to different mental image appearances.
But even this (which I think is misguided, like trying to build what something sounds like from colors) hinges on if they have any mental imagery at all. You thus share the intuition that without having an experience of mental imagery, knowledge about experience is forever inaccessible. This is unlike knowledge of anything else: I can know the sun without ever having seen it; I can know its size, its volume, its temperature, everything that makes the sun the sun, every objective fact about its existence. None of this will tell me what it's like to feel its warmth on my skin: for that, I need to come into direct contact with this experience, I need to actually have that experience (or some sufficiently similar one, i. e. any experience of warmth will do).

Or, to take another approach: it's easy, for anything, to come up with new exemplars once one has become acquainted with the general category. I know what characterizes the category of stars; I can invent stars that have never pierced the darkness of space. I can invent people that have never lived. Tell stories that have never happened.

But I can't invent a color that I've never seen. I can't imagine what it would be like, for instance, if my vision extended into the ultraviolet. I can't imagine what it would be like to be a tetrachromat. Or, to take the canonical example, a bat, sensing the world by echolocation.

I think your trouble comes from failing to distinguish between an object and the experience of that object---hence, your appeal to imagining multiple dimensions, despite the fact that this is obviously an experiential act. Perhaps it helps if you consider what you can write down about something. You can write down every fact of the physical composition of the sun, enough for anybody to construct a new one, without being able to explain what its light looks like to anybody who's never seen anything, or what its warmth feels like.

Quote:
I don't think you've made that case.
Well, you can quite easily dispel it: just find one other area of knowledge whose objects necessitate being in contact with them in order to grasp them; one bit of knowledge that can't be communicated without being related to experience in one way or another. (Or, of course, find a way to communicate knowledge about experiences!)

Quote:
It isn't an answer at all, it is merely a statement of our current knowledge.
It is what you proposed to ward off my probing regarding the need for answers to the questions posed by conscious experience---the brain is complex, and well, something something something.

Quote:
Really? does your definition of "physical" differ from mine?
Probably. For starters, I think it's improper to consider the physical to be (just) the object of (the science of) physics; physics tells us things about structure, but the world doesn't exhaust itself in terms of structure. There may be physical entities that are not properly subject to physical science. Galen Strawson has written insightfully on the subject. (There's also a talk of his on YouTube, but I haven't gotten around to watching that yet.)

Quote:
I don't promote "vague handwaving" as a method for understanding the brain.
You do gesture in the direction of 'it's complicated' when it comes to the difficult questions rather than trying to face them head-on, though.

Last edited by Half Man Half Wit; 02-12-2019 at 06:47 AM.
  #88  
Old 02-12-2019, 08:00 AM
SigMan SigMan is offline
Guest
 
Join Date: Aug 2015
Location: Texas
Posts: 915
Quote:
Originally Posted by sisu View Post
Why can't it have always existed? Time is a man made construct to understand the universe, one theory is that everything that is, has and will exist already does.
No, time is real. Time exists because events happen. Measuring time is the human concept.

This is an age old question that cannot be answered. You say God created Heaven and Earth I say what created God?

Was there ever a beginning? If there's a beginning then there's an ending.
  #89  
Old 02-12-2019, 08:35 AM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Half Man Half Wit View Post
You do gesture in the direction of 'it's complicated' when it comes to the difficult questions rather than trying to face them head-on, though.
Hang on, I don't say "it's complicated" and leave it at that. The next step after admitting it is complicated is enquiry and science which very much is facing them head-on.

See, this is where I think we have a fundamental breakdown in communication.

I still do not accept that you have put forward any difficult questions that require anything beyond better understanding of the workings of the brain and an acceptance that natural selection was the means of developing consciousness. If consciousness helps an organism it is selected for, if subjective experiences helps it is selected for. The evolutionary benefits of those traits were probably greater than the ability to describe red to a blind man so it is no wonder that the latter does not come easily.
Though of course the human brain does throw up anomalies such as synesthesia, where words and sounds and smells can be interpreted as colours and vice-versa. In those circumstances you could give a blind man an experience of "red" by triggering it with the right verbal, aural or olfactory cue. Wierd certainly but just what you'd expect from the misfiring of a complex brain and certainly if that trait conferred a strong enough evolutionary benefit then you would not be able to use example of describing the experience of red to a blind man.

So I don't see the subjective experiential nature of consciousness as a particular and distinct mystery. A creature benefits from being able to experience its surroundings and will do so to a greater or lesser extent. Consciousness and subjective experiences seem to me to be trivial examples of the "greater" extent.

See, this is all a technically difficult question and if it can be solved or understood (not a given) it will be through the scientific method and not through philosophy, I just don't see what philosophical musings add to this. Too often it is very clever people playing word games.
  #90  
Old 02-12-2019, 10:18 AM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by Novelty Bobble View Post
Hang on, I don't say "it's complicated" and leave it at that. The next step after admitting it is complicated is enquiry and science which very much is facing them head-on.
Well, the thread of the conversation is basically, you claim that there's no fundamental problem with consciousness, I present some arguments that there is, which you then dismiss because 'it's complicated'.

Quote:
If consciousness helps an organism it is selected for, if subjective experiences helps it is selected for.
Sure, but again, selection only works on the level of behavior, and it's completely unclear in what way behavior and experience are related. In particular, it seems possible to have experiences completely removed from one's behavior; yet, that's not what we're seeing. This is a problem, and one that needs to be addressed if we're ever going to make way on understanding consciousness.

Quote:
A creature benefits from being able to experience its surroundings and will do so to a greater or lesser extent.
How? What is it you can do with consciousness that you can't do without?

Quote:
See, this is all a technically difficult question and if it can be solved or understood (not a given) it will be through the scientific method and not through philosophy, I just don't see what philosophical musings add to this. Too often it is very clever people playing word games.
This isn't remotely as obvious as you make it out to be. Take the paper I linked above: Strawson proposes a coherent notion of how things might be, such that science can't tell us anything about experience, yet consciousness is still a perfectly natural part of the physical world. Claiming that we know a priori what form the answer will take just blinds us to other options, and may lead to chasing rabbits down blind alleys.
  #91  
Old 02-12-2019, 10:53 AM
Voyager's Avatar
Voyager Voyager is offline
Charter Member
 
Join Date: Aug 2002
Location: Deep Space
Posts: 44,911
Quote:
Originally Posted by Half Man Half Wit View Post
Well, let's say that we're conscious of feedback mechanisms in our minds. I am conscious of pain, and that in itself doesn't seem to require any feedback.

Regardless, there evidently exists feedback without consciousness: take the thermostat. Hence, the presence of feedback mechanisms, such as learning and revising plans, are not sufficient to conclude the presence of consciousness. So, what is it that makes feedback conscious in one case, and not so in another?
What part of "all consciousness is feedback but not all feedback is consciousness" don't you get?
Quote:
I'm not sure I understand exactly what you're asking. Am I aware of my subconscious thought processes? Certainly not, that's what makes them subconscious. But what does that establish wrt the question whether behavioral analysis suffices to decide whether something is conscious, and thus, whether behavior determines consciousness, and hence, whether evolution, acting on the level of behavior, can 'select for' consciousness?
Determining whether another entity is conscious is a known difficult problem. But it is irrelevant here, since we know we are conscious. I don't know what you mean by behavior determining consciousness. We might be able to induce the existence of consciousness through behavior (and we might be wrong) but I doubt we became conscious because our ancestors acted in a different way.
Quote:
So is your claim that consciousness is necessary for (self-guided) learning? Is a neural net performing unsupervised learning necessarily conscious, then?
A neural net "learns" through data. Maybe that is like learning through practice - which does not have to involve consciousness. But current neural nets don't examine themselves and change learning strategies based on self-analysis. I'm not saying that it is impossible to do that, just that current learning methods don't.
  #92  
Old 02-12-2019, 11:01 AM
Half Man Half Wit's Avatar
Half Man Half Wit Half Man Half Wit is online now
Guest
 
Join Date: Jun 2007
Posts: 6,477
Quote:
Originally Posted by Voyager View Post
What part of "all consciousness is feedback but not all feedback is consciousness" don't you get?
I get that perfectly well, which is why I asked you what determines whether feedback is conscious.

Also, 'all consciousness is feedback' is simply wrong: there's no feedback in my being conscious about a persistent itch in my left foot. There's just the itch, and me being conscious of it.

Quote:
Determining whether another entity is conscious is a known difficult problem. But it is irrelevant here, since we know we are conscious.
But how does, for want of a better image, evolution know we are conscious? If it's 'difficult' to conclude whether a being is conscious, then how does evolution do the determination? Because if it can't, then it can't select for consciousness. So if you hold that consciousness is adaptive, you must hold that consciousness is determinable via behavior. Hence, my asking how, precisely, you can tell whether something is conscious merely by observing its behavior.

Quote:
A neural net "learns" through data. Maybe that is like learning through practice - which does not have to involve consciousness. But current neural nets don't examine themselves and change learning strategies based on self-analysis. I'm not saying that it is impossible to do that, just that current learning methods don't.
But if they did, they'd be conscious?
  #93  
Old 02-12-2019, 11:30 AM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by Half Man Half Wit View Post
Well, the thread of the conversation is basically, you claim that there's no fundamental problem with consciousness, I present some arguments that there is, which you then dismiss because 'it's complicated'.
I don't dismiss it in that way. I say that the workings of the brain are complicated and that we don't know how consciousness works exactly, yet, but that it is a function arising from a complex brain is close to certain. There are plenty of unsolved problems in biology and consciousness is on the list along with all the others, not in a special section.

Quote:
Sure, but again, selection only works on the level of behavior, and it's completely unclear in what way behavior and experience are related.
If you can't see how subjective experience might drive behaviours which can be selected for, then I think you fundamentally don't understand evolution.

Quote:
In particular, it seems possible to have experiences completely removed from one's behavior; yet, that's not what we're seeing. This is a problem, and one that needs to be addressed if we're ever going to make way on understanding consciousness.
in what way is this a problem? is it that you can't say how it happens or why it might be selected for or escape selection?


Quote:
What is it you can do with consciousness that you can't do without?
Nothing, but to cover all the things that consciousness can do would involve creating a a sufficiently complex machine that would be, in fact, conscious.

Quote:
This isn't remotely as obvious as you make it out to be. Take the paper I linked above: Strawson proposes a coherent notion of how things might be, such that science can't tell us anything about experience, yet consciousness is still a perfectly natural part of the physical world.
And as natural part of the world if anything can uncover the secrets of consciousness and experience it will be science, you can bet the house on that.
__________________
I'm saving this space for the first good insult hurled my way
  #94  
Old 02-12-2019, 01:43 PM
Voyager's Avatar
Voyager Voyager is offline
Charter Member
 
Join Date: Aug 2002
Location: Deep Space
Posts: 44,911
Quote:
Originally Posted by Half Man Half Wit View Post
I get that perfectly well, which is why I asked you what determines whether feedback is conscious.
Cogito ergo sum to coin a phrase?

Quote:
Also, 'all consciousness is feedback' is simply wrong: there's no feedback in my being conscious about a persistent itch in my left foot. There's just the itch, and me being conscious of it.
The itch sensation is transmitted to your brain. That you know you have an itch stems from being able to observe that sensation. Do you think you never have an itch when asleep? Better, ever wake up hungry? Do you think you only became hungry when you awoke, or did you only realize you were hungry when you awoke?
Quote:
But how does, for want of a better image, evolution know we are conscious? If it's 'difficult' to conclude whether a being is conscious, then how does evolution do the determination? Because if it can't, then it can't select for consciousness. So if you hold that consciousness is adaptive, you must hold that consciousness is determinable via behavior. Hence, my asking how, precisely, you can tell whether something is conscious merely by observing its behavior.
C'mon now. Evolution doesn't "know" anything. If consciousness affects behavior (which I hope you accept) and lets us refine our behavior to be more successful in terms of reproduction, that's enough to have it be selected for.
Whether or not other people are actively conscious or not they act as if they are, and that is good enough for evolution.

Quote:
But if they did, they'd be conscious?
They'd be closer.
One big problem with machine learning is that we don't know why the system makes a decision. It would be nice if we could ask it. It might just be telling us a story (like we do when we are asked a similar question about why we did something) but it would a start. If I were a Turing test examiner it would be about the first thing I asked.
  #95  
Old 02-12-2019, 01:48 PM
RaftPeople RaftPeople is offline
Guest
 
Join Date: Jan 2003
Location: 7-Eleven
Posts: 6,438
Quote:
Originally Posted by Novelty Bobble View Post
If consciousness is, as it thought, an emergent property of a complex, physical brain then mutations will result in differing forms of consciousness which will lead creatures to behaving differently in the world. The outcomes of that behaviour will be positive, negative or neutral. Natural selection does the rest. It doesn't seem like a problem to me at all.
I'm late to the thread and still reading, but wanted to point something out if it hasn't been already.

How does consciousness influence behavior?

We know in some cases how the physical states of the cells in the brain can influence or assist with behavior (e.g. pain receptor causing reaction/movement, or the grid of neurons helping with navigation), but where exactly is consciousness and how does it cause behavioral changes?

Some researchers are leaning more towards the "after the fact/tell a story" version, but even with that, there is still a question as to whether it's providing value and if so, how.
  #96  
Old 02-12-2019, 01:54 PM
RaftPeople RaftPeople is offline
Guest
 
Join Date: Jan 2003
Location: 7-Eleven
Posts: 6,438
Quote:
Originally Posted by Half Man Half Wit View Post
But how does, for want of a better image, evolution know we are conscious? If it's 'difficult' to conclude whether a being is conscious, then how does evolution do the determination? Because if it can't, then it can't select for consciousness. So if you hold that consciousness is adaptive, you must hold that consciousness is determinable via behavior. Hence, my asking how, precisely, you can tell whether something is conscious merely by observing its behavior.
Not directly related to your point but interesting:
Recent research of consciousness via mri showed that consciousness tends to have complex patterns of neural activity bouncing around with inactivity in other areas. Unconscious patterns tended to be much simpler.
  #97  
Old 02-12-2019, 03:47 PM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by RaftPeople View Post
How does consciousness influence behavior?
Being able to consciously consider options and learn from past experience and put yourself into the position of other people may well help you construct optimal strategies for whatever situation you find yourself in.
__________________
I'm saving this space for the first good insult hurled my way
  #98  
Old 02-12-2019, 04:13 PM
RaftPeople RaftPeople is offline
Guest
 
Join Date: Jan 2003
Location: 7-Eleven
Posts: 6,438
Quote:
Originally Posted by Novelty Bobble View Post
Being able to consciously consider options and learn from past experience and put yourself into the position of other people may well help you construct optimal strategies for whatever situation you find yourself in.
But, to HMHW's point, all of that can happen from a functional perspective. That's not really answering "how" consciousness can influence behavior.

If consciousness is just a side effect of the electro-chemical states of the brain, then it isn't influencing behavior, it's just along for the ride.

If it's not just along for the ride, how do our cells etc. incorporate consciousness into their resolution of next behavior? What is the mechanism that provides transfer of information from consciousness into the mechanical workings of the brain?
  #99  
Old 02-12-2019, 04:38 PM
Novelty Bobble Novelty Bobble is offline
Member
 
Join Date: Nov 2009
Location: South East England
Posts: 8,074
Quote:
Originally Posted by RaftPeople View Post
But, to HMHW's point, all of that can happen from a functional perspective. That's not really answering "how" consciousness can influence behavior.
Could it happen from a purely functional perspective? yes. That consciousness does influence behaviour is a working hypothesis that seems to fit the facts best thus far.

Quote:
If consciousness is just a side effect of the electro-chemical states of the brain, then it isn't influencing behavior, it's just along for the ride.
you'll have to explain that to me. What is it about being a side-effect that prevents it from influencing behaviour?

Quote:
If it's not just along for the ride, how do our cells etc. incorporate consciousness into their resolution of next behavior? What is the mechanism that provides transfer of information from consciousness into the mechanical workings of the brain?
If you want an explanation of the exact mechanism you'll find plenty of admissions that we don't know yet. That's fine.
__________________
I'm saving this space for the first good insult hurled my way
  #100  
Old 02-12-2019, 05:30 PM
Pardel-Lux's Avatar
Pardel-Lux Pardel-Lux is offline
Guest
 
Join Date: Sep 2016
Location: Berlin
Posts: 71
Quote:
Originally Posted by Mijin View Post
The intellectually honest answer is we don't know.

But let's be clear it's a double don't know: We don't know whether nothing ever became something. Maybe our universe is part of a multiverse that has always existed?

But if there is a definite start point to explain, then we don't have that explanation yet, and it's hard to see how there could be one. We can't start with "Quantum mechanics says..." when we're asking the metaphysical question of why anything exists at all, including physical laws.

The OP mentions consciousness. Well, there's no mystery there: it evolved as a feature of intelligent life; it's something brains do.
Lots we don't understand about consciousness, but not how it originated.
Some people would disagree even on this and they claim not to be nuts. I admit my ignorance.
__________________
‘To a wise man, the whole earth is open, because the true country of a virtuous soul is the entire universe.’ Democritus
Reply

Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 10:57 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2019, vBulletin Solutions, Inc.

Send questions for Cecil Adams to: cecil@straightdope.com

Send comments about this website to: webmaster@straightdope.com

Terms of Use / Privacy Policy

Advertise on the Straight Dope!
(Your direct line to thousands of the smartest, hippest people on the planet, plus a few total dipsticks.)

Copyright © 2018 STM Reader, LLC.

 
Copyright © 2017