View Single Post
  #161  
Old 05-22-2019, 03:34 PM
begbert2 is offline
Guest
 
Join Date: Jan 2003
Location: Idaho
Posts: 13,070
Quote:
Originally Posted by Half Man Half Wit View Post
The way you frame this already presupposes a speculative and contentious particular view of consciousness, known as functionalism. But functionalism, while it has a large following, is hardly the only game in town when it comes to consciousness; even the straightforwardly physicalist account isn't functionalist. So on most current ideas of how consciousness works, that it's the function of a physical brain is simply false.

I think that both you and wolfpup should take a look at Integrated Information Theory, which posits that consciousness is either due to or constituted by the information distributed among the parts of a system in a certain way. In a simplified sense, integrated information can be thought of as the amount of information the whole of a system has over and above that which is locked in its parts. IIT then postulates that this is what consciousness either correlates with or comes down to.

Now, I think in doing so, IIT is massively begging the question. But that doesn't matter right now. What's important is that IIT gives a scientifically respectable reductive account of consciousness---it shows us exactly how (what it purports to amount to) consciousness emerges from the simpler parts, and gives us objective criteria for its presence. So, IIT provides the sort of story that's needed to make wolfpup's emergence reasonable.

Furthermore, on IIT, consciousness isn't a functional property of the brain. It's a property relating to the correlations between the brain's parts, or regions. So functionalism isn't really the sort of obvious truth begbert2 seems to take it to be, either.

More interestingly, perhaps, is that on IIT, computationalism comes out straightforwardly false. While a brain has a high degree of integrated information, the typical computer, due to its modular architecture, has very little to none of it. So a computer implementing the simulation of a brain won't lead to conscious experience, even if the brain it simulates does.

This suffices to falsify many of the seemingly 'obvious' claims in this thread and shows that other options are possible.
By "functional" I just meant "it works at all". A non-functional brain is one that doesn't have a mind in it.

Don't try to impune upon me any particular cognitive model. All my position requires are the following three facts:

1) Brains produce minds in the real world.

2) Brains are entirely physical (and thus, mechanical) in their operation*.

3) Computers with sufficient memory and processing ability can accurately simulate physical objects and thier mechanical behaviors.

That's it. That's all I need to prove that it's theoretically possible to emulate minds - you may have to emulate everything in the world with even a tangential physical effect on the brain (which could include the entire universe!), but that's still theoretically possible, and thus you provably can theoretically emulate minds on a computer.

Proven, that is, unless you can disprove one of the three premises. And note that I don't care how the brains create minds. It doesn't matter. I don't care. It's utterly irrelevant. All that matters to me is that brains produce minds at all.



* Actually I don't even require the minds to exist entirely in the physical realm - they just have to exist in some realm that has reasonably consistent mechanistic rules. Which isn't really much to ask, because nothing can possibly work or even consistently exist in a realm without reasonably consistent rules.