View Single Post
Old 05-22-2019, 04:20 PM
Half Man Half Wit's Avatar
Half Man Half Wit is offline
Join Date: Jun 2007
Posts: 6,875
Originally Posted by begbert2 View Post
And now, on to attempting to see if HMHW's explanation makes some sense to poor old simple me, and to see if it accomplished the literally impossible and draws a distinction between the function of a physical brain and its functionally exact copy.
The way you frame this already presupposes a speculative and contentious particular view of consciousness, known as functionalism. But functionalism, while it has a large following, is hardly the only game in town when it comes to consciousness; even the straightforwardly physicalist account isn't functionalist. So on most current ideas of how consciousness works, that it's the function of a physical brain is simply false.

I think that both you and wolfpup should take a look at Integrated Information Theory, which posits that consciousness is either due to or constituted by the information distributed among the parts of a system in a certain way. In a simplified sense, integrated information can be thought of as the amount of information the whole of a system has over and above that which is locked in its parts. IIT then postulates that this is what consciousness either correlates with or comes down to.

Now, I think in doing so, IIT is massively begging the question. But that doesn't matter right now. What's important is that IIT gives a scientifically respectable reductive account of consciousness---it shows us exactly how (what it purports to amount to) consciousness emerges from the simpler parts, and gives us objective criteria for its presence. So, IIT provides the sort of story that's needed to make wolfpup's emergence reasonable.

Furthermore, on IIT, consciousness isn't a functional property of the brain. It's a property relating to the correlations between the brain's parts, or regions. So functionalism isn't really the sort of obvious truth begbert2 seems to take it to be, either.

More interestingly, perhaps, is that on IIT, computationalism comes out straightforwardly false. While a brain has a high degree of integrated information, the typical computer, due to its modular architecture, has very little to none of it. So a computer implementing the simulation of a brain won't lead to conscious experience, even if the brain it simulates does.

This suffices to falsify many of the seemingly 'obvious' claims in this thread and shows that other options are possible.