View Single Post
Old 12-30-2017, 03:54 PM
SamuelA is offline
Join Date: Feb 2017
Posts: 3,903
Originally Posted by wolfpup View Post
The operative principle is that some mental processes appear to be computational -- that is, syntactic operations on symbolic representations called propositional representations -- while many others are not. How we process mental images is a classic case where there the evidence is at least somewhat contradictory. There is very, very much about how the mind works that we currently don't understand. You, OTOH, are trying to argue that not only are all mental processes computational, but the brain itself is a computer, because ... signals!
Ok, maybe we can finally get some convergence.

I am saying that from observable sub-processes in the brain (those signals), we can show that the physical matter is performing something that is similar enough to computation we understand that we can mimic it.

And we know if we have a black box we don't understand that emits signals, and we respond with signals close enough to the correct signals that [B]physical reality[/B] provides no reliable way of distinguishing them from correct signals (because the ones we send are accurate to within the threshold allowed by noise), we have replaced the black box with a box we do understand.

And if you can do that, you can get brain-equivalent outputs from a machine that isn't a brain, making what the brain does functionally the same as computation.

So yeah the signals argument is crucially important, and it's also obviously correct.

You would have to discover a method of processing the a synapse does that produces output pulses that you can't reliably emulate with a digital machine to disprove it.

As for the higher level stuff - again, if you built a computer system using neural networks that was even 1% as complex as the brain, with a self modifying architecture, with all kinds of crazy deep connections between layers - you'd probably also notice strange outputs that are hard to correlate to any model of computation you understand.

Even trivial neural networks can easily become a black box to humans.

Anyways, instead of just repeating over and over that "signals" isn't a valid argument, think about it. Mentally isolate off a single synapse. What if you were emulating that synapse badly? How bad do you have to be that the receiver on the other end can tell you're "different" than before? If the environment had no noise, any deviation could be detected. But what if all the signals you send and receive are garbled anyway?

And if you can subdivide the brain into trillions of tiny black boxes around each axon, and mentally swap those boxes with equivalent boxes, why would you not get the same outcome when you look at how the visual cortex processes things? What principle of physical reality allows the outcome to be different?

Last edited by SamuelA; 12-30-2017 at 03:58 PM.