Is the human brain the most complex object in the universe?

When I said A and B are identical after A xmitted information and altered B to match A’s state, you responded with “suppose B sees blue”.

To me that sounded like you thought A and B could have identical state and A see red while B sees blue.

Was that not a fair reading of your statement?

  1. I don’t think I’ve ignored anything, I’ve responded to your points and offered my position as well as my thought process in good faith.

  2. You have claimed it’s tautological, but to get me to agree with you, you would need to respond to my objections in a way that convinces me (look, I don’t play games and if you are right that it’s tautological I’m ok with that, but it’s not clear that you are correct).

  3. They aren’t physically the same brain, that’s not helpful. Are two computers that are basically the same in every respect the same physical computer? Nope. And for communications to occur in computers they must substantially be speaking the same language.

  4. I think there are valid points in the rest of my post worth responding to.

I think I see notbatman’s point about two brains. If they are two unaltered instances of the same brain, then obviously they would function identically, and that answers no questions about the nature of qualia. Since nobody questions that qualia remains consistent within one brain (so far), two different instances would be assumed to have equivalent qualia to start with that would not be affected by new experiences.

However, that is point that I would make, and I think Raft is. Different brains have different qualia, and they can’t communicate those things in absolute detail because they are different. Even in the most minute detail extracted from an analysis of the physical structure of a brain doesn’t enable a different brain to experience the qualia. It can only examine the structure of the other brain and abstract it’s functionality, not process it directly.

Our brains are just not progammable in that sense. We can’t see UV and IR light, even understanding them, we never receive a stimulus response to them through the eyes and can’t construct qualia for them.

I understand you’re looking for an experiment or formal logic that proves this effect, but I don’t know how to provide that. My approach is to look at how qualia would be simulated by machine, and it doesn’t look that difficult. When the difficult parts of intelligence are added, we would have a means of comparison between mind and machie to resolve some of the details.

This is where I was going to go next.

If we imagine our brains had a physical structure that could take input with words and using those words then alter the connections and weights of various neurons to arrive at the same structure as the sending brain, then it seems we all agree the receiving brain would be able to reproduce the same experience as the sending brain.

So the next question is: is that “communicating the subjective experience”?

  1. Clearly we have communicated something (with words in this case)
  2. And clearly the subjective experience ended up being the same between sender and receiver

But maybe to “communicate the subjective experience” requires something other than point #1 and point #2?

I think this question gets to the heart of the matter and needs to be examined.

One simple way to convey the ‘colors’ of IR and UV could be to encapsulate a deep red and a bright violet (respectably, of course) with a thin but perceptible black border.
We are accustomed to accepting comics as metaphors and comedy. That kind of neural translation seems to be an inbuilt quality of humans. So, use it.

I’d say it was communicating the means of reproducing the subjective experience. I’m not sure that’s the same thing though (haven’t even considered it). Since I suspect that qualia are not all that seperate from other experiences, the result in the receiving brain may still be different unless we communicate the complete brain, and end up in the ‘identical brain’ scenario again.

Through all of this I’m not sure what the mystery is in not being able to duplicate subjective experiences in a different subject. It’s what I would expect.

The more I think about this the more I am convinced that a clear definition of communication is required to even continue discussing this.

We like to say that various organisms or parts of organisms “communicate” via chemical signals.

But in reality, all that is happening is the transmission of something physical that causes the receiver to be altered in a manner that (possibly) aids one or both organisms in achieving whatever programmed goal/outcome they are attempting to achieve (between neurons or between bacteria or between immune system cells, etc.)

This is no different than the transmission of instructions to a brain such that it alters it’s internal structure to match what the sender “wanted” it to match.

It’s not at all obvious to me how these things are categorically different or how they are different from transmitting the word “bird” and triggering a response in the receiver - I would be interested in hearing why someone thinks they are different.

I’m embarrassed to say, I don’t know the formal definition from Information Theory.

Information is basically the order or arrangement of matter or energy. It’s all involved with entropy, and with randomness. A random signal has no information in it. However, in the wonderful paradox of the field, a highly ordered and predictable signal – like a binary string of all 1’s – also has very little information in it. It can’t tell you anything new, since you know, just from looking at it, what the next bit will be. (“I’ll bet it’s a ‘1.’”)

One of the very key points in information theory is, while matter and energy can’t really be destroyed (although converted into each other,) information can be destroyed. I wrote a poem yesterday. It was so bad, I deleted it. There is no power in the cosmos that can restore it…

(…and the cosmos is duly thankful!)

No, not at all. You may want to actually read the post to which you refer. In the post of yours I responded to (quoted in the above link), you do not say that A and B are identical. You say, and I quote: “similar but not identical.” I then make an argument that your logic in that post is tautological. I never felt you addressed that argument, however ironically you did bring up a special case within its domain of applicability: A and B being identical. I again (actually at least three times) explained how this special case is tautological (TriPolar seems to get it, perhaps engage him if you still don’t). The fact that you continue to evade this fact (in the above post by questioning whether my statement “I never said otherwise” was accurate rather than simply and finally admitting the tautological logic I’ve tirelessly pointed to) not only makes this correspondence unnecessarily and unfairly tedious, but it makes it very hard for me to take seriously your claims to be responding in good faith.

Look, this is ridiculous. I’m losing patience here, and think I’ll give up unless you at least go back to that post and critically reexamine what you said and what I responded to. And at least admit the tautology so we can move forward.

Say “bird” and the other person can say “yes bird: a warm-blooded egg-laying vertebrate distinguished by the possession of feathers, wings, and a beak and (typically) by being able to fly.”

Say “red” and the other person can say “660nm wavelength light.”

Say “qualia of red” and the other person can only say “yeah, uh, I think I know what you mean, except what I call red could really be what you call blue. I guess we have no way of communicating which qualia you are really seeing when you see 660nm light.”

  1. This is my statement 2 posts prior describing the conditions I think are required to make this kind of information communicable:
    “If the receiver had the same (as in exact) internal structures and we communicated the information properly, like direct stimulation of neurons - then yes it’s seems communicable.”

  2. This was my statement about our current reality (and why it’s tough to communicate some things because we don’t speak the same language):
    “And every single one of us has a unique language (internal brain structure/states) - similar but not identical”

  3. In the same post, again these are the conditions I think are required to communicate this type of information:
    “in a receiver that spoke the same language” (meaning identical structure, or ability to modify structure to reach the desired state)
    It’s not uncommon to have miscommunication, but to say “to actually read…” isn’t really required.

And I’ve stated multiple times that your point about it being tautological is far from made and you keep ignoring my points surrounding this issue.

This is what it looks like from my perspective:
you: it’s tautological
me: but X
you: [ignoring x] it’s tautological
me: but Y
you: [ignoring x+y] it’s tautological and now I’m getting frustrated you don’t agree

I’m not going to admit something I don’t see or agree with. I’m now questioning whether there is even a good definition for communication - I’ve googled and I’m not finding one.

Why don’t you answer these questions - it will help us determine where we differ in our opinions about communication (this is the only way to proceed, to establish where we agree and disagree and why and move on from there):
If the receiver has an internal structure that allows it to modify itself at will, and the sender transmits words that the receiver uses to modify itself to match the state of the sender and thus the experience - do you think that this is again a tautological example of communication? Trivial and worthless?

If so, do you believe for there to be communication that the sender and receiver must be able to start and end with different internal structures, yet have, in some abstract mathematical sense, some form of equivalency between the key areas of the brain involved in storing this information/experience?

Is it fair to compare this to phobias? There seems, per some researchers, to be a physiological basis for some phobias: the patient’s actual physical brain has differences. So, you see a spider, and say, “My, my, a spider.” But I freak out, screaming, frothing at the mouth, and see Shelob.

I can communicate this, to a degree. And, of course, you can observe my behavioral reaction. But, ultimately, you won’t “see” spiders the way I do (and you ought to be very grateful for that small blessing!)

Okay, so what is X and Y here? I think notbatman is saying anything based on ‘identical’ brains will be tautological, and you’re saying it’s not based on ‘identical’ brains because of X and Y. So if it is identical brains, it’s a hidden local variable if they percieve identical qualia to identical stimulus, not spooky action at a distance. No real communication is necessary for them to percieve identical qualia. But if the brains are not identical, some communication must take place to get agreement on the perception of qualia.

I’m not sure that ‘communicate’ makes that much difference. Couldn’t we just say communication (in this case) is transmission of information from one brain to another via a common coding system (language). The more general concept of communicating qualia concerns me now. What does it mean to communicate qualia? Must the destination actual duplicate the experience of the source? Or is it sufficient to understand the subjective experience in some manner, like a cue which tells me if the ‘red’ I see is actually the ‘blue’ you see. Like you, I think there’s a structure that determines what we ‘see’.

Much of my position comes down to what mechanisms are available to humans to read their internal state and to receive information and modify their internal state. We have built in limits to our capabilities in these areas.

If we are simply lacking the capabilities due to our path through evolution, that’s a very different issue than saying something can’t be communicated in theory.

I agree, these are good questions and the type of thing I was trying to get at in the second part of my last set of questions.

Just to answer this more clearly:

If we had a mechanism that allowed us to restructure our brain at will to match the desired state based on incoming words, that seems like communication to me.

What I think I hear you and iamnotbatman saying is that if the only way to communicate experience is to end up in the exact same state then we can’t really call that communication or alternatively it’s such a special case that if we can’t find some other case in which identical structure is not required, then we might as well just consider it not-communicable.

This is what I would say to that position:

  1. Even if it’s a special case, it still seems like communication in practice. That’s why I was asking for formal definitions of communication that would show I’m making an error.

  2. Once we have established the extreme, that it can be communicated with a special case, the next step is to consider what conditions would allow us to ease the restrictions. Are there other configurations of neurons that would result in the same higher level level experience?

It would seem the answer is “yes” based on my experiences with dreams. I have substantially the same level of experience as when the real life stimulus is presented.

The next argument is that, even though there is more than one configuration of neurons that can result in these same experiences it is still contained within the same brain so we haven’t made much progress. But, if we know there are 2 different configurations of neurons that can result in the same experience (the awake and receiving external stimulus configuration, and the asleep and only receiving internal signals), it’s an easier step to imagine at this point that there are even more than 2 and that a receiving brain could experience the same thing - if we had the proper mechanisms to make it happen.

A possible objection to all of this is that it relies on previous experience. But, as I stated with the alien, all communication relies on some shared knowledge.

As I pointed out to iamnotbatman previously and didn’t see a response to it - if the alien has no shared knowledge, we could send 1’s and 0’s all day long and he would never, ever understand what we are trying to communicate.

Claude Shannon made the claim that information reduces uncertainty.

I think consciousness is the brain’s awareness of consciousness.

The link doesn’t seem to work… I Googled about, and apparently there is a field of information theory called “uncertainty reduction theory.” Can you point to a simplified intro? (God, I miss Isaac Asimov!)

It’s hard to disagree with that, but, as a dictionary definition, it has one little problem…

this conversation is silly. are you people really trying to say a machine or contraption is more complex than the brain which conceived and built it…?

brains win, they can think up then create contraptions of infinite complexity. the first time a contraption makes a brain, we’ll talk…til then…how on earth is there a debate…?

I already addressed this in this post. You responded with your current question, which is just infuriating because it is already answered if you read critically the post you responded to. Of course if there is no shared knowledge then they can’t communicate. As I said in that post, “What is needed is a foundational shared vocabulary, but one that does not necessarily overlap with the information you are trying to convey”…

“Two posts prior” is not the post I responded to (as could be seen from the quoted text in my post)!

In any case when you say “If the receiver had the same (as in exact) internal structures” you are admitting a scenario in which you are attempting to describe communication between a brain and itself, between two objects that are already the same. You keep saying you have given a counterargument that somehow renders this problematic fact acceptable. I can’t find it.

I don’t disagree with the above statement…

See, you keep changing what you are saying, and appending it to what you said you meant in your previous post. I may have just missed it, but I never saw this “modify structure to reach the desired identical state” idea put forward until much more recent posts. I’ll address it below.

The fact that your above response starts with “This is my statement 2 posts prior,” rather than addressing what my own post was directly responding to, is not exactly inspiring my confidence in your good intentions, and is part of what I perceive of as a larger pattern. But I’ll try to put all that aside and have a last go at this.

Then please stop saying “I disagree” and please explain what is wrong about my argument. Your responses in the past have been vague and indirect and seem to slightly shift the argument each time, an example of which is your response to the above post:

When we communicate any idea with words it’s an approximation that really isn’t completely communicating what we think it is.

I’m not at all sure how this addresses my argument, since I wasn’t even describing a scenario involving verbal communication at all. The argument involved direct stimulation of the brain.

I’m still unclear on what your “X” and “Y” are.

Well, information theory is not trivial, but I think we can define communication as the transmittal of information. I think it’s totally overboard and distracting here to try to apply a technical definition of information. Why can’t we just use simple examples? If A and B have a shared vocabulary of arithmetic, but A has information that the number 6 is non-prime and B does not, can we agree that if A tells B that 6=2x3 then information has been transmitted, information that was not present already in B? This is distinct, as I have explained many times by now, from the situation with qualia, whereby a scenario analogous to the above is fundamentally impossible, because there is no shared vocabulary out of which qualia can be constructed that is not already maximally overlapping (ie tautological). For example, you can’t start out with the shared vocabulary of qualia “green” and “blue” and use that shared vocabulary to describe to someone what qualia “red” is. Again, compare this to other simple examples using numbers or nouns, that can be communicated in reality using a shared vocabulary that does not already tautologically include the information you are trying to transmit.

It depends on whether “match the state of the sender” means “the receiver and transmitter become equivalent” (ie the entire brain states become identical), or whether you mean some region of the brain of X matches the state of some region of the brain of Y. Suppose you are able to do something that causes the brain state of the receiver to modify itself to match the state of the sender. If the sender and receiver are now equivalent, then all the sender has done is to create a duplicate of himself, which I would argue is indeed tautological, because you haven’t transmitted information to B (B is no longer B, but A), but rather duplicated your own subjective experience. If rather some region of the brain of X is made to match some region of the brain of Y, then my earlier argument applies.

But stepping back, even though I think it is tautological to speak in terms of communication between brain states made entirely physically identical, I do think I should take back my earlier agreement that if A and B are entirely physically identical, then their qualia must necessarily the same. I think I was momentarily overwhelmed by my physicalist bias. The point of qualia is that they cannot be explained through physicalism, and that, even in the extreme case of two identical brain states, there is no way of determining that the qualia experienced are the same. Again this is in contrast to the case of non-qualia information, which in two identical brain states can easily be confirmed to have been made consistent (I hope you have plenty of examples by now, such as those with the bird or the prime number or wavelength of light).

Yes, there has to be some difference in order to avoid begging the question. I think the best way to approach the communication question is through simple real world examples of the exchange of information that have practical consequences whereby the two parties can validate the consistency of the transmitted and received information. It is simple to construct thought experiments as I have done that show that it is impossible for qualia to be communicated in this sense.

As a starting point, would you at least be willing to admit that qualia has the special property that if it were to be communicable it would require this special scenario of exactly reproducing the brain state of the transmitter?