Straight Dope Message Board

Straight Dope Message Board (https://boards.straightdope.com/sdmb/index.php)
-   Great Debates (https://boards.straightdope.com/sdmb/forumdisplay.php?f=7)
-   -   Downloading Your Consciousness Just Before Death. (https://boards.straightdope.com/sdmb/showthread.php?t=875593)

Jim B. 05-15-2019 05:12 PM

Downloading Your Consciousness Just Before Death.
 
This very question was brought up in Star Trek: the Next Generation's The Schizoid Man.But this not a Star Trek question.

What if just before you died, you downloaded your consciousness into a computer? Would that store data be you? And could it literally give you immortality?

I also put this in GD because I assume it's up for debate.

Thank you in advance for your kindly replies:).

:):):)

Buck Godot 05-15-2019 05:36 PM

Need answer fast?

This really gets down to the fundamental definition of what consciousnesses is, which still a very open philosophical question. So what you get will probably be a whole lot of IMHO. Those who believe in a soul would say that that would clearly not be you. I'm of a bit more of a materialist and so I would say that if the data and associated software accurately mimicked your mental processes including internal states then that would be you.

However I could also imagine a sort of chinese room type simulation of my mind which externally behaves like me, but which arrives at that behavior in a totally different way. Some may say that such a simulation is effectively identical to me to all outside observers and so therefor should be considered to be me. But I would argue that the internal thoughts that are not outwardly observable are what truly represents consciousness and so a machine that failed to include those doesn't actually represent consciousness. I therefore reject the Turing test as a sufficient definition of artificial intelligence.

A question you didn't ask but automatically follows is what happens if you download your consciousness long before you die, or even download multiple copies of your consciousness. This idea is explored about 4 seasons later in Second Chances with the answer being they both are you. However as each has different experiences, they devolve into distinct individuals, but neither is more you than the other one.

Wesley Clark 05-15-2019 05:55 PM

https://www.youtube.com/watch?v=IFe9wiDfb0E

Chronos 05-15-2019 06:20 PM

The person in the computer after upload is not the same person as you right now. But by the same token, you right now are not the same person as you yesterday, or even you five seconds ago. We say that you right now and all of the yous of the past are the same person only because the you of right now has the memories of the yous of the past. And by the same token, the person in the computer will have the memories of you right now.

begbert2 05-15-2019 07:02 PM

The uploaded consciousness would be a copy, but that wouldn't bother it any. It'd feel like it was me and carry on accordingly - even though it would know better!

panache45 05-16-2019 12:10 AM

Unfortunately, the consciousness would be JUST a consciousness, not a human being with senses. So strictly speaking, it could not be conscious of anything except itself. Eternal introspection, totally cut off from the external world. Schizophrenia... insanity.

Mijin 05-16-2019 01:37 AM

Jim B, this question is basically a specific phrasing of the general philosophical problem of Personal Identity.

This is one of the biggest, most-debated issues in philosophy. There is no clear answer at this time, although what normally happens is half the responses in a thread like this will be "Obviously the mind has been downloaded and anyone that thinks otherwise must think there is some magical soul or something" and the other half will be sure that "Obviously minds cannot be "moved", and anyone that thinks otherwise must think there is some magical soul or something".

Alessan 05-16-2019 02:15 AM

Quote:

Originally Posted by Mijin (Post 21645121)
Jim B, this question is basically a specific phrasing of the general philosophical problem of Personal Identity.

This is one of the biggest, most-debated issues in philosophy. There is no clear answer at this time, although what normally happens is half the responses in a thread like this will be "Obviously the mind has been downloaded and anyone that thinks otherwise must think there is some magical soul or something" and the other half will be sure that "Obviously minds cannot be "moved", and anyone that thinks otherwise must think there is some magical soul or something".

Not true - plenty of people will say, "Prove there isn't a magical soul or something."

Mijin 05-16-2019 04:12 AM

That is true.

It's just interesting to me that on this topic (and similar ones, like the transporter), you end up with so many people sure there is only one common-sense, scientific interpretation and anything else is magical thinking. But falling on one of two sides for which interpretation that is. It's a blue-black white-gold dress :)

But yes there are also people who still believe in the soul, even though that is a position that scientifically doesn't hold up at all.

Fear Itself 05-16-2019 06:59 AM

Since the whole notion of downloading one's mind into a computer exists only in the imagination, can't we just imagine the results to be whatever we wish also? The mind does not exist in any sort of machine-readable format, so copying it to a digital medium such as a hard drive is just a philosophical exercise. If I want to imagine my soul or identity exists on a computer, then it does. It is is as real as the conversion of the mind to computer data is, which is to say, not real at all.

eburacum45 05-16-2019 07:03 AM

Quote:

Originally Posted by panache45 (Post 21645069)
Unfortunately, the consciousness would be JUST a consciousness, not a human being with senses. So strictly speaking, it could not be conscious of anything except itself. Eternal introspection, totally cut off from the external world. Schizophrenia... insanity.

So you hook the 'consciousness' up to a set of simulated inputs and outputs. I expect we will have artificial eyes, ears and haptic sensors, artificial voice generators and remotely controlled limbs, long before we ever develop any kind of uploading/downloading (if we ever do).

eburacum45 05-16-2019 07:12 AM

Quote:

Originally Posted by Mijin (Post 21645121)
There is no clear answer at this time, although what normally happens is half the responses in a thread like this will be "Obviously the mind has been downloaded and anyone that thinks otherwise must think there is some magical soul or something" and the other half will be sure that "Obviously minds cannot be "moved", and anyone that thinks otherwise must think there is some magical soul or something".

When and if the technology for mind uploading becomes available, I think there will still be two opinions on this matter; but what will happen is that most, if not all of the people who get uploaded will be people who believe that they are being magically preserved in some way, whereas the ones who do not get uploaded will think the opposite. Since the uploaded 'consciousnesses' are effectively immortal, these will eventually outnumber the others by many orders of magnitude, and the 'minds cannot be moved' camp will become a small minority.

Even if they are right.

BwanaBob 05-16-2019 07:13 AM

The only way I'd even consider transference to another vessel, preferably a body, would be if the procedure were continuous, in the sense that as the transfer took place I would experience existence slowly shift from Body A to an amalgam of two viewpoints to Body B with no consciousness discontinuity at any point and that the process would be reversible at any point.

eburacum45 05-16-2019 07:14 AM

If that were possible, why not stick with the halfway stage and have two viewpoints. Sounds like fun.

Chronos 05-16-2019 07:19 AM

And then you have some people who think they have the answer, but don't think that it's at all obvious or intuitive.

Chronos 05-16-2019 07:20 AM

BwanaBob, you already lack that continuity. Does going to sleep every night terrify you?

begbert2 05-16-2019 12:39 PM

If you write a message on a piece of paper, then stick that paper in a copy machine and run off five copies, are the copies the same message?

Half Man Half Wit 05-16-2019 01:06 PM

Consciousness can't be downloaded into a computer, for the simple reason that computation is an act of interpretation, which itself depends on a mind doing the interpreting. Thinking one could download consciousness is the same category error as thinking the sentence 'it is raining' is the same sort of thing as it actually raining. But the former is merely a symbolic vessel, filled with content by an intentional mind; while the latter is water falling from the sky.

It's easy to see that a system only computes if it is properly interpreted. It's only our use of 'transparent' symbols that makes us think that what a computer computes is an inherent feature of the computer, when in truth, it's not anymore inherent to it than it's inherent to the word 'rain' to mean 'water falling from the sky'.

Consider a computer with less obviously transparent symbols. Say you find a device that's constructed as follows: it has four switches, and three lights. If you flip the switches, the lights come on. In your experimentation, you find that there are certain rules according to which these lights light up. If you consider the switches in groups of two, and consider a switch being 'up' to mean '1', while 'down' means '0', and furthermore, consider each light to mean '1' when it's lit, and '0' when it isn't, you can use the device to add binary numbers (of a value up to three).

Now, suppose somebody else examines that same system. They might well come up with an entirely different interpretation: they could, for example, consider 'switch down' to mean '1', and 'light out' likewise. Then, to them, the system would compute an entirely different function of binary numbers.

Many more interpretations are possible. You could take 'switch up' and 'light out' to mean '1'. You could interpret them as bits of different significance---say, you're used to reading Hebrew, and thus, consider the rightmost light to map to 22, the middle one to give 21, and the leftmost one to yield 20.

And so on. Each change of interpretation in that way will yield to the device computing a perfectly sensible binary function; each person with a different interpretation could use it as a computer to compute that function.

Thus, what computation a system performs is not inherent to that system, but is, exactly like what message a text conveys, a matter of interpretation. But if that's so, then computation can't be what underlies consciousness: if there's no fact of the matter regarding what mind a given system computes unless it is interpreted as implementing the right computation, then whatever does that interpreting can't itself be computational, as otherwise, we would have a vicious regress---needing ever higher-level interpretational agencies to fix the computation at the lower level. But if minds then have the capacity to interpret things (as they seem to), they have a capacity that can't be realized via computation, and thus are, on the whole, not computational entities.

Typo Negative 05-16-2019 01:15 PM

This is actually a plot point from The Prestige

SPOILER:
The machine transports a copy to a nearby location, in this case a copy of the magician. When the copy is transported, both the magician and the copy exist at the same time (with the same experience, memory, knowledge). In the first test, the magician immediately shot and killed the copy.

When he designed the illusion, the machine transports a copy and the man on stage fell through the trap door, into the tank of water and drowned. Then the man appears away from the stage to delight of the crowd.

But the magician failed to realize that HE was the man going into the tank to be drowned. It was the copy that reappeared to delight the crowd. THAT copy would be drowned in the following performance.


The question is, are you the magician or the copy.

If your consciousness is downloaded elsewhere, something may live believing it is you. But you will not be there.

CalMeacham 05-16-2019 01:26 PM

Quote:

Originally Posted by BwanaBob (Post 21645334)
The only way I'd even consider transference to another vessel, preferably a body, would be if the procedure were continuous, in the sense that as the transfer took place I would experience existence slowly shift from Body A to an amalgam of two viewpoints to Body B with no consciousness discontinuity at any point and that the process would be reversible at any point.

"You're me!" Roger said, looking at the other robot body, identical to his own, that clearly contained another download of his consciousness.
"No," the Other replied, "I am Roger downloaded from a year later. My knowledge extends beyond yours. There's already the stored backup, so I'm afraid that YOU'RE superfluous."
The Other pulled out a Disruptor gun, which would erase all of Roger's memories, and aimed it.
"Stop!" Roger said, but someone else was saying it at the same time.
Out stepped yet ANOTHER duplicate of Roger, with its own Disruptor.
"I am Roger downloaded from a year after you, and I have a more recent backup. I'm afraid you're BOTH superfluous."

"Wait!" shouted three voices....

Horatius 05-16-2019 01:28 PM

Quote:

Originally Posted by Mijin (Post 21645121)
"Obviously the mind has been downloaded and anyone that thinks otherwise must think there is some magical soul or something" and the other half will be sure that "Obviously minds cannot be "moved", and anyone that thinks otherwise must think there is some magical soul or something".


That's really the only reason I bother reading threads like this; to watch each side accuse the other of being the ones who believe in magic/the soul/god/whatever.



Quote:

Originally Posted by eburacum45 (Post 21645332)
When and if the technology for mind uploading becomes available, I think there will still be two opinions on this matter; but what will happen is that most, if not all of the people who get uploaded will be people who believe that they are being magically preserved in some way, whereas the ones who do not get uploaded will think the opposite. Since the uploaded 'consciousnesses' are effectively immortal, these will eventually outnumber the others by many orders of magnitude, and the 'minds cannot be moved' camp will become a small minority.

Even if they are right.


How about people like me. I don't think the copy will be the same "me" that's sitting here typing, but I'd undergo the procedure because I know the copy will enjoy ever-lasting(ish) life, because I know I'd enjoy it if it were possible to live forever.

It's akin to people wanting to have kids so that "some part of them lives on". In fact, I expect that will be the most likely outcome if this ever really happens: legally, the copy will be my child, and stands to inherit my estate, as any biological child would. This also explains why it would usually be done at the end of life. I wouldn't want to give that brat half my stuff right now, but if I were near to death, yeah, why not? It all goes to taxes otherwise.

Dallas Jones 05-16-2019 01:45 PM

Consciences, personality, identity, self, call it what you will, is a chemical and slightly electrical phenomenon that does not translate into the digital world of IF>THEN. There could be and already are algorithms that can mimic human responses but they are not actual thought. Not actual Self. How would you be able to translate chemical action into digital, storable personality?

Even our memories may not be what actually happened. Each time a memory is thought it overwrites the actual memory one more time. That favorite green truck toy you had when you were 5 years old may not have been green at all. You have remembered it as green so many times since you were 5 that for all purposes it is now green, it may never have been. Unless we are postulating an organic, chemical, emotional, storage system, there is no way to store a human mind.

The storage and response systems are completely different.

That Don Guy 05-16-2019 01:48 PM

The problem with "consciousness" is, how could you tell? Do the memories go with it?

Also, if you can do this, then, theoretically, you can make a copy of it; would both of "you" think you are the "real" you? (Never mind "Schizoid Man"; how about "Second Chances"? They're both "the" William Riker.)

Horatius 05-16-2019 01:55 PM

Quote:

Originally Posted by That Don Guy (Post 21646238)
The problem with "consciousness" is, how could you tell? Do the memories go with it?

Also, if you can do this, then, theoretically, you can make a copy of it; would both of "you" think you are the "real" you? (Never mind "Schizoid Man"; how about "Second Chances"? They're both "the" William Riker.)



I figure this is in the same category of topics as self-driving cars. There are people working on it as we speak, and there are folks who will argue that it can or cannot work for some reason.

But the argument is kind of pointless. If it can't possibly work, then no one will ever do it, no matter how much effort they put into it. But, if it can work, and we eventually figure it out, then the answers to all these "Is it me, or a copy?" type questions will probably become quite obvious.

Omar Little 05-16-2019 02:12 PM

This is the foundational premise for the Bobiverse series of books by Dennis Taylor: the first in the series is We are Legion (We Are Bob).

Very good read as a hard sci fi series. Bob continues to make copies of his own consciousness as a means of creating a supply of individuals that are needed to complete various functions. Each new Bob has a separate identity but shares a history with the other Bobs from the time they were created.

eburacum45 05-16-2019 03:11 PM

Quote:

Originally Posted by Half Man Half Wit (Post 21646134)
But if minds then have the capacity to interpret things (as they seem to), they have a capacity that can't be realized via computation, and thus are, on the whole, not computational entities.

I usually agree with you on most things, but you are wrong about this. Human minds are entirely computational, even though they are not digital.

eburacum45 05-16-2019 03:16 PM

Quote:

Originally Posted by Horatius (Post 21646186)
How about people like me. I don't think the copy will be the same "me" that's sitting here typing, but I'd undergo the procedure because I know the copy will enjoy ever-lasting(ish) life, because I know I'd enjoy it if it were possible to live forever.

This is more-or-less exactly what I think. Even though the copy would be physically different from me, and would (no doubt) have many minor differences in data, it might be the closest I could get to immortality, unless a better alternative came along.

begbert2 05-16-2019 03:17 PM

Quote:

Originally Posted by Dallas Jones (Post 21646228)
Consciences, personality, identity, self, call it what you will, is a chemical and slightly electrical phenomenon that does not translate into the digital world of IF>THEN. There could be and already are algorithms that can mimic human responses but they are not actual thought. Not actual Self. How would you be able to translate chemical action into digital, storable personality?

Even our memories may not be what actually happened. Each time a memory is thought it overwrites the actual memory one more time. That favorite green truck toy you had when you were 5 years old may not have been green at all. You have remembered it as green so many times since you were 5 that for all purposes it is now green, it may never have been. Unless we are postulating an organic, chemical, emotional, storage system, there is no way to store a human mind.

The storage and response systems are completely different.

With a sufficiently powerful computer system one could theoretically emulate reality at the molecular level, allowing a physical brain to be emulated with perfectly replicated behavior and functionality. And it probably wouldn't even take that much - much of the physical body's function is tangential or irrelevant to cognition and could be simplified out without impacting the accuracy of the emulation.

Quote:

Originally Posted by Horatius (Post 21646251)
I figure this is in the same category of topics as self-driving cars. There are people working on it as we speak, and there are folks who will argue that it can or cannot work for some reason.

But the argument is kind of pointless. If it can't possibly work, then no one will ever do it, no matter how much effort they put into it. But, if it can work, and we eventually figure it out, then the answers to all these "Is it me, or a copy?" type questions will probably become quite obvious.

I don't see how having it happen in front of you would make anything more obvious - unless the soul people are right and all clones turn up dead or something. In the materialist view the only difference between a person and a copy is that one would have continuity of existence and the other wouldn't - though the one that didn't would think that it did have continuity of existence thanks to its inherited memory and would only notice a discontinuity of location.

It seems obvious to me that continuity of existence is part of identity, so a copy of you isn't you by virtue of the fact that you've been over here the whole time and they haven't been. However in a world of star trek transporters where the original of you doesn't hang around to dispute the copy's claim to your identity there's no reason a copy can't step into your shoes and carry on where you left off.

Ashtura 05-16-2019 03:18 PM

IMO, there is no way to do this other than a complete brain transplant (which will be affected by age).

If it's done by any computerized method it will be a copy, not a transfer. Meaning you die, and there's a robot that thinks it's you, but you're still dead lights out.

Ashtura 05-16-2019 03:25 PM

Also, I believe biological immortality (stopping the aging process through genetic engineering) will occur before "brain downloads" do.

Half Man Half Wit 05-16-2019 03:43 PM

Quote:

Originally Posted by eburacum45 (Post 21646394)
I usually agree with you on most things, but you are wrong about this. Human minds are entirely computational, even though they are not digital.

OK, thanks for telling me, I guess.

eburacum45 05-16-2019 03:48 PM

This is an example of Mijin's statement that both sides accuse each other of believing in souls. If there is anything non-computational in the human mind, what is that something? A soul? Something else? Perhaps we could call it wibble. So a human brain is a computer with wibble. How do you know that we can't make wibble and add it to the uploaded computational representation of a human mind?

begbert2 05-16-2019 03:57 PM

It eludes me why anybody would even want their minds to be "non-computational" - doesn't that just mean that it doesn't work in a rational or coherent manner? That it's totally random? My thoughts happen for reasons, thanks very much. And even if brains do include some small amount of randomity, computers can simulate randomity, so no problems there. Whatever a wibble does, however a wibble works, it works somehow, and that "somehow" is a process and that process can be imitated and simulated. Doesn't matter if the wibble is material or supernatural, that's still the case.

So yeah, brains can be emulated, given sufficient understanding of how they work and sufficient processor power. That's still a pretty weaksause approach to immortality though, because the person being copied isn't going to live any longer as a result. They'll still age and die, and experience aging and dying. Their copy may be off having fun in a virtual amusement park forever, but that's not going to help them any.

Half Man Half Wit 05-16-2019 03:58 PM

Quote:

Originally Posted by eburacum45 (Post 21646480)
This is an example of Mijin's statement that both sides accuse each other of believing in souls. If there is anything non-computational in the human mind, what is that something? A soul? Something else? Perhaps we could call it wibble. So a human brain is a computer with wibble. How do you know that we can't make wibble and add it to the uploaded computational representation of a human mind?

Well, I gave an argument demonstrating that computation is subjective, and hence, only fixed by interpreting a certain system as computing a certain function. If whatever does this interpreting is itself computational, then its computation needs another interpretive agency to be fixed, and so on, in an infinite regress; hence, whatever fixes computation can't itself be computational.

And there's no need for souls, or anything like that; anything non-material or non-physical. Computation is really concerned with structural properties: we can simulate something because we can instantiate the right sort of structural relationships within a computer. But relations imply something to bear them, something that actually stands in these relations; but that doesn't carry over to the simulation. After all, that's what makes simulations so useful: if they replicated every property of the thing simulated, they'd just be copies. A simulated tree and a tree aren't the same thing, and neither is a simulated mind and a mind.

bump 05-16-2019 04:00 PM

Quote:

Originally Posted by Omar Little (Post 21646286)
This is the foundational premise for the Bobiverse series of books by Dennis Taylor: the first in the series is We are Legion (We Are Bob).

Very good read as a hard sci fi series. Bob continues to make copies of his own consciousness as a means of creating a supply of individuals that are needed to complete various functions. Each new Bob has a separate identity but shares a history with the other Bobs from the time they were created.

It's also a piece of foundational technology in the Takeshi Kovacs novels by Richard K. Morgan ("Altered Carbon" is the most famous one). Basically everyone's got a 'stack' that records their conscious mind/memories in real-time. So if someone's killed, dies, etc... that data can be downloaded into another body (a 'sleeve' in book parlance). From their perspective, there's a certain level of discontinuity, in that they go from being killed to becoming aware in a different body, not even necessarily the same gender as they started with. They can also be backed up, much like computers today, in case their stack itself is destroyed somehow. In that case, their mind info can be downloaded into a new stack in a different body and they're back, minus the time between the last backup and whenever they died.

begbert2 05-16-2019 04:01 PM

Quote:

Originally Posted by Half Man Half Wit (Post 21646502)
Well, I gave an argument demonstrating that computation is subjective, and hence, only fixed by interpreting a certain system as computing a certain function. If whatever does this interpreting is itself computational, then its computation needs another interpretive agency to be fixed, and so on, in an infinite regress; hence, whatever fixes computation can't itself be computational.

And there's no need for souls, or anything like that; anything non-material or non-physical. Computation is really concerned with structural properties: we can simulate something because we can instantiate the right sort of structural relationships within a computer. But relations imply something to bear them, something that actually stands in these relations; but that doesn't carry over to the simulation. After all, that's what makes simulations so useful: if they replicated every property of the thing simulated, they'd just be copies. A simulated tree and a tree aren't the same thing, and neither is a simulated mind and a mind.

What makes simulations so useful is that they can be created for free and it doesn't matter how many times they crash/fail/explode as a result. Also being digital means you don't have a giant pile of crashed/failed/exploded things left lying around that you have to dispose of.

Inaccuracy of behavior or fuctionality, on the other hand, is not a valued aspect of a simulation, and it's bizarre to hear somebody say otherwise.

Half Man Half Wit 05-16-2019 04:03 PM

Quote:

Originally Posted by begbert2 (Post 21646499)
It eludes me why anybody would even want their minds to be "non-computational" - doesn't that just mean that it doesn't work in a rational or coherent manner? That it's totally random?

Huh? No, of course not. Causality is a perfectly distinct notion from computation, and what's not computable isn't therefore unreasonable. That'd be like saying that only what's written down has a logical structure, but not that which the written text describes. Indeed, computation merely replicates the logical structure of whatever it implements, so this is just kinda backwards.

Anyway, what I want or don't want really has no bearing on the issue of what I have grounds to believe is true.

eburacum45 05-16-2019 04:09 PM

Quote:

Originally Posted by Half Man Half Wit (Post 21646502)
A simulated tree and a tree aren't the same thing, and neither is a simulated mind and a mind.

No. A mind is a program running on a biological computer, so a simulated mind running on a simulated biological computer is still a mind.

Half Man Half Wit 05-16-2019 04:09 PM

Quote:

Originally Posted by begbert2 (Post 21646507)
What makes simulations so useful is that they can be created for free and it doesn't matter how many times they crash/fail/explode as a result. Also being digital means you don't have a giant pile of crashed/failed/exploded things left lying around that you have to dispose of.



Inaccuracy of behavior or fuctionality, on the other hand, is not a valued aspect of a simulation, and it's bizarre to hear somebody say otherwise.

Ok, so I guess you actually agree with my point, and are demonstrating the interpretation-dependent nature of symbolic reference by example. In that case, thanks!

Half Man Half Wit 05-16-2019 04:10 PM

Quote:

Originally Posted by eburacum45 (Post 21646520)
No. A mind is a program running on a biological computer, so a simulated mind running on a simulated biological computer is still a mind.

So, using your conclusion as a premise is cool again, I see. That tends to come round again and again

bump 05-16-2019 04:10 PM

Having taken midazolam, which causes anterograde amnesia (you can't form memories for a period after when you take it), It's like you weren't actually conscious at all. One minute you're in one place, talking about getting the drug, and <blink!> you're somewhere else. From your perspective, you skipped forward in time. From everyone else's, a period of time has passed, and you interacted with various things, etc... A roommate got it for an esophageal endoscopy once, and I dropped him off and picked him up after the procedure. From my perspective, he walked right out, complained about his throat being a little sore, and then suggested we go grab lunch. We did, and then headed home, where he took a nap. Later that afternoon, he emerged, and asked me where we'd had lunch- he had no recollection of anything after they gave him the midazolam, and had deduced we'd got lunch because he was full, and had gone into the procedure hungry.

I suspect that from the perspective of one of the backed up and restored people, it would be similar- they'd remember everything up to the backup, and <blink> they'd be somewhere else.


Or if it was real-time, they'd recall having died, then <blink> they'd be somewhere else. In that case, it would (IMO) effectively be the same person, unlike the backup example, where there was a period of time where there were potentially diverging individuals sharing the same set of memories before a certain point.

begbert2 05-16-2019 04:13 PM

Quote:

Originally Posted by Half Man Half Wit (Post 21646510)
Huh? No, of course not. Causality is a perfectly distinct notion from computation, and what's not computable isn't therefore unreasonable. That'd be like saying that only what's written down has a logical structure, but not that which the written text describes. Indeed, computation merely replicates the logical structure of whatever it implements, so this is just kinda backwards.

I really have no idea what you mean by "computational", and it pretty much certainly has no bearing on how consciousness works, particularly in a materialist system. In a materialist system consciousness is an emergent consequence of the physical behavior of the brain. The brain is physical and follows physical rules. It can therefore be simulated, by simulating those physical rules. Doing that in an accurate simulation will necessarily give you the same emergent behaviors, because causality. Which means you'll get a mind simulated in the computer. Pretty straightforward, give or take the unbelievably massive amount of storage and processing power it will take to simulate the behavior of that much physical mass in detail. It certainly is theoretically possible, though.

begbert2 05-16-2019 04:14 PM

Quote:

Originally Posted by Half Man Half Wit (Post 21646521)
Ok, so I guess you actually agree with my point, and are demonstrating the interpretation-dependent nature of symbolic reference by example. In that case, thanks!

No. I find your point incoherent.

Voyager 05-16-2019 04:16 PM

Quote:

Originally Posted by begbert2 (Post 21646407)
With a sufficiently powerful computer system one could theoretically emulate reality at the molecular level, allowing a physical brain to be emulated with perfectly replicated behavior and functionality. And it probably wouldn't even take that much - much of the physical body's function is tangential or irrelevant to cognition and could be simplified out without impacting the accuracy of the emulation.

I agree. The confusion here is that there seems to be an assumption that the consciousness can be loaded into a general purpose computer, whereas if you could perfectly emulate the brain and all its peculiarities and inputs the consciousness would come with it more or less automatically.

The standard science-fictional treatment gets this wrong also.

Voyager 05-16-2019 04:19 PM

Quote:

Originally Posted by begbert2 (Post 21646507)
What makes simulations so useful is that they can be created for free and it doesn't matter how many times they crash/fail/explode as a result. Also being digital means you don't have a giant pile of crashed/failed/exploded things left lying around that you have to dispose of.

Inaccuracy of behavior or fuctionality, on the other hand, is not a valued aspect of a simulation, and it's bizarre to hear somebody say otherwise.

As someone who has written a lot of simulations, the biggest benefit I've found is that you can monitor the internals without interfering with the run. That's not something you can do in real life. Running a test on a real IC is fast, but seeing inside is damn difficult. Not true in a simulated version.
The big win for a brain simulation based on simulated neurons would be looking to see what happens in different psychological states.

thorny locust 05-16-2019 04:20 PM

I very much doubt that what we think of as ourselves is only made up of our conscious minds. Seems to me that a great deal of thinking, remembering, etc. is being done by other portions of our minds entirely; and then interpreted by the conscious mind, which may (or may not) then claim the conclusion it came to was made by the conscious mind's processes only.

And the mind as a whole is also influenced by the rest of the body -- hormonal shifts, exhaustion, hunger, satiation, pain, exercise, physical joy, etc. all affect it.

So I think that what would be downloaded would only be a part of me; and would probably rapidly diverge even from that part of me as it is now, because it wouldn't have the inputs coming from the rest of me.

And the me that's the body (including the brain) would die when the body died. So no, sticking a bit of me in a computer wouldn't give me immortality. Even if you could get all of me copied somehow, that still wouldn't be immortality: this me would still die. (Would you be willing to have yourself copied, while in decent health, if someone were waiting to shoot you the minute the copy was finished?) Whether there'd be some other sense in doing one or both of those things I don't know. I also don't know how my conscious mind would take to being stuck in a box, even a moving one; but I find the idea uncomfortable enough that I'd hesitate to do that to a copy, or a partial copy.

begbert2 05-16-2019 04:20 PM

Quote:

Originally Posted by Voyager (Post 21646540)
I agree. The confusion here is that there seems to be an assumption that the consciousness can be loaded into a general purpose computer, whereas if you could perfectly emulate the brain and all its peculiarities and inputs the consciousness would come with it more or less automatically.

The standard science-fictional treatment gets this wrong also.

Well, thanks to Turing-completeness, any "general purpose" computer could be made to run a simulation program that could handle the intricacies of emulating the brain/mind and the environment it will be reacting to.

That simulation program would probably take more than one CD to install, though.

eburacum45 05-16-2019 04:20 PM

Quote:

Originally Posted by Half Man Half Wit (Post 21646529)
So, using your conclusion as a premise is cool again, I see. That tends to come round again and again

Like your infinite regress.

One theory (quite an old one now) is the idea that the regression of subjectivity is a 'strange loop'; you only need to have a self-referential loop which can examine itself.
Hofstadter came up with this idea in 2007 or thereabouts.
https://en.wikipedia.org/wiki/I_Am_a_Strange_Loop

Perhaps it is only possible to create these self-referential loops inside a biological brain, or maybe only inside a human brain; but I do not see any reason to believe that, and I am surprised that you do.

Half Man Half Wit 05-16-2019 04:23 PM

Quote:

Originally Posted by begbert2 (Post 21646535)
I really have no idea what you mean by "computational", and it pretty much certainly has no bearing on how consciousness works, particularly in a materialist system. In a materialist system consciousness is an emergent consequence of the physical behavior of the brain. The brain is physical and follows physical rules. It can therefore be simulated, by simulating those physical rules. Doing that in an accurate simulation will necessarily give you the same emergent behaviors, because causality. Which means you'll get a mind simulated in the computer. Pretty straightforward, give or take the unbelievably massive amount of storage and processing power it will take to simulate the behavior of that much physical mass in detail. It certainly is theoretically possible, though.

You have the whole thing backwards. It is, ultimately, only the ability to interpret symbolic vessels that makes it possible to use any physical system to compute, or simulate, anything. So when you say that a system can be simulated, you're already appealing to mental capacities.

Let's leave out the middleman of computation, and talk about imagination instead. I can imagine robots, unicorns, trees, and even minds. Physical systems following physical rules---no problem there. By your argument, for a sufficiently powerful imagination, it should be possible to imagine an actual mind into existence.

So, well, imagination gives rise to minds! That's that, then. Except of course nobody's going to buy that: after all, I have just used a transparently mental capacity to explain the mental. That's of course a no-go; but that's exactly what computationalism does. That's the point of my above example that's being so studiously ignored.

Half Man Half Wit 05-16-2019 04:24 PM

Quote:

Originally Posted by eburacum45 (Post 21646550)
Like your infinite regress.



One theory (quite an old one now) is the idea that the regression of subjectivity is a 'strange loop'; you only need to have a self-referential loop which can examine itself.

Hofstadter came up with this idea in 2007 or thereabouts.

https://en.wikipedia.org/wiki/I_Am_a_Strange_Loop



Perhaps it is only possible to create these self-referential loops inside a biological brain, or maybe only inside a human brain; but I do not see any reason to believe that, and I am surprised that you do.

Well, I've given the reason above. If you find fault with it, feel free to point it out.


All times are GMT -5. The time now is 12:23 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2019, vBulletin Solutions, Inc.

Send questions for Cecil Adams to: cecil@straightdope.com

Send comments about this website to: webmaster@straightdope.com

Terms of Use / Privacy Policy

Advertise on the Straight Dope!
(Your direct line to thousands of the smartest, hippest people on the planet, plus a few total dipsticks.)

Copyright 2018 STM Reader, LLC.