Downloading Your Consciousness Just Before Death.

This very question was brought up in Star Trek: the Next Generation’s The Schizoid Man.But this not a Star Trek question.

What if just before you died, you downloaded your consciousness into a computer? Would that store data be you? And could it literally give you immortality?

I also put this in GD because I assume it’s up for debate.

Thank you in advance for your kindly replies:).

:):):slight_smile:

Need answer fast?

This really gets down to the fundamental definition of what consciousnesses is, which still a very open philosophical question. So what you get will probably be a whole lot of IMHO. Those who believe in a soul would say that that would clearly not be you. I’m of a bit more of a materialist and so I would say that if the data and associated software accurately mimicked your mental processes including internal states then that would be you.

However I could also imagine a sort of chinese room type simulation of my mind which externally behaves like me, but which arrives at that behavior in a totally different way. Some may say that such a simulation is effectively identical to me to all outside observers and so therefor should be considered to be me. But I would argue that the internal thoughts that are not outwardly observable are what truly represents consciousness and so a machine that failed to include those doesn’t actually represent consciousness. I therefore reject the Turing test as a sufficient definition of artificial intelligence.

A question you didn’t ask but automatically follows is what happens if you download your consciousness long before you die, or even download multiple copies of your consciousness. This idea is explored about 4 seasons later in Second Chances with the answer being they both are you. However as each has different experiences, they devolve into distinct individuals, but neither is more you than the other one.

The person in the computer after upload is not the same person as you right now. But by the same token, you right now are not the same person as you yesterday, or even you five seconds ago. We say that you right now and all of the yous of the past are the same person only because the you of right now has the memories of the yous of the past. And by the same token, the person in the computer will have the memories of you right now.

The uploaded consciousness would be a copy, but that wouldn’t bother it any. It’d feel like it was me and carry on accordingly - even though it would know better!

Unfortunately, the consciousness would be JUST a consciousness, not a human being with senses. So strictly speaking, it could not be conscious of anything except itself. Eternal introspection, totally cut off from the external world. Schizophrenia… insanity.

Jim B, this question is basically a specific phrasing of the general philosophical problem of Personal Identity.

This is one of the biggest, most-debated issues in philosophy. There is no clear answer at this time, although what normally happens is half the responses in a thread like this will be “*Obviously *the mind has been downloaded and anyone that thinks otherwise must think there is some magical soul or something” and the other half will be sure that “*Obviously *minds cannot be “moved”, and anyone that thinks otherwise must think there is some magical soul or something”.

Not true - plenty of people will say, “Prove there isn’t a magical soul or something.”

That is true.

It’s just interesting to me that on this topic (and similar ones, like the transporter), you end up with so many people sure there is only one common-sense, scientific interpretation and anything else is magical thinking. But falling on one of two sides for which interpretation that is. It’s a blue-black white-gold dress :slight_smile:

But yes there are also people who still believe in the soul, even though that is a position that scientifically doesn’t hold up at all.

Since the whole notion of downloading one’s mind into a computer exists only in the imagination, can’t we just imagine the results to be whatever we wish also? The mind does not exist in any sort of machine-readable format, so copying it to a digital medium such as a hard drive is just a philosophical exercise. If I want to imagine my soul or identity exists on a computer, then it does. It is is as real as the conversion of the mind to computer data is, which is to say, not real at all.

So you hook the ‘consciousness’ up to a set of simulated inputs and outputs. I expect we will have artificial eyes, ears and haptic sensors, artificial voice generators and remotely controlled limbs, long before we ever develop any kind of uploading/downloading (if we ever do).

When and if the technology for mind uploading becomes available, I think there will still be two opinions on this matter; but what will happen is that most, if not all of the people who get uploaded will be people who believe that they are being magically preserved in some way, whereas the ones who do not get uploaded will think the opposite. Since the uploaded ‘consciousnesses’ are effectively immortal, these will eventually outnumber the others by many orders of magnitude, and the ‘minds cannot be moved’ camp will become a small minority.

Even if they are right.

The only way I’d even consider transference to another vessel, preferably a body, would be if the procedure were continuous, in the sense that as the transfer took place I would experience existence slowly shift from Body A to an amalgam of two viewpoints to Body B with no consciousness discontinuity at any point and that the process would be reversible at any point.

If that were possible, why not stick with the halfway stage and have two viewpoints. Sounds like fun.

And then you have some people who think they have the answer, but don’t think that it’s at all obvious or intuitive.

BwanaBob, you already lack that continuity. Does going to sleep every night terrify you?

If you write a message on a piece of paper, then stick that paper in a copy machine and run off five copies, are the copies the same message?

Consciousness can’t be downloaded into a computer, for the simple reason that computation is an act of interpretation, which itself depends on a mind doing the interpreting. Thinking one could download consciousness is the same category error as thinking the sentence ‘it is raining’ is the same sort of thing as it actually raining. But the former is merely a symbolic vessel, filled with content by an intentional mind; while the latter is water falling from the sky.

It’s easy to see that a system only computes if it is properly interpreted. It’s only our use of ‘transparent’ symbols that makes us think that what a computer computes is an inherent feature of the computer, when in truth, it’s not anymore inherent to it than it’s inherent to the word ‘rain’ to mean ‘water falling from the sky’.

Consider a computer with less obviously transparent symbols. Say you find a device that’s constructed as follows: it has four switches, and three lights. If you flip the switches, the lights come on. In your experimentation, you find that there are certain rules according to which these lights light up. If you consider the switches in groups of two, and consider a switch being ‘up’ to mean ‘1’, while ‘down’ means ‘0’, and furthermore, consider each light to mean ‘1’ when it’s lit, and ‘0’ when it isn’t, you can use the device to add binary numbers (of a value up to three).

Now, suppose somebody else examines that same system. They might well come up with an entirely different interpretation: they could, for example, consider ‘switch down’ to mean ‘1’, and ‘light out’ likewise. Then, to them, the system would compute an entirely different function of binary numbers.

Many more interpretations are possible. You could take ‘switch up’ and ‘light out’ to mean ‘1’. You could interpret them as bits of different significance—say, you’re used to reading Hebrew, and thus, consider the rightmost light to map to 2[sup]2[/sup], the middle one to give 2[sup]1[/sup], and the leftmost one to yield 2[sup]0[/sup].

And so on. Each change of interpretation in that way will yield to the device computing a perfectly sensible binary function; each person with a different interpretation could use it as a computer to compute that function.

Thus, what computation a system performs is not inherent to that system, but is, exactly like what message a text conveys, a matter of interpretation. But if that’s so, then computation can’t be what underlies consciousness: if there’s no fact of the matter regarding what mind a given system computes unless it is interpreted as implementing the right computation, then whatever does that interpreting can’t itself be computational, as otherwise, we would have a vicious regress—needing ever higher-level interpretational agencies to fix the computation at the lower level. But if minds then have the capacity to interpret things (as they seem to), they have a capacity that can’t be realized via computation, and thus are, on the whole, not computational entities.

This is actually a plot point from The Prestige

[spoiler] The machine transports a copy to a nearby location, in this case a copy of the magician. When the copy is transported, both the magician and the copy exist at the same time (with the same experience, memory, knowledge). In the first test, the magician immediately shot and killed the copy.

When he designed the illusion, the machine transports a copy and the man on stage fell through the trap door, into the tank of water and drowned. Then the man appears away from the stage to delight of the crowd.

But the magician failed to realize that HE was the man going into the tank to be drowned. It was the copy that reappeared to delight the crowd. THAT copy would be drowned in the following performance. [/spoiler]

The question is, are you the magician or the copy.

If your consciousness is downloaded elsewhere, something may live believing it is you. But you will not be there.

“You’re me!” Roger said, looking at the other robot body, identical to his own, that clearly contained another download of his consciousness.
“No,” the Other replied, “I am Roger downloaded from a year later. My knowledge extends beyond yours. There’s already the stored backup, so I’m afraid that YOU’RE superfluous.”
The Other pulled out a Disruptor gun, which would erase all of Roger’s memories, and aimed it.
“Stop!” Roger said, but someone else was saying it at the same time.
Out stepped yet ANOTHER duplicate of Roger, with its own Disruptor.
“I am Roger downloaded from a year after you, and I have a more recent backup. I’m afraid you’re BOTH superfluous.”

“Wait!” shouted three voices…