A test of faith in science - would you use a teleporter?

  1. How do you know that “this mind would exist in two places at once”?

  2. How do you know that it possible for one mind to ever exist in two places at once?

  3. What does it mean for a mind to exist in two places at once? Does it mean that it takes in two separate sets of sensory information? Does it mean that it is able to operate two separate bodies independently?

Ah, gotcha.

Well, I really don’t know, as my teleportation device has never quite worked right, but I can make some guesses.

The best analogy I can think of is a computer. Suppose I’ve had a computer for years, and it’s loaded up with all of my favorite games, documents, e-mails, website favorites, etc. Then I get an identical computer with a blank hard drive, and transfer all of the data from my old computer onto it. Now they’re separate but identical machines.

Suppose I gave you the new one. Immediately you’d probably start changing the settings, deleting some crap (to you) files, and installing some other stuff. They are no longer identical.

I would think that a copy of a brain (loaded up with the software of the mind) works in much the same way. It would start with an identical copy of the mind, but would become unique the second it starts getting some input.

So one copy of this mind is saying “I’m still in Boston” while the other says “Cool, I’m in Maui!” Different brains, different minds, but for now very very alike.

Does that make sense?

It makes sense, but it doesn’t really get to the heart of the problem, which is whether you can destroy one of those sets of data without destroying an individual person.

What are you as a person/individual/persona? Are you just a collection of data (memories, etc.)? Or are you something more? It seems to me that there is something more than just the data in question, precisely because of the fact that we can conceive of making an exact copy of that data and having it exist in two places at once without giving one individual the point of view of both physical locations.

Well, now we’re getting into some deep stuff. You’re essentially asking me if I think we have souls, or some other form of consciousness that’s not just an emergent property of our brainwaves. I honestly don’t know, and I don’t think that anyone does. Personally, as an atheist, I think we don’t, but that’s just my opinion. What do you think?

I think the computer analogy is a very good one.

And I think for the sake of discussion we really have to leave spiritualism out of it, because even if we are something more than the sum of our parts we still wouldn’t be able to define it, so we have nothing to work with.

I am not asking if we have souls. (I too am an atheist.) I think the question is much more tangible and less faith based than that and it doesn’t require us to ask whether we have a “form of consciousness that’s not just an emergent property of our brainwaves.” The question is whether two physically separate consciousnesses can be said to constitute the same consciousness simply because they contain identical sets of data. In my mind, the question is easily analogized to identical twins, or even a photocopied document.

My thinking is that both copies are independant, and would be considered two separate people. They are both capable of independant action the moment the second one is created. Even though their nurture history is identical in every facet - simply being in a different physical place means the influence of nature could make them react differently. Similar to chaos theory.

If the two copies were of a singular mind and not independant, much of the ethical discussion we are having here would become moot, as far as I can see. The old copy would recognize the new copies existance, and acquiesce to the need to be destroyed. Or at least would be more willing, because of that continuity of conscience.

The more I think about it, the more difficult a problem it becomes. At first it was a question just concerning whether you would take that ‘leap of faith’ (and it would be a leap of faith, because you’d have no way of knowing for sure that it worked out as planned) of jumping into a transporter, knowing you’ll be killed, but end up on the other side. The more we discuss it here, though, the more it really would seem like the copy is just some other guy, but happens to have the same identical memories and life experiences as me.

And the more I think about it, the more I think it becomes close to chaos theory. I dont know how I, personally, will react to a situation that will be presented to me 2 hours from now. And while a copy of me, made 1 hour and 59 minutes from now, might react similarly or identically, there’s no guarantee of that.

The twin analogy is an interesting one. Here you have two separate people, who were once identical, but became distinct people early on in the womb. While connected in many ways, they never actually shared a mind. And we’ve seen that identical twins can behave and think very much the same way later in life.

Yes, I’ve known some who are very different. I’ve known a couple that, despite being very close, live very different lives. They don’t even look alike anymore. They’ve spent over 50 years not being the same person.

But if I step into the teleport machine, I’m creating an identical twin who has shared my exact thoughts and my exact experiences for 48 years. We’ll start on divergent paths immediately, but I don’t thnk that in 20 years we’ll be vastly different people. There’s too much shared history on the most personal level possible.

That’s not really what I was getting at at all. (I slept very poorly the last couple of nights, so I am not at my most articulate right now, but the libation I’m currently enjoying is helping some.)

I think that consciousness is an emergent property of the processing power of the brain. The data stored by the brain doesn’t have anything to do with it necessarily (except that it provides something for the processing power to chew on). Maybe imagine a big machine like in Contact–when it’s still, it’s just several hunks of metal. But when it’s moving it turns into something that’s far beyond just several hunks of metal.

One implication of this is that if we eventually create a computer with the same processing power as the human brain, then that computer may develop a consciousness that is similar to human consciousness. Or maybe not. Maybe the emergent property of consciousness can only arise out of a complex set of biological processors.

Also, I’m sure someone somewhere has written about this–I make no claim to inventing this idea.

Identical twins each have their own consciousness (because they each have their own brains from which their consciousness emerges). So the transporter thing is like creating an identical twin, but the two people created would each have different consciousnesses (like identical twins).

Yep, I agree with this.

But it matters a great deal to me which one is killed. The new person is a new person with a separate consciousness, so if the old person is killed, I (ie, my consciousness) is killed.

To put a legal/ethical twist on it: somebody commits murder and jumps in the teleporter. Can you now hold “him” criminally culpable for the murder?

Under my conception of all this, the answer is no–the person that emerges on the other end has a separate consciousness and is therefore a different person than the murder.

One addendum to what I’ve said above:

The difference between a “human” consciousness and a “cat/dog/other animal” consciousness is only one of degree and not of kind. In other words, I’m not positing some human specialness vis-a-vis the emergent property of consciousness from processing power–animals have a consciousness as well. I’m not sure they are completely capable of knowing that “I am me,” and I think the differences between human consciousness and animal consciousness make it OK to eat them (but not OK to unnecessarily harm them).

That’s my instinct as well, but then I’m thinking you may end up having to hold him criminally responsible to stop the epidemic of murders by people who think they’ve found the “jump in the transporter” loophole.

But under my way of thinking, jumping in the transporter is committing suicide, so I don’t think the murderer is really gaming the system here.

Is the question about science or ethics?

Science: Would I trust a teleporter?
By the time I could afford it, yes. Only the rich can afford bleeding edge technology, and I am not rich. By the time I could afford it, the technology would be proven. (And my company would have audited the hardware and software companies as shipping vendors.)

Ethics: Me 1.0 dies after Me 1.1 materializes?
Theoretically, hell no. Scary as the thought is, better two of me in this world, than anyone (even me) dies for my convenience. No. Wrong. Re-write.
Those are my indescribable little energy fields being recreated using the ‘potential energy’ in my body, and self-awareness does not exist during the change. Then I will try it.

Pragmatism: Why teleport people?
If technology has advanced that far, tele-conferencing technology will make physical presence obsolete. I can do 50 % of my job off-site now. Actually, I could do all of my job off-site if people would answer their emails.

Teleporting will be used for materials, not people, until the military proves the technology.

I’d agree that it’s suicide and wouldn’t myself do it for any amount of money, but my concern is that if you didn’t hold the teleportee/clone criminally responsible in the place of the deceased teleporter, word would get out among criminals who didn’t fully grasp the suicide-y aspects of teleportation that taking a trip through the teleporter was a get out of jail free card (I have seen prison writ writers express many ideas at least this dumb). I think we would eventually have to hold the teleporter clone criminally responsible to dissuade crime, and to stop a criminal class of legally untouchable teleporter clones from killing and robbing banks with impunity.

The really big question would be whether we also hold them responsible for crimes if they simultaneously teleport themselves and wipe their memory of the crime. I would be for this too, until I found out I too was a teleporter clone who committed a crime I didn’t remember.

Well, you could always use the transporter to switch places :wink:

This whole conversation reminds me of the pilot ep of Star Trek: Enterprise. This brand new starship is the first Starfleet starship to have a transporter pad. Nobody trusts the thing. It’s black magic, it’s witchcraft, the demon thing tears apart your molecules and fires you across space as a beam of energy. It only does the last part if it WORKS properly. They use it to bring supplies and spare parts up to the ship, and rely upon the shuttlepods almost exclusively for human and canine transport (did we mention that the Captain won’t even let his dog on the thing?).

Anyhow, plot ensues, and long story short, the Captain ends up trapped aboard the bad guy ship as it’s about to asplode or fall apart or something. The Enterprise breams him out in the ta-dah nick of time, just as we’d seen them do many times on the previous shows.

The Captain has this look of “What the HELL did you do THAT for?!” on his face, even knowing the alternative was dying on the enemy ship. :smiley: It does help that nobody TOLD him they were gonna throw him through the molecular-shredder-transmogrotron before they went and did it.

It’s also worth noting that in Star Trek: The Motion Picture, we get a first-hand look of what a horrifying lifetime nightmare-enducing experience it would be just to watch someone not make it through the transporter all the way. It didn’t look very comfortable for the guy.

That said, in the Real World of just far enough into the future to have transporters? I see it being used for transporting materials and supplies at first, then the military would probably go ahead and use it to send in troops when they needed to move quickly. Existential crises aside, you gotta admit it’d end a war pretty quick if a company of US Marines popped into existence in the middle of the bad guy HQ.

Come to think of it, why even send the Marines? Just have a bomb pop into existence there instead. Maybe give it a verbal countdown for full dramatic effect.

According to current law, probably not.

According to common sense, definitely. At the very least, you’ve got a psycho that you know with absolute certainty would have committed the murder if he was capable. There is just as much need to render him incapable of offending as there is for the original.

I agree with every part of your post except this. Mostly because I’m not sure I really understand it. Trying to parse it, I guess I might agree with it. But mostly now my own brain hurts.

This is one heady thread!

Absolutely, and I don’t just mean that from a perspective of crime prevention.

From my point of view, the copy is, for all intents and purposes, the original person. The copy has all of the intent and malice of the original.

I would not. I am not sure what makes me me. And to that extent I am not comfortable saying that the person on the other end is me.