Transporters and the destruction of the self: What am I missing here?

Shijinn, I see your point, but the problem with it is that as you yourself noted, it is a hypothetical on a hypothetical. Your situation *requires * the clone paradox to even exist, then completely changes the functionality of the machine. It is apples and oranges at this point. If there is a logical oppositional argument to be made, it should be able to be done so without the use of a hypothetical which invalidates the parameters of the original question.

A better hypothetical might be: Suppose the machine has an error that causes the transportation end of things to not work. The machine then re-creates the user out of the original material, at the same end. Is that person the same? If not, then what is the difference?

I’m confused here about who you are claiming can and did give consent. You.1 or You.2? To whose death is consent given? Which you are you labeling surplus?

This also sounds suspiciously like it’s ok as long as it doesn’t hurt.

Let us suppose we have our DinoTeleporter.

Acid Lamp goes to use it and there is a malfunction. Acid Lamp (AL) is unaffected back on Earth. Acid Lamp Prime (AL’) is created on Alpha Centauri.

The Dinos did not design their teleporter for human specifications, so AL’ stands up and hits his head. I think we all agree that AL does not feel nor know about AL’ bumping his head.

Back on Earth, the Dinos are shocked and one accidentally steps on the toe of AL. I think we again agree that AL’ does not have any issue with his toes, just a throbbing headache.

The Dinos then, per procedure, pull out a Desert Eagle .50 and place the barrel between the eyes of AL.

Acid Lamp, are you okay with the Dino pulling the trigger?
And once the trigger is pulled(and the brain has time to splatter on the wall), what do you next experience?

I would guess the answers would be (correct me if wrong) would be “no” and “nothing”.

I hope this points out the issue of consciousness. Sure, AL’ lives on but that does little for poor AL, rendered into oblivion.

A teleporter that works by copying and destroying me is all well and good for anyone who steps out but ends my ability to continue to experience reality going forth.

Allright let’s examine this step by step.

I walk into the teleporter machine. The friendly voice asks me for my destination and I tell it where I’d like to go. The machine then informs me that there is small chance that my teleportation may result in an accidental copying of my consciousness. It tells me that as a consequence of this possibility I will need to terminate the surplus “copy.” The machine will handle this on it’s own should the unthinkable occur, but I must consent to the termination of “myself”. It then asks me to confirm that I would like to continue and go to my destination. I tell it that I understand, so do consent, and would like to continue. I arrive at my destination and any ghastly impossibility is take care of.

Because at the point of entry there is only one consciousness/ physical form known as “Acid Lamp”, I, the entry user am the only one able to consent to this process. If the impossible mistake actually happens, I have already consented that I wish for my timeline to remain undivided and continued at the destination. Theoretically, I could refuse to use the machine, or consent to the dividual being left to exist if heso chooses. Because he is me, and does not exist except by my will, and an error that should be impossible he has no rights unless I choose to grant them to him. In this case, the “me” left at the entry point is the “copy”. Any machine of this type would be engaging in a painless, instant deconstruction so there is no ethical problem with the termination of the surplus dividual left by the accident. A machine that operated to purposefully copy and leave the entry user to a lingering painful death would simply be unethical to operate.

So the only issue to you is the manner of death, not the fact of it?

If I confirmed to myself via Dino-vision that I was fine and well at my new location then I logically should have no objection. “I” am already there, the one that needs to be shot is basically a past “me” who shouldn’t be around.

As to what happens next, I’ve no idea at all. I suspect nothing. That doesn’t frighten me though. “I” am already where I wanted to be.

I Understand the point and distinction you are making, but this example is very different from the one posited in my OP. The Dino machine works differently. In my machine the AL MUST be disassembled to be “scanned” No destruction, no copy. So all this clone nonsense doesn’t matter. Any user of such a device would have to accept that the distinction between AL and AL’ is nothing but semantics. That is to say that while AL’ may indeed be a separate physical and conscious entity, he is equipped to only be AL(c) (continued). He was, and could never be anyone else. So long as the consciousness AL continues somewhere, “I” am not lost to oblivion.

The question though, is for that those opposed to this idea, what is the special distinction that would make AL’ an *invalid *version of AL ?

That is fairly accurate. “I” still exist somewhere. The pattern “Acid Lamp” hasn’t been lost to oblivion in the way it would if say I was run over by a bus tomorrow. In that case, there is no way for “Acid Lamp” to continue.

If I was “saved” from minute to minute on say a computer system that would allow me to be revived from the last save point in the event of an accident, and i was convinced of it’s accuracy, I would not have a philosophical objection to blowing my head off other than the possibility of painful failure and the inconvenience of having to “catch up” with the stuff I missed while I was offline. I would have effectively “time traveled” into the future a little bit and that would be annoying.

By changing the time of the destruction, we’re showing that it is an actual separate sentient being that’s being destroyed. Whether you do it before or after materializing a duplicate is irrelevant.

So what? That sentience is Identical to my own and only exists at my pleasure. By using the machine I agreed to the division and destruction of the surplus. I gave my consent. That the division results in a separate sentience matters nothing at all to me. I only agreed for one to continue.

You agreed to the destruction of an Acid Lamp. I daresay when there are two Acid Lamps that each will have good and valid reasons why he should continue and that other Acid Lamp is surplus. This is one of the few occasions where I believe that pain, or its prospect, will elicit truth.

It came into existence at your pleasure, but there’s no logical, ethical, or legal reason why its continued existence should be at your pleasure. The fact that it is a perfect copy of you doesn’t mean that its fate should be yours to control.

But once the copy is made, why should either be bound by your prior decision? Both are now sentient beings that should have the right to determine their own fates.

And your prior agreement matters nothing at all to the resulting sentient beings. They didn’t consent and there’s no reason why they shouldn’t be allowed to prevent their own destruction.

Scumpup, I’m happy to debate whatever you like, but I don’t appreciate being called a liar. I"m doing the best I can with the absurd scenarios presented. My best guess is that I’d be fine with it since I agreed to it.

Just because he believes that you’d answer differently in the actual situation doesn’t mean he’s calling you a liar.

Your guess has no reasonable basis. There’s no reason to believe that the likelihood of your agreeing to be killed in that situation would be any different from your willingness to be killed when traveling by taxi.

I’m not calling you a liar. Appy-lolly-ologies if it seemed that I was. I am simply pointing out that in fantasy situations it is easy to claim one would choose a particular course of action, whereas IRL ugly, painful death has a way of making even very idealistic people reevaluate their priorities.

So you seem to be more concerned with the lineage of Acid Lamp rather than continuing to experience life if I understand you correctly. I’d rather not have a legacy and stick around as long as possible, myself.

I have to generally disagree with you here.

Legally speaking I’ve no idea what the ramifications would or should be. We’ve never had that case and I imagine it would turn any judge inside out as it is us. However, if we assume the suicide is a human right, issues of that might crop up along with contractual obligations. Suppose that using the machine is the same to consenting to death? then what? Once you push the button, then you are killed. I cannot conceive of a machine that would work differently.

Ethically, until the moment of division, only you can determine your future. I you decide you want your timeline to be undivded that is your right. I suppose you could what? sue your clone…? Sheesh.

Logically there is a wealth of problems this would cause. Am I liable for what my renegade copy does? does he have access to my funds, home and other goods? How will he provide for himself if not? How will he exist independently since he’s Me but not Me now that we’ve dividuals and I’m on vacation on Rigel six? Moreover, what if he decides to go somewhere via transport? Am I on the Prime Timeline now responsible for all of his clones as well? It’s ridiculous.

Fair enough, but What is the difference? That is what I was asking in my OP. I don’t see one, and the few arguments I can think of seem to advocate for an irrevocable aspect to the self that makes a significant difference. I want to know what that is. A soul? just the fear of death? what?

so you do recognise that you die when you step into those things. that is a salient point to a lot of people. that your offspring continues your legacy is all very well and good, but you are dead.

Well, first of all, there’s no basis in contemporary law to assume that suicide is a human right.

However, even allowing for that, you’ve now conceded that what we are talking about now is not a transportation device but a suicide device. That changes the entire conversation and context of this discussion.

Furthemore, if a contract is allowed to bind someone else’s death, then it’s not a suicide. It’s a contract to murder. Under current law, that’s illegal. Do you think that the legal system should be changed to allow a contract to effectuate a non-party’s death?

Sez who? We don’t have any such right under current law. And why should anyone accept your premise that what we’re talking about is an “undivided timeline” rather than “copy and kill”?

This is pure opinion on my part, but let me nevertheless assure you that the law is going to be far more agile in accommodating these legal problems than it is in allowing a binding contract to murder. All you will need is a set of rules, after all, to delineate rights and liabilities – much less difficult problems ethically, philosophically, and legally than what you’re proposing.

Sure. But I also Continue. Either way I still exist. I stop, I start. The me that continues is no different than the one that stops. His experience will tell him that was just transported to wherever, and he IS me in all things. There is no appreciable difference to me. I also accept that it is a salient point to many folks, I’m wondering WHY though.