The Clone Paradox

If a copy of Juckleberry Finn, with one typo, is still Huckleberry Finn (and I would argue that it is) that does not mean that Huckleberry Finn with one typo is also Catcher in the Rye.

I didn’t accuse you of saying anything (in that post anyway). I’m saying that the twin claims of “he is transported (in whatever form)” and “he is not transported” cannot be true at the same time.

This is why this discussion is frustrating for me. How debates are supposed to work is we list out our premises, and say which ones we agree, and which we don’t, and then point out which premises are implicit etc. And proceed logically from there.
In this thread, this just doesn’t seem possible: every premise is getting disputed every time, even ones that are spelled out in the hypothetical, or should be uncontroversial, or that the person agreed to in another post.

Sure and it’s that “up to some point” I’m interested in.
The point at which it’s not a bad copy of John any more, it’s simply not a copy and John was never transported.

You’ve actually agreed with me about key points of my argument; that, for the “you are transported” position, this line exists somewhere, we don’t know where it is, and that it’s a fact about the universe not some arbitrary human-drawn line.

The claims we disagree on: Whether this is analogous to a heap problem, and whether the position of the line is not merely unknown, but unknowable, are not so critical anyway. The single point was that the “you are transported” position requires this extra physical fact.

Forget the transporter for a moment; John is seeing blue when an orchestra falls on him. The grand piano misses him, but the piccolo does not - it causes a penetrating head injury, damaging part of his frontal and parietal lobes. John’s personality and perceptual framework is a bit altered by this injury.

Is John still seeing blue? Or is John partly seeing blue?

It’s the wrong question. The right question is:

To what extent is the thing that remains, seeing blue, still John?

It is already possible, in our real world that doesn’t have transporter cloning devices, for a situation to arise where we talk about losing or preserving parts of a person - for example, this TV information ad on recognising the signs of stroke, ends with the phrase “The faster you act, the more of the person you save”

IOW, the ‘heap’ problem already exists in our real world.

OK, I tried the time travel thing again and I am sorry, but I cannot really figure out what it is you’re saying or asking. You think you would ever answer ‘no’ to the question ‘did you feel alive’? I don’t get it.

Mangetout, I broadly agree with everything you just said (post#203).
But I think you are repeating back to me one of the premises of the N-errors hypothetical, and missing the point of it.

Let me try another stab at this.

John walks into the transporter pod on earth. A person walks out the transporter pod on the moon. We can ask two questions:

Question_A: The third-person question. Is John still John as far as the world is concerned? Does he have all of John’s memories? Is little Junior happy playing catch with him?

Question_B: The first-person question. From the perspective of the guy who stepped into the pod on earth, does he have a future? Is he now seeing the moon, or seeing nothing, because he’s dead?

Now, I think some of the confusion here rests on an implicit assumption some are making that those questions are the same; if Question_A is true, so must be Question_B, and vice versa.

But no reason has been given why that should be the case, and indeed one crucial difference between the questions is that the answer to Question_A can be fuzzy, with a big gray area. There’s no doubt a huge uncanny valley in which John would seem different to others but still basically John. And it can be subjective so maybe his wife isn’t comfortable with him but his colleagues are, etc etc.

But the answer to whether he’s experiencing anything whatsoever, or is dead, cannot be fuzzy.
I’m saying “anything whatsoever” because for the point I’m making it doesn’t even matter if he lives on in a FUBAR state, that’s still a transport. He’s still experiencing. It’s not a gray area between dead and still_experiencing at all: it’s explicitly still_experiencing.
And the point is, if we agree that there’s a point where the answer to Question_B is “no”, then directly above that, there must be a binary flip to “yes”, because there is no possible answer to that question between yes and no.

Alternatively, if we say there’s never a time where the answer to Question_B is “no” then that has all the bizarre consequences, such as immortality for all.

Some of those questions only make sense if you consider the personality/identity of a person to be something that can ‘go’ somewhere - e.g. John not experiencung anything because he has ‘departed’, and a new person who merely resembles John has ‘arrived’.

If that view is not taken as given, you absolutely can have a fuzzy answer as to whether John is dead, or still experiencing anything, because that is exactly what happens in the case of traumatic brain damage, progressive dementia, etc. Being alive, being a person, experiencing things as a person; these are not boolean condition, even in the real world already, without transporters.

Ouch! May God preserve us all. Few things could be more horrible. Many of us would find death preferable…

The more I think about this, the less it looks like a materialist statement, and the more it appears to be an expression of the notion of self as an independent homunculus riding and controlling the body, rather than existing as an integral function of it.

I’m fairly certain you will say that’s not your position, but it really does look that way.

The fact that humans die and that multiple humans have entirely separate experiences are not facts in dispute.
The questions do make sense, they are the normal kinds of questions discussed in the context of the transporter problem, and if you engaged with them I think you’ll find it is an interesting point regarding the “you are transported” position.

Reduced function is not a superposition of alive and dead, it’s simply alive with reduced function. Every single time I’ve made this point I’ve said it’s about drawing a line between being alive in any form, and being dead.

Let’s make it simpler: you walk into the transporter on earth. The transporter vaporizes you, but then there’s an error and it creates Val Kilmer on the moon.
You’re now dead, right? Not experiencing anything, in any way, at all?

I don’t know why you’d think that; I’ve been arguing for the bodily continuity position in this thread, meaning consciousness is explicitly an integral part of a body.

Obviously so. How does this rebut what we “pragmatics” are saying in any possible way?

The difference between Val Kilmer and me is way beyond the “bright line” of unacceptable differences. It’s like a telegrapher trying to send Huckleberry Finn by Morse Code and getting Das Kapital instead. Ain’t nobody gonna claim it was a good transmission!

In itself of course it doesn’t. I’m simply getting you (both) to agree to a simple proposition so the argument can proceed.

On reflection (using my self-awareness skill :)) I see that a classical time machine will not achieve what I want it to in this experiment. Apparently in a classical machine, you can go back and forth in time beyond your lifespan and you view everything from 3rd person perspective (even with regard to another “you” if you travel within your lifespan). That’s not what I want, because we’re seeking 1st personal experience only.

We want something that will allow you to fast forward into the future in first person perspective, make note of what you feel while there, then travel back to the present, with the memory of your trip intact (that last part needs a special add-on app to achieve). IOW, we want to fast forward in our current arrow of time, experience qualia/or lack of qualia, then fast backward with a reversed arrow of time.

Think of the time machine as a transporter in time, rather than a transporter in location, with one major difference—the time machine doesn’t break down the particles in your brain, while the location transporter does. Now we can use the time machine to measure the qualia experience of the location transported person.

Perhaps some type of sub c/super c time machine will work, with a recording devise made of unobtainium whose recordings remain intact traveling back into the past. I’ll let you guys figure out how to build that. Or, you can just borrow the NSA’s machine.

Until we have this machine we can only make educated guesses about what this hypothetical time machine will reveal about these hypothetical situations (perfect clone and transported self).

So, given these parameters, if I time travel to the future in my non-location transported self and return, I expect to demonstrate that I experienced qualia after I reached my future destination.

If I time travel to a point beyond being location transported, I expect to return to present with no record of qualia experience. The same applies to my location transported self, but he can only time travel into the future and expect to record qualia; if he time traveled into the past (before being transported) he would have no record of qualia. Complicating matters is that he’ll have memories of being time traveled before being transported, so his opinion really doesn’t matter.

Given these new and improved time machine parameters, how would you answer the questions?

Carrying on, maybe using the computer analogy does simplify things:

Assume we now have computers with advanced AI. Trinopus exists in a Compaq AI-1000. It’s loaded with the Trinopus 1.0 operating system. It has various peripherals to input raw data (video, audio, touch…) into the computer where it’s processed and stored into an external hard drive (named T-memories). Trinopus learns, grows and experiences qualia keeps creating memories. He has self-awareness within the Compaq.

Now, take a mirror copy of the external hard drive and plug that into a tricked our Alienware AI-2000, pre-loaded with Bill Gates 10.0 operating system. Does anything change with Compaq Trinopus when the Alienware computer is booted up with his memories? I answer, “no.” What do you answer?

Take it a step further, plug Trinopus’s memories into the Alienware pre-loaded with Trinopus 1.0 operating system. Now, does Compaq Trinopus feel any effect when the Alienware is booted up? Again, I say “no”. How about you?

One step further: load the same things onto another Compaq AI-1000. I still say the first Trinopus Compact feels nothing when the second is booted up. How about you?
If you were the first Trinopus AI-1000, would you agree to be booted off for good if the second Trinopus remained booted? Let’s sweeten the deal and say there’s a Girls Gone Wild DVD in the second Compaq.

My conclusion: “you are your memories” is only part of the consciousness story. To take Trinopus’s allusion to the Holy Trinity, I believe sentient (or, maybe just sapient) consciousness is a trinus of memories, feeling of qualia and self-awareness. In the computer model, I see memories residing in the external hard drive, qualia being experienced in the processor and self-awareness being formed in the operating system (I’m no computer expert, maybe the last two should be reversed).

Memories are non-local, they can exist anywhere on any machine exactly alike. But, more important to me is my feeling of qualia and I view that as strictly a non-local event (unless you’re using hardwires or wifi as a connection). I’m still undecided about the creation of self-awareness part of the triad, although I lean toward it also being only a local event (i.e. you can’t transfer a copy of the Trinopus operating system to another location without breaking continuity).

I also believe rebooting the original Trinopus AI-1000 breaks conscious continuity and forms a new self-aware consciousness (like the transported person).

So would I choose to remain in a Compaq AI-Tibby-1000, or shut down and exist in Alienware-AI Tibby-1000 with my operating system and memories? No, not even with Girls Gone Wild in the DVD bay. I wouldn’t even choose the Alienware if the Compaq replaced my external hard drive with Bill Gate’s memories (particularly if his $ is part of the deal).

The position is that you can be transported, and remain - both things can happen in the same process, in the case of a duplicator (vs a transporter). That this is a surprising outcome is only because a duplicator/transporter would be a surprising machine.

Well, I just disagree with this. If you are a function of your brain, and a chunk of your brain is irretrievably lost, then a part of you has died.

I agree - you are dead in this scenario, just the same as you would be dead if you lost enough chunks of your brain in a transporter accident, to reduce the function of your mind to zero.
What you have observed here is not that heap problems have a bright dividing line, but rather, that they can have extremes (they need not, but they may).

I only perceive it because you appear to be arguing that there is more than the sum of the current state of a thing that makes it what it is.

It’s like you’re arguing that objects have a property of ‘having been’ - regardless of, and additional to what they actually, entirely are right now, somehow, they are also indelibly (yet not in any way measurably) etched with an aura of what they have been.

Of course not; why should it? That’s never been any part of my claim here.

Of course not; this is irrelevant to my views.

Once again, of course not. This serves no argumentative purpose whatever.

I punch Will Riker in the nose; why should Tom Riker feel the pain?

No, because we aren’t “identical” any longer. You’ve made an offer to one of us that you haven’t made to the other.

Again and again and again, you keep making this fatal error: “The two are absolutely identical…except that there’s a difference between them.”

You just can’t do that.

If the two of us are absolutely identical, and you make the offer to “us” at the same time in the same way, then, yes, we (being me) will both say, “Yes, you may turn me off, for I know I will continue in the existence of the other.”

If you wait until there’s a difference between us – as in the example of the magician’s apprentice drowning in a box while his duplicate waits in the stage wings – then the premise fails: we aren’t identical any longer.

(If he and I are identical in all ways, and you say, “I’m going to drown one of you horribly, but the other will live,” we will both say “No, thanks,” because there’s a full fifty per cent chance that, at the point of divergence, the new “Me No. 1” will be the one drowning horribly.)

Then this is the point of our disagreement and the point that I find illogical.

At least we understand each other now.

But you’re still alive, and that’s all that matters to the point I’m making.
Because, in any case, I’m referring to a fully-formed, fully-functional person walking out of the transporter, and how similar to me they must be for me to live on. And whether I live on is clearly binary: either I’m seeing what the person who walks out of the transporter sees, or I’m dead.

OK

It’s not a heap problem at all because there can be no gray area. There is no point at which the statements “You did not transport, and are now dead” and “You transported, and are now seeing the moon” can be true at the same time.

No aura, but it would seem that when it comes to consciousness the past does matter.

When I asked you about the scenario of Val Kilmer walking out of the transporter, you were as sure as I am that you completely don’t live on as Val Kilmer. You are simply dead, end of story.
But Val Kilmer is some X% the same as you. Why do you not X% live on?
Why mourn Mozart, when he’s Y% seeing through my eyes right now?

Because X and Y are very, very, VERY low numbers.

Successful Transportation/duplication would require transmission accuracy well above 99.999% You don’t resemble Val Kilmer or Wolfgang Mozart more than 97% or so. (i.e., the resemblance any human man has with pretty much any other human man.)

We haven’t agreed on where the line is, exactly, but we have agreed there is a line, and 97% is way below it.