Willing to be teleported as in Star Trek? I'm not.

How the hell are we supposed to cite a claim like that?

Beats me, that is why I asked for a cite!:stuck_out_tongue: If I could easily find it myself I wouldn’t ask.

It is just such an incredible claim, and so obviously incredible I can’t believe anyone would make it. But everyone rolls with it.

Plato’s conundrum. Same question. How much difference between two objects can we overlook? So-called “identical twins” aren’t. Two peas in a pod aren’t either. No two objects, man-made or occurring in nature, are “identical” until you get down to fundamental particles.

Even the Star Trek transporter probably fails to account for the exact energy state of several trillion electrons. But I’d accept that as sufficiently close to identity (given how many electrons there are in a human body!)

99.9%? No, I think I would not accept that.

99.99… twenty-five to thirty “9s” follow – per cent? Yeah, I’ll accept that without a moment’s qualm. I don’t hold on to that proportion of my own brain cells from one end of the day to the next!

I said the opposite- anesthesia separates your ‘soul’ from your body, but there’s no danger of it getting lost.
So to think there’s any danger of losing your ‘soul’ *from transporting *needs a cite.

I disagree that it’s the same problem.

It’s one thing to talk about where we arbitrarily draw the line when classifying objects. It’s another to try to draw a similar line when considering if I survive the process or not.

Let’s start from this:
Say I walk into the transporter, it scans and kills me. Then, when it’s time to create the new me on the arrival pod, it corrupts my data so much that the resulting human is identical to Hanna Montana, not me as I was.
Now, we wouldn’t say that I am now Hanna Montana, nor that I K% survived the process (where K is however many relative atom positions we happened to share). I’m just dead, the end. Agreed?

Sure. But if challenged to tell you how many 9s in the 99.999…% range is sufficient to declare that the transport was successful, I most certainly can’t. I’ll say that twenty-five 9’s is good enough, and three 9s isn’t good enough, and that the line is to be drawn somewhere in between. But exactly where, I have no way to say. I don’t know how to construct that number.

Personally, I think it is the same problem. How many 9s of similarity are required for one to say it’s “the same thing.”

Take counterfeited money. The philosophical purist would say “It is never real money.” But the pragmatist (me) would say, if it is 99.999… (twenty five 9s follow) per cent the same as real money, then it is real money for every possible meaningful purpose. Eliot Ness himself couldn’t tell the difference, even if he had a scanning electron microscope to play with.

I’m not a philosophical purist; I’m a scientific pragmatist. I don’t go so far as to say “A difference which makes no difference is no difference.” But I do say “A difference which is no difference is no difference.” The philosophical purist is not ready to make that concession.

In Plato’s conundrum, the only issue is how we are choosing where to draw the line. It makes no material difference where that line is drawn.

In the case of whether you survive the transport, that line has life or death consequences. You agree that if we make Hanna Montana instead of you then you did not survive the transport. But at some point there must be a binary flip where we say you did survive the transport (though not in an unchanged state).
This is a (known) significant problem for the Psychological Continuity position, but one that tends to get handwaved in debates here IME.

This is a very poorly-chosen example.
Counterfeit money is defined by the process, not its current properties. If the secret service kick down my door and see I’m printing money, then I will be convicted. It would be utterly irrelevant whether anyone could tell my money apart from the real thing (or, if it has any bearing, it would be that my crime may be perceived as more serious the better my forgeries are).

The government could hypothetically choose to use the confiscated money as legal tender (say their printers are on the fritz), but it is that process of the government defining and issuing the money that would make it “real” money. Nothing to do with the intrinsic qualities of the notes themselves.

Well I personally do not believe in a soul, but even IF I did I would not believe anesthesia separates your soul from your body. I’ve never seen any religious claim that tweaking with some brain chemistry can separate the soul from the body.

I don’t think a soul exists, but I still would not use a Trek style transporter.

I find this to be a curious position, Trinopus. You seem to be implying that some very small and arbitrary increment of change in brain structure will make an enormous change in the subjective identity of an individual. A change of, say, 0.00000000…000000001% of the atoms could spell the difference between you being you, or someone else. Although I still don’t believe it, I could more readily accept someone’s sense of self turning into someone else gradually in lock-step with a gradual change of brain structure.

Some of you claim that “you” are your memories and your memories may be converted to data and the data may simply be uploaded into another vessel and that vessel would be “you.”

I argue that memories are only part of “you” (subjective you), and not even the most important part. I believe your consciousness is an amalgamation of memories, sensation and thought process—from these three things self-awareness emerges and supervenes.

For the sake of simplicity, let’s again use a computer as our model, along with **Trinopus’s **memories (because he’s a good sport :)). The computer: Dell model AI-1000. (No WI-FI is allowed, since that would equate to human telepathy). There are 5 input devices hard-wired to the computer. All that is Trinopus fits nicely on a floppy disk :smiley: (I need many gigaflops to contain me!) .

Hardware: microprocessor (equals human thought process); input devices (equals 5 human sensory organs). Software: Floppy Trinopus (aka Floppy-T…how’s that’s a good rap name!).

Let’s now designate each of these artificial consciousness parts as being either “local” events, or “non-local.” By this I mean is it something that can exist only in one location, attached to one thing; or can it exist in multiple locations, attached to many things?

It is easy to see that the software (Floppy-T) is non-local. You can put copies of this in any AI computer and it will believe itself to be Trinopus and the universe will believe that to be correct.

It is also easy to see that the hard-wired input devices (sense organs) are local. Those particular devices will and can only input the sensations they perceive into the computer they are hard-wired to—no other. It matters not whether two computers and their input devices are exactly the same down to the last quark, without wi-fi, they will not share their sensations. It’s a little less apparent, but still rather obvious that the microprocessor “thoughts” are likewise local.

So, the hardware is local and the software in non-local (i.e. sensation and thought process is singular and limited to one location; memories may be replicated ad nauseum and can exist in more than one location simultaneously).

The Big Question: Are “you” in the hardware, or in the software?

By “you” I mean that higher order subjective sense of self that that emerges from and supervenes on the lower order parts of consciousness (lower order consciousness [you] begins in 3rd trimester; higher order self-consciousness [“you”] begins ~2 years postpartum).

Imagine now that you are Floppy-T in the Dell AI-1000 and someone takes out the disc and says, “I have to destroy either the computer, or the software.” Which would you choose?

I’d opt to have my software destroyed, because I believe “I” reside in the hardware. As the computer, It would be nice to have the destroyed **Tibby **1.0 software replaced with **Tibby **2.0 (the version where I recall all the people I care about, but also learned to play the piano and recall scoring with many more hot chicks). But, if that option wasn’t available, I still choose keeping my original hardware and load it with someone else’s software (memories)—maybe Hugh Hefner. I’d rather be a live Hugh Hefner than a dead Tibby.

How about if the floppy drive was broken, and no one else’s memories could be loaded? I’d still choose the computer over the original software. As an AI computer, I’d expect the microprocessor thought processes to interact with the real-time input sensations (i.e. “short term memories”) to form new long-term memories to reference on my hard-dive. Same subjective awareness of self; new identity. How would that feel? It would feel like going back to my third trimester and being born again.

In keeping with the AI computer analogy, I believe my base level sense of self would remain intact no matter what software was loaded, or if I continued with no software and processed only new data to be referenced. Going into hibernation/sleep mode would not affect my sense of self, but being re-booted would destroy me and give birth to someone else.

Do you think “you” has to be you, or could “you” be “you” apart from you? Maybe you think you could be “you” in a ewe if the ewe had the memories of “you”. What do you think? How about “you”? How about you, ewe?

Just watched this episode (I’ve been working my way through TOS, watching an episode each week while doing laundry). You forgot the most important point: Spock remarks that he expects Redjak’s consciousness to persist “for some time,” even when scattered into billions of energy fragments. Admittedly, he was talking about a seemingly immortal entity, but why would he say that unless it fit in with his understanding of the nature of consciousness? So we have an explicit, in-universe declaration that consciousness is not disrupted by the transporter process.

That’s a rather unique viewpoint Tibby, for myself I would rather keep the software and run it on different hardware. Both because I think the software is you and not the hardware and I think that it would be a fascinating experience.

I disagree. That’s the hell of this conversation; we can’t even agree on the terms of discussion!

No! I’m not saying that at all! See what I mean? We are simply not successfully communicating here!

This is why all the ‘shoot your duplicate’ thought experiments are wrong; you can’t duplicate quantum-level information, but you can move it.

Good catch!

Glad Biffy finally ended this debate once and for all. Consciousness persists because Spock said so. And who can argue with Spock?

Who else? The replica Spock from “Spock Must Die.” He was damned good at arguing with himself!

Wow! Six pages! I feel like the only thing missing is…
“If a transporter were on a treadmill, would it still transport you?”

I wouldn’t mind being teleported to a place where there is no Star Trek.

Do we? Why?

I should add that I do not believe the transporter (as usually described) actually transports you.
But the appeal to quantum phenomena always seemed to me an attempt to find something, anything, to justify our intuitions. Why is it necessary to move the quantum state?