John Locke, personal identity and the problem of duplication

I think there are lots of ways to get around the problem of duplication for Locke’s definition of self, and “I don’t care about philosophy” or “doesn’t seem like a problem the way I define it” are two of them. They do seem a bit out of place in a thread called John Locke, personal identity and the problem of duplication, though.

I am not me from 1 year ago, or even 1 second ago, because I constantly gain more experience as time goes on. If I were to be duplicated right now neither of my future selves will be the same as I was in the past but each will still have a continuity of consciousness that extends into the past. There is an exclusivity to my past that disappears when I am duplicated, but that’s all that changes, and it doesn’t affect either of our identities. We know who we are now, and who we were in the past. Of course people don’t usually share a common past like that, but people usually aren’t duplicated either. Still, there is no confusion over identity.

But would you have an infinite number of different experiences? Alternatively, if time stopped and never restarted, would that make you live forever? I’d say no, and for the same reason, I’d say that repeating the same life over and over wouldn’t count either.

If the universe is infinite in scope, then there are also an infinite number of copies of you out there right now (as much as “now” means anything when you are dealing with things that will never have intersecting light cones.)

But I’d say that it’s more “causality of self”, in that the state of your consciousness now affects your consciousness in the next moment.

Does this change if AI achieves self awareness?

One of the problems with this question is the difficult to implausible notion of copying a human being, memories and all, and producing an identical duplicate. This is hard to imagine, but even harder (probably impossible) to do.

But with AI, it’s fairly simple to do. If I have trained a new AI system on my GPU cluster, then shut it down and make a thousand copies to distribute to others, are they the same? What if we network them together so they share the same experiences?

“Self” is what we call what we are experiencing the world with. If there is an identical copy to me, I wouldn’t call it “self” because it is having different experiences that I am not experiencing. If we had a telepathic link so that we were both experiencing the same thing, then we’d both be “self”.

The problem I see is that it comes to poor definitions and false intuitions. We call ourselves “self”, so anything that is not ourself is not “self”, even if it is identical and having the same thoughts about me.

You are not qualitatively the same person, no.
However, the presumption most of us have, or at least the thing we live our lives as if*, is that you are numerically the same person.
And that’s where the difficulty comes in. In a hypothetical future where consciousness could be transferred, or duplicated, or split, questions of numerical identity because very hard.
Our current understanding of consciousness is still far too rudimentary to have a clear answer to such questions.

* In the same way that we live our lives as if the world around us is real. It simplifies things, plus of course even if we knew that the world was a simulation or whatever, that information in itself would be no guide at all on how to behave. So you may as well satisfy your hunger, for example.

I don’t know. Like I say, I don’t think we have the tools to start answering such questions, all we can do right now is speculate and maybe rule out inconsistent or incoherent models.

Of course, if strong AI is possible it raises many more questions than just personal identity.

If I can make consciousness in a computer can I perform the world’s worst ever war crime by making a trillion instances of a program capable of feeling a billion times more physical pain than a human? What does “times” physical pain even mean?
Sorry to give such a dark example, but I think it’s visceral examples like this that bring home that our “common sense” understanding of consciousness cannot yet avoid crazy implications. Or at least, it’s far too crude to rule them out.

All Locke meant is that what makes you ‘you’ is not your physical body, but the chain of memories and experiences in your concious mind.

Basically, if you clone a body, you get two identical bodies, but two individual people because their experiences will be different. But if you make a perfect cooy of a mind with all its memories and learning, it would be the same person even if put in a different body, In short, it’s the mind, not the body, that makes you individual.

Now, when talking about old philosophers you have to remember that they were dealing with more primitive science and therefore could be wrong about things. For example, Locke didn’t have an appreciation for how the mind and body work together, how the body affects the mind and vice versa. He didn’t have an understanding of how the brain learns and the physical nature of thought. Old philosophers must be read in the context of their time.

That said, I’m still not seeing a problem with the debate over self. If your mind gets duolicated, two people will both believe they are the same one. So what? As soon as the copy happens they begin to diverge, so they are now two different individuals with the same memories from before thr split. Why is this hard to accept?

The thing you are possibly not getting is that it is not about “belief”, it is a question of being the (numerically) same consciousness.

Let’s say you make a perfect duplicate of me, on Mars. That duplicate might be identical to me, in the first nanosecond, but it’s not like the Mijin that lives in Bristol, UK, gets to see the martian surface for a nanosecond. They are two entirely separate entities, as separate as me and King Henry VIII, that happen to have the same memories and personality.

Imagine that far beyond the observable universe there is an entity that, just by random chance, happens to have an identical brain state to yours. (Let’s just say it’s a Boltzmann brain, if you’re wondering how it could have identical memories in a different part of the universe).
So, was your consciousness just “split”? If you were to die now, would you say you live on, as the other Sam_Stone, far outside of the light cone of the first?

Imagine that Boltzmann brain is identical to yours now, apart from one bit of data being flipped. Is that still a “split” of your consciousness? What decides?

Or committing genocide by creating a trillion instances in a paradise… and then shutting it off.

I mean, the answer is that you can, assuming that we have that capability. Now the question is why would you?

If we live in a simulation, are our creators worried about the moral implications of all the suffering that exists?

I’ll agree that those are disturbing implications, but I don’t think that they are all that crazy.

Someone who believes they are me lives on. I don’t have any problem with that.

Also, the fact that my experience is different from theirs is perfectly acceptable.

It seems like the argument is that, if I can’t experience both “selves” simultaneously, then they can’t both by a self.

I only see a problem if there is some metaphysical object, like a soul, is necessary for self awareness that can’t be copied. I don’t think that’s the case, though.

There is no decision to be made. Both me sitting here and the Boltzmann brain duplicate a googl+ years in the future will be self aware and think of themselves as being “self”.

Once again, the question is not about “belief”. The fact that two entities initially share a belief is pretty much inherent to the premise and is completely trivial.

The question concerns the numerically same instance of consciousness.

I don’t personally believe it is the same instance in that sense…my consciousness does not “split”. There are two entities, I am me, and he is him. We happen to have the same memories. The same likes and dislikes. Maybe the same name. But that doesn’t make any magical connection between us even before any “divergance”.

The copy has identical memories, but those memories are, in fact, mirages. Hallucinations, if you will. The copy truly believes in a continuity of consciousness, but that continuity is an illusion. They are an entity that has existed only for the few moments since duplication occurred, with a false memory of experiencing all kinds of things that he never actually experienced.

He’s qualitatively similar to an insane person who remembers hundreds of prior lives. They are elements of his mental continuity, but they just happen to be false.

The fact that the original and the copy are identical at the moment of duplication doesn’t change the fact that one of them has a bank of memories that actually occurred, and one a bank of delusions. I am likely missing something, but I don’t see how the duplication hypothetical creates a conflict for Locke. One person “owns” a flow of memories that actually occurred, and one “owns” an illusion that came into being when the duplicate was “born.” Locke wouldn’t argue that a delusional schizophrenic lacks the personal identity he defined, right?

The memories of the original are also delusions, partly because they are the memories of a fallible human being, and partly because they are ‘only’ memories.
Memories are data recorded partly in the brain and central nervous system, and partly in the body (‘muscle memory’ is a real, and important, aspect of memory). I assume that a truly accurate method of copying would result in a person with exact copies of all the memories in the brain tissue, and also in the body. A person with the exact same body and brain as the original would have the same memories.

I have great trouble imagining the sort of physical process that could create such a copy. The process of ‘uploading’ by electronic means would not be sufficient, I think. Even a very accurate ‘upload’ would only ever be a copy, especially if you consider the ‘no-cloning’ theorem that prevents exact quantum copying.

How about the Many-worlds interpretation? If you observe an event that splits you into two separate observers, you would be self-identical up to that point, then exist as two different instances. Locke doesn’t seem to consider that possibility.

He argued exactly that, and pointed to the lack of punishment for crimes while a person is insane as proof. When you are insane you aren’t the same person as when you are sane, because you do not have a continuity of consciousness.

Why do we care about that? It seems like you are trying to disconnect ‘consciousness’ from the physical person, such that there’s a conflict if you duplicate the person but only have one consciousness or something. Or, that people have a ‘soul’, which cannot be duplicated, and therefore duplicating a person creates a conflict in that only one of them can have a soul.

But if like me you have a mechanistic view of the brain and of consciousness, none of this seems to be particularly difficult. If you duplicate someone, you also duplicate the physical brain, which encodes everything having to do with the mind, including consciousness. So now you have two people with shared memories, with two separate consciousnesses and immediately begin to diverge and become individuals.

I suspect Locke’s problem, to the extent he had one, had to do with the inability to accept that consciousness is nothing more than a physical manifestation in the brain. Even deists of the time thought that there was something supernatural about consciousness, didn’t they?

Isn’t that the resolution of this “conflict,” then? Wouldn’t Locke categorize the copy as a blameless “lunatic,” having a baseless belief in a non-existent personal history?

I’m mostly just trying to explain the problem.
Several people are saying things like “once the two consciousnesses diverge, then they are separate” as if that solves the problem of personal identity, when in fact that’s basically the premise.

Maybe it is simpler to come at all this from the Star Trek transporter problem description (does a transporter move consciousness, or does it kill you and make an entirely separate copy elsewhere?). Because, in terms of Locke’s description, many here are mistakenly seeing this as purely a question of semantics.

The fascinating thing with this problem is that most people subscribe to one of two camps and both accuse the other of believing in souls.

If you believe that the transporter moves consciousnesses, you can rhetorically ask (as you essentially have done, here) What does the being at the transporter destination lack, in order to be you? Is it a soul?

If you believe that the transporter just makes a separate copy, you can rhetorically ask With most objects we do not consider an identical copy to literally *be* the original. What’s the difference with consciousness…what physical mechanism links the two entities? Is it a soul?

I have a master’s degree in neuroscience and work in neuropathology. I very much have a mechanistic view of the brain and consciousness, but I accept that our understanding of consciousness is insufficient at this time to answer this philosophical question.

I agree in a certain sense, but with different reasoning. We’re all constantly changing. I’m not the same person I was this morning, two weeks ago, or thirty years ago. It’s highly unlikely any of the same atoms in my brain right now were there thirty years ago. Even if they are, they’re probably doing something different (maybe a hydrogen atom that was part of a water molecule in my brain back then somehow worked it’s way back and is now part of an arachidonic acid molecule in the cell membrane of a neuron or some other random molecule). A copy is different in that same sense. None of the exact same neurons in my brain were there in 1980 when I was three, despite my still remembering going through Hurricane Allen. The continuity is an illusion even without getting into duplication.

Actually, Locke famously held it possible that mere matter could think, without the need for any immaterial thinking substance, although he considered the matter undecidable:

We have the ideas of matter and thinking, but possibly shall never be able to know whether any mere material being thinks or no ; it being impossible for us, by the contemplation of our own ideas, without revelation, to discover whether Omnipotency has not given to some systems of matter, fitly disposed, a power to perceive and think, or else joined and fixed to matter, so disposed, a thinking immaterial substance : it being, in respect of our notions, not much more remote from our comprehension to conceive that God can, if he pleases, superadd to matter a faculty of thinking, than that he should superadd to it another substance with a faculty of thinking ; since we know not wherein thinking consists, nor to what sort of substances the Almighty has been pleased to give that power […]

Indeed, as noted above, the question of personal identity or continuity is much simpler for the dualist, who can just identify it with the substantial continuity of the res cogitans. Locke’s notion can be considered an alternative that doesn’t rely on adding an immaterial substance to matter; the trouble of duplication isn’t that we then don’t know where that immaterial substance ends up, or anything of the sort, but simply that we can derive the contradictory proposition that someone and their double both are and aren’t the same person from it—they aren’t the same, because they aren’t psychologically continuous (my experiences are not my double’s), and they are the same, because both are psychologically continuous with the same pre-doubling person (and A = B and C = B implies A = B).

But that’s just par for the course in these discussions, because what people really want isn’t to try and consider difficult problems, but rather, just not to be bothered by them; so labeling those who do find an issue problematic or just hold to a different point of view as ‘dualists’, with that being a position considered to be doomed from the start, just neatly removes all need for examining one’s own preconceptions. So if you think anything about consciousness is problematic at all, that’s obviously just because you’re secretly just a feckless dualist who believes in immaterial souls, fairies, and the Easter bunny, because you’d rather hew to romantic delusions than accepting the reality of whatever flavor of materialism everybody else uncritically believes in.

This still doesn’t address the problem of Many–world identities. In that concept, entities split and are multiplied constantly.

Perhaps Locke’s argument is evidence against the idea of Many-worlds, or Many-worlds is an argument against the continuity of personal identity, but whatever, they do not seem to be compatible.

Or maybe the truth is that the original and their double really are and aren’t the same person, and we just can’t imagine it with our limited conception of identity. It’s waves and particles all over again.

Agreed, and welcome back :slight_smile:

I would say another aspect is also some people don’t quite understand what the problem is, so make the assumption that it must be the other side throwing in unnecessary gubbins like souls.

(…a lesson for us all, perhaps. There are certainly philosophical problems which I think have been needlessly overcomplicated, so I’ll try to be self-aware enough to consider whether I am making the same error…)

It seems to me that an individual has memories of a sequence of events, A+B+C…X+Y+Z, and that is their identity, according to Locke;

but if the copy has memories that diverge after event Y, so they remember A+B+C…X+Y+ Ꙃ instead, they are not the same individual. They are almost the same, but almost is not the same as identical. So there is no contradiction.