Ending death for most people in the United States is feasible. What's your excuse?

It does not stop death. If you take a picture of a person just before they die you haven’t stopped their death.

@SamuelA : You’re still assuming that copying a brain is the same as transferring my consciousness; that we’re “stopping death”.
For the purpose of this thread it suffices to say that actually that’s a very contentious idea, and when it comes to the transporter problem, very compelling arguments can be given that the copy has no association with you (*and *very compelling arguments for the opposite view).

I think it’s OK to say “I think the copy really is the transfer of your consciousness” as long as you’re conceding that it’s your opinion and not a settled fact. Otherwise we’re going to end up revisiting very familiar territory here.

As I said earlier, I am in the camp that thinks uploads are possible - but I do agree that the original could (and probably will) die at some point, so something is lost in the process. If and when this technology becomes available (probably hundreds of years from now) society may split into two philosophical camps- the ones who think that uploading allows the survival of consciousness, and those who don’t.

Note that (even if the ones who think that consciousness does not survive are right) the ones who copy themselves may soon outnumber the ones who don’t. It’s a process of un-natural selection.

I’m saying that the argument is irrelevant because the same technology that would allow you to make copies would let you establish network links, such that you share thoughts and memories with any such copies. Since they are you and you are them (after some brief period of resync delay), if one of you dies, the total “you” has only lost a few hours or days or whatever of memories from one of your nodes. You might experience some annoyance at this but it would be a minor loss.

Hundreds of sci-fi stories have been written about this idea, but they are just that - stories of evil doubles, etc. It wouldn’t really be an issue.

I guess what I’m saying is that I do not concede that it’s a copy at all if you have a realtime sharing of experience and personality updates and knowledge and all the rest with your “copy”.

It’s an obvious additional step. Yeah, it’s a loss of being human, but as you can imagine, the beings doing this would almost immediately have such an overwhelming advantage over the people who don’t that they would soon become the vast majority of all sentient beings alive. There might be a natural size where more group members means less flexible and less creative “groupthink”, but less group members means less collective knowledge and self-sufficiency.

Nothing wrong with engaging in this sort of imaginative exercise, but the more you talk about it, the more obviously absurd it becomes to claim, as your OP does, that this issue is somehow more relevant and important to present-day existence than, e.g., “Trump or taxes or gun control”. It seems overwhelmingly likely that the life of even the youngest grandchild of anybody on these boards at present is going to be far more seriously impacted by the issues of “Trump or taxes or gun control” than by cybernetic “consciousness copying”.

If you want to stop death, forget about the second you, just store your mind state every night and get yourself loaded into a new body when you die. There are a lot more sf stories using this than multiple entities.

I think this makes the situation more complex, but don’t agree it necessarily makes it irrelevant or not an issue. I still think the best approach here is to just say it’s part of the premise of the thread that it’s a successful transfer of your consciousness. Your main point was not about debating that.

I wasn’t familiar with the term. Looked it up on Wikipedia then found the short story online and just now read it.
Thanks for the nightmares…

Seems like a pretty unfair characterization of the posts in this thread referencing old people. I read those same posts and nodded along because it seemed pretty clear they meant that young people replacing old people was a good thing because we were also replacing their *Attitudes and *Worldview.

Would the world really be a better place if the same legislators who drafted Jim Crow laws and fought against giving women the vote were still alive and passing legislation now?

Civil war plantation owners and slave trading captains? European colonialists delighted to meet primitive savages and enslave them to mine gold because that’s the natural order?

We generally tend to think of ourselves as pretty progressive and reasonable, but take our leaders of today and throw them into the context of 2050 or 2100 or 2500 and they’ll seem shittier and shittier in comparison to people born in those periods.

I’m all for living forever; I’ll be pleased as punch if mind downloading happens and everyone lives in blissful VR for as long as the cloud stays up and they’re lucky enough to avoid power outages. But I’ll be equally horrified if all those dead people are still walking around in cloned or robot bodies carrying on like they never died. We *need people taken out of the mix after a while to keep society progressing, bad ideas removed from the norm, and room made for new ideas to take over.

Resistance is futile.

Apparently they’re making it into a movie. Yes I know, a Stephen King short story being made into a film, whatever next?

The link is quite old though, so maybe not.

You’ll have to wait longer than you think.

Yes, and I’ve written a few myself. But there is a data-related problem associated with this strategy that we don’t really know how to address. If you take a new copy of the mind state every night, and store it until the next night, you are basically storing an inactive copy of the original then rewriting over the top of it.

We don’t know how memories are stored, and it is possible that the new version is significantly different to the old one. By writing over the top of the old copy, then you are basically destroying a fully coherent, viable (but inactive) human being. Do you destroy the old version, or should you keep it just in case? Maybe have numerous copies of various dates stored in different databanks in case of accident? You could, if you wished, reactivate any, or all of these alternate versions.

I think the worst problem with making redundant copies is that the total level of neurodiversity goes down. Given enough reactivated copies you could populate a planet, or a universe, with copies of yourself. This would be a Bad Thing[SUP]TM[/SUP].

There is an assumption here that I don’t agree with, that memories are part of consciousness. They are certainly used by consciousness, but I believe memories are just simple data used by the *process *of consciousness. The current state of my consciousness has been affected by my memories, but the process itself does not require them. My consciousness can be copied without my memories. Without any memories that consciousness may no longer work, but with some of my memories, or copies of other’s memories it could continue to operate, and continue to develop. New experiences will change my consciousness, it’s a dynamic process.

King is the master…

Whenever someone says that it would be a good thing if we were all of the same mind, you know that what they are really thinking is that it would be nice if everyone was thinking like them.

You’re making a big point about how important memories are. Keep your memories, and “you” continue. But I forget stuff all the time. Does the old me die when I forget something?

Your memories are important, sure. But you don’t die when you forget things, you just forget things. It’s completely impossible to remember everything that happens to you, and even if you could remember everything there’s no way you could actually remember everything since playing back the complete memory would take as long as it did to experience it in the first place. You don’t spend your whole life watching tapes of yourself in the past. Your memories are abbreviated recreations.

So what exactly is the point of synchronizing memories with duplicates of yourself? You don’t suffer because you can’t remember what you had for breakfast 5 years ago on Tuesday, why should you suffer because you can’t remember what your duplicate did yesterday?

The thing is, human beings fear death because we evolved that way. If we create a strong AI–which is what an “uploaded person” would really be–then why should we create a being that fears death, or thinks duplicating itself is creepy, or thinks remembering things is the key to immortality?

An AI that can be copied, edited, duplicated, deleted, or stored like our familiar programs of today is something that isn’t much like a human or animal brain. It’s not going to have the same instincts or compulsions, because it didn’t evolve them. It’s easy to imagine a human being who’s scared of death, so he makes some sort of digital copy of himself that will live on afterwards. Why should the digital copy fear death? If it fears death because it’s a copy of a human being who feared death, wouldn’t the correct treatment be for the emulated AI person to just edit themselves so that they don’t fear death rather than going through some elaborate gyrations to preserve themselves forever?

Yes, eventually if this sort of thing really worked there would be some sort of natural selection such that AIs who cared if they survived and copied themselves would persist and spread, and AIs who didn’t would be marginalized. But that would only happen in an environment where natural selection is possible. That is, lots of AI instances are created, those instances vary, some variations are more likely to persist and duplicate themselves, and the variation is heritable. Except if we can just edit the AIs directly (like when they “sync” themselves in your example) then the cycle of natural selection is short circuited.

The point is, human beings have drives and instincts and needs that were created by evolution. AIs will have very different drives and instincts and needs because their evolutionary history is going to be completely different in every way compared to human beings. And this includes AIs that were deliberately created to emulate human brains.

If a strong AI is possible, and there’s no good reason to believe it is impossible, then an AI that will react as if it were a particular human being is certainly possible. And if you can’t tell the difference between an AI emulating a particular human being and the actual human being, then what exactly is the difference? An AI that can talk to you as if it were Grandma, and talk about Grandma’s experiences as if it remembered them, well, if it acts as if it remembers being Grandma how is that different than what regular human beings do? If you can’t tell the difference, then what is the difference?

But it doesn’t really matter. An AI that remembers being Grandma is fine and all, but why should Grandma care if there’s an AI that remembers being her after she dies? The only reason this would happen is that people who care create these AI duplicates, and people who don’t won’t, and so only the people who care will have AI copies of themselves floating around. How these AIs fit into the future ecosystem of competing AIs is anyone’s guess. I’m going to imagine though that AIs that emulate humans are going to be drastically inferior to AIs that work like AIs, and any AI that persists in acting as if it were a digital copy of Grandma is going to only survive in the future equivalent of a digital nature preserve.

…but entertaining. :wink:

I don’t care. I suffer from severe depression, I’m not interested in extending my life, nor am I interested in being part of a grope mind (too many voices in my head now).

Call me when they have something to help people with mental illness.

I donno, that sounds potentially interesting.