Who among you would want immortality via conscisousness/memory recording?

Let’s say you have an brilliant friend, Shawna, totally unaffiliated with Rhymer Enterprises, who, using entirely unmagical technologies, has perfected a method of recording the consciousness and memories of a human person. The recording, until activated is as quiescent as any other computer program is when not running; it will be activated upon the person’s death, whereupon it will exist in a simulation it find indistinguishable from the real world until some other hoser devises an android body capable of carting it around, whereupon, if the digital version desires, it will be uploaded into that.

You’ve known Shawna for years-- you saved each other’s lives during the wars–and you know that she is as empty of shit as is possible for a human to be. If she says she can do shit like this, it’s true, and you have utter confidence in her probity and good intentions. Shawna explains the process to you in detail and tells you that it is painless and safe. She’s used it herself–or, rather, uses it herself, as she updates the recording on a regular basis, and the only bad part is that it requires her to be hooked up to the recording gizmo for five minutes every month. She even activates her own recording briefly and hooks you into the virtual world, so you can see how convincing it is and interact with Virtual!Shawna.

The process is insanely expensive. Fortunately you saved Shawna’s life six times while she only saved yours thrice, so she’s willing to underwrite not only your initial recording but the monthly updates. The initial recording will take a while–about five minutes for every month of your life to date.

Do you take Shawna up on her offer? Why or why not?

No.

My downloaded self isn’t me. I’ll still die. My identity will still either cease, or transition into an afterlife. (For the sake of not polluting this discussion with an argument about the afterlife, presume it’s the former.)

Oh, from an external perspective, it’s probably the same. I mean, one “me” is as good as another “me” to someone else. The downloaded personality will still respond similarly to how I used to. From a “gnoitall Turing test” perspective, it’d probably pass.

But that entity in the computer will be perceiving the universe through a sensorium that isn’t mine, and think thoughts that aren’t mine. I am me and not you for the same exact reasons.

I’d take her up on it. I want to live for ever. Even though in your scenario it would not actually be me with the potential to live forever, I would be giving birth to a being that thinks like me and would therefore be very happy to have this gift.
I would prefer if my recorded self is never activated until I am actually dead. To activate a recording without letting it go on to “live” forever seems cruel and as I have that opinion, so would my recording. If my recording knows it is activated it must assume that it will continue to be indefinitely.

Probably. It would be a comfort for Husband and my family.

The results are public on account of me being a jerk.

I’m doing it. To not do so would deprive the world of the awesomeness of me and that would be cruel and unusual punishment.

Since I will not be around to enjoy the immortality, I’m not interested. What do I care if a copy of me gets to live forever?

Sigh. The problem with most people’s objections is they aren’t well thought out, and they don’t reflect the actual limitations of the technology.

  1. For the foreseeable future, you gotta die first. You gotta be stone cold dead, probably with a chemical cocktail of plastination agents or cryoprotectants injected while your heart was still beating (or on bypass). Someone slices your brain very very finely, using an ATLUM, and then an array of electron beams scans each slice and reconstructs your synaptome. A very powerful set of circuitry emulates it.

So…in case #1, you were totally dead. Kaput. Oblivion forever. And now something of you is still around. To me that sounds like it’s better than the alternative, copy or not.

  1. If you have anything like the technology described in the OP, you could transition very slowly and gracefully. A “realistic” scenario (aka allowable by today’s knowledge of physics but unbelievably difficult to engineer) is there are a series of growing tendrils, from a larger source machine, growing into your brain. The tendrils might be composed of trillions of cuboid robots, and these robots drive down internal tracks in the middle of the tendrils, welding themselves onto the “growth edge” of the tendril. That’s how the tendril pushes into your living brain. They use nanoscale saws and sensors and other things to rip your synapses up one at a time, scanning it to extremely fine levels of detail at the same time. You are still alive and conscious. Good thing the brain cannot detect pain. As they do so, the tendrils also drive themselves deeper into the axons connecting one synapse to another, and as they are ripping up a synapse, they emulate the electrical signals that would have been sent had the synapse still existed. As you can imagine, over time, more and more of your brain is replaced, and the “virtual” part is being emulated by a computer located close enough to your brain that the speed of light delays are short enough you can’t tell a difference.

So instead of dying all at once or being copied, you are just unable to move but can function normally during the years or decades this process takes. You can’t tell the difference and your continuity of existence is preserved.

It sounds scary but your brain dies a little piece at a time, every day, anyway. Doing it this way, you would actually get smarter over time, as the virtual neurons would probably act more like your neurons would act if they were perfectly healthy and had limitless energy. Once you finally transition, you could probably “cut loose” and install tons of patches and upgrades and speed up your thought speed to millions of times what it is now. That, to me, sounds pretty awesome.

I voted no. I might be willing to have my memory backed up for the sake of saving some of my knowledge or memories for use by future generations, but I don’t think that’s a virtual me in the sense the OP means.

But simply put, when it’s my time to go, I want to simply die. And while I recognize that that virtual me isn’t me, and it seems the OP recognizes that via the demonstration, I also realize that that virtual me would, from it’s perspective, be a continuation of my consciousness. So, unless somehow my own desires and expectations change sufficiently through the process to change my mind on that, which would seem odd in that it really wouldn’t be a good copy of me then, then I can’t imagine it’s existence would be a pleasant one.

Simply put, though I am a theist, my sense of the hereafter is decidedly different from the typical notion of an afterlife. How it differs isn’t particularly relevant here, but the point is my work is done when my life is done, and it’s time to pass the torch to the next generation. I really have no desire for immortality, or rather an indefinite continuation of my consciousness as it is now or might be if digitized.

Further, as much as that continuation of my consciousness might be similar, I also think the actual knowledge of being incorporeal would be a difficult matter to overcome. Even though I’m not particularly attached to my body, it is the very filter through which I’ve experienced life. Would a future as a digitized consciousness attempt to duplicate this sense, would it improve on them, or attempt to improve on them? I think such a thing would be shocking, perhaps even maddening.

And what if even some minor variation in how I’m copied, or some aspect unforeseen would cause that consciousness to deviate in an unsuspecting way; I bear direct moral responsibility for all of it’s actions, good or bad, despite having no control over it. Even acting in a way morally consistent with my beliefs now, approximation errors will build over time, could be days or years, but we’d have to expect that it would last long enough to potentially reach that point. Thereafter, though it is derived from my consciousness, it is no longer remotely like what I am or would have grown into organically in nearly identical circumstances. Perhaps it will exceed anything I can achieve in the flesh and become far more intelligent and moral, or it might devolve into something so foreign and grotesque as to be unrecognizable as a former human consciousness. Regardless, bearing that moral responsibility for creating and setting it free with such a risk is not an act I’m willing to take.

The virtual me is not me… but then again, me after a traumatic brain injury is also not me. So I’m somewhat in favor of this, even though living longer/forever is not a big priority for me. It’s certainly a small sacrifice to have the process done.

I’d have to make the decision after deciding a few questions, though:

  1. what is there for the virtual me to do? Current law wouldn’t recognize the virtual me as a person and I’d hate to be unable to work, to own property, to vote, etc. even if I do have a wonderful simulation to live in. I’d have to talk to my lawyer about corporations or trusts and see if that’s a possible way to solve some issues. I don’t want me and my simulation to be someone data on a hard drive they can wipe out any time. (Or worse, force me into virtual slavery). If we can’t solve the legal person issue, then even the android body won’t be so wonderful.
  2. would there be any interaction with real people? How would this decision affect my family and friends? If me as a simulation interferes with them having a good life, then I’m happy to pass on.
  3. what do they do if a virtual person goes crazy? Are there virtual drugs? Virtual shrinks? It’d suck to have the virtual me go all manic-depressive or OCD. And, the more a mental illness can be fixed, the higher the risk that someone intentionally tampers with my program to screw with me.

And, that’s why religion is bad for kids, mkay. You simultaneously believe in an entity powerful enough to prevent your physical death yet is too stupid to know the difference between what you consider to be a “computer copy” of you and the “real you”. If you’re that confident in your religious beliefs, why not die now? If your beliefs prevent direct suicide, go try to save people from ISIS or some other lethally risky endeavor. After all, the future you might commit a bunch of sins that you in turn are morally responsible for. What if you live to old age, go senile, and the old man version of your commits a rape? I guess your god will hold you to account for that, right?

Sign me up. The copy wouldn’t, in some sense, be the same person as me… but then, in that same sense, the Chronos of tomorrow, or last week, or three nanoseconds from now, isn’t the same person as the Chronos of right now, either. If it were the same person, I’d be dead.

So, just what’s so special about those persons of the future or the past, that I identify all of them with the same name? I call future-me “me” because it has the full set of memories and experiences that present-me has, and I call past-me “me” because present-me has the full set of memories and experiences that past-me had. When I say that I want myself to live forever, the real meaning of that phrase, if it’s to have any meaning at all, is that I want there to be a being that retains the set of memories and experiences that I have.

Now, technology like what’s described makes it possible for there to be more than one such being, but that possibility doesn’t present any inconsistency. Any such being has an equal claim to be “me”, even though they would not be the same as each other (neither one’s memories and experiences would be a subset of the other’s). “Is the same person as” is apparently not a transitive relationship.

You know, the same technology would probably let you re-synchronize your virtual selves. Only in bad science fiction novels are copies of a person made digitally unable to be re synced with each other.

I say yes. Ultimately, what’s the difference between “me” running on organic, living cell hardware, and “me” running on electric hardware?

It reminds me of that thread some months (years?) ago about anesthesia and whether you’re really “you” when you come out of it, or if that old “you” dies because your brain is essentially shut off entirely, and then you reboot, and a new “you” with new memories takes over the machinery.

As presented in the OP, sure - whether you regard VirtualMe as RealMe or some sort of offspring doesn’t matter, something of me gets to endure.

You force me to put on my evil hat.

:: dons Cobra-Commander helmet, shoots random Hungarian in leg to establish credibility ::

So, in my malevolent incarnation, I have stolen Shawna’s technology and used it on you. Now, obviously the no-hitting-chicks rule prevents me from murdering Shawna out out hand to assure RhE’s monopoly on the tech … but you’re a dude. Evil!Skald’s a dick, as you well now, so just to be a jerk, I am presenting you with a choice. I’m either going to erase Virtual!Chronos in a fashion that will be painless to [del]him[/del] it, or send in a throttlebot to beat the heck out of you, causing you a great deal of pain but not killing or even crippling you. You may choose which happens.

What’s your choice, my Noldorin friend?

There have been several threads on this idea (though not with the entertaining set-up) which usually descend into both sides staring blankly at each other going, huh?!?

Put me down in the ‘no thanks’ camp, having a virtual copy does the real me no good at all and I’m not arrogant enough to think the universe needs another version of me either.

Also I’ve read ‘Surface Detail’ by Iain M. Banks and ‘Altered Carbon’ by Richard Morgan, at least a real physical me can only be killed once, the horrors that could be inflicted on a sentient virtual entity defy imagining.

Sure, why not? The worst that could happen is it doesn’t work, right?

Challenge accepted!

Also, you undercut the Virtual!Me-Is-Acceptable argument with the phrase “real virtual me.”

Uh oh, what if I said I was female? You don’t hurt girls right? (ah the joys of the internet, its impossible to tell)

I don’t think I said ‘real virtual me’ at any point though? Just ‘real me’ and ‘virtual me’.

On thinking about this further, and having recently watched the underrated movie ‘Transcendence’, I think I could go for the slow-scan upload idea rather than memory backups.