I would need some questions answered, because I would expect to use the program while still alive – like an online secretary and assistant. Does it know it’s not me? Will it obey me? Can I establish limits to its behavior while I’m still walking around? For instance, it it gets pissed at me for some reason, I wouldn’t want it to empty my bank account buying QVC jewelry.
I…can’t really imagine that the world would benefit enough from a backup “me” to justify the trouble and expense of creating one, or that more than a handful of people would really miss me once I’m gone. I couldn’t personally consider it anything more than an act of vanity, after that.
Plus, even if the Virtual Ranchoth really is a conscious entity, but indistinguishable from me in all respects…well, in a phrase, “who wants to live forever?”
Perhaps the implication is a kind of universalism, and everyone has a fair chance at downloaded immortality. “No mind left behind.”
How incredibly egocentric do you have to be for this to even occur to you as a ‘good idea’?
This has got to be among the creepiest hypotheticals ever suggested on these boards.
And: No, I don’t vote on ‘Public’ polls - take a guess what my vote would be.
Interesting - that book (one of my favourites) had pretty much the opposite effect on me.
Can you explain why, or is your argument solely based on something deep in your spleen somewhere?
I’m not in a good enough place to contemplate living forever. Maybe if things improved, but even then I wouldn’t want to take the risk of being deeply flawed for an eternity.
Maybe I should post when I’m in one of my mini-mania moods and I’d say of course I want to live forever.
Whether it’s really me or not is irrelevant to the question of whether I’d do it or not (I would, BTW, no hesitation) - even if all it means is I’m serving as a template for a different entity, I’d do it, because how cool would that be?
And yeah, I’ve read Surface Detail, but the same Banks who potulated the Pavulean Hell also postulated Infinite Fun Space…
My main problem with immortality is the inescapable boredom of living. That would be dependent on the actual timespan of the immortality in question. Does immortality end with the end of the universe? If so, and our universe ends in a Big Crunch, that wouldn’t be a problem—heck, I can do 100 billion years standing on my head. If it ends in a Big Freeze, 100 trillion years could get a wee bit tedious …but, still doable. But, forever and ever? Even having a perpetually hot chick all that time would get wearisome.
I agree with those who believe, under normal circumstances (as normal as one can get in a farcical hypothetical), the person uploaded into a synthetic realm would not actually be me. But, I won’t fight this hypothetical and assume that my self-awareness will successfully transfer to my bank of thoughts & memories at the time of my physical death. If my SA was not transferred, I’d accept uploading without hesitation, because having a version of me around for eternity would be a good thing for all mankind, and it’s up to that guy to deal with his ennui.
But, if my self-awareness does get transferred in the upload, I’d need some assurances before signing on. Will I be stuck with this virtual Shawna chick for all eternity? I mean, she may be young and beautiful and sweet-natured for now, but what happens in a few thousand years? Will she get digitally old, wrinkly and sourpussed? If so, I want the option to kick that battle ax’s virtual ass to the curb.
Back to the boredom thing again: does my environment continuously change in my virtual realm? Or, does it stay in lockstep with the actual physical world? Given the relatively slow change of pace in the physical world (even with new versions of i-phones coming out all the time), I can’t imagine not getting bored after eons and eons. I’d want the option to turn myself off for maybe 100k years at a time, and then turn back on for 100 years at a time. That should give me enough change per lifespan to keep me interested, at least for awhile. I’d also like the option to turn Shawna off at my discretion, in case she turns into a virtual nag. Can you imagine being nagged at constantly for eternity?
Better still, I’d like the ability to erase significant amounts of my memories at will (particularly the good ones). After 50 trillion years or so, with all memories intact, I can’t imagine being able to say, “wow, I never saw one of those before”, or “hmmm, that’s a position I never before considered.” I’d want to erase good memories so that I can experience them again as something new and exciting.
I wouldn’t choose to be uploaded into an android body until they develop a giant inter-galactic traveling octopus android with fingers and opposable thumbs at the tips of all 8 tentacles. I’d want Shawna to remain in human form, however. The thought of having sex with a giant octopus [del]squids[/del] squigs me out.
Also, I’d need the ability to erase Pauly Shore and the Kardashians from my virtual environment—that’s a deal-breaker.
Yes absolutely. I want to be there to see what happens. I have fantasized about becoming a ghost when I dies (yes I know, they don’t exist) because I would be happy to have zero interaction with the rest of humanity if I can just be there to see how things turn out.
The sweep of history has always interested me and I am pissed off that I won’t get to see any more of it.
You don’t HAVE to live forever. Basically, it would be your choice. That’s all I’d like…is to have the choice of more time if I want it (which I do…too much interesting shit is happening today and I want to see how all this plays out). If you get to the point where you’ve read all the cool books, seen all the cool movies, done everything and really are that bored (I can’t imagine that myself, but who knows?) then you could always have yourself switched off and erased or whatever.
Sign me up! There are too many books I haven’t read, too much music I haven’t heard, and too much food/drink I haven’t sampled.
I’d prefer it if I could flag some memories for deletion or at least give them ‘do not open’ tags.
Actually, it’s the kidneys which are killing me.
For me to even think ‘yeah, I really should stick around forever’, I would need an incredibly high opinion of myself. What is there about me which is so special that it should disobey the Universe’s Prime Directive for Living Things: You get X period of time. No more, no less. There is nothing so special about you that you should violate that very specific rule.
I am not that special. I am a bit above average in these areas, a bit below in those areas. Not so Special that I deserve eternal life.
And putting this ‘perfectly preserved duplicate’ of myself into an artificial ‘body’ so I (who has been dead for 6,000 years at this point) can inject my special, eternal thoughts/actions into situations I (the original, meat and blood Me) have never even imagined.
I’m not that special.
Keep me around and let me observe? Maybe. As long as I have easy access to an ETERNAL ‘off’ switch.
Nope. Nothing special. I just don’t want to die quite yet. If “not wanting to die, thank you” is “incredibly egocentric,” then so be it, although I think that’s far from demonstrated.
I also like to have plenty to eat, lots to read, health care when I got sick or hurt, and a hell of a lot of other nifty things that “nature” never came up with. Am I “incredibly egocentric” for being happy that there’s food in the fridge, a car in the garage, and a hospital just up the street?
Why should nature’s cruel laws extend to our personal extinction if there’s a way around it?
I see that somebody read Heechee Rendezvous by Frederik Pohl.
How old are you?
Have you not seen your own, personal ‘Stop Sign’ yet?
After a certain point, I came to find great comfort in the knowledge that this life WILL end - I will not need to put up with the ‘slings and arrows’ forever.
YMM, quite obviously, V.
I’m about to turn 60, and I’m looking right down the barrel of that gun.
Damn straight. Also, asking for a couple more hundred years isn’t the same as asking for “forever.” Mathematical infinity isn’t in the cards, but that doesn’t mean I have to agree with dying in the next twenty years or so.