Would you transfer your mind into a computer?

The science behind the following proposal is still a few weeks off, but pretend for a moment that the future is now.

**The Setup:**Today, we know exactly how the brain works as an organ, and we fully understand the nature of consciousness. Our grasp of memory is so complete that we can actually create artificial memories–it’s tricky, but not an uncommon practice in the more upscale travel agencies.

Concurrent with the brain stuff, computer stuff has come a long way as well. The entirety of an average human mind can fit on a 3-D crystaline memory device roughly the size of a baseball. One advantage this storage device has over the older magnetic devices is that it is, comparatively speaking, immune to data loss. It’s not perfect, but it’s far better than 20th century data storage and notably superior to biological devices. CPUs and data access are at least as fast as a human brain in cheaper machines, and deluxe models are, well, mind-blowing.

The Poll:
You have the opportunity, for a mere 30,000 quatloos, to transfer the data contents of your current brain into the above technology. The transfer destroys the biological component so there’s no going back. It’s been done before, and the result was the receiving hardware expressed amazement and swore up and down it worked as advertised–the person’s mind was now in the box and experiencing fabulous memory retention and recall.

Would you do it? How about if your computer were linked to an autonomous…oh hell, they’ll stick it in a mid-sized android for another 30,000 quatloos.

Hell yes.

Weeeeell… subject to a lot of caveats, that you seem to have addressed already. The substrate would need to be more durable than what I have at the moment… you imply it would be. It would need to be, at least in principle, capable of doing exactly what my current brain does (cognitively… fuck the hormone regulation and so on), but faster. And it would need to be less error prone*. And ideally, I would still need it to be ‘me’, but there would only be two ways of judging that. The first-person way, which you say is reliable (‘I’ would still think ‘I’ was ‘me’) and the third-person way, with lots of psychologists all giving me a thorough Turing test.

*To some extent this conflicts with the previous specification, since our brains are shaped to be error prone. I suppose I would settle for having a mind that was like my current one, but could farm out questions of pure logic and calculation to a more traditional microprocessor.

**Also I’m drunk. To be honest I don’t think I’d miss this.

I guess the big gray shady area is: We have the previous guy who said it worked, but how do we know his awareness is in the box as opposed to there just being a computer program that will appear to be him and think like him. Could be his awareness was actually snuffed out and what replaced it was something else.

That’s the risk. Would you still take it?

No, I recently read ‘Surface Tension’ by Iain M Banks, in that book copying peoples minds into a computer generated virtually reality resulted in the creation of virtual heavens…and virtual hells…

So no thanks!

In the hypothetical situation proposed, would the mind of the person on the storage device be truly the same as the person who once upon a time posted on the SDMB under the name Gil-Martin, truly and (if I may use the word) metaphysically? Would my consciousness (whatever exactly that may be) somehow just move to the new iSoul or eMind or whatever? If yes, I could offer you a big fat maybe.

However, if the new **Gil-Martin **were simply a copy, however similar, the answer would be probably not. I say probably not because I think providing an exact duplicate of myself might be a nice thing to do for posterity. :rolleyes: But seriously, if the original me were destroyed, if there were a break in the continuity of my selfhood (again, whatever exactly that may be) during the transfer of my memories and thought processes as they were extracted from my squishy gray wetware, what would be the point? The answer to this, from my point of view, would be “not much.”

Sorry if this isn’t too clear. I’m not a philospher, and at the moment I am considerably overmedicated. Have a nice night. :slight_smile:

Ohhh…batten down the hatches, I’ve read several threads here and elsewhere on this very subject and it tends to get heated.

For myself I agree with you (my above post not withstanding) that unless it is a relatively slow and seamless transference then the copy is not actually you although it may think it is.

But we’re in the minority on this one.

Yes. Yes, I would.

Hell yes. My monkey brain sucks. Riddled with phobias, illness and psychological trauma. Having an intelligently designed brain would rock.

I once heard singularianism derided as ‘religion for the 140 IQ crowd’ which is pretty accurate.

Not unless I did a full audit of their quality systems, including their vendor qualification processes.

Unless I was over 90; then I’d risk it.

If I have 300,000 quatloos, can I get 10 copies of me in 10 computers? Or 5 computer brains + android bodies?

i’ll wait a few more weeks for version 2 where, for yet another 30q, i can upload to a digital copy without killing myself.


Relevant: today’s SMBC

Quoth Gil-Martin:

Everyone always seems so fascinated by the easy questions like this that they seem to ignore the hard ones. Like, is the Gil-Martin who woke up this morning the same person who went to bed last night? Is the one who’s reading this post the same person as the one who wrote the one it’s replying to? Is the Gil-Martin of right this exact moment the same person as the Gil-Martin of a nanosecond ago? The only way of answering any of these questions is the same way we answer them for the mind-in-a-box: By asking it. And we’re told that, when asked, the mind-in-a-box says that it is in fact the same person. If you’re going to worry about that, then you might as well worry about dying and being reborn every nanosecond.

Back to the OP, there are some things I would want settled before I’d be willing to go through the procedure. First, how do I interact with the world? Assuming I don’t get the android body, is the mind-box just put back into my skull and interfacing with my body the same way my meat-brain did, or is it living in some sort of virtual reality? And if I spring for the android body, is it capable of everything my meat-body is?

Second, I’d want to be absolutely sure that none of the “flaws” that the mind-box fixes aren’t actually features rather than bugs. For instance, if I have better deductive-reasoning skills, will that impact my capability for inductive reasoning or creativity? Are those chemicals which influence our meat brains maybe more important than we think? Do dreams still work the same way?

Nope. For the same reasons I’d not use a Star Trek transporter. It’s not a transfer; it’s a destructive copy.

Chronos, if you would consider this one - multiple parallel universes, time travel, and a convoluted hypothetical requiring that your present self be dead for whatever reason. so you’re facing your future self (from a nanosecond hence, if you prefer) and he kills you. are you dead?

i shall cling to the flesh that i have deemed to be Me and reject all others that claim to be what i can see is not. i will not be assimilated.

My mind is a computer but it’s operating on Windows ME.

I’m screwed.

This is not correct, as we have a sense of continued consciousness (aka a soul) in the real life situations. The fact that the original person has to be destroyed means there is no transfer. Both would think they were the same person, and both would be correct from their own point of view, so we kill the one that actually wanted to make the transfer.

The only way this could work is if the baseball was implanted in our brains, and our body was allowed to slowly convert over until the ball had all of our memories, thus simulating the same effect we currently use to deal with the replacement of cells in our body.

And, even then, for those of us who are spiritual, who knows the implications?

Would think it was me, would NOT be me.

Maybe at the end of my life. But likely no.