Let’s say someday scientists create a Matrix-type simulated world that you could enter; and let’s say this simulated world is so much better than the real one you inhabit that you want to jack in and never go back. So you sign up to jack in.
Question: if you had the option of erasing your memory of this world and all the things that suck about it before you jacked into the new simulated one, would you? Also, you can erase your memory of ever having jacked into this simulated world, so that when you wake up in this simulated world you’ll never know it’s simulated (you’ll be living a lie and never know it). Would you do that too?
I believe if scientists ever create a simulated world like the one I dream of living in, I would erase my memory, at least most of it (the parts that suck), and I would also erase my awareness that this new world is not real, that the people around me are just programs, et al. Does that make me unethical or dishonest?
Humans are not rational beings; they are rationalizing beings.
Most people, if given that kind of choice would do what it took to make the simulation compatible with their desires. Most, if not all, of the time that would mean restarting your awareness in media res in a sort of Last-Tuesday kind of concept. Why bother with your previous memories or even your talents if you are allowed to live in a boundless world? Of course, The Matrix posits an oppressive simulation of our already oppressive world, so perhaps your prior skills might be useful.
Okay, let me pose a more specific version of my original questions so I can get some different responses than the ones I’ve gotten already:
Say that you could jack into this simulated world while still keeping all the memories you have of this world, BUT you won’t realize when you wake up that you’ve just transitioned from the old world to the simulated version (that will start to morph into the better world you want for yourself, gradually). In other words, the only memories you erase are the ones involving your realization that a simulated world had been created, and your memories of then jacking into that simulation.
You could hardly argue that those specific memories are a significant part of your personality at the time you jack into the simulated world, right? You’ll still have all your other memories, just not the ones that remind you that you just woke up in a simulated version of your old world, one that will gradually become the better one that you wanted to live in.
The problem there is that, if you erase any trace of jacking into the simulation, then the simulation has to be seamless with the real world (and my last memories pre-erasure) else I’m bound to immediately freak the fuck right out. But if the world of the simulation is so similar to my world that I’m expected not to notice the difference, why would I voluntary jack in to the point of losing my “real” existence ?
I mean, some assurances (from whom ?) that it will gradually get better won’t be enough to make me sign up. For all I know, these motherfuckers haven’t even fully debugged their current-world simulation, let alone have a satisfying better-world one in the pipeline. The trust issues there are non-trivial, even more so if you throw tampering with my head meat into the mix.
Besides, the world I’d want to live in is not subtly different from the one I’m stuck with. It’s night and day. There might even be dragons :p. How do you drive a slow and steady change from Modern Day Europe to high fantasy cyberpunk space opera male power fantasy, and how long is it gonna take ? I’m from Generation X, my people don’t like to wait, where is my immediate gratification goddammit ?!
Let me guess, Came in like a Wrecking Ball.
You have already done this for those who are answering this thread and you are just testing continuity.
Timewinder and Der Trihs make a difficult to assail argument that I probably should have thought of when I posted my first reply. My excuse is that it was after midnight and I was tired.
If you had asked me this question 3 months ago, I would’ve said yes, wipe my mind. But now there’s a person in my life that I have no intention of forgetting, so no.
I don’t know if I would even feel a big need to. If I happen to be the sort of person who would happily live in a fantasy world voluntarily, I don’t know that I would care that enough about what’s real to actually complete the delusion.
I’d need some pretty damn good reassurances that the new reality I’ll be experiencing really would be a paradise.
The problem with these virtualities is that there is really no limit to the horrors a suitably sadistic mind could come up with, at least in the real-world you can only be killed once.
I admit it, ‘Surface Detail’ by Iain Banks (of which virtual heavens/hells featured prominently) freaked me right the hell out…
On the topic of the thread I wouldn’t wipe my mind, for the reasons cited above and also because part of the appreciation of paradise would be remembering the bad things about the world I’ve left behind.
I do think that being able to select, chose and experience different realities and experiences would be my personal idea of paradise. The only ethical problem is what about the other apparently sentient intellects that would have to be created to populate the new reality along with yourself? What right do you have to make things potentially unpleasant for them just to get your own jollies?
Ok, in this weaker form, I might consent to having that small amount of memory wiped if it were required to complete the transition, and there was some benefit to me of being in the simulation (i.e. I was going to die in the real world shortly, but could live on in the simulation indefinitely.)
But if it were optional? Maybe I don’t understand the original question, but why would it matter to me if I remembered the transition from one world to the next? What you’re talking about is deliberate (albeit fairly minor) brain damage that removes a fact of extreme importance for me to know. There’d have to be a heck of an upside for me to damage even that small part of me voluntarily. “Making the transition seamless” just doesn’t seem that important. Specifically, it seems less important than being aware of a basic truth about the universe: that it’s just one world of at least two, and that I have experience in both of them.
Yeah, but the inevitable insanity has been a real headache for the rest of us. You’re on your 33rd restoration of the backup now, and we’re running out of kittens.
No. Part of enjoying a perfect world is knowing that a imperfect world previously existed. I might like the option to TEMPORARILY forget that I’m in a simulation, and not that the world just didn’t get a lot better, but that’s about it. And even that is something I regard as a failing of my mind, not something I should actually need.
The only reasons it makes sense that Cipher in The Matrix might want to forget is that, first off, this world is controlled by machines with malevolent intent, so it taints the world. Second, as long as he knows, he’ll have a feeling of duty to try to get people out. And, third, the world isn’t perfect, so that doesn’t as easily make up for reasons 1 and 2. Knowing it isn’t real by itself isn’t a good reason.
And, frankly, in his version, he was effectively killing himself, as he would become a completely different person, and I think the machines knew that. There’d be no real difference in just having someone else have the same thing happen to them, and his life be ended. If said person must be attached to his body, have it be a program who wants to be human. Or just kill another body but attach their mind to his body.