Personally, I’ve always thought that the whole “human vegetable” is the ultimate Faust in modern Western society. We have people spending days on end wasting away at their televisions (and, increasingly, computers), not really thinking anything and not doing anything about it.
[true story example]I remember I was at a friend’s house once. We decided to go the mall. On the way out the door, he told his little brother where we were going. The kid just sits there, playing Nintendo. It was very creepy, because the kid was just a zombie in front of that screen. “You know that movie, The Matrix?” my friend asks. “Yeah,” I say. “It’s reality,” he says.[/example]
And also…
Metropo:
I think the statement is BS. That kind of thinking made us slaves to the Catholic Church for over 1000 years and still enslaves some. If you think your whole reality is whatever you have been fed by some religious authority you will be condemned to be a slave to some half-brained ape in religious vestments. Beeruser:
Good points. There is always a point to knowing how things really work. General Fact:
We use all of our brains (most of us, anyway). Snopes has something to say on the topic here.
Uhhmmmm, I don’t see how this follows from my statement. In fact, I’m not entirely sure how I can respond to it, because you aren’t being quite clear. How does thinking one’s “whole reality is whatever you have been fed by some religious authority,” follow from my statement. I don’t see the connection at all.
In fact, what I’m trying to get across is that one shouldn’t form their conception of their whole reality based on religious authorities. To take an example, fundie Christians tell you “God doesn’t send people to hell, they send themselves there.” But if you realize that, according to their conception of God, God made the rules, God made the universe, God designed the human mind, God made the environment into which those humans would be born, and is therefore entirely responsible for every person that goes to hell.
I honestly don’t think In vino veritas would be appropriat for this situation. I think that wine and booze would go in the cave of Plato’s Allegory of the Cave, and not outside in the world of knowledge.
Neo: “Why do my eyes hurt?”
Morpheus: “Because you’ve never used them.”
This analogy doesn’t apply. A better analogy would be, let’s say I’m making $20/hour, but instead of making Federal Reserve Notes, I’m making Disney Dollars. But everyone everywhere accepts Disney Dollars. Cool. Now, let’s say my co-worker is making $10/hour, but he’s getting paid in Federal Reserve Notes, which are more “real.” Am I jealous of him? Hell no. I’m getting twice as much buying power with my “fake” money than he’s getting with his “real” money. Same thing. The “real” reality is worse than the “fake” one, it just has the appeal of being somehow more “real.” (And you have no way of proving that it is real, anyway, just like none of us can prove that our own personal reality is real. We might be drooling vegetables.)
**
If it were not for the fact that as a drooling euphoric vegetable, I would not have the added variable of interacting with other people who introduce elements of unpredictability not available in a personal reality, I would say, sign me up. If I can live as a drooling vegetable, but live within my own personal lucid dream for the rest of my life, I’d love it. It’s only the fact that I like interacting with other people that would keep me from making that choice. (Of course, I have no proof that “other people” really exist outside of my own mind either, but since it seems to me that they do, they might as well actually exist.)
**
You would have as much free will inside the Matrix as you would have outside the Matrix. The difference is that your free will is subject to different rules in each reality. Then there is the question of whether any free will is more than an illusion. You’re still subject to your hormones and neurons, exactly as you would be within the Matrix. The difference is purely one of scenery. Unless the same connection that’s creating your reality is forcing you to act in certain ways within that reality, your free will is no more illusory within the Matrix than without it.
Truth. What do you mean by truth. How do you know your current reality is truth? You don’t, and you never will. Let’s say I wake up out of the Matrix, see that there is another reality, presumably the “real” one, then go back inside the Matrix because I like it better in there. I now know the “truth,” but simply made a choice of which environment I want to experience. I’m no longer deceived, so what’s the problem?
Sometimes, after I wake up in the morning, I choose to go back to sleep for another hour, because I know when I do, I will probably be lucid dreaming. I still know which reality is the more “real,” my dream or wakefulness, but I choose to spend some time in my own mind. Am I “copping out” because I’m having a bit of fun in my personal reality, rather than being in the “real” reality.
What about the issue of having an external power with absolute control over everyone’s lives? “All power tends to corrupt; absolute power corrupts absolutely.” Of course, the AI’s aren’t human, so perhaps the aphorism doesn’t apply to them, but they don’t seem to have much compunction about killing their human “batteries” when necessary. I’m not just talking about the rebels, either–those Agents repeatedly jump into innocent bystanders, presumably killing them in the process. They also don’t seem to exercise a whole lot of fire-control discipline when blasting away at some fleeing outlaw. (Of course you could argue that if there were no rebellion, there would be no need to have Agents going around zapping innocent bystanders, a self-serving argument often used by tyrants. And the rebels are pretty ruthless about innocent bystanders, too, but two wrongs don’t make a right.) Even if there were no organized opposition, and thus perhaps no need for active repression by Agents, I still get the distinct impression that if the AI’s need to turn down power production in Sector 17, they’d have no compunction whatsoever about offing a few million people.
To broaden this a bit from the exact scenario of The Matrix, is it all right for everyone to be living in a virtual reality, if someone else has control over all the life-support systems?
Something that seem to have been overlooked in this debate is that the rebels aren’t trying to free all of the people trapped in the Matrix. As they explain to Neo when they revive him, anyone taken from the Matrix before the age of five or so has a mental breakdown: they can’t handle reality. So all those people in the Matrix are stuck there for life, no matter what. What the rebels are fighting against is the continuing efforts of the AIs to wipe out all free humans. The question this raises is, if it becomes clear that the only way to defeat the AIs is to destroy the Matrix and everyone in it, is this a morally justifiable act, considering that the alternative is to be eventually hunted down and destroyed yourself?
Also, as to the side debate about human reproduction, the movie features a brief, Gigerian f/x scene of tall, spidery robots harvesting pod-born fetus from some sort of bio-mechanical plant, then pulls back to reveal similar plants stretching to the horizon. Morphius also mentions in this scene that the AIs use humans “and a form of fusion” to get all their power. So the AIs aren’t relying solely on the dubious “human battery” concept for their power. Personally, I think the AIs, being computers, can’t innovate: so they need to keep humans around for new ideas.
The human battery explanation seems dubious to me as well. If the AIs represent a computer program that achieves sentience, I speculate that they evolved from software whose original purpose was entertainment. Think Doom or Myst (or perhaps David Chronenburg’s eXistenZ).
So, a perfect world would be rejected by the AIs, since that would make a boring game. Perhaps a more fantastic world was rejected, since they lacked the creativity to design one: they had to start with a world that the humans were familiar with.
Nonetheless, the AIs are mostly benevolent; otherwise they would have dispensed with these squishy humans long ago. Morpheus et al are rightly seen as potential terrorists, who might at a minimum introduce bugs into the AIs rather complicated computer system.