Submitted for your approval:
The X-Files, “Je Souhaite” season 7 episode 21
Submitted for your approval:
The X-Files, “Je Souhaite” season 7 episode 21
I think we get your point that “Powers Like These Are Meant For God Alone”. The trouble is that you are using episodes of television shows to prove your point…shows that rely on dramatic tension and/or a continuing cast of characters in a set situation. Of course the writers make sure something bad will happen to and/or because of those that receive god-like powers, otherwise the story is boring and/or you offend some religious group and/or you’ve killed the premise of the show.
Given the constrictions of the OP, that I’d still have my own mind, my greatest concern would be my inability to foresee the consequences of changes I might make. As it is, not being a politician, the scope of my ability to affect those around me is more or less limited to my ability to predict it and contain it. Obviously, I could still make largely disastrous mistakes, like causing a massive pile-up, causing deaths, property damage, and wasted time and resources, but I have enough incentive not to do that that I wouldn’t do it on purpose.
Instead, with these sorts of powers, how could I reasonably predict the butterfly effect that could stem from thoroughly well-intentioned change I might make. Without omniscience, I cannot do that, and any change should be thoroughly thought out, and even then taken with great caution. For instance, there’s the mention of a volcano which, perhaps, I could stop, but what if that causes pressure to build and creates a more violent erruption or an Earthquake. And still, without having omniscience, I cannot even predict it, and would I even be able to act fast enough to do anything? It would basically be like taking God and turning him into a superhero, having a lot of powers but only able to behave and react as fast as a human mind can.
I also couldn’t make any changes I would consider to be unethical, and that does particularly include free will. Sure, perhaps playing the role of a superhero I might do things to prevent them but I wouldn’t just flat remove something that I consider to be a fundamental part of what makes us human.
I could see using the powers akin to how Dr. Manhattan does in Watchmen or Connor claims to at the end of Highlander, helping to advance research and improve the human condition. I would be afraid to change anything fundamentally to make something possible that shouldn’t be, but no reason I couldn’t help with experiments or with difficult engineering projects and that sort of thing. Maybe there’s some things I can do to help with negotiations or whatnot in tense areas of the world, particularly the Middle East.
Or maybe, once you get powers like these, including the ability to see the future, you’ll realize why fixing everything and coddling everybody is not the right thing to do; and in the end, you’ll let Humanity fight it out against the environment in which it evolved, achieving a goal that was previously unknowable to you, and is presently beyond our understanding.
Or maybe not.
Seriously. You haven’t demonstrated that and it reeks of Panglossia. (I agree the possibility is worth stating however. I’m not ruling it out.)
So is it your contention that these television episodes were written by great philosophers trying to show us why we shouldn’t try to reach further than our grasp, and not by stock scriptwriters told that whatever they do must be undone by the end of the show and/or “Whatever you do, don’t offend the religious conservatives!”?
Build a world simulator and game out the consequences.
Problem: does the world simulator contain conscious entities? If no, then its accuracy would not be clear (though the exercise might be worth doing anyway). If yes, then you’ve increased the moral consequences, rather than reduced them.
Behold!
Star Trek, “The Squire of Gothos”, season 1 episode 17
Star Trek, “Errand of Mercy”, season 1 episode 26 (Organians!)
It’s not that I think script writers, or Sci-fi writers, (some good ones worked on Star Trek) are great philosophers, it’s just that I think y’all aren’t (great philosophers) and I can see events mirroring most of these TV examples would be more likely if any of you were granted God-like powers.
Also, Drama is Art … don’t underestimate Art. Art is Truth (or it’s not Art at all).
Godlike powers?
The people who bullied me without mercy in elementary, middle, and high school would get their comeuppance.
Wander around the planet cleaning up the worst sorts of polluted sites. Free hostages and slaves (if you want to fuck up your own life have at it but I will protect the victims and those with no say). Redirect governments from war and conflict to science and medical research/projects.
Long term - gradually and humanely reduce the human population to a more sustainable level, encourage space travel/colonization
Grin! I was gonna pop up with something vaguely similar: create a “sim” program (without true conscious self-awareness!) that would accurately model the world, so I could know the full consequences of the changes I’m contemplating making.
Once I know all the repercussions, then I can act.
(As a “strong AI” believer, I think that an accurate sim would have to have self-awareness and consciousness, but, hey. Even if so, I’d rather cause a new Ice Age on a simmed earth than on the real one.)
(So much for that remedy for global warming…)
First thing I would do: back up everything so I can restored to a known good state
Then I would start to investigate subtle uses of my power. Presumably I could just instantly wipe out cancer, but doing so would be pretty obvious. Instead can I alter reality such that the drug that I realize that a certain researcher is about to discover, which would have been 5% effective against certain types of cancer, is instead 99% effective against almost all kinds of cancer?
I think it’s fairly “easy” to ease a bunch of somewhat prosaic physical issues with humanity. Hey, some scientist, one who is fortunately working for some nice benevolent NGO, just stumbled upon a really really awesome new breed of potatoes that grows in an incredible variety of climates and is very nutritious. Yay! Oh, and all the cane toads in australia just caught a really horrible and deadly disease… good thing it affects ONLY them and isn’t going to mutate to affect other species. And, gosh bless it, a tiny remnant population of (some plausible thought to be extinct species) was just found in (some plausible and remote location).
Much trickier is issues of social justice, policy, etc… particularly because I’m so far from certain that my beliefs are actually the correct ones, and it’s much harder to just kind of do things. That argues for the kind of sandbox/simulation approach that others are suggesting, although it raises some VERY serious ethical issues. If you terminate a simulation, how is that different from killing 8 billion people? (Of course you could argue the same is true of my save/restore, but that’s to use only in case of emergencies.)
I really don’t want my actions to be interpreted as those of God (either God that people already believe in, or me). One way around that would be to conjure up some super-advanced aliens to show up and be benevolently helpful. Need a source of infinite clean energy? Aliens show up and give one to us just 'cause they’re nice, etc.
Give myself the unlimited wisdom to use my omnipotence properly.
Can anyone think of any negatives for my proposal from post #11 (below)?
Four year olds with guns need to be stopped. You can oppose spanking for example while still conceding that there are times when force needs to be inflicted on children, just like everyone else.
Rape: the option of withdrawing of consent in the middle of the act might shift power dynamics somewhat. I suppose both genders could try that stunt, though it would be easier for one.
All bullets will hit their intended target and only their intended target.
(a) how is it decided what acts do and do not qualify? Do your powers extend to the ability to basically set up an infinite number of local AIs? Or do you yourself have to actually come in and monitor each individual situation?
(b) Like it or not, it does take away free will, which some object to on ethical grounds
(c) It also will certainly be noticed that “hey, all of a sudden everyone who tries to rape anyone gets nauseous”, without any explanation. Which is going to be throw society into massive spiritual/religious/ethical/scientific/philosophical chaos.
If I didn’t care about (c), I think I’d try to tackle the sample problem with more of a defensive solution… everyone has a portable injury-protection device that is on at all times and protects them from all types of physical damage, and also acts as a chastity belt, but they can consciously disable that function.
Can I make myself superintelligent, as well, or am I “just” omnipotent?
If it’s the former, that’s what I’d do, and then plan my actions accordingly from that point forward—well, assuming that I don’t become so enraptured by pure thought that I forget to eat and I starve to death. Which, honestly, if I’m capable of mentally absorbing the entire written works of humanity—and possibly the rest of the universe—of all time, is not an insignificant risk. :smack:
Maybe I’d spend my last days writing Aristophanes’ Babylonians/17th century Tau Cetiian Bondage Opera crossover slashfics on Livejournal. In droll, pun-filled Etruscan.
The OP says you have a human mind … kind of like you have now just omnipotent powers. You WANT to make the world a better place, but without, you know, stomping all over it and oppressing it. Your slashfic project could arguably make the world a better place … but perhaps not a LOT better, certainly now what you’d expect from someone omnipotent.
Well first off, the anti-technology and anti-education fanatics of Boko Haram will all get a painful and debilitating disease that is easily curable by modern medicine. Preferably cureable only by a drug developed by a female African researcher.