Excellent question. The answer? While someone might create an evil robot, or more generally, evil artificial intelligence, it should be theoretically possible to create good AI, if that’s your goal, despite your own moral failings. At least that’s my understanding of Yudkowsky’s position, but I haven’t actually read all of it yet. :o You know, I think I may do that sometimes soon. Anyway, here it is:
I’m guessing that BZ is talking about robots with at least human intelligence, capable of creative thinking, etc. Obviously “mere automatons” couldn’t carry on human culture. The whole idea behind robots as successors for humans is that the important thing isn’t propagating our genes, it’s having intelligent “offspring”, so we might as well make them as smart, strong, moral, and generally capable as possible. Isn’t that what the book Mind Children is all about? I haven’t read it, so I can’t say for sure.
It all depends on what’s important to you. Is it
- Preserving the human species?
- Preserving human culture?
- Preserving sentient intelligent beings in general?
And most importantly, why?
Humans need not all die off. We could upload our minds into a new substrate. But we might still be vastly inferior to artificial beings. We could increase our intelligence, modify our personalities, but then what exactly would we be preserving? Our memories? Those could all be kept in a central deposit somewhere, with everyone having access to the memories of all the uploaded humans. Yet I think that many people would not consider their minds to have survived in that case. But that gets into the whole issue of what constitutes personal identity, which is a whole other topic.
Some people raise the point that to fully preserve human culture we would have to preserve all of the bad things too. Personally, I think that we should get rid of the bad things and keep the good things. That sounds pretty obvious in retrospect, doesn’t it? Of course, since “bad” and “good” are relative terms, it’s possible that every aspect of our culture could eventually be replaced with something far superior. If that happens, I couldn’t be happier. (I feel the same way about the preservation – or non-preservation – of my own mind.)
I’m still trying to figure out whether BZ00000 was serious in that “Dog Owners” thread. If he wasn’t, could someone please explain to me what he really meant? BZ?
Oh, and one more thing, off topic: could someone please tell me what “OP” stands for? Pardon my ignorance.