I actually posted this on the discussion of a science website. But it bears repeating.
Some day, computers are going to become more intelligent than people. It is just a matter of time, an ‘when’, not ‘if’. I think they even have a name for that. It is called the ‘singularity’ if I am not mistaken.
Most people welcome this time. As do I. Computers can do many of the mundane things humans do now. And they don’t even need a break.
But I just have one lingering question. Is there not an inherent danger here as well?
As I said, on the discussion page, all intelligence, but no emotion. That is the very definition of a sociopath, isn’t it? Unless there is an objective and non-emotional component to human morality that I am not aware of.
I don’t know for sure what if anything the computers would do, if they got an independent and superhuman intelligence. They still would be programmed and made by man. But you never know.
What do the rest of you think:)?