Hey, JoeyBlades, I’m not arguing with you. I agree that there should be protections. I am just laying out the arguments others will surely make.
Another problem will be determining just what is sentience. Is there a bright-line cutoff, or is there a continuum? And if it is a continuum, at what point along that continuum would rights accrue? (The folks pushing for rights for great apes are probably positing the same questions as they apply to animal rights. A good point of reference, Freedom.)
We will have to arrive at an agreed-upon definition of sentience before Congress is likely to act, even in the best-case scenario, and I’m not sure that will be an easy task.
The property rights issues will be thorny as well. If I build a sentient machine in my garage, I may spend a lot of money on parts and labor. When the machine attains sentience, will I have lost all say-so over what becomes of it? What about the money I spent. Could I force the machine to repay me for the cost of building it? If it fails to pay, can I start repossessing its component parts? These questions seem kind of silly, I suppose, but I guarantee that they will be real issues if sentient machines become a reality.
Still no religious takers on the debate, it seems. Come on. What happens if an android asks to be baptized?
Well, I was making the point that “sentience” and “humanity” are not necessarily the same thing. The Turing Test purports to prove “humanness” not “sentience”.
Much depends on the definition of “sentience”. Currently we define human sentience operationally: a sentient being is one that acts sentient. In which case, distinguishing between the “real thing” and a “simulation” is a distinction without a difference.
Let’s analyze a simpler phenomenon. I can perform “real” math computations with my brain. Now, if a calculator performs those same computations and comes up with the correct answer, is it performing a “real” computation or is it a “simulated” computation?
That cuts both ways, you know. How do you know that I am sentient or even human. “Brain in the Vat” arguments don’t really lead anywhere. Your previous point makes more sense.
I tend to agree with Nen’s POV. I would speculate we would not grant civil rights to sentient computers until they fought a war to gain them. Which is not reallly a moral argument, just an observation of human nature.