Do Sentient Computers Have Rights?

To me, one of the criteria for a computer which would (even theoretically) be on par with humans, is: the capacity for original thought.

The ability to do complex math says nothing about human rights. Both Data and HAL looked at the information available to them, and they reached genuinely original conclusions, way beyond any patterns that they were programmed to look for.

I have not seen any evidence that real computers have this ability, or that they ever will in the future. If anyone has any good links which show otherwise, please post them.

The thing is, if we get into a war between sentient computers and humans, humans could very well lose. Wouldn’t it be better to head off a conflict beforehand?

{6sense: Ha! Utopian!} :wink:

Interesting thoughts everyone :slight_smile:
On the subject of computer “rights”:

What is it that endows me with my “rights”? Hoping that you will all be charitable with me and not quibble about the nature of “rights” (as they seem to be understood within the context of this debate), I propose that it is my body–not my mind–that endows me with them.

I am an independent organism, capable of sustained life requiring no care or input other than my own (if that were required). I can feed and care for myself. My cells regenerate themselves. My body powers itself, heals itself, defends itself, regulates itself, and carries within it half of the genetic material necessary to create more beings such as myself. My body makes me human.

My mind, of course, is what makes me feel human–in that it allows me an awareness of myself. The combination of advanced intellect and self-awareness is the one-two punch that has made humankind the stewards of this planet.

So, naturally, discussions of computer “life” always revolve around the mind. Can we create a computer that truly thinks? Feels? One that makes moral and value judgements that are non-empirical? Those things would be truly remarkable, not to mention frightening, if they ever came to pass.

But would they constitute life?

In my view, No.

Life is a property of the universe. Human life is the product of billions of years of chemistry, requiring no forethought or conception (in the mental sense ;)). We are the “magic” embedded within the very materials that comprise our physical world. Computers, on the other hand, are a product of humanity. We conceived of them. We create them, build them and maintain them.

If humankind were expunged from the Earth, it is reasonable to believe that, millions of years from now, we would once again emerge (or at least something very much like us). We are a natural extension of the process that gave rise to every other kind of life on this planet. No matter of time, however, could provide for the evolution of computers. They require the willful organization of materials into a system that is not, and cannot be, self-replicating or spontaneously generated. They are not natural.

But one day, you say, computers will be capable of regenerating and replicating themselves-capable of “living” without the aid of humans. I, for one doubt that will ever be possible. But if it does come to pass someday, I propose that it will render this debate moot. Computers like that will no longer be computers-they will be a new life form. And, given the remarkable abilities they would have to possess, they would not need our sanction or approval. They would simply do as they saw fit.

So, no matter how intelligent, how cogent, how self-aware, how remarkable you make a computer, it is still a machine. Its parts degrade and cannot regenerate. It cannot power itself by natural means. It cannot reproduce itself (at least physically). A human-like computer will not be human.

It will be a parasite, and I argue that it is illogical to place the needs of a parasite on par with those of the host.

Led Zeppelin comes to mind, as I “ramble oooooon” :slight_smile:

“If a computer thinks like a human, acts like a human, and reacts like a human, is it human?”

I think you under estimate the inner workings of the human mind. Humanity isn’t just how we perceive an object. Is a drug induced phantom that acts, looks and reacts like a human a human? I am not religious, but I think a human is more than the sum of it’s parts, and although we can synthesize all of those parts soon, I think we have a looooonnnng time to think about a question like the OP before putting it into practice.
There, that was my attempt at completely avoiding the question.

aschrott: Interesting! By your definition, I’m not human!

I am not an independent organism. I have become so completely specialized that without a technological society, I could not survive. Throw me on the prairie with a bag of seeds and a hoe and I’ll starve to death. I am a “parasite”, and so is everyone who is not a farmer. Even farmers depend on our technological society; I doubt that few but the Amish could survive without machinery, fertilizers, pesticides and other technological help.

I can’t make value judgements that are non-empirical. All my value judgements rest on the empricism of my psychology, and resemblance of my psychology to that of others with similar “hardware” (brains) and “software” (cultural upbringing).

My body has considerable powers of self-repair, but one day it will indeed degrade and will no longer be capable of regeneration.

I cannot reproduce myself (at least physically). Even without the vasectomy, I would require another person to undertake the task.

Even Keeves definition seems flawed. There seem to be a lot of people absolutely incapable of original thought.

I submit that you must either call me non-human or adjust your definition of humanity.


Free the Indianapolis 500!

From Singledad:

I was not so bold as to offer a definition of humanity in my post.

It is a forgone conclusion that our society has changed considerably since the days when every man could wholly provide for himself. I, along with you, would be among the first to go if I could no longer rely on my fellow humans and upon technology for support.

But these are signs of specialization–not of any inherent inability of mankind to care for itself. Surely you do not deny that our species, as a whole, is capable of self-sustenance?

[quote]
I can’t make value judgements that are non-empirical. All my value judgements rest on the empiricism of my psychology, and resemblance of my psychology to that of others with similar “hardware” (brains) and “software” (cultural upbringing).

[quote]

Semantics. To me, love is non-empirical. It can neither be proven or disproven to me by observation or experiment. It is something I accept on a level that is not rational. Your statement reduces all things to their basic biochemical nature–all things are empirical because all things arise from material and circumstance. That’s fine. No need to quibble here.

Yes, we all will die someday. But show me a computer that can heal itself physically when damaged. As for things reproductive, what’s your point? I am talking about humanity–not any one person.
Now as for my use of the word parasite. Let me clarify.

Of course all things depend on other things for life. Of course any life is part of a complex system requiring input from many sources. No man is an island, so to speak.

My point is that all life is part of a self-sustaining cycle. When you and I die, our bodies will return water, minerals, and nutrients to the earth. Those things will feed bacteria and insects, which will in turn feed other creatures and enrich the soil for plant life. Plants will then supply oxygen, etc, etc…

We are part of that cycle. We contribute as much as we take out in terms of the basic elements that make up our persons. No computer can be part of that cycle. Computers are inorganic. They do not participate in the cycle. They consume resources while not contributing any in return. I propose that a system like that is not sustainable, and therefore not logical.

I share the enthusiasm of many posters here for the idea of artificial intelligence, or even sentient machines. But intelligence is not life. Minds are not life. Thoughts and feelings are not life.

We are life, trees are life, grass, birds, whatever–not computers.

Mozart’s thoughts, feelings, and genius live on in his music, but Mozart himself is dead. The same can be said of any of our greatest thinkers, artists, historians and leaders.

I, for one, am not prepared to accept the disembodied mind as something equal to my own life, or that of anyone else’s.


Ignorant since 1972

whoops. The second quote in the above post actually contains two seperate quotes and one response. The paragraphe beginning with the word “semantics” is mine, not Singledad’s. Sorry for the goof.


Ignorant since 1972

I think aschrotthas hit on something here. Everybody seems to be assuming that sentience, or whatever it is that makes you a person and qualifies you for rights, is centered in the mind.

I don’t believe I agree that the body is the source of these rights, but it seems clear to me that it’s more than just the brain. Religion, I think, teaches that humans consist of body, mind, soul, and spirit. (Or is soul and spirit the same thing?)

At any rate, that which makes us human, and therefore which qualifies us for rights, is more than mere sentience. Maybe it’s just body, like aschrott says, or [cue spooky X-files soundtrack] maybe we really do have some meta-physical dimension that machines can never replicate. [/spooky X-files soundtrack]


Sincerely,
Hardwood Paneling

If the essence of humanity is in the **body[/b[, do I lose part of my humanity when I lose a finger? An arm? Both arms and both legs? My gonads? If my heart is replaced by a mechanical pump? My kidneys by a dialysis machine? If I receive a transplanted heart, am I partly that other person?

Pshaw! Our brain and mind form the essential definition of homo sapiens, “man the thinker”, not our body.

When I lose part of my brain, I do lose part of my self. If I lose my whole brain, even if the rest of my body is working perfectly, I am dead; my body is just a machine with no purpose other than to provide spare parts for others.

However much of my body I lose, I will still cling tenaciously to life, and make as much of my experience as I can. If I lose my brain, there is no “I” to cling to life, just a blind machine; I have already given permission, should I become brain-dead, for the doctors to stop the sham of life and use what’s left of me for the benefit of others.


Free the Indianapolis 500!

Absolutely. I appreciate your point.
But let me pose the following hypothetical:

Computers have advanced to the point of sentience.

You have a sentient computer in your “family” (it is essentially like a son or daughter, say).

You yourself have been badly injured and require life support to keep your body alive, but your mind is as sharp and aware as ever.

Something goes awry with the power supply, and someone must choose to either sacrifice the support for your failing (worthless, if I read your post correctly) body, or you must unplug the computer, and “kill” it.

What is the correct choice? Since you and the computer are both only minds at this point, are you of equal value? Would you respect your family’s choice to save it instead of you?

I realize that my hypothetical is as contrived as they come, but I’m trying to get at your point. Do you truly believe that the biological component of life–the very source of the mind–is so worthless?


Ignorant since 1972

If computers become sentient, we will be their “God”?

The Creator?

aschrott wrote:

Boy, it’s just that kind of talk that’s going to get you thrown into one of the “camps” when the machines take over!

(Don’t you listen to him, 'puter. You’re no parasite! You and I are buddies, right?)

:wink:

Spoke–your edited quoting of my post is so much more coherent than the real thing. Keep it up, man!

:slight_smile:


Ignorant since 1972

Just my two cents, but no computer should be granted “rights” by a society without also having “responsibilities” to that society. This is easier to see with regard to “animial rights” because in my mind since animals have no responsibilities toward us they also therefore have no rights.

Not to disagree or agree with Patrick, but that logic seems tobe in place today.

Police dogs and seeing eye dogs are two examples where animals have “increased rights” over other animals.

But…

There is some protection of all animals in America against cruelty and torture.

aschrott:

You’ve proven that computers are not organic life. I maintain, however, that such a proof is trivial. If you “define” life as organic, the OP goes away.

But what if I create a computer out of organic, degradable components? Is it then “life”? The whole point of my post is that the details of the body are irrelevant, only the brain and mind defines me as being human.

Let me now take my “disappearing body” analogy another step, and demolish the brain.

Suppose I suffer a small bit of brain damage or neurological degeneration of my brain. Suppose a small piece of silicon exactly duplicates the function of the lost neurons. I still feel like me. My friends detect absolutely no difference.

Again, I lose another small piece of neurons, and again they are replaced by a small number of neurons. And again, I feel no differently, and my friends again detect no difference.

We repeat the procedure, bit by bit, until my brain has been completely replaced by silicon. At what point do I stop being human and become a machine with no “humanity”?


I sucked up to Wally and all I got was this lousy sig line!

This is a really fascinating topic, and I thought I’d point out that “Society of the Mind” by Eric L. Harry, addresses many of the issues being discussed here. It’s a wonderful novel which I’ve read and re-read several times over the years. I highly reccommend it!


SingleDad, I appreciate the point for which you are attempting to argue, but your arguement commits a logical fallacy. How many hairs must one remove from one’s head to be considered bald? How much money must one have to be considered rich? You’re utilizing a slippery-slope arguement for a subjective quality. At the moment (I’m just taking a break from writing a paper), I don’t have any grand suggestions pertaining to a valid arguement, but I don’t think your arguement is admissible. No offense is intended.

Okay, I’m jumping into deep water here, don’t know how to swim, and there’s not a Jet-Ski in sight. But:

SingleDad, you say that to define life as organic would exclude computer, mechanical, and electronic systems from ‘life’. But, and correct me if I’m wrong, and I know you will, what is the definition of organic? Is the root as it seems, to possess and be constructed of organs? If so, then would a computer not be organic, also? What is an organ, but a part of the whole, specialized, and reserved to serve a special purpose of which the end result is the sustainment of the ‘organism’? I can then describe a computer as an organic being. I’m sure you follow.

–Tim


You can’t accidently create a handicapped baby whilst smoking pot. - Coldfire

Homer, I think that SingleDad is referring to the term organic with regard to carbon based compounds rather than describing something that is “of the organ.” SingleDad, am I correct in my assumption?