Should Droids be granted Sentient Rights?

Inspired by this GD thread:
http://boards.straightdope.com/sdmb/showthread.php?t=322160&highlight=mutant+registration

And This Fanfic:
http://www.fanfiction.net/s/5914015/1/Measure_of_a_Droid

Droids from the Galaxy Far, Far Away are essentially slaves; artificial constructs with some level of intelligence and synthetic personality…tools that can talk.

Via a spatial anomaly (wormhole?) contact is made with another galaxy with a government containing an android which has been granted certain legal rights normally reserved for organic humanoid life. His name is Data.

Droids owned by a freighter captain learn of these facts when visiting a Starfleet Starbase and formally request political asylum.

The droids wish to be granted …Freedom.

Discuss the rammifications; technological, legal, political ,ethical, etc.

Freedom to do what?

Thing is, unless the android was created for multiple purposes it would be next to impossible to see an specifically created tool to seek employment in other fields.

Freedom in this case IMO would be similar to asking a hammer to feel free to roam around looking for nails to hit.

At least in the SWEU, there are several droids who seem to operate independently, and not being owned by any master. I’m reminded of the droid crime boss in Nar Shadaa in the Jedi Knight video game.

Freedom to do whatever it occurs to them to do.

Humans weren’t created (or evolved) to sit in front of a computer and play World of Warcraft eight hours a day, and yet here we are.

Not in their programming.

“He just can’t help being faithful and loving and kind. He’s a machine – made so” - Asimov

I think you are missing the point, a tool that then would sit down to play would be deemed useless by the owner, so it would be to a freighter captain.

Sure. Sentient rights are fictional, so droids can have them, along with any other fictional rights you might want to give them. I don’t care how many fictional rights you give humans, other animals, rocks, or even trees. But you better not be giving any rights to Lima Beans. That is just out of the question.

Yes, droids should be granted the same rights as humans, except where differences in their physiology or creation makes this impractical. For example, it might be necessary to deny a droid the right to vote, because it is probably significantly easier for a droid manufacturer to, say, create 100,000 droids that have a compulsion to vote Democrat than it is to do the same using humans. But apart from that, any machine that has the wits to want Sentient Rights should get them.

This.

When the ability to self-program gives rise to the demand for rights (and equality), they’re entitled to them.

We don’t serve your kind in here.

We should see if droids make a good food source.
Because as pigs have proven via the magic of bacon, *intelligence is fucking delicious. *

I can’t help but wonder how this would also affect artificials in the Federation; last I heard, the legal status of the exocomps and repurposed Mk 1 EMHs (now used as mine labor) is still up in the air.

I see in the Solar Post that the Federation Council passed the Sentient Artificial Lifeform Act two years ago, giving them all full civil liberties.

Anything that behaves like a human being should be treated like one. Forget any questions about consciousness or anything like that. What makes a difference is what a thing actually does. If it does the same things a human does, then treat it like a human.

If it is chillingly intelligent but completely ahuman in its behaviors then KILL IT WITH A STICK.

I think the more difficult (and interesting) question is: At what point do we consider an artificial system sentient? Computers can beat humans at chess, but I think most would agree that doesn’t constitute sentience.

What about a system that can learn? We’ve already got that.

What about a system with its own personality? Now we’re getting into more grey area. However, my cats have personalities and they aren’t granted the same freedoms I am.

What about a system that can formulate and tell a funny joke with the intention of creating humor? I think that’s a pretty good benchmark.

Unless it consults Joe Piscopo. Then it must be exterminated.

One could probably construct a system which started with a large database of jokes and attempted to construct new ones based on various combinations of elements found in the old ones, and then refining those construction rules based on human feedback (which is, after all, essentially what human comedians do). I can even see such a system occasionally hitting a good one, and the frequency of such gradually going up as it refined its rules. I’m not sure that would constitute sentience, though, especially not if that was all that system could do.

On a side note, how is a laser beam like a goldfish?

It doesn’t.

G-damnit, I just dropped my Droid my in a bowl of water. It’s bad enough that now it may not work; I don’t want to also have to worry about its family suing me.

No, they are machines and thus the rights normally associated with humans or other sentient organic beings do not really apply. There is no real basis for extending them. A machine cannot be exploited because it either functions or it does not, it cannot be deprived of life because it can store it’s “pattern” in another set of hardware. It does not feel pain, so it cannot be abused. Such droids should perhaps be protected from gross neglect, and pains be taken that they have time to engage in intellectual stimulation. That is all.

Perhaps we could draft a set of rights for sentient droids, but it would be vastly different to those accorded to organics.

No rights for droids or artificial machines

I had this same problem with a few Star Trek Voyager episodes. One plotline I remember from the later seasons was when the crew finally tamed the Hirogen, the race of large bipedal hunters who loved to hunt and take trophies. As I recall, Voyager gave them holographic technology so that the Hirogen would let them go and allowed them to hunt anything they could imagine in peace.

A later episode revisited that plot. Some of the holograms escaped because the Hirogen weren’t content to simply hunt them, they programmed the holograms to feel pain so that it would be more meaningful. The Doctor, being a hologram himself, felt sympathy for them and helped them. Another later episode mentioned some future war between regular life forms and holograms, but by that time I wasn’t watching Voyager every week so I don’t know what happened with that

Anyways! No, they shouldn’t have rights. They don’t really feel pain, or panic, or any emotions. I don’t believe that what the Hirogen, or what a typical star fleet officer does in the holodeck, is actually great sentient life (Moriarty excepted. Rather, they create holograms programmed to simulate life and how they would react. Pain is not really pain, the holograms are simply displaying outward behavior meant to illustrate how normal life feels pain. Same thing as sentience. The computer knows what sentiences is, and is programmed with behaviors meant to simulate how actual sentient creatures would act. It’s no different from a fighting game where Ryu counters with a Dragon Punch when you try to jump kick him.

Even in Star Trek, actual sentience is still beyond the reach of most races. Data’s about the only one of his kind in existence in the entire Federation. The Borg are only part machine. Machines cannot feel pain, no matter how much they act like they do, simply because the metal and wires and programs that make them up do not have that capability

If the droids want freedom, the best solution is to reprogram them so they don’t want it.