In that case the clone would have your memories, and would look like you. But would it be you?
<sorry>
In that case the clone would have your memories, and would look like you. But would it be you?
<sorry>
I think ethics of the use and abuse of Transporter Technology for the achievement of immortality would merit an entire thread of its own.
Rights are social constructs intended to serve particular human needs.
If humans had a different evolutionary history, and a substantially different neurology, our notion of what constituted a right would be similarly different.
So, no, it’s meaningless to give droids human rights. Their different cognition will result in drives and motivations that are so alien that it would render the concept meaningless.
Honestly, I know it’s sci-fi and you can pretty much write anything you want into it, but I believe they were simply programmed to react AS IF they felt pain.
What evidence? How do you seperate the fact that they can simply be programmed to react as if they felt those emotions rather than really feeling it themselves? People right now can write pretty convincing chat programs that can fool people, and we can already beat the best chessmaster with a computer. They don’t think, they simply have a bunch of probability codes programmed into them to react a certain way.
Because programming a non-sentient machine to react as if it felt emotions is generally stupid. Unless it’s a protocol droid or holodeck “actor”, all you’ve done is programmed it to act with suboptimal effectiveness for no gain.
Trek is inconsistent on this. Supposedly the Federation can’t replicate living things, since trying results in a corpse due to “single bit errors”. Except of course when they suddenly can “restore from save” because the plot demands it, like in the TNG episode when they reversed a disease that caused rapid aging.
You’d need more than just the transporter for that, you’d need some sort of continuous backup. Otherwise you’d still lose everything since your last transport, which for some people could be years.
Then what is the point of torturing the droids?
So they will try really hard to prevent themselves from being harmed.
Emotions are very useful hardwired responses. They exist to protect us from behaviors that are physically or socially dangerous. Pain is what protects me from chewing off my own finger. Anger is what protects me from being exploited by ruthless acquaintances.
And Star Wars droids don’t appear to be noticeably smarter than humans, except in narrow areas like data recall and calculation performing. They’ll need the same kind of intellectual short cuts we use, or an equivalent.
Stupid or sadistic. The Hirogen from Voyager programmed the machines to react as if they felt pain because to them, it would give a more satisfying hunt. You cannot simply dismiss it on the basis of stupidity. If it’s possible, then someone will do it, stupidity be damned
Sadism, such as the Hirogen example
If it’s so easy to simply program unfeeling machines to feel pain, then the easiest way to solve the problem is to reprogram them so they can’t want freedom, isn’t it? People act as if machines can’t fool the living but what essentially is happening here is just that. You cannot in any way state for certain that the machines aren’t simply programmed to react as if there is pain because we do that all the time.
Besides, there are also other good reason to program a machine to act like it’s in pain. Medical tests that take into account realistic living reactions would require the machine to act as realistic as possible, including shying away from pain. Or perhaps it would be less creepy on the Uncanny Valley scale if a machine reacts as a living being does in order for the living to be able to relate to it better. Or whatever job the machine is designed for may be helped with a more life-like response. There are countless episodes in Star Trek where the humans run some diagnostic on the computer to find something when the advanced programming of the 24th century computer should have been able to instantly pinpoint the problem and fix itself. It shouldn’t require the chief engineer to tell the computer to reroute coolant through tube J to prevent a warp core breach. The computer should be able to do that themselves
The way we understand pain cannot be so easily translated to a computer. Poke a human with a sharp pin and you know what’s going on physically: pressure concentrated on a point leads to damage of the cells when pressure exceeds the capacity of the flesh to deflect it, cause precious liquids and other fleshy bits to spill out of the hole. Pain sensors get excited, signals get sent to the brain
Poke the metal skin of a robot and all you’ve done it press one piece of metal against another. It is contrary to all our notions of damage and feelings to assume that touch would be harmful, let alone painful, to a robot. And even then, you cannot be certain that the expression of pain is merely a mask, a behavior the robot performs due to it’s programming instead of actual feeling.
yeah, I know. I’m handwaving the existence of such tech so we can make a valid comparison. Let us just say that everyone has a neural implant that automatically saves them each night when they sleep. Even if it was available, organic life should enjoy a class of protection above droids due to its nature. Organic life can be harmed both physically and mentally. Droids, by their nature have all sorts of protections built into their systems that make certain critical abuses as perceived by organics, impossible.
Sure, why wouldn’t it? I am the sum of my physical and mental experiences. In fact, the clone might be better because it would repair minor damages I might have endured over my lifetime.
You know, you stretch this line far enough, and you might be able to apply it to autistics, too. :eek:
Anyways, backing off from the riling up, and to the subject at hand, I checked on the official SW wiki (“Wookiepedia”) for their Droid article for some more technical info (I’m a Star Trek fan, primarily, so I’m unfamiliar with some SW stuff), and came across a few little tidbits:
That quote is unattributed. However, the fact that it includes “self-aware” and “consciousness” right up in the definition is telling. And a bit alarming. :eek:
Also:
(Bolding mine)
So the highest degrees of droids, if this is true/canon, are capable of “creative, complicated thought,” and if the other definition is true, all possess some form of consciousness and self-awareness. (And it seems like the lower degrees are less like “mindless machines” as “well trained artificial animals.”)
Jeez. Even Janeway wasn’t a slaver.
That may be, but regardless of intelligence droids remain artificial life; free of the motivations inherent to organics. Provided you have a back up of the droid’s pattern, you cannot kill it. You cannot injure it, it can only be disabled or put to inconvenience. If it is made to serve a specific function, then you cannot deprive it of opportunity, for it never had any to start with. Other than access to power and the necessary materials to sustain it’s existence a droid needs nothing; you cannot deprive it of property since it has no use for property by it’s very nature. Droids have no resources to protect, no offspring to care for, and no reason to desire true freedom as defined by organic life. Most of them would be useless outside of their programmed field, or ones defined by their hardware; there is no reason for them to vote. Moreover, every droid is really nothing more than a direct competitor with every other droid for resources. They have no need for a social system or for the benefits we grant organics in one. They should not be allowed to form businesses because they can perform constantly without rest. they would have an unfair advantage, and have no logical use for the money such a business would generate.
The best I can think of is that droids should have the right to be cared for properly by their owners. They must be provided with the proper fuel, oils, etc necessary for function. They have the right not be reconfigured to perceive pain and be tortured. They have the right to shut down when there is no work to perform to prevent sensory deprivation. You could make the argument that the higher functioning ones have the right to pursue intellectual stimulation when not performing their specified tasks. I would allow that a truly exceptional droid that can demonstrate “Data” type thought, behaviour, and action be allowed to purchase itself should it choose. It would have the status of “Free droid” and be able to pursue whatever course of existence it so chooses. It still may not vote, incorporate, etc; but it would be able to secure housing, own property, conduct limited business, and have the right to travel unescorted.
You are describing a theoretical form of machine life; not Star Wars droids. Star Wars droids think and act like people made out of metal more than anything else.
And the underlined is no more true for machines than it is of us.
Acid Lamp, you still don’t seem to understand the concept of rights. You do not need to prove a need to be permitted to take advantage of a right, or it would not be a right. Rights are granted by default, and anyone who objects without a damn good reason can go pound sand.
Humans at least, are social animals. Our “rights” have been determined by the necessities of our biology. Our social order follows suit. We have to nurture our young, and have a collective interest in the continuation of the species. Because of how we function we have an instinct for hoarding resources to give our offspring, or ourselves a better chance in life. A life that has a finite limit. Humans compete, certainly but very few humans would be just as happy in a world without any others to interact with.
If SW droids act like people, it is because they have been programmed to do so. Their behaviour is not the natural result of their nature; it is what we felt comfortable with. We programmed them to be like us, because that makes us feel comfortable. It does not equate to the same needs, nor their logical extension: rights.
By WHO?
We agree on certain rights, but they are not the same as they have always been. Rights are granted by social collective agreement, nothing more. They are enforced by law, or governmental action. They are not bestowed from an imaginary god, nor are they granted externally as a law of nature or biology. They are a social construct, nothing more.
Their “nature” isn’t to be a Von Neumann machine, their nature is to be a lump of dead metal sitting in the ground. They weren’t tamed or domesticated, they were created in the image of the mind of man. If I took 100 odd pounds of water and chemicals, and distilled them into a human via alchemy, does that mean I’m justified in keeping the homunculus in a chemical tank?
That’s a false comparison. The droids weren’t turned human, they were made to resemble humans. The analogy would be more apt if you were to turn the water and chemicals into something vaguely humanoid, but still distincitively water and chemicals.