An episode of Star Trek: Voyager

The English teacher in me is crawling the walls over this use of amoral.

Amoral means neutral or without sense of right and wrong, not really, really wrong. The experiments in question, whether done by Cardassians or Nazis, were not morally neutral, but in our sense of morality, evil.
If immoral isn’t strong enough for you, add a modifier; don’t change the meaning of a word. And I mean that literally.

Using the enemy’s evil experiments in order to survive, if for nothing than to thumb your nose at them makes sense to me. Why not redeem the evil and find a way to use it for good? Just Make sure the punishments for such evil is sever enough to be a deterrent to anyone else who would thin to try it.

Why not dematerialize both of them and reassemble them separately? They couldn’t filter out the alien? :confused:

The prefix a- can mean without, and the more common meaning of amoral is showing no concern for morality, not the technical philosophical sense of being neutral to morality. We’re concerned here about research that’s conducted without concern for any collateral harm that’s done, without following ethical guidelines; not just cases where there is a primary objective to inflict torture. So amoral is an apt descriptor, similar to unethical.

To suggest that such use of the word amoral in any way implies that (say) Nazi research was not also utterly evil is ridiculous. You are drawing a distinction that you would like to exist, not one that actually exists in the way that the words amoral and immoral are commonly used outside of technical philosophy.

But the Nazi medical “research” wasn’t amoral. It wasn’t that they really wanted to learn about drowning and hypothermia and whatever and just didn’t care about the Jews. Rather, they cared about the Jews very much, and deliberately performed their “experiments” for purposes of torturing Jews. That’s why they kept such poor scientific records, because science was never the point.

Werner von Braun is a better example of Nazi amorality. He only cared about rocketry, and was just as happy doing rocketry for the Americans after the war as he was with the Nazis during the war. He never wanted London to suffer; he just considered that an acceptable consequence, if that’s what it took to get rockets developed and launched.

I haven’t seen the episode of Voyager in question, so I can’t say which category applies to the Cardassians.

I don’t agree.

Firstly, we were discussing more than just Nazis. The philosophical question of whether research data should be used encompasses more than just situations where torture was the objective, it includes situations where the primary objective was to obtain scientific data, but just without any regard for incidental harm.

But as for the word amoral in colloquial use, I simply think CelticKnot’s (and apparently your) definitions of amoral vs immoral are empitirally wrong. If I describe someone as amoral, that’s not value-neutral. Outside of a technical philosophical sense, it does not mean that you are operating on some plane independent of morals, it means that you don’t care about moral aspects, i.e. you are a bad person. If you like - evil is a more extreme subset of amoral, but amoral is not remotely neutral.

Let’s put it this way. How much money do you guys want to bet on the outcome of the following poll:

“Were the Nazi experiments on human subjects amoral?”

  • Yes
  • No

“Amoral” is a value judgement when applied to humans, because humans are expected to be moral. But it’s not a value judgement inherently, because there are plenty of things (almost everything other than humans) which are not expected to be moral. Computer programs, hammers, trees, and most animals are all amoral, and there’s nothing wrong with that.

And here I was obviously talking about humans (or other sentient beings such as Cardassians) doing bad things (whether incidentally or as a primary objective) in the course of their experiments.

That’s the whole point - I agree that there’s a technical philosophical meaning of amoral that can come up when you’re talking about the actions of non-human animals or perhaps non-sentient computers. But you can’t impose that technical philosophical meaning neutral to morals on colloquial usage when we’re talking about humans. When your talking about humans the word amoral usually means without concern for morals, or lacking in morals, i.e. a bad person.

I strongly disagreed with deleting the research from the database. It would be terrible to condem people to death because the research that can save them is tainted by tragedy.

I understand the simulation of Crell Moset, the Cardassian doctor needed to be deleted.

But the database of information could be referenced as needed. You don’t need Crell Moset to understand the findings of his research. Voyager’s doctor is capable of understanding and applying that information.

On the issue of not accepting the data as to not encourage it: that also punishes the innocent people whose lives could be saved by using that data. So I would argue that, if there is any other way to discourage the creation of such data, it should be used before deleting the data. You should not punish the innocent people unless that’s the only possible way to deal with it.

As for amoral vs. immoral: An immoral person is someone who has morals but chooses to violate them, or who has warped morals. An amoral person is someone who acts without any morals.

As for the reason for the hologram: it was a separate database about exobiology, and the Doctor apparently has limits. His matrix can only hold so much data at once. So the idea was to create a new hologram that could just be focused on that one database, and only process that.

From the episode:

EMH: If I’m to have any hope of devising a treatment I’ll need to brush up on my exobiology.
PARIS: What do you mean, brush up? Don’t you have all this information in your database?
EMH: I may be a walking medical encyclopaedia, but even I don’t know everything. My matrix simply isn’t large enough.
JANEWAY: Maybe we can do better than giving you a crash course in exobiology. Maybe we can provide you with a consultant.
EMH: I’d be delighted, but how?
JANEWAY: By isolating the computer’s exobiology data files and merging them into a an interactive matrix.
PARIS: A hologram.
JANEWAY: Exactly.
EMH: That may not be as simple as it sounds. It would need to be nearly as sophisticated as I am. Tactile interfaces, personality subroutines.
PARIS: Harry could do it.
JANEWAY: Search the database for the leading exobiologist. If you want to add a personality, it may as well be based on a real person. I’ll have Harry meet you in holodeck two.