Suicide pods now legal in Switzerland

This news has been making its way around a couple forums and chats that I frequent: Suicide pods being developed in Switzerland, providing users with a painless death - National | Globalnews.ca

I’m not sure how I feel about this. On one hand, I support medically-assisted deaths as I don’t feel people with terminal illnesses and similar conditions should be allowed to end their suffering, and I think if someone is committed to ending their life it is best if it can be done in as clean and painless way possible, but I feel uneasy about normalizing suicide in general.

The makers of the device had the following to say:

“We want to remove any kind of psychiatric review from the process and allow the individual to control the method themselves,” Nitschke said. “Our aim is to develop an artificial intelligence screening system to establish the person’s mental capacity. Naturally, there is a lot of skepticism, especially on the part of psychiatrists.”

“The benefit for the person who uses it is that they don’t have to get any permission, they don’t need some special doctor to try and get a needle in, and they don’t need to get difficult drugs,” Nitschke said in a Sarco demonstration last year.

Do you think that anyone should be able to access such devices to be able to end their life solely based on their own determination, or do you think people should be forced to consult with medical professionals before their lives can be ended?

All I can say about this is it sounds like one guy hungry for publicity/controversy, and not something that’s actually going to catch on in any meaningful way.

I think a psychiatric review is required.

If someone is committing suicide because they have treatable depression, you should be treating the depression. However, if they have untreatable depression, then I’m fine with it. No reviews could mean a high suicide rate among people with treatable depression.

Using an AI screening system to … what? I doubt any currently existing computer is nearly that advanced. And if the computer is that advanced, it’s effectively doing a psychiatric review of it’s own.

I saw a headline about this recently, but I haven’t checked the veracity. Assuming for the moment that it’s true, and it’s a painless, reliable way to end life… I’m thrilled to hear it because I may want to use it one day.

I’ve seen a fair amount of suffering at end of life. Among others, my best friend died from a brain tumor that ravaged his body while leaving him fully aware and present mentally. I won’t be going that way, and I fear not having the means to do it when the time comes.

Sure, anyone can buy a firearm. But to paraphrase the Mikado, shooting your own head off is a difficult, and also dangerous, thing to attempt. To say nothing of burdening others with finding you and cleaning up the mess.

Bring on the death pods as a means for people to choose to end their suffering. I’m all for it.

In what way is the AI evaluation not a psychiatric evaluation?

I could never make use of such a device, and I could never deny it to others that deem it necessary.

This. If I had a dog that was dying and suffering, with no chance for recovery, I’d have that dog put down. I don’t know why don’t love humans like that…we force loved ones to suffer through similar hopeless situations.

I can see that some people choose suicide as a permanent solution to a temporary problem. So there should be a reasonably brief waiting period. But I think that each person has a right to choose to end his/her life.

The fact that it’s not being handled by an accredited psychologist, but a type of software that tends to have like a 20% failure rate?

I definitely have a problem with this as described. I don’t see any reason you need autonomous suicide pods for people choose to end their lives early. The last thing we want to do is make the idea attractive. We just want the ability to be there for those who genuinely need it.

I do believe that there must be a right to end one’s life. But I’m also aware that wanting to do so is a symptom of a disease. That sort of thing needs to be addressed, and I do not think any AI is capable of proper diagnosis, or that people won’t learn to game the AI. We’re not there yet.

I also don’t like the idea of sanitizing suicide. Even if you make it physically not messy, the real mess is with those you leave behind. That needs to be considered, and not have some shiny pod that promises to do it “cleanly” to make it more attractive.

I’m much more a fan of it being a non-flashy procedure, enacted with care.

Not to mention that having easy suicide gives perverse incentives for governments to not care about the wellbeing of their people. They can just start pushing the idea of going to a suicide booth if you can’t stand it. That’s why they are portrayed that way in dystopic science fiction.

I support the right to die. But, based on what I’ve seen, I don’t support this project. And I do hope it doesn’t go very far. Maybe it can just pressure others to provide the non-flashy service to those who need it.

Yes, I should have placed sarcasm tags.

This reminds me of a banned poster (I won’t name them) who was constantly saying society was only valid if it provided suicide booths.

I have nothing else to add at this time.

This is pretty much exactly what I was going to write, though it’s likely more articulate than what I would have said.

I’m waiting until I see sufficient five-star Amazon reviews from satisfied users.

It’s like the suicide booths from Futurama have become real.

It seems like a lot of the reason people don’t want their loved ones to commit suicide is the pain that they feel as a result, regardless of the pain being suffered by the suicidal person. The guilt and loss they feel makes it okay to make it harder for their loved one to commit suicide. I would argue that that guilt and loss is coming from themselves and is not the responsibility of the person committing suicide.

The only restriction I would put on it is that the person should be fully aware, intellectually and (if possible) emotionally, of what they are proposing to do (I suspect that’s what the AI is for).

Anyway, now that the thing is legal in Switzerland, the rest of us can wait and see what happens. Will it be a huge tragedy that wouldn’t have happened otherwise? Tune in and find out.

All the mess would be eliminated if the pods doubled as caskets.

Suicide should never be encouraged or enabled, let alone by an AI or for a profit. This is just another step on the slow walk towards state-mandated euthanasia.

They do. See the article.

People who kill themselves don’t seem to care much, if at all, about the pain they cause other people as a result of their actions. Why doesn’t the pain of the survivors count?

^ This.

History tells me that what started as a means to ease suffering in the 20’s an 30’s morphed fairly rapidly into first killing off the disabled, to “ease suffering”, rather than seeking means to make the lives of the disabled better, then evolved into industrial murder on a scale of millions.

In a world where too many can’t get medical care unless they’re rich, and too much of that care that does exist is doled out by heartless corporations that care only for the bottom line, it would be far, far too easy to “encourage”, if not outright coerce, people to kill themselves rather than to actually care for and help people.

That’s leaving aside the inevitable cases where the elderly are pressured to kill themselves so their heirs can inherit, or the inevitable murder case using one of these to get rid of someone “inconvenient” to someone else.

I find the idea of making suicide neat and antiseptic and easy to be utterly abhorrent. It should never be the first choice, or even the second. It is never a good thing, at best it might be the lesser of two evils.

Are Bending Robots also legal?

Does the company sell gift cards?