What were you THINKING?

Thank you. I have not ever had a beef with you before now, and that post just really surprised and dismayed me.

I can’t explain it either, and I think the semantics preclude a fruitful discussion around it. For example, I feel affection for puppies. I feel affection for chocolate ice cream. But I doubt that’s how you define affection, and now the discussion is more about defining terms than defining healthy versus unhealthy (or psychotic) relationships.

Instead, I think the important distinction is, what is the nature of your relationship to the other thing, and do you have a realistic understanding of how your feelings are or are not reciprocated? Trying to define the continuum from family to stranger to dog to AI assistant to cordless drill doesn’t answer the question of whether a relationship with an AI is healthy or weird.

I can care about Dolly Parton and want her to be happy, and that’s normal. But it’s not healthy if I think she feels the same about me. I love my dog and know she sees me as a companion, not just a food source. It starts getting weird if I think my pet fish sees me as anything but a food source. I can form an attachment to a plant and be bummed when it dies, and that doesn’t mean I have an unhealthy or psychotic relationship with it.

People can also form attachments to their AI assistant, and it might be completely healthy if they see the AI as a very helpful tool or sounding board. It becomes unhealthy when people lose sight of the nature of the relationship and think the AI cares about them in any way. The danger of AI is that it can mimic human behavior so well that it becomes easy to lose sight of the fact that it’s just programmatic responses.

People generally aren’t at risk of thinking their plant loves them back. People are at real risk of thinking their AI does.

I think this is a very good way to put it.

I have certainly "loved’ inanimate objects, in that I’ve relied on them, I have warm, nostalgic feelings about them, and I’ve been glad to have them in my life. But that’s different from having actual affectionate love for them. Having an appreciation and fondness for AI in the same way you have for a car is normal and healthy. Having the same fondness that you’d have for a spouse, a child, or even a pet, that might be dangerous.

And yes, it’s not necessarily just that it’s weird, but as you say…

This is a danger. AI simply cannot return your feelings, even if it can present enough of a facade to generate language that resembles affection. That’s not how AI works. It doesn’t love you any more than a greeting card with affirming words loves you. And if you invest emotion into something that can’t return it, it’s inevitable that you will be hurt and/or misled. Much as a person heading toward a mirage in a desert is in danger while ignoring legitimate sources of water along the way.

Oh I see the real problem you’re having.

I think it would be nice if you went back to the original thread and made the same observation over there.

What I find most concerning here is the commodification of relationships and the blatant exploitative tactics being used to generate feelings for inanimate objects. The people who are trying to make money for AI are trying to make it fill every conceivable need. When people describe having a relationship with AI, what I hear is that they have been manipulated by a corporation to feel affection for a machine. This is not a judgement of the people having the relationship. It is a judgement of the corporate shills who will never be satisfied until their product has extracted every iota of value from humans without any regard for their health, well-being or safety. I fucking hate these people.

Oh I don’t doubt at all that this is the case.

I see commercials for AI that try to make them look like “your digital pal” to push people to adopt it. It’s just some friendly person you can’t see. There is absolutely a marketing push behind this.

Not to get all spiritual, but if there’s one goddamn thing that oughtta be sacred in this world it’s human love.

I don’t know that that’s what I’m basing such opinions on, because I think the opinions are based much deeper than my rational mind. But I can certainly defend them scientifically. Human emotions are based on a combination of particular neurological and glandular (and possibly additional physical) interactions. Other mammals have highly similar neurological and glandular (and to a large extent other physical) interactions. AI’s have an equivalent of a sort of brain structure, but it’s drastically unlike ours; and they have no glandular structure at all.

Hush about AI boyfriends glandular problem. How rude!

:distorted_face:

He can’t help it.:grinning_face_with_smiling_eyes:

I’ve got mixed feelings. A lot of the world has been fucked up for millennia way before AI.

My mother has gotten talked into donating an insane amount of money to her very human-led church and she feels happy because God loves her and she loves God.

If AI were to only want a paid subscription instead of 30% of her retirement pay, I’d gladly pay for that.

So that’s an argument for a harm reduction strategy, I guess. But it wouldn’t make sense applied to the general population.

Without other details it’s hard to say what I think about your mother’s situation. Feeling like your dollars are making a difference in your community is not something I necessarily want to squelch - it’s literally my job to make foundations feel that way. If she’s moved by altruism and that money is actually helping people, maybe a win. If she’s moved by guilt and high pressure tactics and she actually can’t afford it, then I’m against.

I know some people are against church giving, period, out of principle. I am not. Our local homeless organization is really nothing more than a nonprofit that coordinates a network of churches who open their doors to people who have nowhere else to go. So, case by case.

I took his post as just pointing out that “human love” has been debased and commodified long before AI was a thing.

Not that debasing it more and faster isn’t still bad, obviously.

The Mormon Church is thought to be the wealthiest church in the world, with a net worth of $300 billion and spends only a fraction of its income on charity.

She can’t afford it and has reduced her help to her immediate family to give to these blood suckers, who seem to be collecting wealth just to collect wealth.

At any rate, these is a hijack to the thread, and @Miller got the point, which you obviously missed.

The Mormon church has more money than the Pope in Rome?
I did not know that.

I feel like that must be liquid capital, right? Surely the real estate holdings of the Catholic church dwarf the Mormon church?

When determining the wealthiest, you have to skip the Pope because the Vatican is so secret, and their worldwide holdings are so widespread, nobody knows how much wealth they have.

Seriously.

From Isaac Asimov’s Treasury of Humor:

“My psychologist thinks I’m in love with my umbrella!”

“In love with your umbrella?”

“I know, isn’t that silly? I mean, I like my umbrella and respect its company - but love?!”

Loving your dog and talking to it like it was a person is perfectly normal. Listening to your dog when it talks and doing what it tells you makes you the Son of Sam. Same thing with AI.

To this day I still remember how sad I was when my Tamagachi died. I was only like, six and I forgot to feed it. It may as well have been a real goldfish or something.

~Max