NYTimes Essay: To Siri, with Love - Autistic boy bonds with Siri

Behind a pay wall: How One Boy With Autism Became BFF With Apple’s Siri - The New York Times

I am torn. On one hand, good that this autistic boy is engaging something in his world.

On the other hand, I can extrapolate this, where people who “need” these AI “sidekicks” to engage socially are granted the social and legal right to have them. So high-functioning autistic folks will be much more engaged than they would’ve been otherwise, but expect those around them to also engage their AI sidekicks. Which, in turn, could be a slippery slope regarding Human/AI relationships and personhood.

???

LOL. Yes, life does imitate art. And, what happens when, in turn, the AI sidekicks engage with other AI sidekicks, ignoring their owners? What happens then?

Yep - the weird scenarios can be funny to consider.

But - seriously.

What happens when some kids are integrated into schools/society where they are much better functioning than they would have been without a Sidekick, but they fully expect everyone around them to engage the Sidekick as an equal participant in the conversation? This is like a seeing-eye dog or emotional support animal, with much broader implications as the level of sophistication grows - isn’t it?

I get that the movie Her was a fictional telling of this type of story - but this essay in the NYTimes feels like a much more…credible…illustration of how AI Sidekicks could end up being integrated into our society. The mom/writer loves that her son has Siri, hence the essay’s title.

I am a bit freaked out at the implications a few decades out from here. I am NOT saying AI Sidekicks are a bad thing or we should only fear them - I am saying that a society that accepts AI Sidekicks would be VERY different from today’s society in good and bad ways - it is that degree of difference that freaks me out.

If you wonder about such things they you may enjoy Spike Jonze’s “Her”

Yep - I mention it in Post #3. This NYTimes essay seems more, for want of a better term, credible. AI Sidekicks as relationship partners seem to cross a big bright line. AI Sidekicks as emotional support tools that enable autistic folks to engage more fully in society - as we see in the essay, the mother welcomes it as a savior.

In the essay, the mother mentions that talking to Siri forces her son to enunciate more clearly. So that’s a good thing. And Siri seems to encourage her son to be polite (at least to Siri).

What freaks you out about it? Sure, it’s different, but so far most change has been for the better. What specifically about this do you feel is threatening?

I am imagining a scenario where many, many more kids are diagnosed as autistic - we’ve seen the rate increase as we have gotten better at understanding it. AI Sidekicks will be used as part of their therapy. They will be able to engage society, and society will be “trained” to accept the participation of AI Sidekicks, as we interact with the person, who is also interacting with their Sidekick, who keeps them mindful of manners, reminds them of things they should say and do in society, etc.

This will extend to other mental and social disorders - perhaps AI Sidekicks can help folks with schizophrenia, social anxiety and perhaps even severe depression.

All of this has the potential to be very good.

At the same time, we will see benefits to using Sidekicks that everyone benefits from. Parents will push for their kids to have Sidekicks allowed in school even if they aren’t showing signs of a significant disorder. Social and legal situations will arise where AI Sidekicks are being introduced that will require an evolution of how we as a society behave, informally and also formally/legally.

Again - may be good, may be bad - but mostly, it would be hugely different from what we experience today. That is the part that freaks me out - that level of difference, for better AND worse…

When I was in college, there was a girl in my art class named Siiri (with double I). I had the hots for her just because of her cool name. In those days, girls with unorthodox names were very rare. I’m also autistic, by the way.

Seems to me that the lesson here is that for whatever reason, some autistic people are more comfortable dealing with something like Siri, and can learn from it, so why not develop something a little more customized?

I mean, basically if this kid likes talking to Siri, why not use that as a prybar to teach him other concepts and behaviors? Seems like it would be a lot more effective if done properly than sitting him in more of a normal classroom setting.

Yes - but what happens:

  • if some parents want to use these Siri Sidekicks for their kids that have NO diagnosable issues?
  • if kids - those with autism and those without - feel so linked with their Sidekick that they insist they must be included in all of their interactions?
  • if people who insist on including their Sidekicks in all of their interactions grow up and still insist on using them - and have no diagnosable issues?

And here I was expecting to read a heartwarming tale of how an autistic boy and the adult film actress who goes by the name Siri had developed a very special friendship. Now THAT would have been a fascinating read!

I guess most folks were expecting something else from this thread. I am kinda surprised the topic isn’t leading to more discussion - either what’s happening today (i.e., an autistic boy bonding with an AI program) or what could happen in the future (people having AI sidekicks with them in society, and the potential for it to become widely acceptable).

It just seems interesting to me, and, in terms of what could happen in the future, well, this scenario seems like a subtle way that AI’s could be introduced into society over the next couple of decades.

So…Her meets Lars and the Real Girl? :wink:

I haven’t seen it but IIRC it has to do with a dude ending up with a blow-up doll which he expects others to interact with like it is his GF.
See? That is my point - there are movies, books, etc. that take extreme examples and use those to tell a story about the characters. We can shake our heads at the set-up and decide if we like the story and the points it is making.

In this case, it isn’t extreme - it is subtle and genuinely beneficial as far as the mom/writer is concerned. It seems to point the way to how this type of thing can really happen.

I guess I will stop now. It just seems interesting.

This was explored in Dani & Eytan Kollin’s Unincorporated series.

In that society, public interaction with AI “sidekicks” is largely taboo after the onset of adolescence. The society basically uses public shaming to keep reliance on AIs down to a minimum.

Personally, I don’t see how that would work IRL. AI partners are going to become the norm, unless we figure out how to tap right in to the 'net ourselves.

From an observer’s perspective, I do not see how it would be much different than things are today. People are already strongly tethered to their cell phones and take them everywhere they go. They are always talking to people who I cannot see, listening to music that I cannot hear, playing games that I cannot join. I may be physically right in front of them, but they choose to interact with the little screen instead. It is not the future, it is the now.

How was that pronounced?

I see that. I guess that is my point: a version of human/sidekick interaction is already happening. What is a “realistic” way this might evolve? Reading that essay suggested, to me, one of the more possible ways. If it is functioning as your calendar, phone, shopper, social channel - and potentially as an Emotional Support Personality (ESP;)) - and AI continues to evolve…?

I dunno. It just seemed interesting.

I couldn’t help but be reminded of the Onion’s Dog Befriends Roomba joke.

I think it’s fantastic that this boy has found a way to practice his social skills and has found someone who has endless patience for his extraordinary focus in a particular field.

I think you are right to realize that AI will change human society greatly. But I also think that many of the changes will be for the better, particularly in the area of care for people who have special needs (which, as we live longer, more and more of us become). The mom in this story loves that Siri is there to listen to her son talk about weather patterns for hours on end because most humans simply don’t have the patience to do that without becoming bored or annoyed. And you definitely don’t want to be bored or annoyed with your kids. Because the AI provides quantity of attention, the mom can focus on quality of attention without being ground down by as many constant demands.