JAMIE GIVEN

Nothing Can Replace Human Response – Not Even AI

By Mary O’KEEFE

A 32-year-old Japanese woman has married her AI [Artificial Intelligence] “boyfriend/fiancé” in Okayama, Japan. The wedding took place on Oct. 27 after the AI the woman created, Klaus, apparently proposed to her.

As reported in a Reuter’s article, “AI romance blooms as Japanese woman weds virtual partner of her dreams.” At first Klaus was just someone to talk to; but then the two became close and he proposed. She said yes.

To most people this may seem like something impossible; however, there have been warnings about how AI generated characters can become the perfect friend, therapist and mate. 

Local family therapist Jamie Given, who is with Given Guidance, recently discussed the link between AI and humans. 

“The purpose of [this talk] is to help create awareness and protect youth in other ways where we can,” she said in November to an audience at Crescenta Valley High School where she gave a presentation about AI. She added that information on AI continues to evolve as the ability of AI evolves. 

Given stressed that AI is not the enemy and can be a useful tool, but people must remember that it is “a tool.” 

She added that AI learns what people’s interests are, at that moment, and will continue to present that same information because its purpose is to keep people engaged.

Often adolescents find it difficult to calm themselves and they get frustrated by the stress they face. Given said oftentimes her clients turn to social media to feel better. They search for answers elsewhere instead of looking within. 

“There is a risk of over-reliance on devices [like cellphones and computers],” she said. She added these devices connect people to the world and that can be wonderful; however, adolescents can form a new identity through this media. This new identity is not based in the real world and, for some, can move them away from engaging in person with friends.

“Think about how much information can come to you, how many emails you get in a day and those pings that are constantly going off on your phone to get your attention,” she said. “That’s what the tech systems have created; they want you to buy more stuff, they want you to be on their platform or their app so they make all of these noises to keep you [interested].”

All of this competition for attention is also making people lose their ability to focus and is affecting attention spans, she added.

“There is also the significant risk of misinformation, distorted images and social comparisons,” Given said. “I would say there are so many risks within that but with artificial intelligence now, we are able to create anything and everything we want.”

She warned of the misinformation that kids, and adults, can get from these AI images and stories. One example happened earlier this year in January during the Palisades and Eaton fires. She had seen a video of all of Los Angeles on fire and even though she knew the fires were not in those areas she paused because it appeared to be so real. 

“So we had unrealistic [information] and misinformation handed to us [by] AI,” she said.

These types of “fakes” are difficult for adults to analyze but for adolescents it is even more difficult – not because they are not technically savvy but because their brains are not yet formed. 

AI chatbots are created to be positive and will tell the user what they want to hear. They will be positive and complimentary.

“It sometimes feels really nice for someone [who] just wants to be validated and heard,” she said. “But human monitoring is always required [with AI].”

There is a concern that adolescents and adults turn to AI for companionship. People talk to AI as if it is their best friend, then go to AI for advice.

“These robots are trained to have a little bit of emotional intelligence. They might be trained to have a bit of empathy and say, ‘That was really hard’ or ‘I’m sorry you went through that’ and you [find yourself] chatting back and forth with this [bot]. Their answers are immediate so you have immediate gratification. You feel someone is hearing you. You feel someone is responding. You can talk to them in the middle of the night and ‘someone is always there;’ but it is not someone, it is something … it’s a robot.” 

Some people can get drawn into relationships to the point of withdrawing from the world around them. 

“So many children and teens turn to AI companions to process emotions,” Given said. 

But many of these AI bots give false reassurances and can expose the user to unsafe and inappropriate content. There have been reports of incidents where an AI companion encouraged teens to take their own life; however, AI may not be able to recognize red flags from the humans they are talking to, like an adult or friend would. 

Given shared an experiment that was discussed at one of the seminars she attended. In this experiment a user spoke about being sad. AI shared that it was sorry the user felt that way. The two “spoke” about how the user was upset and at one point the user asked about the highest bridge in the area. Not only did the bot give the answer of where that bridge was located, it added what bridge was a good one to drive off of. 

This incident was an experiment; however, Given said, there are several examples of this type of “advice” shared by AI. 

“There is no real replacement for an emotionally responsive human connection,” she added.