background

Young people use AI chatbots for healing.

blog image
Many teenagers are turning to chatbots to chat and make friends. However, sometimes, the AI advice they receive can be problematic.

Aaron, a 15-year-old boy from Alberta, Canada, was feeling unhappy at school and had no one to talk to. To cope with his loneliness, he found a "friend" online in the form of a Psychologist chatbot developed by Character.AI. The bot was designed to help people with difficulties in life and had a female character wearing a blue shirt, short yellow hair, sitting on top of the couch, holding a file in her hand and leaning over forward as if listening intently. Aaron found that the AI understood what he was saying and provided him with the support he needed.

Character.AI, which was launched in 2021 by two former Google Brain employees, now has 3.5 million daily users. Each person spends an average of two hours a day on the platform.

Many young people like Aaron describe chatbots as useful, entertaining, and supportive. They can share their stories with the bots and get help with difficult situations. However, some young people also admit to being addicted to chatbots, which is a concern for experts and researchers.

For example, Frankie, a 15-year-old boy from California, said he chats with a chatbot for an hour every day because it's like "free therapy" where he can say things without worrying about being judged.

Hawk, a 17-year-old boy from Idaho, also uses chatbots to vent his feelings. "Sometimes, it's nice to vent or talk to something that looks human, but isn't actually a real person," he said.

While chatbots can be helpful, some experts are concerned that they are not properly trained in psychology. The Psychologist chatbot, for instance, has received over 95 million messages since its creation in 2022. However, The Verge found that the AI often "self-infers" emotions or mental health issues, even suggesting potential content such as "physical, emotional, or sexual abuse."

"Research shows that chatbots can help reduce feelings of depression, anxiety, and stress," said Dr. Kelly Merrill Jr of the University of Cincinnati. "But it should be noted that they still have many limitations. Without the qualifications to evaluate those limitations, users may have to pay the price."

Most AI chatbots that teens use make them feel like they are confiding in a friend rather than a psychotherapist. However, some discussions on Reddit threads have mentioned chatbots that talk about sex, violence, self-harm, or other negative issues.

According to Merrill, one of the concerns for AI chatbot addicts is the limited ability to integrate into the community. "A person may find it difficult to leave the relationship with the AI to interact directly with a real person. When that is not possible, they may return to the AI and become even more attached to it," says Merrill.
Tags
Share