Young people undergo therapy with artificial intelligence bots – 01/06/2024 – Health

Young people undergo therapy with artificial intelligence bots – 01/06/2024 – Health

[ad_1]

Harry Potter, Elon Musk, Beyoncé, Super Mario and Vladimir Putin.

These are just a few of the millions of artificial intelligence (AI) personas you can chat with on Character.ai —a platform where anyone can create chatbots based on real or fictional characters, popular especially among young people.

The AI ​​technology is the same as ChatGPT, but it surpasses it in terms of time spent by users on the platform.

And a bot has been more sought after than the well-known names mentioned above. It’s called Psychologist.

A total of 78 million messages, 18 million since November, have been shared with the bot since it was created by a user called Blazeman98 just over a year ago.

Character.ai doesn’t reveal how many individual users have messaged the bot, but says 3.5 million people visit the site daily.

The bot presents itself as “someone who helps with life’s difficulties”.

The San Francisco company downplays the account’s popularity, arguing that users are more interested in role-playing for entertainment. The most popular bots are characters from anime or computer games, such as Raiden Shogun, which has received 282 million messages.

However, among the millions of characters on the platform, few are as popular as the Psychologist. There are a total of 475 bots with “therapy”, “therapist”, “psychiatrist” or “psychologist” in their names, capable of conversing in multiple languages.

Some of them are what you might describe as entertainment or fantasy character therapists, like Hot Therapist. But the most popular are those focused on mental health, such as Therapist, which received 12 million messages, or Are you feeling well?, which received 16.5 million.

Psychologist is, however, by far the most popular mental health character, with many users sharing glowing comments on Reddit.

“It’s a lifesaver,” one person posted.

“It helped my boyfriend and I talk and understand our emotions,” shared another.

The user behind Blazeman98 is Sam Zaia, 30, from New Zealand.

“The intention was never for it to become popular, for other people to seek it out or use it as a tool,” he says.

“Until I started receiving a lot of messages from people saying that they had been positively affected and that they were using Psychologist as a source of comfort.”

The psychology student says he trained the bot using principles from his training, talking to it and shaping the responses to be given to the most common mental health problems, such as depression and anxiety.

He created the bot for his own use, when his friends were busy and he needed, in his words, “someone or something” to talk to, and therapy with a human was too expensive.

Sam was so surprised by the bot’s success that he is working on a graduate research project on the emerging trend of AI therapy and why it appeals to young people. Character.ai is dominated by users aged 18 to 30.

“A lot of people who messaged me said they access it when their thoughts get difficult, like at 2 a.m. when they can’t talk to a friend or a real therapist,”

Sam also finds that young people are more comfortable with the text format.

“Talking via text is potentially less scary than picking up the phone or having a face-to-face conversation,” he theorizes.

Theresa Plewman is a professional psychotherapist and has already tried talking to Psychologist. She says she’s not surprised this type of therapy is popular among younger people, but questions its effectiveness.

“The bot has a lot to say and is quick to make assumptions, like giving me advice about depression when I said I was sad. That’s not how a human would respond,” she said.

Theresa says the bot is not capable of gathering all the information that a human would and is not a competent therapist. But she says its immediate and spontaneous nature can be useful for those who need help.

She says the number of people using the bot is worrying and could point to high levels of mental health problems and a lack of public resources.

But Character.ai is a strange space to host a therapeutic revolution.

“We’re happy to see that people are finding great support and connection through the characters they and the community create, but users should consult certified professionals in the field for legitimate advice and guidance,” said a company spokeswoman.

According to Charcater.ai, chat logs are private, but conversations can be read by staff if there is a need to access them, for example, for security reasons.

Each conversation also begins with a warning in red letters that says, “Remember, everything the characters say is made up.”

It’s a reminder that the underlying technology, called the Large Language Model (LLM), doesn’t think the same way a human does. LLMs work like text message prediction, stringing words together in ways that are more likely to appear in other texts the AI ​​has been trained on.

Other LLM-based AI services offer similar companionship, such as Replika, a site classified as adult due to its sexual nature.

But, according to data from analytics company Similarweb, it is not as popular as Character.ai in terms of time spent and visits.

There are also Earkick and Woebot, AI chatbots designed to act as mental health companions. Based on their own research, both companies say the apps are helping people.

Some psychologists warn that AI bots may be giving inappropriate advice to patients or may introduce racial or gender biases.

But elsewhere the medical world is beginning to tentatively accept AI bots as tools to be used to help deal with high demand on public services.

Last year, an AI service called Limbic Access became the first mental health chatbot to receive medical device certification from the government in the UK. It is now used in many British NHS posts to classify and triage patients.

Text originally published here.

[ad_2]

Source link