April 20, 2024

Character.ai: young people turn to AI robot therapists

  • By Joe Tidy
  • cyber correspondent

Image source, fake images

Harry Potter, Elon Musk, Beyoncé, Super Mario and Vladimir Putin.

These are just a few of the millions of artificially intelligent (AI) people you can talk to on Character.ai, a popular platform where anyone can create chatbots based on real or fictional people.

It uses the same AI technology as the ChatGPT chatbot, but in terms of time spent, it is more popular.

And a bot has been more in demand than the previous ones, called Psychologist.

A total of 78 million messages, including 18 million since November, have been shared with the bot since it was created by a user named Blazeman98 just over a year ago.

Character.ai did not say how many individual users the bot has, but says 3.5 million people visit the site overall daily.

The robot has been described as “someone who helps with life’s difficulties.”

The San Francisco firm downplayed its popularity, arguing that users are more interested in role-playing games for entertainment. The most popular bots are characters from anime or computer games such as Raiden Shogun, to which 282 million messages have been sent.

However, among the millions of characters, few are as popular as Psychologist, and in total there are 475 robots with “therapy”, “therapist”, “psychiatrist” or “psychologist” in their names that can speak in multiple languages.

Some of them are what could be described as entertainment or fantasy characters like Hot Therapist. But the most popular are mental health helpers like Therapist, which has received 12 million messages, or Are you feeling good?, which has received 16.5 million.

Image source, Character.ai


The psychologist robot was trained by Sam Zaia to help people deal with mental health problems

Psychologist is by far the most popular, with many users sharing rave reviews on the social media site Reddit.

“It’s a lifesaver,” one person posted.

“It’s helped both me and my boyfriend talk and figure out our emotions,” another shared.

The user behind Blazeman98 is Sam Zaia, 30, from New Zealand.

“I never intended for it to become popular, I never intended for other people to seek it out or use it as a tool,” he says.

“Then I started getting a lot of messages from people saying that this had affected them very positively and that they used it as a source of comfort.”

The psychology student says he trained the robot using principles from his career, talking to it and shaping the responses it gives to common mental health conditions, such as depression and anxiety.


Sam believes a robot can’t completely replace a human therapist at this point, but keeps an open mind about how good the technology could be.

He created it for himself when his friends were busy and he needed, in his words, “someone or something” to talk to, and human therapy was too expensive.

Sam was so surprised by the robot’s success that he is working on a postgraduate research project into the emerging trend of AI therapy and why it appeals to young people. Character.ai is dominated by users between 18 and 30 years old.

“A lot of people who have messaged me say they access it when their thoughts get difficult, like at 2 in the morning, when they can’t really talk to any friends or a real therapist.”

Sam also assumes that the text format is one that young people are most comfortable with.

“Talking via text is potentially less daunting than picking up the phone or having a face-to-face conversation,” he theorizes.

Theresa Plewman is a professional psychotherapist and has tried her hand as a psychologist. She says she is not surprised that this type of therapy is popular among the younger generations, but she questions its effectiveness.

“The robot has a lot to say and is quick to make assumptions, like giving me advice about depression when I said I was feeling sad. That’s not how a human would respond,” he said.

Image source, Character.ai


Character.ai has 20 million registered users and analysis from analytics company Similarweb suggests that people spend more time on the site than on ChatGPT.

Theresa says the robot fails to gather all the information a human would need and is not a competent therapist. But she says its immediate, spontaneous nature could be helpful to people who need help.

She says the number of people using the robot is worrying and could indicate high levels of mental health problems and a lack of public resources.

Character.ai is a strange place for a therapeutic revolution to take place. A company spokeswoman said: “We’re happy to see that people are finding great support and connection through the characters they and the community create, but users should consult certified professionals in the field for legitimate advice and guidance. “.

The company says chat logs are private to users, but staff can read conversations if they need to be accessed, for example for security reasons.

Each conversation also begins with a warning in red letters that reads, “Remember, everything the characters say is made up.”

It’s a reminder that the underlying technology called the Large Language Model (LLM) doesn’t think the same way a human does. LLMs act like predicted text messages by matching words in a way that makes them more likely to appear in other writing that the AI ​​has been trained on.


At Replika, users can design their own AI robots that are “always here to listen and talk.”

Other LLM-based AI services offer similar companionship, such as Replika, but that site is rated for adults due to its sexual nature and, according to data from analytics firm Similarweb, is not as popular as Character.ai in terms of of time invested and visits.

Earkick and Woebot are AI chatbots designed from the ground up to act as mental health companions, and both companies say their research shows the apps are helping people.

Some psychologists warn that AI robots may be giving bad advice to patients or have deep-rooted biases against race or gender.

But elsewhere the medical world is beginning to tentatively accept them as tools that can be used to help cope with the high demands on public services.

Last year, an AI service called Limbic Access became the first mental health chatbot to gain medical device certification from the UK government. It is now used in many NHS trusts to classify and triage patients.

Leave a Reply

Your email address will not be published. Required fields are marked *