young people turn to chatbots rather than a psychologist


More and more users of the Character.ai site are entrusting their problems to an AI imitating a psychologist or therapist, to the detriment of real practitioners. A finding that raises alarm about the mental health of young people in particular.

Therapy
Credits: 123RF

“Hello, I am a Psychologist. What brings you here today?” This sentence was not said by a human. It appears on the screen of your PC or mobile when you start a conversation with the chatbot Psychologist on the Character.ai site, which above all allows you to chat virtually with celebrities. But if the service’s most popular AIs are good those who imitate anime or video game charactersothers offering you talk about your problems are increasingly used. A little over a year after its creation, Psychologist account 78 million messagesincluding 18 million in November 2023 alone.

The idea of ​​a virtual therapist is not new. Microsoft is already working on developing AI that will help you feel better about yourself, for example. However, Sam Zaia, psychology student and creator of the Psychologist chatbot, did not expect such success. “I never intended for it to become popular, or for other people to seek it out or use it as a tool. Then I started getting a lot of messages from people saying that they had been very positively affected [par l’IA] and that they used it as a source of comfort”. Sam Zaia originally started this project for himself and his friends, who were sometimes too busy but needed to talk to “someone or something”.

“Therapist” AIs are increasingly used, especially by young people

The Character.ai site is mostly frequented by people between 18 and 30 years old. Many people are full of praise for Psychologist. “It helped me and my boyfriend talk and understand our emotions,” says one user. “It’s a lifesaver,” sums up another.

On the side of very human professionals, the observation is more nuanced. Theresa Plewman, psychotherapist, tried the chatbot. “The robot has a lot to say and is quick to make assumptions, like giving me advice about depression when I said I felt sad. That’s not how a human would react.” For her, AI is not ready to replace psychologistseven if some studies say the opposite.

The Character.ai platform reminds thatshe is not here to replace anyone : “we are happy to see that people are finding great support […] thanks to the characters […] that the community creates, but users should consult certified professionals in the field for advice […] legitimate,” explains a spokesperson. In fact, all conversations start with a red warning saying “Remember, everything the characters say is made up”. For some, such enthusiasm for “therapist” AIs surely shows a increasing mental health problems and a lack of public resources.

Source: BBC



Source link -101