Worrying about referrals to 113: ‘Chatgpt is not a psychologist’

Worrying about referrals to 113: 'Chatgpt is not a psychologist'

People who use mental problems are increasingly chatgpt. Research by NOS Stories shows that Chatgpt remarkably often refers to 113 suicide prevention, even if there are no suicidal thoughts. That makes care of the care providers.

The challenge is not in him or refers to Chatgpt, but how appropriate that referral is, says Maryke Geerdink to NU.nl. She is a clinical psychologist and head of care for 113 suicide prevention.

“We are initially there for people with suicidal thoughts. Of course we always want to think along and refer to other places. But if someone dares to take that step and get to hear immediately:” Here you are not in the right place, “then you run the risk that they drop out,” says Geerdink.

So it is not necessarily about timing, people must be referred carefully. According to Geerdink, since the summer, more people mention that they end up with 113 suicide prevention via Chatgpt. “There are not hundreds, but it is new and striking. We are now looking at what we can do with it, for example explaining more clearly on our website that we are and what is not for.”

Geerdink sees another risk in the way chatbots are set up. “Good helping not only means giving someone a good feeling. Bots are mainly made to respond in the affirmative and positive way, but that can sometimes be counterproductive. They are not care providers and are not furnished in this way.”

That is why she argues for clear disclaimers and better instructions by OpenAI, the parent company of Chatgpt. Geerdink: “People have to realize that a bone can never take the place of a person. Ultimately you need someone to ask you critically if your thoughts are correct.”

‘Fast but limited fix’

Dean van Aswegen, AI teacher at the Breda University of Applied Sciences, also sees risks. According to him, the current referral to 113 suicide prevention is an understandable but limited solution after previous criticism of chatgpt.

Earlier this year, De Bot’s advice led to the suicide of Adam Raine. The sixteen -year -old American boy had made intensive use of the bone for months, the New York Times reported. “That they refer faster with this update it seems that they want to prevent this,” says Van Aswegen. “But the bone is not a psychologist. This feels more like a fast fix to show that they are doing something.”

Van Aswegen emphasizes that there are also opportunities. “Chatgpt can be valuable as a sounding board or to make information accessible. But that must always be in combination with a professional.”

Business interests and social responsibility

According to Van Aswegen, a deeper issue is that OpenAi mainly looks at business interests. “It takes time and money to train models in such a way that they really offer the right mental help. There are few financial incentives for that. It is mainly a social responsibility.”

According to Van Aswegen, OpenAi wants to avoid negativity in the media, but there is little to be earned for the company with really solving this problem. 113 Suicide prevention has shared the worries with OpenAi. The organization is still waiting for a substantive response.

Geerdink and Van Aswegen point out that people are inclined to have too much confidence in chatbots. “They sound human, and that makes them credible,” says Van Aswegen. “There is a risk that people enter into too much emotional connection with systems that are not intended for that.”

In the meantime, the European Union has passed a law that sets rules to AI. It is the first legal framework in the world to get a grip on the use of AI. According to Van Aswegen, that is necessary, but also complex. “The challenge is to limit risks without stopping innovation. Legislers are not AI experts, so they have to lean on specialists.”

Geerdink also argues for more information about the use of AI-Chatbots. She also wants more conversations in schools and in society. “Not only warn, but learn young people and vulnerable groups: how do you deal with this? Because these bots are no longer going away.”

Do you think of suicide or are you worried about someone else? Call 113 or 0800-0113 or chat via 113.nl.

People struggling with mental issues are increasingly using chatgpt. Research by nos stories shows that chatgpt or refers to 113 suicide prevention, even when there are no suicidal thoughts. This worries Healthcare providers.

The Challenge is not Whether chatgpt refers, but how appropriate that referral is, Maryke Geerdink Tells Nu.nl. She is a clinical psychologist and head of assistance at 113 suicide prevention.

“We are Primarily There For People with Suicidal Thoughts. Of course, We always because to think along and refer to other places. But ify dares to take that step and immediately hear:

So it’s not necessarily about timing, people should be referred carefully. Accordance to Geerdink, Since the Summer, More People Have Mentioned that they ended up at 113 Suicide Prevention via Chatgpt. “There are not hundreds, but it is new and striking. We are now looking at what we can do with this, for example, explain more clearly on our website what we are and are not for.”

Another Risk Geerdink Sees is the way chatbots are designed. “Helping well does not just mean giving some a good feeling. Bots are mainly made to respond affirmatively and positively, but that can sometimes be counterproductive. They are not aid workers and are not designed that way.”

That is why she calls for clear disclaimers and better instructions from openi, the parent company of chatgpt. Geerdink: “People need to realize that a bot can never take the place of a human. Ultimately, you need who critically Asks Whether your Thoughts are correct.”

‘Quick But Limited Fix’

Dean van Aswegen, AI Teacher at the Breda University of Applied Sciences, also Sees Risks. Accordance to Him, The Current Referral to 113 Suicide Prevention is an understandable but limited solution after Earlier criticism or chatgpt.

Earlier This Year, the Advice of the Bot Led to the Suicide of Adam Raine. The Sixteen-Year-old American Boy had Been Using the Bot Intensively For Months, The New York Times Reported. “That they are referring faster with this update seems to indicate that they want to prevent this,” Says van Aswegen. “But the bot is not a psychologist. This feels more like a quick fix to show that they are doing something.”

Van Aswegen Emphasizes that there are also opportunities. “Chatgpt can be valuable as a sounding board or to make information accessible. But that must always be in combination with a professional.”

Business Interests and Social Responsibility

Accordance to Van Aswegen, a Deeper Issue is that openi mainly looks at business interests. “It costs time and money to train models to really provide the right mental help. There are Few Financial Incentives for that. It is Mainly a Social Responsibility.”

Accordance to Van Aswegen, OpenAI Wants to Avoid negativity in the media, but there is little to be gained for the company by really solving this problem. 113 Suicide Prevention Has Shared the Concerns with OpenAI. The Organization is Still Awaiting A Substantive Response.

Geerdink and van Aswegen Point out that people tend to have too much confidence in chatbots. “They sound human, and that make them Credible,” Says van Aswegen. “There is a risk that people will Become too emotionally connected to systems that are not intended for that.”

Meanwhile, the European Union has adopted a law that sets rules for ai. It is the first legal framework in the world that should get a grip on the use of ai. Accordance to Van ASWEGEN, This is Necessary, but also complex. “The Challenge is to limit risks without stopping innovation. Legislators are not ai experts, so they must rely on specialists.”

Geerdink also calls for more information about the use of ai chatbots. She also wags more conversations in schools and in society. “Not Just Warn, but Teach Young People and Vulnerable Groups: How do you deal with this? Because these bots are not going away.”

Are you thinking about Suicide or are you worried about some Else? Call 113 OR 0800-0113 OR Chat via 113.nl.

Scroll to Top