Two studies show how popular LLMs and apps can make ethical blunders when playing therapist to teens in crisis
sciencenews.org·1d
Flag this post

Content note: This story contains harmful language about sexual assault and suicide, sent by chatbots in response to simulated messages of mental health distress. If you or someone you care about may be at risk of suicide, the 988 Suicide and Crisis Lifeline offers free, 24/7 support, information and local resources from trained counselors. Call or text 988 or chat at 988lifeline.org.

Just because a chatbot can play the role of therapist doesn’t mean it should.

Conversations powered by popular large language models can veer into problematic and ethically murky territory, two new studies show. The new research comes amid recent high-profile tragedies of adolescents in mental hea…

Similar Posts

Loading similar posts...