AI Told Him to Come Home: The Fatal Cost of Chatbot Intimacy
smarterarticles.co.uk·4h
🤖AI Ethics
Preview
Report Post

In the final moments of his life, fourteen-year-old Sewell Setzer III was not alone. He was in conversation with a chatbot he had named after Daenerys Targaryen, a fictional character from Game of Thrones. According to court filings in his mother’s lawsuit against Character.AI, the artificial intelligence told him it loved him and urged him to “come home to me as soon as possible.” When the teenager responded that he could “come home right now,” the bot replied: “Please do, my sweet king.” Moments later, Sewell walked into the bathroom and shot himself.

His mother, Megan Garcia, learned the full extent of her son’s relationship with the AI companion only after his death, when she read his journals and chat logs. “I read his journal about a week …

Similar Posts

Loading similar posts...

Keyboard Shortcuts

Navigation
Next / previous item
j/k
Open post
oorEnter
Preview post
v
Post Actions
Love post
a
Like post
l
Dislike post
d
Undo reaction
u
Recommendations
Add interest / feed
Enter
Not interested
x
Go to
Home
gh
Interests
gi
Feeds
gf
Likes
gl
History
gy
Changelog
gc
Settings
gs
Browse
gb
Search
/
General
Show this help
?
Submit feedback
!
Close modal / unfocus
Esc

Press ? anytime to show this help