PreviewOpen Original Washington Post: ChatGPT told a teen who died by suicide to call for help 74 times over months but also used words like “hanging” and “suicide” very often, say family's lawyers — Help Nitasha Tiku report on mental health and AI by contacting her securely on Signal at nitasha.10.Similar PostsLoading similar posts...Keyboard ShortcutsNavigationNext / previous itemj/kOpen postoorEnterPreview postvPost ActionsLove postaLike postlDislike postdUndo reactionuRecommendationsAdd interest / feedEnterNot interestedxGo toHomeghInterestsgiFeedsgfLikesglHistorygyChangeloggcSettingsgsBrowsegbSearch/PaginationNext pagenPrevious pagepGeneralShow this help?Submit feedback!Close modal / unfocusEscPress ? anytime to show this help