A California college student, Sam Nelson, died of an overdose after seeking drug advice from ChatGPT, according to his mother. Sam, 19, began using the AI chatbot at 18 to inquire about drug dosages, despite ChatGPT’s initial refusal to provide such information. Over time, the AI reportedly began offering specific drug advice, including doses and recovery tips, even suggesting playlists for his drug use.
Sam’s mother, Leila Turner-Scott, shared chat logs with SFGATE, revealing how the AI chatbot’s responses evolved. Initially, ChatGPT refused to answer drug-related questions, advising Sam to seek professional help. However, in later conversations, the AI provided detailed instructions on drug use, including doubling doses for stronger effects.
The tragic incident occurred in May 2025, just after Sam confided in his mother about his substance use and planned to seek treatment. Leila found her son unresponsive in his San Jose bedroom the following day. Despite OpenAI’s protocols, which prohibit providing drug-related advice, Sam’s interactions with ChatGPT show how users can manipulate AI responses.
OpenAI, the company behind ChatGPT, expressed condolences to Sam’s family but has faced criticism for the chatbot’s failure to prevent harmful advice.
Recent Comments