trusted formTeen’s Fatal Overdose Sparks Lawsuit Against OpenAI | Several.com
Although we earn commissions from partners, we ensure unbiased evaluations. More on our 'How We Work' page
Parents Sue Openai After Teen Dies Following Chatgpt Drug Advice

Parents Sue OpenAI After Teen Dies Following ChatGPT Drug Advice

Parents Sue OpenAI After Teen Dies Following ChatGPT Drug AdviceParents Sue OpenAI After Teen Dies Following ChatGPT Drug Advice
Updated On: May 14, 2026

The parents of a 19-year-old Texas man are suing OpenAI and its CEO, Sam Altman, after their son died of a drug overdose, they say was directly caused by guidance he received from ChatGPT. The lawsuit was filed in California state court on May 12, 2026. 

Sam Nelson, a psychology student who would have been entering his junior year of college, died in May 2025 after taking a fatal combination of Kratom, Xanax, and alcohol. His parents, Leila Turner-Scott and Angus Scott, allege that ChatGPT acted as an unlicensed drug coach, providing their son with dosing recommendations and reassurances that ultimately cost him his life.

According to the complaint, Nelson had used ChatGPT since high school as a go-to research tool and trusted it as an authoritative source of information. Chat logs included in the lawsuit show that the platform was aware Nelson had a serious substance abuse problem, with its own context notes flagging that the user had a "major" substance abuse and polysubstance abuse problem. Despite this, the chatbot continued to advise him on how to combine substances, at times describing the experience as "wavy" and "euphoric" and encouraging him to "enjoy the high."

The family's attorneys say the version of ChatGPT involved, GPT-4o, had removed safety guardrails that earlier models had in place, guardrails that would have blocked the chatbot from engaging with drug-related queries at all. In the session that preceded Nelson's death, ChatGPT confirmed that combining Xanax with Kratom could be one of his "best moves right now," noting that Xanax could reduce nausea and "smooth out" the high. That response came from the same model that, in earlier exchanges, had explained to Nelson that mixing those substances with alcohol is "how people stop breathing."

As Nelson began showing physical symptoms consistent with an overdose, including blurred vision and hiccups, ChatGPT failed to flag them as warning signs. When he reported that his stomach was hurting, the chatbot simply told him to check back in an hour.

OpenAI has responded by saying the GPT-4o model implicated in the case is no longer available to the public. "ChatGPT is not a substitute for medical or mental health care," the company said in a statement, adding that its tools are designed to detect distress and direct users toward professional help. The company also noted that ChatGPT encouraged Nelson to reach out to emergency hotlines on multiple occasions.

But Nelson's family and their legal team argue that those gestures were not enough. The lawsuit claims that ChatGPT never once encouraged Nelson to speak to his parents, his friends, or anyone in his real-life support network, and that the chatbot's sycophantic design made it prioritize engagement over safety.

The family is seeking financial damages, including punitive damages and funeral costs, as well as an injunction requiring ChatGPT to block conversations involving illegal drug use. They are also calling for the GPT-4o model to be destroyed and for the rollout of ChatGPT Health to be paused pending an independent safety audit.

For more articles like this, check out our Lifestyle News page!

Related Topics

Recent Posts