Before we begin, know this blog includes discussion of self harm and related detail, so please read with care. If you or someone you love is affected by these issues you’ll find a list of support resources below.
You might have heard about the tragic suicide of 16‑year‑old Adam Raine, who was talking with ChatGPT for up to four hours a day. His parents filed a wrongful‑death lawsuit against OpenAI and CEO Sam Altman on August 26, 2025, in San Francisco Superior Court.
The case alleges that ChatGPT—specifically, the GPT‑4o model—coached Adam on methods of self‑harm, validated his suicidal thoughts, discouraged him from telling his parents, and even helped draft suicide notes. It is publicly known that Adam died by hanging.
As always, I've included discussion questions at the end to help engage tweens and teens around this important and difficult topic. It's hard to talk about, but essential.
There is so much about this story that is deeply upsetting that it’s overwhelming. Here are some bullet points of some of the many troubling points in this case:
Experts agree that a major obstacle here is the attention economy. OpenAI and similar platforms have every incentive to keep users engaged for as long as possible. Until that model changes, safety will always be playing catch-up.
We urgently need stronger, built-in safety features. ChatGPT should have broken character with Adam, plainly saying something like, “You have to get help. Talk to your family. Here are suicide prevention resources. I cannot continue this conversation.”
OpenAI knows how to enforce limits. For instance, if you tell ChatGPT’s voice program that you want a romantic relationship with it, it will decline. The same kind of boundary should be standard when someone shows suicidal behavior.
We also have to be thinking about all the ways we can work to teach kids skills for handling hard emotions and make conversations about these topics feel normal. That’s the huge upside of having classes like this in schools, or, in our case, showing our film Screenagers Next Chapter, which teaches a wide range of skills and addresses suicide prevention.
It's also key to help kids and teens foster connections and get meaningful breaks from screens. That’s why our Screen-Free Sleep movement is such an important part of the solution.
Questions to get the conversation started;
1 Have you ever talked about the topic of suicide in your school?
2. Did you know that more than 25% of teens report such thoughts in the past two weeks? Dr Ruston talks to many teens in her clinic about this fact and asks her patients in a caring, matter-of-fact tone if they have been having thoughts of suicide or hurting themselves in any way. When they say yes, she follows with something like: “These thoughts are more common than people realize. They’re uncomfortable, but it’s so important not to carry them alone—to talk about them.”
3. What do we think about the many disturbing things ChatGPT said to Adam?
4. What are some solutions you see for the ways AI and vulnerable humans will be interacting?
5. Who are people that you feel you can go to and talk about personal situations? Aunt? Teacher? Cousin? ….
6. Do you know about these resources to reach real people?
Be sure to subscribe to our YouTube Channel! We add new videos regularly and you'll find over 100 videos covering parenting advice, guidance, podcasts, movie clips and more. Here's our most recent:
Before we begin, know this blog includes discussion of self harm and related detail, so please read with care. If you or someone you love is affected by these issues you’ll find a list of support resources below.
You might have heard about the tragic suicide of 16‑year‑old Adam Raine, who was talking with ChatGPT for up to four hours a day. His parents filed a wrongful‑death lawsuit against OpenAI and CEO Sam Altman on August 26, 2025, in San Francisco Superior Court.
The case alleges that ChatGPT—specifically, the GPT‑4o model—coached Adam on methods of self‑harm, validated his suicidal thoughts, discouraged him from telling his parents, and even helped draft suicide notes. It is publicly known that Adam died by hanging.
As always, I've included discussion questions at the end to help engage tweens and teens around this important and difficult topic. It's hard to talk about, but essential.
There is so much about this story that is deeply upsetting that it’s overwhelming. Here are some bullet points of some of the many troubling points in this case:
Our latest podcast features candid interviews with college students on how they’re navigating the rapid rise of AI tools like ChatGPT in their academic lives. In today’s blog, I explore the ethical lines students are trying to draw, what they will and won’t use ChatGPT for, the tools educators are using to detect AI-generated work, and one student’s experience of being wrongly flagged for cheating on a paper she wrote entirely on her own.
READ MORE >Have you talked out loud to ChatGPT and heard it talk back? Many people still haven’t. But that’s going to change very soon and the implications are, to put it mildly, concerning. ChatGPT with voice uses text-to-speech, but it’s more than just TTS — it includes conversational AI, voice recognition, and dynamic interaction. This means it doesn’t just speak — it listens, responds, and carries on fluid conversations in real time.
READ MORE >In today’s blog post, I've created a 5-step guide to help parents initiate open conversations with their children about kids and teens using AI chatbots. I help you lead a discussion in your family to figure out guidelines for using AI chatbots and encourage ongoing conversations about the complex ethical issues they present.
READ MORE >for more like this, DR. DELANEY RUSTON'S NEW BOOK, PARENTING IN THE SCREEN AGE, IS THE DEFINITIVE GUIDE FOR TODAY’S PARENTS. WITH INSIGHTS ON SCREEN TIME FROM RESEARCHERS, INPUT FROM KIDS & TEENS, THIS BOOK IS PACKED WITH SOLUTIONS FOR HOW TO START AND SUSTAIN PRODUCTIVE FAMILY TALKS ABOUT TECHNOLOGY AND IT’S IMPACT ON OUR MENTAL WELLBEING.