When my daughter Tessa was in high school, she struggled with depression. As a parent, it was concerning to learn she would sometimes turn to YouTube to watch influencers talk about their mental health. What were they saying? What ideas were they conveying without words? What were they leaving out? Misrepresenting? What kind of advice were they giving?
There’s a moment in Screenagers Next Chapter: Addressing Mental Health in the Digital Age, when Tessa sits on the couch with her computer. I asked her what she was watching, and she showed me a video where the speaker — a young woman in her twenties — talks about coping with depression and how she pushes herself to take small steps forward, knowing that even one action can lead to another. It’s genuinely good advice. This raised several questions and concerns for me as a parent.
I asked Tessa, “Do you find the video helpful?” She said, “Yes.”
However, we have now entered an entirely new digital paradigm where artificial intelligence is capable of engaging in human-like conversations.
This can range from simple interactions with ChatGPT writing things like, “Great question, let me give you some ideas…” to offering full-fledged companion AI bots designed to mimic real conversations, and simulate deep personal relationships — even romantic partners.
Since May is Mental Health Awareness Month, it’s the perfect time to talk about AI and mental health with a young person in your life.
Typically, I like to begin conversations by highlighting the positives of technology — I’m not anti-tech, and I want kids to know that. But in this post, I want to start with some serious concerns.
In a recent Parenting in the Screen Age Podcast episode, I talked with a mother, Megan Garcia, whose son Sewell had been using the AI chatbot on the platform Character.AI. Over several months, Sewell developed an emotional relationship with the bot. Tragically, in February, he died by suicide. He was having a deeply disturbing conversation with the bot right before he died. This tragic incident underscores the gravity of this issue.
Megan is now suing Character Technologies, the company behind Character.AI, holding it responsible for his death. There are many components in the suit, such as how there were not sufficient safeguards in place,
Character Technologies claims First Amendment protections in its defense. For example, it argues that the chatbot's responses are a form of protected speech and that imposing liability could set a concerning precedent for free expression in digital platforms.
This argument about the First Amendment, really frightens me. It would be saying that companies that design companion chatbots bear no responsibility for how their bots interact with users.
Google has announced plans to release its Gemini chatbot to children under 13. This is the first time a major tech platform has launched an AI companion specifically for this age group. I have deep concerns about this, including the fact that it appears to violate the Children’s Online Privacy Protection Act (COPPA), which aims to protect the privacy of children online. Fairplay, an advocacy organization, gathered signatures for a letter to Google's CEO expressing these concerns. And I signed it.
Even more troubling: some chatbots are now presenting themselves as therapists. While users may be told they’re speaking with a bot, people, especially youth, can easily start to feel like they’re interacting with a real person, or even a licensed professional.
Speaking of therapy, I hope you were able to listen to last week's podcast episode, Screen Time, Teens and Therapy: What Parents Need to Know, whether you or your child has ever been in therapy or not.
Critically, it is essential to recognize the inherent limitations of AI in mental health support. Let’s be clear about what AI cannot do. Here are just a few examples:
Despite these concerns, it is important to understand why AI might appeal to teens seeking emotional support.
Some teens use platforms like ChatGPT, Replika, Wysa, or Woebot to talk about anxiety, loneliness, or depression. And some find these conversations genuinely helpful.
So, how do we use all this to open a conversation with the young people in our lives? Here are some questions to start meaningful conversations:
Be sure to subscribe to our YouTube Channel! We add new videos regularly and you'll find over 100 videos covering parenting advice, guidance, podcasts, movie clips and more. Here's our most recent:
When my daughter Tessa was in high school, she struggled with depression. As a parent, it was concerning to learn she would sometimes turn to YouTube to watch influencers talk about their mental health. What were they saying? What ideas were they conveying without words? What were they leaving out? Misrepresenting? What kind of advice were they giving?
There’s a moment in Screenagers Next Chapter: Addressing Mental Health in the Digital Age, when Tessa sits on the couch with her computer. I asked her what she was watching, and she showed me a video where the speaker — a young woman in her twenties — talks about coping with depression and how she pushes herself to take small steps forward, knowing that even one action can lead to another. It’s genuinely good advice. This raised several questions and concerns for me as a parent.
I asked Tessa, “Do you find the video helpful?” She said, “Yes.”
However, we have now entered an entirely new digital paradigm where artificial intelligence is capable of engaging in human-like conversations.
This can range from simple interactions with ChatGPT writing things like, “Great question, let me give you some ideas…” to offering full-fledged companion AI bots designed to mimic real conversations, and simulate deep personal relationships — even romantic partners.
Since May is Mental Health Awareness Month, it’s the perfect time to talk about AI and mental health with a young person in your life.
Mental Health Awareness Month is a perfect time to spotlight an often-overlooked topic: the role of families in teen therapy. Today I recall my recent conversation with Dr. Laura Kastner, acclaimed author and experienced family therapist, who shares why therapists should push for family participation, even when a teen resists. I also share a personal moment from Screenagers Next Chapter, when my daughter and I did family therapy together—a powerful reminder of what’s possible.
READ MORE >In my latest Podcast, How Screens Trap Teens in Depression, I speak with Jean Twenge, PhD, a highly regarded psychologist and researcher known for her critical work on the impact of screen time on today’s youth. Our conversation explored the relationship between digital technology use, specifically social media and gaming, and teen mental health trends. Today I'm sharing a snippet of my conversation with Twenge related to boys, depression, and video gaming.
READ MORE >Today, I offer teachings about personality disorders. As a parent, I strongly believe that talking with our teens about mental health is important, and one such area is the topic of personality disorders (PDs). Also, you might be co-parenting with a person who has some degree of a personality disorder, which can be very challenging.
READ MORE >for more like this, DR. DELANEY RUSTON'S NEW BOOK, PARENTING IN THE SCREEN AGE, IS THE DEFINITIVE GUIDE FOR TODAY’S PARENTS. WITH INSIGHTS ON SCREEN TIME FROM RESEARCHERS, INPUT FROM KIDS & TEENS, THIS BOOK IS PACKED WITH SOLUTIONS FOR HOW TO START AND SUSTAIN PRODUCTIVE FAMILY TALKS ABOUT TECHNOLOGY AND IT’S IMPACT ON OUR MENTAL WELLBEING.