X

OpenAI Plans to Add Parental Controls to ChatGPT After Lawsuit Over Teen's Death

The parents of a teen boy who died by suicide in April filed a wrongful death lawsuit against OpenAI, calling the chatbot his "suicide coach."

Headshot of Macy Meyer
Headshot of Macy Meyer
Macy Meyer Writer II
Macy is a writer on the AI Team. She covers how AI is changing daily life and how to make the most of it. This includes writing about consumer AI products and their real-world impact, from breakthrough tools reshaping daily life to the intimate ways people interact with AI technology day-to-day. Macy is a North Carolina native who graduated from UNC-Chapel Hill with a BA in English and a second BA in Journalism. You can reach her at mmeyer@cnet.com.
Expertise Macy covers consumer AI products and their real-world impact Credentials
  • Macy has been working for CNET for coming on 2 years. Prior to CNET, Macy received a North Carolina College Media Association award in sports writing.
Macy Meyer
2 min read
Open AI blue icon

OpenAI is considering adding parental controls and other safety features to ChatGPT.

Viva Tung/CNET

OpenAI has announced its plans to implement parental controls and enhanced safety measures for ChatGPT after parents filed a lawsuit this week in California state court alleging the popular AI chatbot contributed to their 16-year-old son's suicide earlier this year. 

The company said it feels "a deep responsibility to help those who need it most," and is working to better respond to situations involving chatbot users who may be experiencing mental health crises and suicidal ideation. 

"We will also soon introduce parental controls that give parents options to gain more insight into, and shape, how their teens use ChatGPT," OpenAI said in a blog post."We're also exploring making it possible for teens (with parental oversight) to designate a trusted emergency contact. That way, in moments of acute distress, ChatGPT can do more than point to resources: it can help connect teens directly to someone who can step in."

OpenAI has not yet responded to a request for comment. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)

Among the safety features being tested by OpenAI is one that would allow users to designate an emergency contact who can be reached with "one-click messages or calls" within the platform. Another feature is an opt-in option that would allow the chatbot to contact those people directly. OpenAI did not provide a specific timeline for the changes.

Read more: Why Professionals Say You Should Think Twice Before Using AI as a Therapist

The lawsuit, filed by the parents of 16-year-old Adam Raine, alleges that ChatGPT provided their son with information about suicide methods, validated his suicidal thoughts and offered to help write a suicide note five days before his death in April. The complaint names OpenAI and CEO Sam Altman as defendants, seeking unspecified damages. 

"This tragedy was not a glitch or an unforeseen edge case — it was the predictable result of deliberate design choices," the complaint states. "OpenAI launched its latest model ('GPT-4o') with features intentionally designed to foster psychological dependency." 

The case represents one of the first major legal challenges to AI companies over content moderation and user safety, potentially setting a precedent for how large language models like ChatGPT, Gemini and Claude handle sensitive interactions with at-risk people. The tools have faced criticism based on how they interact with vulnerable users, especially young people. The American Psychological Association has warned parents to monitor their children's use of AI chatbots and characters. 

If you feel like you or someone you know is in immediate danger, call 911 (or your country's local emergency line) or go to an emergency room to get immediate help. Explain that it is a psychiatric emergency and ask for someone who is trained for these kinds of situations. If you're struggling with negative thoughts or suicidal feelings, resources are available to help. In the US, call the National Suicide Prevention Lifeline at 988.