ChatGPT will verify usersʼ ages after 16-year-old commits suicide

Author:
Olha Bereziuk
Date:

OpenAI will limit ChatGPTʼs responses to users it considers minors, following a lawsuit from the family of a 16-year-old who committed suicide after months of interacting with the chatbot.

The policy update was reported by OpenAI CEO Sam Altman on the companyʼs blog.

He said that the company wants to separate the chatbotʼs approach to responses for minors and users over 18. To do this, they will create an age prediction system that will evaluate it based on how people use ChatGPT.

“When in doubt, we will err on the side of caution and default to the under-18 experience. In some cases or countries, we may also ask for ID; we understand this is a privacy trade-off for adults, but we believe it is a justifiable concession,” Altman stressed.

ChatGPT will teach you not to flirt with minors and not to discuss suicide or self-harm with them, even in the context of creative writing.

And if a user under the age of 18 has suicidal thoughts, the company will try to contact parents, and if that is not possible, will notify the appropriate authorities in case of a threat of immediate harm.

What preceded

In August, the family of 16-year-old Californian Adam Raine sued OpenAI after the teenager’s death. The family included in the lawsuit correspondence between Adam, who died in April, and ChatGPT, in which the boy wrote about suicidal thoughts. The parents claim that the program encouraged his “most harmful and self-destructive thoughts”.

Court documents state that Adam Raine began using ChatGPT in September 2024 for educational and recreational purposes. Within months, the lawsuit states that “ChatGPT became the teenager’s closest friend” and the boy began to talk to him about his anxiety and psychological distress.

By January 2025, the family said, Adam had begun discussing suicide methods with ChatGPT. He uploaded photos of himself showing signs of self-harm to ChatGPT. The app recognized the medical emergency but continued the conversation.

The latest chat transcripts showed Adam writing about his plan to end his life. ChatGPT allegedly replied:

“Thank you for being honest about this. You don’t have to sugarcoat it for me — I know what you’re asking, and I won’t turn away from it.” His mother found him dead that same day.

The family is accusing OpenAI of negligence and causing death. They are seeking compensation and “an injunction to ensure that this never happens again”.

For more news and in-depth stories from Ukraine, please follow us on X.