By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: ChatGPT developing age-verification system to identify under-18 users after teen death
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > ChatGPT developing age-verification system to identify under-18 users after teen death
News

ChatGPT developing age-verification system to identify under-18 users after teen death

News Room
Last updated: 2025/09/17 at 6:30 AM
News Room Published 17 September 2025
Share
SHARE

OpenAI will restrict how ChatGPT responds to a user it suspects is under 18, unless that user passes the company’s age estimation technology or provides ID, after legal action from the family of a 16-year-old who killed himself in April after months of conversations with the chatbot.

OpenAI was prioritising “safety ahead of privacy and freedom for teens”, chief executive Sam Altman said in a blog post on Tuesday, stating “minors need significant protection”.

The company said that the way ChatGPT responds to a 15-year-old should look different to the way it responds to an adult.

Altman said OpenAI plans to build an age-prediction system to estimate age based on how people use ChatGPT, and if there is doubt, the system will default to the under-18 experience. He said some users “in some cases or countries” may also be asked to provide ID to verify their age.

“We know this is a privacy compromise for adults but believe it is a worthy tradeoff.”

How ChatGPT responds to accounts identified as being under 18 will change, Altman said. Graphic sexual content will be blocked. It will be trained to not flirt if asked by under-18 users, or engage in discussions about suicide or self-harm even in a creative writing settling.

“And if an under-18 user is having suicidal ideation, we will attempt to contact the user’s parents and if unable, will contact the authorities in the case of imminent harm.

“These are difficult decisions, but after talking with experts, this is what we think is best and want to be transparent with our intentions,” Altman said.

OpenAI admitted in August its systems could fall short and it would install stronger guardrails around sensitive content after the family of 16-year-old Californian Adam Raine sued the company after the teen’s death.

The family’s lawyer said Adam killed himself after “months of encouragement from ChatGPT”, and the family alleges that GPT-4o was “rushed to market … despite clear safety issues”.

According to US court filings, ChatGPT allegedly guided Adam on whether his method of taking his own life would work, and also offered to help write a suicide note to his parents.

OpenAI previously said it was examining the court filing. The Guardian approached OpenAI for comment.

Adam exchanged up to 650 messages a day with ChatGPT, the court filing claims. In a blog post after the lawsuit, OpenAI admitted that its safeguards work more reliably in short exchanges, and after many messages over a long period of time, ChatGPT may offer an answer “that goes against our safeguards”.

The company announced on Tuesday it was also developing security features to ensure data shared with ChatGPT is private even from OpenAI employees. Altman also said adult users that wanted “flirtatious talk” with ChatGPT would be able to have it. Adult users would not be able ask for instructions on how to kill themselves, but can ask for help in writing a fictional story that depicts suicide.

“Treat adults like adults,” Altman said of the company’s principle.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article London AI group secures £40m in growth capital – UKTN
Next Article The Tissot RockWatch is back and more desirable than ever | Stuff
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Meta Ray-Ban leaks offer glimpse at some seriously futuristic glasses
Gadget
Rethinking AI Data Security: A Buyer’s Guide 
Computing
How Does NASA Add Features To Rovers Already On Mars? – BGR
News
Microsoft Rolls Out A Linux 6.12 LTS Option For Azure Linux
Computing

You Might also Like

News

How Does NASA Add Features To Rovers Already On Mars? – BGR

5 Min Read
News

There’s no reason to use old HDMI cables, so don’t buy them

8 Min Read
News

iPhone Air review: The thinnest iPhone ever, but at what cost? – 9to5Mac

32 Min Read
News

The best (and weirdest) things people actually use Samsung’s S Pen for

5 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?