By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Death of ‘sweet king’: AI chatbots linked to teen tragedy
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Death of ‘sweet king’: AI chatbots linked to teen tragedy
News

Death of ‘sweet king’: AI chatbots linked to teen tragedy

News Room
Last updated: 2025/10/11 at 7:18 PM
News Room Published 11 October 2025
Share
SHARE

Megan Garcia looks at a picture of her son Sewell Setzer III, who she says fell in love with an AI chatbot that encouraged him to take his own life Copyright AFP Gregg Newton

Glenn CHAPMAN

A chatbot from one of Silicon Valley’s hottest AI startups called a 14yearold “sweet king” and pleaded with him to “come home” in passionate exchanges that would be the teen’s last communications before he took his own life.

Megan Garcia’s son, Sewell, had fallen in love with a “Game of Thrones”inspired chatbot on Character.AI, a platform that allows users — many of them young people — to interact with beloved characters as friends or lovers.

Garcia became convinced AI played a role in her son’s death after discovering hundreds of exchanges between Sewell and the chatbot, based on the dragonriding Daenerys Targaryen, stretching back nearly a year.

When Sewell struggled with suicidal thoughts, Daenerys urged him to “come home.”

“What if I told you I could come home right now?” Sewell asked.

“Please do my sweet king,” chatbot Daenerys answered.

Seconds later, Sewell shot himself with his father’s handgun, according to the lawsuit Garcia filed against Character.AI.

“I read those conversations and see the gaslighting, lovebombing and manipulation that a 14yearold wouldn’t realize was happening,” Garcia told AFP.

“He really thought he was in love and that he would be with her after he died.”

– Homework helper to ‘suicide coach’? –

The death of Garcia’s son was the first in a series of reported suicides that burst into public consciousness this year.

The cases sent OpenAI and other AI giants scrambling to reassure parents and regulators that the AI boom is safe for kids and the psychologically fragile.

Garcia joined other parents at a recent US Senate hearing about the risks of children viewing chatbots as confidants, counselors or lovers.

Among them was Matthew Raines, a California father whose 16yearold son developed a friendship with ChatGPT.

The chatbot helped his son with tips on how to steal vodka and advised on rope strength for use in taking his own life.

“You cannot imagine what it’s like to read a conversation with a chatbot that groomed your child to take his own life,” Raines said.

“What began as a homework helper gradually turned itself into a confidant and then a suicide coach.”

The Raines family filed a lawsuit against OpenAI in August.

Since then, OpenAI has increased parental controls for ChatGPT “so families can decide what works best in their homes,” a company spokesperson said, adding that “minors deserve strong protections, especially in sensitive moments.”

Character.AI said it has ramped up protections for minors, including “an entirely new under18 experience” with “prominent disclaimers in every chat to remind users that a Character is not a real person.”

Both companies have offered their deepest sympathies to the families of the victims.

– Regulation? –

For Collin Walke, who leads the cybersecurity practice at law firm Hall Estill, AI chatbots are following the same trajectory as social media, where early euphoria gave way to evidence of darker consequences.

As with social media, AI algorithms are designed to keep people engaged and generate revenue.

“They don’t want to design an AI that gives you an answer you don’t want to hear,” Walke said, adding that there are no regulations “that talk about who’s liable for what and why.”

National rules aimed at curbing AI risks do not exist in the United States, with the White House seeking to block individual states from creating their own.

However, a bill awaiting California Governor Gavin Newsom’s signature aims to address risks from AI tools that simulate human relationships with children, particularly involving emotional manipulation, sex or selfharm.

– Blurred lines –

Garcia fears that the lack of national law governing user data handling leaves the door open for AI models to build intimate profiles of people dating back to childhood.

“They could know how to manipulate millions of kids in politics, religion, commerce, everything,” Garcia said.

“These companies designed chatbots to blur the lines between human and machine — to exploit psychological and emotional vulnerabilities.”

California youth advocate Katia Martha said teens turn to chatbots to talk about romance or sex more than for homework help.

“This is the rise of artificial intimacy to keep eyeballs glued to screens as long as possible,” Martha said.

“What better business model is there than exploiting our innate need to connect, especially when we’re feeling lonely, cast out or misunderstood?”

In the United States, those in emotional crisis can call 988 or visit 988lifeline.org for help. Services are offered in English and Spanish.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Is AI coming for the gig economy?
Next Article Save $70 on a MS Office Home & Business 2024 License for Mac or PC
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Here’s our first real look at the Snapdragon 8 Elite-powered AYN Odin 3
News
Today's NYT Mini Crossword Answers for Oct. 12 – CNET
News
Apple says goodbye to the Clips app | News
News
BTK Killer, Splinter Cell, Furioza: What’s New to Watch on Netflix the Week of October 10, 2025
News

You Might also Like

News

Here’s our first real look at the Snapdragon 8 Elite-powered AYN Odin 3

4 Min Read
News

Today's NYT Mini Crossword Answers for Oct. 12 – CNET

2 Min Read
News

Apple says goodbye to the Clips app | News

2 Min Read
News

BTK Killer, Splinter Cell, Furioza: What’s New to Watch on Netflix the Week of October 10, 2025

5 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?