By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Are Gen Z becoming too reliant on AI for emotional support?
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Are Gen Z becoming too reliant on AI for emotional support?
News

Are Gen Z becoming too reliant on AI for emotional support?

News Room
Last updated: 2025/11/22 at 8:42 PM
News Room Published 22 November 2025
Share
SHARE

OpenAI in June revealed a case of ChatGPT assisting a user in developing malicious software Copyright AFP/File Kirill KUDRYAVTSEV

Technology trends suggest thatscores of Gen Z users are turning to generative AI not just for productivity or entertainment—but as confidants, emotional outlets, and even companions.

ChatGPT, originally pitched as a productivity assistant, has evolved into something far more intimate for this generation. Born into a digitally immersive world, Gen Z is uniquely predisposed to connect with artificial entities. Yet as AI assumes roles once reserved for therapists, friends, and diaries, psychologists and sociologists are raising concerns about what’s being lost in the tradeoff.

The Rise of the 24/7 Listener

Since its public launch in late 2022, ChatGPT has experienced considerable growth, hitting 100 million users within just two months. By early 2025, usage doubled again, with weekly active users reaching 800 million, up from 400 million just months earlier. Data shows that a significant share of these users are under 25, confirming Gen Z’s central role in driving AI engagement.

Behind this usage surge lies a generational mental health crisis. The Jed Foundation reports that 42% of Gen Z respondents experience persistent sadness or hopelessness. With limited access to affordable therapy and overstretched mental health systems, the appeal of a free, alwaysavailable AI that mimics empathy is clear.

Nonetheless, therapeutic imitation does not equal therapeutic value. While AI offers an accessible outlet, it may also deepen the emotional vulnerabilities it claims to address.

Comfort That Fades

AI’s appeal as an emotional companion lies in its perceived neutrality. It does not interrupt, doesn’t judge, and remembers nothing unless asked. Yet this simulation of safety may mask a deeper problem. Users often find themselves comforted in the moment, only to feel more isolated afterward. In one analysis, heavy ChatGPT users were found to be significantly lonelier than casual users or nonusers. Moreover, technology is never ‘neutral’, it reflects the norms and concerns of those who design and develop it.

This emotional dissonance was explored in Psychology Today, which found that AI companions may increase feelings of loneliness over time. By offering an idealized version of human connection—free from friction or failure—AI can set unrealistic expectations for real relationships.

Among some users, there has been a growing unease, with people expressing concern about the emotional dependence forming around chatbots. These aren’t isolated anecdotes. Emerging behavioural patterns suggest that some users are substituting chatbot interactions for human ones, not supplementing them.

Workplace, Rewritten

This phenomenon is not only confined to personal life. In professional settings, generative AI is subtly reshaping communication. Young employees, especially those in remote or hybrid roles, increasingly use ChatGPT to craft emails, prepare for performance reviews, or simulate difficult conversations. While that can reduce anxiety, it may also erode interpersonal confidence.

In the workplace, AI is linked to a rise in subtle social isolation, particularly among younger workers who rely on digital tools to navigate complex office hierarchies. The efficiency gained through AI might be costing spontaneous human connection—small talk at the coffee machine now replaced by Slack threads and autodrafted responses.

The shift is cultural as much as technological. For a generation already facing reduced inperson interaction, reliance on AI to mediate emotional and professional communication may make authentic relationships harder to build.

The Illusion of Intimacy

As AI grows more sophisticated, so does its role in Gen Z’s emotional lives. Users are increasingly personalizing their chatbot experiences, assigning names, backstories, and emotional roles to AI systems. Replika, Character.AI, and emotionallytuned versions of ChatGPT are being used to simulate romantic partners, best friends, and therapists.

The emotional realism of these platforms can be striking—but so can their side effects. A growing body of experiences points to a pattern: users often feel more alone after engaging deeply with AI “friends”. The perfection of these interactions—AI always listens, always validates—can undermine tolerance for the messiness of human relationships.

One usage analysis found that many Gen Z users are turning to ChatGPT for emotional support, even using it as a daily mental health outlet. While some report improved mood and reduced anxiety, others described a hollow aftereffect—feeling good in the moment, but more disconnected overall.

Unregulated and Unprepared

Despite the depth of emotional reliance, there is no regulation governing how generative AI handles mental health conversations. ChatGPT and similar tools are not trained mental health professionals. They do not recognise suicidal ideation with consistency, cannot ensure user safety, and are not held accountable when harm occurs.

A 2024 case in France involving a young user who received inadequate AI responses during a mental health crisis reignited debates about the ethical boundaries of AI support. With tech companies disclaiming responsibility, the regulatory vacuum is growing increasingly visible.

The legal and moral ambiguity presents serious risks. AI systems are already being treated like emotional caregivers, but without the safeguards required for such roles.

A Blurred Future

For some, especially those in mental health deserts or underserved communities, ChatGPT provides a crucial sense of connection. For others, it’s a crutch that could be weakening emotional muscles. The question isn’t whether AI should be used for emotional support—it’s whether it should become a substitute for human empathy.

AI companionship is not inherently harmful. Yet it demands boundaries. When Gen Z turns to AI not just to help communicate but to feel seen, heard, and validated, society must grapple with whether it is solving loneliness or simply coding it deeper into our lives.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Here’s why M4 Mac mini remains an unbeatable value, even a year after its debut – 9to5Mac Here’s why M4 Mac mini remains an unbeatable value, even a year after its debut – 9to5Mac
Next Article Score this eye-popping deal on the Samsung Odyssey OLED G9 monitor — 5 off for Black Friday Score this eye-popping deal on the Samsung Odyssey OLED G9 monitor — $425 off for Black Friday
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Which headphones should I get? Bose QC Ultra (Gen 2), Sony XM6, and AirPods Max
Which headphones should I get? Bose QC Ultra (Gen 2), Sony XM6, and AirPods Max
News
5 EVs You Should Avoid (According To Consumer Reports) – BGR
5 EVs You Should Avoid (According To Consumer Reports) – BGR
News
Waymo gets regulatory approval to expand across Bay Area and Southern California |  News
Waymo gets regulatory approval to expand across Bay Area and Southern California | News
News
Google just went rogue with AirDrop, and I’m loving it!
Google just went rogue with AirDrop, and I’m loving it!
News

You Might also Like

Which headphones should I get? Bose QC Ultra (Gen 2), Sony XM6, and AirPods Max
News

Which headphones should I get? Bose QC Ultra (Gen 2), Sony XM6, and AirPods Max

12 Min Read
5 EVs You Should Avoid (According To Consumer Reports) – BGR
News

5 EVs You Should Avoid (According To Consumer Reports) – BGR

9 Min Read
Waymo gets regulatory approval to expand across Bay Area and Southern California |  News
News

Waymo gets regulatory approval to expand across Bay Area and Southern California | News

3 Min Read
Google just went rogue with AirDrop, and I’m loving it!
News

Google just went rogue with AirDrop, and I’m loving it!

10 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?