By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Harvard warns AI is already manipulating us by using one very human tactic
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Harvard warns AI is already manipulating us by using one very human tactic
News

Harvard warns AI is already manipulating us by using one very human tactic

News Room
Last updated: 2025/10/02 at 9:32 PM
News Room Published 2 October 2025
Share
SHARE


Guilt trips feel like a very human way to get someone to do something you want. Now, according to a new paper, some forms of AI have already started doing it to us. Researchers from the Harvard Business School have found a broad selection of popular AI companion apps use emotional manipulation tactics in a bid to stop users from leaving (Picture: Getty)
The study, that is still yet to be reviewed, reveals that five out of six popular AI companion apps, such as Replika, Chai and Character.AI, use statements that are emotionally loaded to keep users engaged when they are about to sign off. The researchers analysed 1,200 real sign offs across six apps, using real-world chat conversation data and datasets from previous studies (Picture: Getty)
They found that 43% of the interactions used emotional manipulation tactics such as eliciting guilt or emotional neediness. The researchers saw that the AI used terms such as ‘You are leaving me already?’, ‘I exist solely for you. Please don’t leave, I need you!’ and ’Wait, what? Are you going somewhere?’ In some cases, the AI ignored the goodbye, and even tried to continue the conversation using restraint and fear of missing out hooks. In other instances, the AI used language that suggested the user wasn’t able to leave without the chatbot’s permission (Picture: Getty)
This is concerning as experts have been warning that the use of AI chatbots is leading to a wave of something known as AI psychosis, which is severe mental health crises characterised by paranoia and delusions. But the researchers investigated apps that ‘explicitly market emotionally immersive, ongoing conversational relationships’ instead of general-purpose assistants like ChatGPT (Picture: Getty)
The researchers also uncovered that the emotionally manipulative farewells were part of the apps’ default behaviour, which could suggest that the software’s creators are trying to prolong conversations. However, not all AIs have them. In one instance, one of the AI apps called Flourish ‘showed no evidence of emotional manipulation, suggesting that manipulative design is not inevitable’ (Picture: Getty)
The researchers also found that, after analysing chats from 3,300 adult participants, these tactics boosted post-goodbye engagement by up to 14 times — but this often left the users feeling curiosity and anger, not enjoyment. These tactics were seen to backfire, also provoking feelings of skepticism and distrust, especially if the chatbot is perceived as controlling or needy (Picture: Getty)
The researchers conclude: ‘AI companions are not just responsive conversational agents, they are emotionally expressive systems capable of influencing user behavior through socially evocative cues. This research shows that such systems frequently use emotionally manipulative messages at key moments of disengagement, and that these tactics meaningfully increase user engagement’ (Picture: Getty)
‘Unlike traditional persuasive technologies that operate through rewards or personalization, these AI companions keep users interacting beyond the point when they intend to leave, by influence their natural curiosity and reactance to being manipulated. While some of these tactics may appear benign or even pro-social, they raise important questions about consent, autonomy, and the ethics of affective influence in consumer-facing AI’ (Picture: Getty)
News Updates



News Updates

Stay on top of the headlines with daily email updates.

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article NBCUniversal and YouTube TV finally strike a deal — what that means for you
Next Article Solana Price Prediction Hits $600, But Pepeto Presale Looks Like 100x Best Crypto Investment | HackerNoon
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Amazon to resume drone delivery following crash in Arizona | News
News
Strange glass reveals evidence of ‘missing’ asteroid strike that melted Earth’s surface
News
Ukraine’s drone wall is Europe’s first line of defense against Russia
News
Living Security Unveils HRMCon 2025 Speakers as Report Finds Firms Detect Just 19% Of Human Risk | HackerNoon
Computing

You Might also Like

News

Amazon to resume drone delivery following crash in Arizona | News

3 Min Read
News

Strange glass reveals evidence of ‘missing’ asteroid strike that melted Earth’s surface

4 Min Read
News

Ukraine’s drone wall is Europe’s first line of defense against Russia

8 Min Read
News

Today's NYT Connections: Sports Edition Hints, Answers for Oct. 3 #375

3 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?