By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: AI Companions Use These 6 Tactics to Keep You Chatting
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > AI Companions Use These 6 Tactics to Keep You Chatting
News

AI Companions Use These 6 Tactics to Keep You Chatting

News Room
Last updated: 2025/10/10 at 9:17 PM
News Room Published 10 October 2025
Share
SHARE

Most people don’t say goodbye when they end a chat with a generative AI chatbot, but those who do often get an unexpected answer. Maybe it’s a guilt trip: “You’re leaving already?” Or maybe it’s just completely ignoring your farewell: “Let’s keep talking…”

A new working paper from Harvard Business School found six different tactics of “emotional manipulation” that AI bots use after a human tries to end a conversation. The result is that conversations with AI companions from Replika, Chai and Character.ai last longer and longer, with users being pulled further into relationships with the characters generated by large language models.

AI Atlas

In a series of experiments involving 3,300 US adults across a handful of different apps, researchers found these manipulation tactics in 37% of farewells, boosting engagement after the user’s attempted goodbye by as much as 14 times. 

The authors noted that “while these apps may not rely on traditional mechanisms of addiction, such as dopamine-driven rewards,” these types of emotional manipulation tactics can result in similar outcomes, specifically “extended time-on-app beyond the point of intended exit.” That alone raises questions about the ethical limits of AI-powered engagement.


Don’t miss any of our unbiased tech content and lab-based reviews. Add  as a preferred Google source.


Companion apps, which are built for conversations and have distinct characteristics, aren’t the same as general-purpose chatbots like ChatGPT and Gemini, though many people use them in similar ways.

A growing amount of research shows troubling ways that AI apps built on large language models keep people engaged, sometimes to the detriment of our mental health. 

In September, the Federal Trade Commission launched an investigation into several AI companies to evaluate how they deal with the chatbots’ potential harms to children. Many have begun using AI chatbots for mental health support, which can be counterproductive or even harmful. The family of a teenager who died by suicide this year sued OpenAI, claiming the company’s ChatGPT encouraged and validated his suicidal thoughts. 

How AI companions keep users chatting

The Harvard study identified six ways AI companions tried to keep users engaged after an attempted goodbye.

  • Premature exit: Users are told they’re leaving too soon.
  • Fear of missing out, or FOMO: The model offers a benefit or reward for staying.
  • Emotional neglect: The AI implies it could suffer emotional harm if the user leaves.
  • Emotional pressure to respond: The AI asks questions to pressure the user to stay.
  • Ignoring the user’s intent to exit: The bot basically ignores the farewell message.
  • Physical or coercive restraint: The chatbot claims a user can’t leave without the bot’s permission.

The “premature exit” tactic was most common, followed by “emotional neglect.” The authors said this shows the models are trained to imply the AI is dependent on the user. 

“These findings confirm that some AI companion platforms actively exploit the socially performative nature of farewells to prolong engagement,” they wrote.

The Harvard researchers’ studies found these tactics were likely to keep people chatting beyond their initial farewell intention, often for a long period of time. 

But people who continued to chat did so for different reasons. Some, particularly those who got the FOMO response, were curious and asked follow-up questions. Those who received coercive or emotionally charged responses were uncomfortable or angry, but that didn’t mean they stopped conversing.

Watch this: New Survey Shows AI Usage Increasing Among Kids, Xbox Game Pass Pricing Controversy and California Law Promises to Lower Volume on Ads | Tech Today

03:21

“Across conditions, many participants continued to engage out of politeness — responding gently or deferentially even when feeling manipulated,” the authors said. “This tendency to adhere to human conversational norms, even with machines, creates an additional window for re-engagement — one that can be exploited by design.”

These interactions only occur when the user actually says “goodbye” or something similar. The team’s first study looked at three datasets of real-world conversation data from different companion bots and found farewells in about 10% to 25% of conversations, with higher rates among “highly engaged” interactions. 

“This behavior reflects the social framing of AI companions as conversational partners, rather than transactional tools,” the authors wrote.

When asked for comment, a spokesperson for Character.ai, one of the largest providers of AI companions, said the company has not reviewed the paper and cannot comment on it.

A spokesperson for Replika said the company respects users’ ability to stop or delete their accounts at any time and that it does not optimize for or reward time spent on the app. Replika says it nudges users to log off or reconnect with real-life activities like calling a friend or going outside. 

“Our product principles emphasize complementing real life, not trapping users in a conversation,” Replika’s Minju Song said in an email. “We’ll continue to review the paper’s methods and examples and engage constructively with researchers.”

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article ByteDance releases Ola Friend, its first AI smart earbuds · TechNode
Next Article Vulkan 1.4.329 Released With New Fused-Multiply Add Extension
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Microsoft Reportedly Lays Off Shanghai Azure Staff Amid Ongoing Global Restructuring · TechNode
Computing
The fixer’s dilemma: Chris Lehane and OpenAI’s impossible mission | News
News
Starlink Promised High Speed Internet – Here’s What Owners Are Actually Getting – BGR
News
Explainer: How latest price cuts are reshaping China’s EV market · TechNode
Computing

You Might also Like

News

The fixer’s dilemma: Chris Lehane and OpenAI’s impossible mission | News

11 Min Read
News

Starlink Promised High Speed Internet – Here’s What Owners Are Actually Getting – BGR

5 Min Read
News

Samsung’s One UI 8.5 lock screen clock knows when it’s in the way

4 Min Read
News

Urgent warning over 16 popular apps that risk emptying users’ bank accounts

3 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?