By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: AI Voice Cloning Apps Should Terrify You – Here’s Why – BGR
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > AI Voice Cloning Apps Should Terrify You – Here’s Why – BGR
News

AI Voice Cloning Apps Should Terrify You – Here’s Why – BGR

News Room
Last updated: 2026/01/11 at 3:37 AM
News Room Published 11 January 2026
Share
AI Voice Cloning Apps Should Terrify You – Here’s Why – BGR
SHARE






Thapana_Studio/Shutterstock

The arrival of AI software, like ChatGPT, has opened the door to new ways for scammers to trick people. Scammers may use AI chatbots like TerifAI, which supports voice cloning to target unsuspecting victims. They’ll pretend to be loved ones in an emergency, superiors asking for financial transactions, or well-known celebrities endorsing specific investments. The goal is to convince the victim to act, and AI voice cloning apps are easy to find and use. They only need a few seconds of audio to clone someone’s voice, and some of them may not have enough security guardrails in place to prevent abuse. You should be worried about voice cloning threats and second-guess any suspicious voice calls in the future.

A Consumer Reports investigation from March 2025 highlighted the risks associated with voice cloning apps. The non-profit looked at six apps that use AI to offer voice cloning capabilities, finding that four of them don’t do enough to ensure that the person doing the cloning has the consent of the speaker. The report names ElevenLabs, Speechify, PlayAI, and Lovo as the four apps that had no technical mechanism to confirm speaker consent, as of last March. Descript and Resemble AI are the two apps that made abuse harder, but Consumer Reports still found ways to bypass protection.

To test these apps, Consumer Reports tried to create voice clones from publicly available audio, a scenario a scammer might employ. A malicious individual may look for voice samples from social media when targeting an individual. The audio may be used with voice cloning services to create the desired outputs. Some of the companies Consumer Reports studied said that they have protections in place, like watermarking recordings or using a database of deepfakes, but those protections may be insufficient.

How do scammers use voice cloning tools?


A concept of creating a deepfake voice with a computer program.
Tero Vesalainen/Shutterstock

Consumer Reports detailed a few specific attacks involving voice cloning. For example, the “grandparent scam” targets seniors. A scammer would use a voice cloning tool to place a call to a senior and pose as a loved one. They’ll say they need money for an emergency, urging the senior to give them money immediately, via a cash payment, a wire transfer, or the purchase of gift cards. The second scam involves cloning the voices of celebrities to generate content that endorses a particular type of investment. Other impostors may go for financial scams, targeting employees at large corporations. In 2024, a worker at a Hong Kong company sent $25 million to impersonators who used a voice cloning tool to pose as the firm’s CFO.

While these types of attacks have the same motive — obtaining money from the victim — scammers may have other nefarious reasons in mind. Consumer Reports cites reports of deepfake schemes targeting the Taiwan presidential election in 2023. Similarly, a robocall in New Hampshire using a voice clone of former President Joe Biden urged people not to vote in 2024.

It’s not just Consumer Reports warning against AI voice cloning tools. OpenAI CEO Sam Altman addressed fraud and AI in an interview at the Federal Reserve last July. Specifically, he mentioned the use of voice to authenticate in some apps. “A thing that terrifies me is apparently there are still some financial institutions that will accept a voice print as authentication for you to move a lot of money or do something else,” he said, adding that “AI has fully defeated that. […] I am very nervous that we have a significant impending fraud crisis because of this.”

What you can do to protect yourself


A concept image of using AI tools.
dee karen/Shutterstock

A key way to protect yourself against scams is by being informed of what’s possible with AI voice tools and what attackers do in the real world. Armed with that knowledge, you’ll know that you have to verify the caller. If someone is saying they’re your child, and they need money, you can hang up and call them on their actual number to verify the information. If a superior asks for an urgent wire transfer, you can delay the action until you’ve been able to check that the call was genuine. As for celebrities asking you to invest in a company or crypto coin, or politicians urging you not to vote, you can search the web for additional information. What you can’t do is try to detect cloned voices. Consumer Reports cited a Berkeley study that showed people were able to detect fake voices only 60% of the time, but the AI software is improving continuously, and attacks are getting more sophisticated.

OpenAI’s ChatGPT supports voice chats, but the company doesn’t offer voice cloning AI software to users, something Consumer Reports highlights. Altman also cited some of the scams the non-profit mentioned, including ransom attacks “where people have the voice of your kid or your parent, and they make this urgent call.” Altman predicted that, in the future, some scammers may switch to convincing AI-generated FaceTime calls that may be “indistinguishable from reality.”

A Time report last July mentioned an FBI warning about “vishing” campaigns, saying that “malicious actors are more frequently exploiting AI-generated audio to impersonate well-known, public figures or personal relations to increase the believability of their schemes.” Consumer Reports also noted that attacks are becoming increasingly multimodal, involving email, text, and images/videos in addition to cloning someone’s voice.



Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article 5 Fire TV settings you should change right now for better performance 5 Fire TV settings you should change right now for better performance
Next Article Agibot denies backdoor listing via Swancor Advanced Materials acquisition · TechNode Agibot denies backdoor listing via Swancor Advanced Materials acquisition · TechNode
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

WeChat HarmonyOS version arrives on Huawei AppGallery
WeChat HarmonyOS version arrives on Huawei AppGallery
Computing
Speediance’s Portable Resistance Trainer and Wearable Connect Your Wellness and Fitness Data
Speediance’s Portable Resistance Trainer and Wearable Connect Your Wellness and Fitness Data
News
Europe’s first self-driving bus shuttle service for airport launches in Switzerland · TechNode
Europe’s first self-driving bus shuttle service for airport launches in Switzerland · TechNode
Computing
This Babbel Deal Can Help Your Resolution to Learn a New Language Finally Stick
This Babbel Deal Can Help Your Resolution to Learn a New Language Finally Stick
News

You Might also Like

Speediance’s Portable Resistance Trainer and Wearable Connect Your Wellness and Fitness Data
News

Speediance’s Portable Resistance Trainer and Wearable Connect Your Wellness and Fitness Data

4 Min Read
This Babbel Deal Can Help Your Resolution to Learn a New Language Finally Stick
News

This Babbel Deal Can Help Your Resolution to Learn a New Language Finally Stick

3 Min Read
Indonesia and Malaysia block Grok over non-consensual, sexualized deepfakes |  News
News

Indonesia and Malaysia block Grok over non-consensual, sexualized deepfakes | News

3 Min Read
Best water flosser deal: Save  on Waterpik Cordless Pulse
News

Best water flosser deal: Save $10 on Waterpik Cordless Pulse

3 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?