By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: Lip-Syncing Robot Face Is a Step Towards Helping Future Bots Talk Like Us
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > Lip-Syncing Robot Face Is a Step Towards Helping Future Bots Talk Like Us
News

Lip-Syncing Robot Face Is a Step Towards Helping Future Bots Talk Like Us

News Room
Last updated: 2026/01/19 at 10:11 AM
News Room Published 19 January 2026
Share
Lip-Syncing Robot Face Is a Step Towards Helping Future Bots Talk Like Us
SHARE

The unease that creeps up your spine when you see something that acts human but isn’t remains a big issue in robotics — especially for robots that are built to look and speak like us.

That peculiar feeling is called the uncanny valley. One way roboticists work to bridge that valley is by matching a robot’s lip movements with its voice. Last Wednesday, Columbia University announced research that delves into how a new wave of robot faces can speak more realistically. 

Hod Lipson, a Columbia engineering professor who worked on the research, told that a main reason why robots are “uncanny” is they don’t move their lips like us when they talk. “We are aiming to solve this problem, which has been neglected in robotics,” Lipson said. 


Don’t miss any of our unbiased tech content and lab-based reviews. Add as a preferred Google source.


This research comes as hype has been spiking around robots designed for use at home and work. At CES 2026 earlier this month, for instance, saw a range of robots designed to interact with people. Everything from the latest Boston Dynamics Atlas robot to household robots like those that fold laundry, and even a turtle-shaped bot designed for environmental research, made appearances at the world’s biggest tech show. If CES is any indication, 2026 could be a big year for consumer robotics.

AI Atlas

Central among those are humanoid robots that come with bodies, faces and synthetic skin that mimics our own. The CES cohort included human-looking robots from Realbotix that could work information booths or provide comfort to humans, as well as a robot from Lovense designed for relationships that’s outfitted with AI to “remember” intimate conversations. 

But a split-second mismatch between lip movement and speech can mean the difference between a machine that you can form an emotional attachment to and one that’s little more than an unsettling animatronic. 

So if people are going to accept humanoid robots “living” among us in everyday life, it’s probably better if they don’t make us mildly uncomfortable whenever they talk. 

Watch this: Lip-Syncing Robot Sings a Song

01:58

Lip-syncing robots

To make robots with human faces that speak like us, the robot’s lips must be carefully synced to the audio of its speech. The Columbia research team developed a technique that helps robot mouths move like ours do by focusing on how language sounds.

First, the team built a humanoid robot face with a mouth that can talk — and sing — in a way that reduces the uncanny valley effect. The robot face, made with silicone skin, has magnet connectors for complex lip movements. This enables the face to form lip shapes that cover 24 consonants and 16 vowels.

Watch this: Lip-Syncing Robot Face Sounds Out Individual Words

00:30

To match the lip movements with speech, they designed a “learning pipeline” to collect visual data from lip movements. An AI model uses this data for training, then generates reference points for motor commands. Next, a “facial action transformer” turns the motor commands into mouth motions that synchronize with audio. 

Using this framework, the robot face, called Emo, was able to “speak” in multiple languages, including languages that weren’t part of the training, such as French, Chinese and Arabic. The trick is that the framework analyzes the sounds of language, not the meaning behind the sound.

“We avoided the language-specific problem by training a model that goes directly from audio to lip motion,” Lipson said. “There is no notion of language.”

Watch this: Lip-Syncing Robot Face Introduces Itself

00:49

Why does a robot even need a face and lips?

Humans have been working alongside robots for a long time but they have always looked like machines, not people — the disembodied and very mechanical-looking arms on assembly lines or the chunky disc that is a robot vacuum scooting around our kitchen floors. 

However, as the AI language models behind chatbots have become more prevalent, tech companies are working hard to teach robots how to communicate with us using language in real time. 

There’s a whole field of study called human-robot interaction that examines how robots should coexist with humans, both physically and socially. In 2024, a study out of Berlin that used 157 participants found that a robot’s ability to express empathy and emotion through verbal communication is critical for interacting effectively with humans. And another 2024 study from Italy found that active speech was important for collaboration between humans and robots when working on complex tasks like assembly. 

If we’re going to rely on robots at home and at work, we need to be able to converse with them like we do with each other. In the future, Lipson says, research with lip-syncing robots would be useful for any kind of humanoid robot that needs to interact with people. 

It’s also easy to imagine a future where humanoid robots are identical to us. Lipson says careful design could ensure that people understand they’re talking to a robot, not a person. One example would be requiring humanoid robots to have blue skin, Lipson says, “so that they cannot be mistaken for a human.”

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article ‘Revenue per gigabyte’: Nvidia’s possible plan revealed as chipmaker set to reduce percentage of mid-tier RTX 50-series GPUs thanks to RAM crisis ‘Revenue per gigabyte’: Nvidia’s possible plan revealed as chipmaker set to reduce percentage of mid-tier RTX 50-series GPUs thanks to RAM crisis
Next Article Revocable Resource Management Appears On Track For Linux 7.0 Revocable Resource Management Appears On Track For Linux 7.0
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Rogue agents and shadow AI: Why VCs are betting big on AI security |  News
Rogue agents and shadow AI: Why VCs are betting big on AI security | News
News
Verizon Outage Credits Can Hit Up to 0, But Only For a Legitimate Reason
Verizon Outage Credits Can Hit Up to $200, But Only For a Legitimate Reason
News
Mozilla Now Providing RPM Packages For Firefox Nightly Builds
Mozilla Now Providing RPM Packages For Firefox Nightly Builds
Computing
Best TV deal: Save 35% on the LG 86-Inch Class QNED 4K TV
Best TV deal: Save 35% on the LG 86-Inch Class QNED 4K TV
News

You Might also Like

Rogue agents and shadow AI: Why VCs are betting big on AI security |  News
News

Rogue agents and shadow AI: Why VCs are betting big on AI security | News

5 Min Read
Verizon Outage Credits Can Hit Up to 0, But Only For a Legitimate Reason
News

Verizon Outage Credits Can Hit Up to $200, But Only For a Legitimate Reason

4 Min Read
Best TV deal: Save 35% on the LG 86-Inch Class QNED 4K TV
News

Best TV deal: Save 35% on the LG 86-Inch Class QNED 4K TV

4 Min Read
Why Are Android Phones Getting Rid Of SIM Cards? – BGR
News

Why Are Android Phones Getting Rid Of SIM Cards? – BGR

6 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?