C. Scott Brown / Android Authority
In late April, Meta released an update for its unbelievably popular Ray-Ban Meta smart glasses ($329 at Amazon). Although these don’t run on the exciting new Android XR platform, they have been a very successful entry into the burgeoning AR glasses market. Thanks to this latest update, the glasses now have a new feature: live translation.
As a frequent traveler, I rely on translation apps a lot — specifically Google Translate. Could the Ray-Ban Metas be better than Google Translate for my conversations with people who don’t speak English? They certainly seemed like they could, considering the convenience of having the glasses on my face, making for an unobtrusive translation conduit.
Obviously, you’ve already guessed this experiment’s outcome based on this article’s headline. However, continue on to see just how poorly this went.
Which translation service do you prefer?
4 votes
Setting up live translation on Ray-Ban Meta glasses
C. Scott Brown / Android Authority
Once you have the latest update installed on your glasses, the Meta AI app (formerly known as Meta View) will alert you that your Ray-Bans now support live translation. You’ll then be guided through the setup process, the first step of which is to tell Meta AI which language you speak (English, in my case).
Setting up live translation on the Ray-Ban Meta smart glasses is easy, but you currently have a very limited selection of languages.
The next step is to tell the app which language(s) you want to translate. As of right now, Meta only offers a minimal selection of non-English languages: French, Spanish, and Italian. Hopefully, this list will expand over time, but for now, that’s all you get. For this experiment, I chose French as the translation language.
Once you’ve completed those steps, the app will download your selected language packs to your phone. This allows you to use the service even when you are away from cellular data. The packs must be pretty small, as the French download only took a few seconds on Wi-Fi.
That’s all it takes to get set up with live translation on the Ray-Ban Meta glasses. But how does it actually work? Well, that’s where things went downhill fast.
Using Ray-Ban’s live translation is, well, not good
C. Scott Brown / Android Authority
It didn’t take long for things to go awry after getting all this set up. The in-app instructions explain that there are two ways to start a translation session. The first is to hit the new Translate button on the Devices page, right underneath the picture of your glasses. The second is to wear your glasses and say, “Hey Meta,” wait for the tone, and then say “Start live translation.”
I couldn’t trigger a translation session using voice commands, and the translator doesn’t seem to understand human laughter is not language.
Unfortunately, no matter how many times I (and my partner) tried to trigger a translation session through a voice command, it never worked. The Meta AI voice would always respond with, “I can’t help with that kind of request.” So, right from the get-go, we were off to a bad start.
C. Scott Brown / Android Authority
Thankfully, using the in-app button worked just fine for starting a session. Once I was in, I had my partner speak French while I spoke English with the glasses on. My partner’s French is a little rusty, but the app started out doing a good job of figuring out what she was trying to say.
Essentially, I spoke English, and the Meta AI app instantly translated it to French. My partner could then see the French translation on my phone screen. When she responded verbally in French, I could both see the English translation on my phone and hear the translation spoken to me by Meta AI from the Ray-Bans. This was pretty fast, too — not real-time fast, but fast enough for a comfortable conversation.
Eventually, though, she said something wildly inaccurate and laughed at her poor pronunciation. For some reason, her laughter sent the translator into a tizzy. Check out the screenshot below:
C. Scott Brown / Android Authority
This was not an ideal outcome. The translator’s understanding of our words worked OK, but a human conversation is not always just words. The fact that the AI doesn’t know what to do when someone starts laughing makes this a service I wouldn’t want to use in a legitimate conversation with a stranger while I’m traveling.
Google Translate is still the undisputed king
Edgar Cervantes / Android Authority
I want to give Meta credit where credit is due: the live translation aspect of the Ray-Ban Meta smart glasses is, fundamentally, OK. I couldn’t trigger it with a voice command, but triggering it with the in-app button worked OK, and translating from French to English happened quickly and accurately. However, going out of control when someone starts laughing is not ideal.
The limited language support is also not great. I go to Berlin every year for IFA, so not having access to German is a deal-breaker for me.
Really, the Ray-Ban Metas are just superfluous hardware. There’s nothing these do that Google Translate on a phone doesn’t do better.
But really, the big problem here is that there’s nothing about this that’s any better than Google Translate. That app has more reliable and accurate translations. It supports 249 languages as of today, which puts Meta’s pitiful three non-English language options to shame. It also supports broadcasting audio translations from the phone so everyone can hear, or pushing the audio directly to earbuds, replicating the novelty of the smart glasses speaking the translation for me.
If Meta’s smart glasses had display capabilities, things would be different. If I could communicate with someone while holding my phone out in front of me and not needing to turn it around to read it for myself, and instead see the translation in my glasses, that would be an interesting proposition. But Ray-Ban Meta smart glasses have no display capabilities, so that’s never happening. Really, all this system does is add an extra piece of hardware to a system that is already better using Google Translate on a phone.
This whole experiment has made me excited for Samsung’s Project Moohan and other Android XR-based systems coming this year, though. A pair of smart glasses with a display that automatically translates signage and speaks translations into my ear would be pretty cool. But we’d need a brand new version of Ray-Ban Meta glasses before that would happen in the Meta ecosystem. Until then, I’ll continue to rely on Google Translate for these situations.