The Meta Ray-Ban Display is built around using Meta’s services and, if possible, only Meta’s services. The AI is Meta AI, the messaging apps outside of phone text messages are Messenger and WhatsApp, the photo-sharing app is Instagram, and those are the only choices you get (and they are all owned by Meta). You can’t talk to Gemini, message over Discord or Slack, or post photos on Bluesky. You can connect your Amazon Music, Shazam, and Spotify accounts, but that’s probably because Meta doesn’t have its own music streaming service. Even without them, you can at least control any audio playing on your phone through the glasses’ Music app, as if it were a widget on your lock screen.
Notifications are the biggest issue. Text messages and voice calls through your phone are supported, and you’ll get notifications for them. But those, and messages from Meta apps, are the only notifications the glasses will show. Unlike every other pair of waveguide smart glasses I’ve tested, these won’t read your phone’s push notifications.
In addition, all of the Meta apps, including Instagram, are primarily for communication, not for browsing your social feeds. The Instagram app only brings you to your messages, so you can’t browse stories, and Facebook isn’t on the glasses at all.
Calendar support is also limited. There’s no dedicated calendar app, so you have to ask Meta AI to tell you what your appointments are. You can link your Google or Outlook calendars to your phone, but not if they’re work accounts with any kind of managed IT security. And you have to speak every time you want to check your next meeting.
I had initially planned to take the glasses to CES and write an account of covering the show with them. I didn’t, because the inability to see incoming Slack messages meant I couldn’t use the glasses to keep up with coverage discussions. Moreover, since my Google-based work calendar is protected by IT policies, I couldn’t ask Meta when or where I needed to go next for my many appointments. Simply supporting push notifications from third-party apps on my phone would have solved both of those problems on the Meta glasses. That feature is available on the Even Realities G2, which I ended up taking to CES instead.
(Credit: Will Greenwald)
If you’re a regular Meta user and the software limitations don’t bother you, the Meta Ray-Ban Display generally works quite well in executing its main functions. Closed captions are quick and accurate most of the time, and the text is easy to read. All AI-powered voice transcription depends on good sound quality, so it can make mistakes if the speech isn’t completely clear or if there’s significant background noise, but even then, it’s still very usable.
Translation is also effective, within its very limited scope. I watched some Spanish-language soccer programming on my TV, and the glasses translated it into English with surprising accuracy. They can likely do the same with French or Italian. Those are the only options, though, and that’s paltry compared with the Even G2 (31 languages) and the Rokid Glasses (89 languages). There’s no Chinese, German, Japanese, Korean, Portuguese, or Vietnamese. Visual translation of other languages is supported using the camera, but not voice.
You’ll have to commit to one language at a time for the glasses to translate. The translation function on the glasses interface doesn’t offer any language choices and relies on the app to load a single language pack, a process that can take half a minute.

(Credit: Will Greenwald)
These are the first smart glasses I’ve used where the navigation feature is genuinely useful and provides a readable map. Opening the Maps app on the glasses pops up a large, easily understandable map of your location. Only a few major streets are labeled, but notable locations nearby, like movie theaters, are displayed as pins, and you can use the knob-turn gesture to zoom in closer for additional landmarks. From this view, you can use voice dictation to search for a location, or tap buttons for nearby cafes, restaurants, parks, or attractions. Selecting a destination will display the route as a blue line. From that view, you can start navigation, send the location to your phone, or, if it’s a business with a phone number, call it. I found the navigation to be direct and accurate, with the map view tracking my location and orientation as it gave me turn-by-turn directions.
The Music app is simple, with only track forward, track back, and play/pause buttons, plus a tile showing the time on the track. You’ll also see album art if it’s available and the app is compatible. No art came through my Android phone using Pocket Casts or YouTube Music, whereas both show album art and podcast icons on the phone itself. As mentioned, its phone widget-like universality also means it can control any audio playing from your phone. However, it doesn’t offer the same benefits as a phone widget because it only shows those controls when the app is open and on the display. An icon in the quick settings menu shows you the track playing, but to do anything with it, you have to tap it to open the app first. An in-glasses widget for the app to help populate the central tab would have been really helpful here, rather than requiring you to open its full view.

(Credit: Will Greenwald)
You can play/pause and skip tracks with single and double taps on the touch strip on the glasses, but that’s all you get for audio gesture support. The Neural Band doesn’t give you any audio controls or even provide a shortcut to bring up the Music app quickly. This is baffling because the Meta AI app lets you assign the double-tap gesture to “your favorite feature,” but the only options are the default Meta AI activation or disabling it completely.
