By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: I Wore Meta’s Ray-Ban Display Smart Glasses—And Saw the Future Through One Eye
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > I Wore Meta’s Ray-Ban Display Smart Glasses—And Saw the Future Through One Eye
News

I Wore Meta’s Ray-Ban Display Smart Glasses—And Saw the Future Through One Eye

News Room
Last updated: 2025/10/19 at 5:44 AM
News Room Published 19 October 2025
Share
I Wore Meta’s Ray-Ban Display Smart Glasses—And Saw the Future Through One Eye
SHARE

I’ve been reviewing smart smart glasses for years, and while many have shown promise, none have truly delivered detailed visual information in a way that feels natural and untethered. Some, like the Rokid Glasses, come close—but they’re consistently held back by technical shortcomings, clunky controls, and a lack of refinement. The new Meta Ray-Ban Display could be the pair to change that. I had the chance to try them out at a demo event in New York, and I came away genuinely impressed. There’s still more testing to do before I can say if they’re worth $799 (especially with the month-plus waitlist just to book a demo), but based on this first hands-on experience, the potential is clear.


Display: A Full-Color AR View

The Meta Ray-Ban Display’s waveguide display is the best of its type I’ve seen yet, for several reasons. To clarify exactly what that means, I should first explain waveguide display technology.

Displays in smart glasses can be separated into two categories: prism and waveguide. Both use tiny projectors to send an image to your eyes, but they differ in terms of how the image actually reaches your eyes. Prism displays use angled lenses, located behind the front lenses, to redirect projected light toward the eyes, much like a prism. These lenses are bulky, and while you can see through them, they can dim and obscure your view even when the display is turned off. The advantage is that they can show high-resolution, full-color images with a wide field of view. 

Meta Ray-Ban Display front

(Credit: Will Greenwald)

Waveguide displays use a single lens with special patterns etched into it. They’re far lighter than prism displays and completely transparent when not in use, but the trade-off is that they have significantly lower resolutions and fields of view. They’re also usually monochrome: Other waveguide-equipped models I’ve tested, including the Even Realities G1, Vuzix Z100, and Rokid Glasses, all have green-only displays.

The Meta Ray-Ban Display solves at least one of those problems. It features a full-color waveguide display, and considering the challenges of the technology, it’s truly stunning. When I tried on the glasses, colors looked surprisingly vibrant, and the picture was quite bright in a reasonably well-lit room. Even though the resolution is only 600 by 600 and the field of view is a tiny 20 degrees, to my eye, menus, text, pictures, videos, and even maps were large and sharp enough to read easily. 

I say eye singular, because the display is only built into the right lens. This could be awkward for users with a dominant left eye, but I didn’t have any difficulty reading it. Moreover, I could easily see everything in front of me, both around the projected image and through the other, clear lens. I couldn’t take them outside to see if the display is still visible against a sunny day, a challenge of waveguide displays.

I’ve used prism smart glasses with bigger, sharper, and more colorful displays before; I’m writing this with an XReal One Pro connected to my laptop right now. However, prism displays are bulky and make the glasses difficult to see through. Even if my laptop wasn’t tethered by a cable to the One Pro glasses, I wouldn’t feel comfortable wearing them while walking around. The Meta Ray-Ban Display is a different beast, and while it wouldn’t be my first choice for remote work or movie watching, I can definitely see myself using it on the go.

The waveguide in the right lens is nearly invisible, which might not seem important unless you’ve tried competing smart glasses. The waveguides on every other pair I’ve used have appeared to outside observers as distinct rectangles with weird, colorful reflections. The actual details of what the displays show couldn’t be seen, but they’ve still been invariably distracting to anyone I talked to while wearing them. On the Meta Ray-Ban Display, I didn’t see a hint of the waveguide from the outside. At least, I couldn’t see it in the lighting of the specific room where I had the demo. It might still be visible under different lighting conditions, but from my early testing, it’s the least outwardly recognizable waveguide I’ve seen so far.


Controls: You Use Your Fingers

Meta Neural Band

The Meta Neural Band enables gesture controls (Credit: Will Greenwald)

More than the color display, the biggest draw of the Meta Ray-Ban Display is its controller. It uses what Meta calls a Neural Band, a wristband that uses electromyography (EMG) to track hand gestures. Instead of sensors directly on your hand or a camera array constantly watching it like on the Apple Vision Pro, the Neural Band measures tiny movements on your arm to determine what your hand is doing.

In my short demo, the Neural Band controls worked better than I expected. Gestures are simple: Curve your fingers inward and swipe your thumb up, down, left, and right on the side of your index finger to move the cursor in different directions, pinch with thumb and index finger to click, pinch with thumb and middle finger to go back, double-pinch with thumb and middle finger to sleep and wake the display, and double-tap your thumb against the side of your index finger to bring up the AI assistant. There’s also a contextual gesture where you hold your thumb and index finger together and rotate them, as if turning a knob, which can adjust the volume or zoom in and out, depending on the situation. 

The single and double pinch gestures were all spot-on with no misfires. I could consistently select menu items, back out of menus, and turn the display on and off. The direction swiping was a bit more awkward, because I couldn’t use my thumb as if it were on a touchpad; each gesture was a single motion in that direction. I got used to it quickly, but even then, my swipes sometimes didn’t register. The knob-turning gesture was the hardest to get used to, as it’s so close to the single-click gesture. The changes in volume and zoom levels were jerky and felt detached from the smooth rotation of my wrist. Again, I got used to it, but it didn’t feel natural.

Still, all of these gestures were usable, and compared with the tiny touchpads and buttons other smart glasses of this type typically build their controls around, they felt revolutionary. In fact, if I weren’t already well acquainted with the Apple Vision Pro’s much, much more advanced and consistent eye- and hand-tracking controls, which are possible only on a bulkier headset with multiple cameras, I would call the Neural Band wholly revolutionary. As it is, I can call the Neural Band revolutionary for its form factor, even if it has a few rough edges.

Meta Ray-Ban Display gestures

Me controlling the Meta Ray-Ban Display with pinch gestures (Credit: Kenneth Butler)

Speaking of rough edges, the Neural Band also enables gesture-based text input. Handwriting recognition isn’t officially available yet, but I tried out the feature in beta. When entering text in the glasses, I could write letters using my index finger, drawing on any solid surface (such as my thigh); however, you can’t draw letters in mid-air. It kind of worked. The glasses recognized about three-quarters of the letters I drew, but it wasn’t accurate enough to really rely on to enter text. I was reminded of the very early days of touch screens on personal digital assistants (PDAs), the homo erectus to smartphones’ homo sapiens. Drawing letters to input text felt very similar to performing the same motions with a stylus on a little touchpad at the bottom of the monochrome screen of a PalmPilot (or in my case, the Sony Clie I had in college). It was both awkward and nostalgic. 


Features: AI Assistance, Live Captions, Navigation, and More

So, with glasses on my face and the wristband on my arm, I walked through some of the Meta Ray-Ban Display’s main functions: an AI assistant with visual processing, a camera, live captioning, video calls, and navigation. Meta is really focused on its AI assistant (to the point of calling its smart glasses “AI glasses”), so using voice commands and asking Meta AI for information is a big part of the experience.

Wi-Fi issues made the AI sluggish, but for the most part, it accurately understood and answered all my questions. I asked the glasses to identify a coffee shop based on the cup, and it read the logo and immediately showed me where it was on a map (Urban Backyard, on Mulberry Street). I looked at a collection of flowers and asked the glasses to identify them, and it listed each of the ones it recognized. I then asked for specific information about the “peach-colored flowers,” and after initially mentioning the pink roses that were also in view and similarly colored under the light, it provided details about the dahlias, along with a picture and a text blurb that I could scroll through for more information. Finally, I looked in a mirror and asked the glasses if they could tell me anything about myself. Well, my face isn’t recognizable enough for it to identify me (my feelings weren’t hurt too much at this), but the AI provided an accurate description in both voice and text.


Newsletter Icon

Newsletter Icon

Get Our Best Stories!

Your Daily Dose of Our Top Tech News


What's New Now Newsletter Image

Sign up for our What’s New Now newsletter to receive the latest news, best new products, and expert advice from the editors of PCMag.

Sign up for our What’s New Now newsletter to receive the latest news, best new products, and expert advice from the editors of PCMag.

By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.

Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Meta Ray-Ban Display camera

(Credit: Will Greenwald)

The cameras in the glasses appear to be the same as those in the non-display Meta Ray-Bans, with a 12MP resolution capable of capturing photos at 3,024 by 4,023 pixels and video at 1,440 by 1,920 pixels at 30 frames per second. I could only see what I shot on the lower-resolution display, but if the cameras are indeed identical, I would expect picture quality roughly equivalent to a budget-to-midrange smartphone. Good, not great.

What is great is how the display helps you frame pictures. It shows the center of the frame, allowing you to line up what you’re looking at. Like on a smartphone, a post-shot preview appears with every capture. With the regular Meta Ray-Ban cameras, framing is much harder because you need to eyeball it with no guide.

Live captioning is becoming a standard feature on waveguide-based smart glasses, and I’m glad to see it because it’s an incredible boon for users with hearing difficulties. The Meta Ray-Ban Display includes this feature, and I was impressed to see how quickly and accurately it recognized speech and converted it to text. It starts captioning with only a few words, tweaking the text on the fly as it gets more context for the sentence. It managed a short conversation I had with a Meta representative with almost no incorrect words, even punctuating sentences properly. It was much smoother than the captioning I’ve seen on other smart glasses like the Rokid Glasses (which is quite usable, but a bit finicky in picking up speech). To be fair, Meta’s demo was in a fairly small, quiet room, and I’ve yet to see how the glasses handle any remotely busy environment.

Language translation is also available on the Meta Ray-Ban Display, but it wasn’t part of the demo.

Recommended by Our Editors

For video calls, a Meta rep left the room and called me on WhatsApp through the glasses. I answered the call with a gesture, and his face appeared on the display. It seemed to work perfectly, and I could hear him through the glasses’ speakers clearly.

While I could see his face, he couldn’t see mine. Obviously, the glasses on my face couldn’t capture my face, and they don’t have any avatar feature like Personas on the Apple Vision Pro. However, I could share what I was looking at with a tap, letting the other person on the call see through the glasses’ cameras, which is very useful in its own way.

I tried the navigation feature by opening a dedicated map app in the glasses’ visual interface using hand gestures, but like almost all functions, I could have also asked the AI assistant to do it. A map of downtown New York, where the demo was being held, appeared in my vision, with my location at its center. I could zoom in and out from that point using the knob gesture, and toggle between the map staying oriented with north pointing up and the map rotating as I turned my head.

Meta Ray-Ban Display (brown)

(Credit: Will Greenwald)

Another button in the app let me ask the AI to find a location. I said the address of PC Labs, and it immediately plotted a route between the demo venue and my destination. The entire demo took place in a small room within one building, so I couldn’t see how well the glasses tracked my movements. However, it seemed accurate for that single location, and the route was correct.

Of all the potential functions of display-equipped smart glasses, navigation has been the most exciting for me on a personal level, for entirely immature reasons. Basically, I’ve wanted to have a video game-like minimap in the corner of my vision ever since I was a kid. Seeing the layout of my surroundings, with quest markers and other useful information, has been a dream of mine. The map on the Meta Ray-Ban Display is the first time I’ve actually seen that dream come close to being true.

I’ve tested only one other pair of smart glasses before that had a map function, the Even Realities G1, and it was completely unusable. Not only was it incredibly slow to update my location and orientation, but its map didn’t have any labels whatsoever. Nothing identified the streets around me, so all I saw was an effectively useless all-green line drawing of my neighborhood, only helpful if I already knew where I was and where I was going. The Meta glasses’ color view identified streets and landmarks, and it reoriented instantly when I turned. I still need to test how well the navigation function works while walking around, but it looks like my long-sought-after personal minimap might finally be a reality.

Next step: Displaying my to-do lists as quest objectives. I have plans.


Is This the Future of Smart Glasses?

From my limited time with the Meta Ray-Ban Display, I can say it’s the most advanced set of display-based smart glasses I’ve used. The jump from other waveguide smart glasses to these feels like the leap from conventional VR headsets, such as the Meta Quest 3, to the Apple Vision Pro—a huge upgrade in visual clarity, features, and input method.

Because I only tried them out for about 30 minutes in a controlled environment, I won’t give a score or buying advice until I can fully test them on my own. I don’t know how well the Neural Band works when I’m actually moving around, how reliable the navigation feature is when I walk down the street, or how accurate the live captioning is when I’m talking to someone outside of a quiet room, for example. These could well be revolutionary smart glasses that show how the entire space will develop going forward. I’ll see when I really put them through their paces outside of Meta’s supervision in the future.

About Our Expert

Will Greenwald

Will Greenwald

Principal Writer, Consumer Electronics


Experience

I’m PCMag’s home theater and AR/VR expert, and your go-to source of information and recommendations for game consoles and accessories, smart displays, smart glasses, smart speakers, soundbars, TVs, and VR headsets. I’m an ISF-certified TV calibrator and THX-certified home theater technician, I’ve served as a CES Innovation Awards judge, and while Bandai hasn’t officially certified me, I’m also proficient at building Gundam plastic models up to MG-class. I also enjoy genre fiction writing, and my urban fantasy novel, Alex Norton, Paranormal Technical Support, is currently available on Amazon.

Read Full Bio

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article Review: Anker’s New Nano Line is Worth Checking Out Review: Anker’s New Nano Line is Worth Checking Out
Next Article What Graphics Card Do I Have in My PC? What Graphics Card Do I Have in My PC?
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

9to5Mac Top Stories: The latest iPhone Fold, iPhone Air 2, and iPhone 18 rumors – 9to5Mac
9to5Mac Top Stories: The latest iPhone Fold, iPhone Air 2, and iPhone 18 rumors – 9to5Mac
News
Why Timing Matters: 10 New Crypto Coins to Consider Today as Market Sentiment Shifts
Why Timing Matters: 10 New Crypto Coins to Consider Today as Market Sentiment Shifts
Gadget
iPhone Fold crease-free screen glass may not be ready yet
iPhone Fold crease-free screen glass may not be ready yet
News
India startup funding hits B in 2025 as investors grow more selective |  News
India startup funding hits $11B in 2025 as investors grow more selective | News
News

You Might also Like

9to5Mac Top Stories: The latest iPhone Fold, iPhone Air 2, and iPhone 18 rumors – 9to5Mac
News

9to5Mac Top Stories: The latest iPhone Fold, iPhone Air 2, and iPhone 18 rumors – 9to5Mac

0 Min Read
iPhone Fold crease-free screen glass may not be ready yet
News

iPhone Fold crease-free screen glass may not be ready yet

1 Min Read
India startup funding hits B in 2025 as investors grow more selective |  News
News

India startup funding hits $11B in 2025 as investors grow more selective | News

11 Min Read
Samsung reportedly doesn’t know what to charge for the Galaxy S26 — and that could be bad news
News

Samsung reportedly doesn’t know what to charge for the Galaxy S26 — and that could be bad news

5 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?